I'm trying to study a process that produces, in theory, an equilibrium distribution where the $i$th raw moment is given by:
$$ \mu_i = \exp(-\theta_1 \sum_{j=0}^{i-1}(1 + j\theta_2)^{-\theta_3}) $$
So that
$$ \mu_1 = \exp(-\theta_1) $$ $$ \mu_2 = \exp\big(-\theta_1 (1 + (1 + \theta_2)^{-\theta_3})\big) $$ $$ \mu_3 = \exp\big(-\theta_1 (1 + (1 + \theta_2)^{-\theta_3} (1 + 2\theta_2)^{-\theta_3})\big) $$
and so forth.
There are three parameters I'd like to estimate, the $\theta$'s. I've taken the first three moments and solved for the three parameters (I believe this is the "method of moments"?). There's a simple equation for $\theta_1$ and I got quite a complicated one for $\theta_2$:
$$\theta_1 = \ln{\mu_1} $$
$$ \frac{\ln(1+\theta_2)}{\ln(1+2\theta_2)} = \frac {\ln \Big( \frac{-\ln\hat{\mu_2}} {\theta_1} - 1 \Big) } {\ln \Big(\frac{\ln\hat{\mu_2}/\hat{\mu_3}}{\theta_1} \Big)} $$
I've used this equation to estimate $\theta_2$ and then I can go ahead and solve for $\theta_3$.
This does seem to work with simulated data but it is very very sensitive to the estimates for the moments that I plug in. So if I simulate some noise and $\hat{\mu_2}$ changes by a bit then I get very different estimates for the $\theta_2$ and $\theta_3$. So my question is, is there a better approach to estimate these parameters? Maybe one also using the higher moments? For instance I know an alternative to the method of moments is maximum likelihood but I have little idea how to proceed in that direction. Thank you!