Questions tagged [method-of-moments]

A method of parameter estimation by equating sample and population moments then solving the equations for the unknown parameters.

Method of Moments estimation (MoM or MME) is an approach to parameter estimation which equates sample and population moments to solve for unknown parameters.

Loosely, the Law of Large Numbers suggests that in as sample sizes become sufficiently large, the sample moments will approach the corresponding population moments; this is sometimes used to justify the MoM as a general approach.

While less commonly used than maximum likelihood estimation (MLE) – for example, it is quite often less efficient – it can be quite useful in many situations.

192 questions
4
votes
1 answer

Method of moments when there's no closed form expression

I am trying to code up a method of moments algorithm for parameter estimation. I have a closed form for the moments as a function of the parameters, but these expressions are complicated, so there's no way to get a closed form expression for the…
user272429
3
votes
0 answers

Method of moment estimator

Consider $U_i \sim^{iid} Bernoulli(\pi)$. Also consider: $$Y_i | U_i = 0 \sim exp(1/\gamma) \text{ and } Y_i | U_i = 1 \sim exp(1/2\gamma) $$ What are the method of moment estimators of $\pi \text{ and} \text{ }\gamma$ ? Here is my solution: $E(U_i)…
Sam
  • 2,184
3
votes
1 answer

Generalize the usage of moments in method of moments?

In Method of Moments for estimation, if there are $k$ parameters to estimate, we usually consider $i$-th moments, $i=1,...,k$, so that we have k equations for k unknowns. I wonder if it is wise to consider more moments of different orders, i.e.…
Tim
  • 19,445
2
votes
1 answer

How does the information in the problem statement and this solution align with the provided description of the method of moments?

I have the following problem: Let $Y_1, Y_2, \dots, Y_n$ be i.i.d. $\text{Uniform}(\theta, 1)$ random variables, and let an estimator be $\hat{\theta} = \min\{ Y_1, Y_2, \dots, Y_n \}$. You may find the following information useful when answering…
The Pointer
  • 1,932
1
vote
0 answers

Method of Moments for a mixing proportion

Suppose we have densities $f_1, f_2$ from the random variables $W_1$ and $W_2$ where $W_i$ has known mean $\mu_i$ and variance $\sigma_i$. Consider the mixture of the two densities $$ f(x;\theta)=\theta f_1(x) + (1-\theta)f_2(x)$$ with $0< \theta <…
1
vote
0 answers

Two supposedly equivalent approaches to the method of moments

I am reading the book "An Introduction to Econometric Theory" by A. Ronald Gallant. In the section of the book on the Method of Moments, I get a little confused about the method as I know it. I would like to understand how these two forms are…
Fam
  • 1,007
1
vote
1 answer

Does the second moment estimator of the uniform distribution parameter have the same properties as that of the first moment?

For independent and identically distributed samples $[y_1,...,y_m]$ where $y$ is uniformly distributed between $[0,\theta]$ with $0 \lt \theta \lt \infty$, finding the method of moments estimator for $\theta$ is very straightforward using the first…
Falimond
  • 141
  • 4
0
votes
1 answer

Population and sample moment notation

A population moment is often stated as $E[x]$ the sample moment counterpart is stated as $1/n \sum_{i=1}^{n} x_i$. I do not quite understand what is meant by $E[x]$. Why not $E[x_i]$? What does $E[x]$ say? Expected value of all observations of x?…
Snoopy
  • 523
0
votes
0 answers

Method of moments giving super sensitive estimates

I'm trying to study a process that produces, in theory, an equilibrium distribution where the $i$th raw moment is given by: $$ \mu_i = \exp(-\theta_1 \sum_{j=0}^{i-1}(1 + j\theta_2)^{-\theta_3}) $$ So that $$ \mu_1 = \exp(-\theta_1) $$ $$ \mu_2 =…
Ben S.
  • 145