The $n^{th}$ moment of a distribution can be estimated from a vector of samples $(x_1,x_2,...x_k)$ by: $$ \sum_{i=1}^{k} x_i^n $$ Now, let's say I've calculated the first $m$ moments for my distribution. How do I then go about doing the normal things I would do with my distribution, like finding the $PDF(x)$ or $CDF(x)$? If $m=2$, this is easy because it's just a Gaussian. But for any other value of $m$ I'm pretty much lost.
1 Answers
The problem you are dealing with is known as the Hamburger moment problem, which seeks to characterise and recover a distribution from a known series of moments.
It is only possible to characterise the distribution if the entire series of moments is specified, not just the first $m$ moments. In view of this, perhaps you could enhance your problem by setting a rule that assumes what the remaining moments in the series are taken to be (e.g., you could assume that all central moments after the $m$th moment match the standard normal distribution). Also, just because you only estimate the first two moments does not mean you are dealing with a normal distribution - you might assume that it is a normal distribution, but that would be an assumption, not a logical implication of your estimation problem.
In any case, once you have specified the entire series of moments (e.g., by estimating the first $m$ and assuming the later ones) you can form the moment generating function:
$$m(t) = \sum_{n=0}^\infty \frac{t^n}{n!} m_n.$$
If this function exists (i.e., if the sum converges) then you have a well-defined moment generating function, and the corresponding distribution can be recovered by inverse-Laplace transformation. In most cases there will not be a closed form solution to this problem, and so the distribution will have to be approximated numerically.
- 124,856
density(). Another kind of nonparametric estimator is implemented in the package logcondens. – Nov 30 '12 at 18:07