Due to an upcoming exam I came across the following past exam question which at first seemed easy. The question is as follows.
Let $f(y_i) = (\frac{k}{y_i})^2$ be the density function of the (random) income of agent A where $k$ denotes the minimum amount. As such the income follows a pareto distribution with support $[k, \infty)$. Determine the expected income.
At first, my thoughts were pretty routine: Apply the definition $\mathbb{E}[y_i] = \int\limits_{k}^{\infty} f(y_i)y_i dy_i$ and integrate. The result is the expected income. But after doing that I got the integral $\int\limits_{k}^{\infty} \frac{1}{y_i} dy_i$ which of course does not converge. Hence, the expected income would be infinite which is not sensible - not economically nor in the exam situation. But what do I miss here?
https://stats.stackexchange.com/questions/70088/when-does-a-distribution-not-have-a-mean-or-a-variance ... and perhaps ... https://stats.stackexchange.com/questions/114511/do-mean-variance-and-median-exist-for-a-continuous-random-variable-with-continu
– Glen_b May 29 '17 at 00:16