10

If $X$ is a random variable, I would like to be able to calculate something like $$E(e^{-X})$$

How can I do this? Thank you so much.

Macro
  • 44,826

1 Answers1

15

As pointed out in the comments, your specific question can be solved by evaluating the moment generating function and $t=-1$, but it appears you may be asking the more general question of how to calculate the expected value of a function of a random variable.

In general, if $X$ has density function $p$, then

$$ E \left( f(X) \right) = \int_{D} f(x) p(x) dx $$

where $D$ denotes the support of the random variable. For discrete random variables, the corresponding expectation is

$$ E \left( f(X) \right) = \sum_{x \in D} f(x) P(X=x) $$

These identities follow from the definition of expected value. In your example $f(X) = \exp(-X)$, so you would just plug that into the definition above.

Continuous example: Suppose $X \sim N(0,1)$, then

\begin{align*} E \left (\exp(-X) \right) &= \int_{-\infty}^{\infty} e^{-x} \frac{1}{\sqrt{2 \pi}} e^{-x^2/2} dx \\ &= \int_{-\infty}^{\infty} \frac{1}{\sqrt{2 \pi}} e^{-(x^2 + 2x)/2} dx \\ &= \int_{-\infty}^{\infty} \frac{1}{\sqrt{2 \pi}} e^{-(x^2 + 2x + 1)/2} e^{1/2} dx \\ &= e^{1/2} \int_{-\infty}^{\infty} \underbrace{\frac{1}{\sqrt{2 \pi}} e^{-(x+1)^2/2}}_{{\rm density \ of \ a \ N(-1,1)}} dx \\ &= e^{1/2} \end{align*}

Discrete example: $X \sim {\rm Bernoulli}(p)$. Then

\begin{align*} E \left( \exp(-X) \right) &= \sum_{i=0}^{1} e^{-i} P(X=i) \\ &= (1-p)e^0 + pe^{-1} \\ &= (1-p) + p/e \end{align*}

Macro
  • 44,826
  • $Y = f(X)$ is a random variable which we assume for simplicity is a continuous random variable with density function $g_Y(y)$. Thus the definition of $E[Y]$ is $$E[Y] = \int_{-\infty}^{\infty}yg_Y(y),\mathrm dy,$$ and so the question is why is the expectation of $Y$ also given by $$E[Y] = E[f(X)] = \int_{-\infty}^{\infty} f(x) g_X(x),\mathrm dx?$$ That the two competing formulas for $E[Y]$ both give the same value is worth mentioning somewhere, possibly linking to a proof of this remarkable result which the Wikipedia link calls the Law of the Unconscious Statistician. – Dilip Sarwate Jul 10 '12 at 18:16
  • @DilipSarwate, doesn't this just follow from the change of variable theorem? – Macro Jul 10 '12 at 18:22
  • Measure theoreticaly, you prove it when $f$ is an indicator, then when $f$ is a nonnegative simple function, then when $f$ is a nonnegative measurable function (using monotone convergence), and finally when $f$ is an arbitrary measurable function. – Zen Jul 11 '12 at 01:41