3

If f is a pdf, the integral of xf(x) over the entire range where f(x) > 0 gives, of course, the expected value. Suppose that integrate the same function, xf(x) from negative infinity up to t, giving a new function, G(t). So G(t) = E(x | x < t), and is measured in the same units as x, as the expected value is (while probability measures like the CDF are unitless).

I believe that G must be a well-known function with a name I should recognize and, on seeing it, clonk myself on the head and say “Of course!”. But I can not for the life of me think what it is. Am I right that this _is_a function with an established name? And if so, what is that name?

I have some problems to solve in this area and without a name it’s hard to search the literature. Help me out here, please.

andrewH
  • 3,117

2 Answers2

1

$$E(X \mid X\le t) = \int_{-\infty}^tx\frac {f_X(x)}{F_X(t)}dx$$

is the correct expression, and it is the "truncated expected value" which is a shorthand for "expected value of a random variable whose support is truncated".

The integral without the scaling by $F_X(x)$ (the cdf of $X$) can appear in the following situation: define the random variable

$$Y = X\cdot \mathbf 1_{\{X\le t\}}$$

where $\mathbf 1_{\{X\le t\}}$ is the indicator function taking the value $1$ when $X\le t$, zero otherwise. So $Y$ equals $X$ if $X\le t$, and it equals $0$ otherwise (it is a "censored" version of $X$). We can write

$$E(Y) = E(Y \mid X\le t) \cdot P(X\le t) + E(Y \mid X> t) \cdot P(X> t)$$

$$=E(X\cdot \mathbf 1_{\{X\le t\}} \mid X\le t) \cdot P(X\le t) + E(X\cdot \mathbf 1_{\{X\le t\}} \mid X> t) \cdot P(X> t)$$

$$= E(X \mid X\le t) \cdot P(X\le t) + E(X\cdot 0 \mid X> t) \cdot P(X> t)$$

$$= \int_{-\infty}^tx\frac {f_X(x)}{F_X(t)}dx\cdot F_X(t) +0 = \int_{-\infty}^tx f_X(x)dx$$

1

"Conditional mean" or "constrained mean" are two variations I have seen. A more general version is "conditional moment."

soakley
  • 4,516