2

I'm sorry if the title is confusing. I am working with a model with $(X,Y_1,Y_2,...,Y_k)$ such that $X$ is some random variable (a latent factor) and the $Y$'s are generated according to:

$$ Y_i = \mu_i + \lambda_i X + \varepsilon_i$$

with all $\varepsilon$'s and $X$ mutually independent, all $\mu$'s and $\lambda$'s are constant parameters.

My question is: in this case where the $Y$'s are related to $X$ linearly, is it possible to have

$$\mathbb{E}[X|Y_1=y_1, Y_2 = y_2,...,Y_k = y_k] \neq \beta_0 + \beta_1 y_1 + ... +\beta_k y_k$$

(all $\beta$'s are constants and the lowercase y's are the realizations of the big Y's), and what is an example of this case? Any help would be appreciated!

I've played around with copulae on MATLAB but I can't figure this one out... What am I missing here?

2 Answers2

1

As a counterexample, just take: $$k=1,\quad X \text{ uniform on } [0,1],\quad \varepsilon \text{ uniform on } [0,2],\quad Y=X+\varepsilon$$ $$E[X\mid Y=y]=\begin{cases} \frac{y}{2} & if\quad 0<y<1 \\ \frac{1}{2} & if\quad 1<y<2 \\ \frac{y-1}{2} & if\quad 2<y<3 \\ \end{cases}$$

Speltzu
  • 192
0

Yes, it is possible. In general, when you have the distribution of $X|Y$ and you want to know the distribution of $Y|X$ (or some aspect of this distribution, such as its expected value), you need to reverse the conditioning by applying Bayes' rule and so the result depends on the marginal distribution of $X$. Applying this rule gives the general result:

$$\mathbb{E}(X|\mathbf{Y}=\mathbf{y}) = \int \limits_{\mathscr{X}} x \cdot f_{\mathbf{Y}|X}(\mathbf{y}|x) f_X(x) \ dx.$$

It is simple to choose $f_X$ to give a counter-example to your result. For example, if you take this distribution to be a point-mass distribution on $x=0$ (and assuming that there is some non-zero value for one of the beta coefficients) then you have:

$$\mathbb{E}(X|\mathbf{Y}=\mathbf{y}) = 0 \neq \beta_0 + \beta_1 y_1 + \cdots + \beta_k y_k.$$

Ben
  • 124,856