2

I assume that X and Y are normally distributed with individual mean and variance. So far, I have found that an analytic expression exists for $E[X^2+Y^2]$, $E[X^2*Y^2]$ and $E[X^2*(X^2+Y^2)]$, all using the theory of quadratic forms Quadratic forms in random variables.

However, I really can't find one for $E\left[\frac{X^2}{X^2+Y^2}\right]$. I would welcome any ideas to find an analytic solution!

Jannis
  • 310
  • 2
    Are $X$ and$Y$ still independent even though they are not identically distributed? That is, does the "not iid" in the title mean netiher independent nor identical distributed? Or just that they can be independent since they have already been assumed to be not identically distributed. Also, what is the answer for the case when they are iid? (My guess is $\frac 12$). – Dilip Sarwate Dec 18 '22 at 14:59
  • I think both cases would be interesting and I would appreciate both a general solution for a multivariate Gaussian and one with a diagonal covariance matrix. – Jannis Dec 20 '22 at 07:26
  • Consider accepting the new answer, if it suits your purpose – dherrera Nov 03 '23 at 07:04

2 Answers2

3

For the case where $X$ and $Y$ have different means ($\mu_X$ and $\mu_Y$ respectively) but the same variance ($\sigma^2$), and are uncorrelated, you can use the answer posted here to get an exact solution.

Make $Z = (X, Y) \sim \mathcal{N}(\mu, \sigma^2 \mathbf{I})$, where $\mu = (\mu_X, \mu_Y)$. Also, let $\mathbf{A} = \begin{bmatrix} 1 & 0 \\ 0 & 0 \end{bmatrix}$. Then, using the formula in the link above, and setting $n=2$, $tr(\mathbf{A})$ = 1 and $\mu' \mathbf{A} \mu = \mu_X^2$

Then, $\mathbb{E}\left(\frac{X^2}{X^2 + Y^2} \right) = \mathbb{E}\left(\frac{Z' \mathbf{A}Z}{Z'Z} \right) = \frac{1}{2} {}_1F_1\left(1; 2; \frac{-\Vert\mu\Vert^2}{2}\right) + \frac{1}{4} {}_1F_1\left(1; 3; \frac{-\Vert\mu\Vert^2}{2}\right) \mu_X^2 $

where ${}_1F_1$ is the Kummer confluent hypergeometric function.

For the more general case where $Z \sim \mathcal{N} (\mu, \Sigma)$ (and thus $X$ and $Y$ can have different variances and be correlated), a Taylor approximation like that described here can be used. Using that approximation, with $\Sigma = \begin{bmatrix} \sigma^2_X & \rho \\ \rho & \sigma^2_Y \end{bmatrix}$ we get the formula

\begin{equation} \mathbb{E}\left(\frac{X^2}{X^2 + Y^2} \right) = \mathbb{E}\left(\frac{Z' \mathbf{A}Z}{Z'Z} \right) \approx \frac{\mu_N}{\mu_D}\left( 1 - \frac{Cov(N,D)}{\mu_N \mu_D} + \frac{Var(D)}{\mu_D^2} \right) \end{equation}

where

\begin{equation} \begin{split} & \mu_N = \sigma_X^2 + \mu_X^2 \\ & \mu_D = \sigma_X^2 + \sigma_Y^2 + ||\mu||^2 \\ & Var(D) = 2tr([\Sigma]^2) + 4 \mu^T \Sigma \mu \\ & Cov(N,D) = 2 tr(\Sigma A \Sigma) + 4 \mu^T \Sigma \mathbf{A} \mu \end{split} \end{equation}

and $N$ and $D$ stand for $Z' \mathbf{A}Z$ and $Z'Z$ respectively.

dherrera
  • 1,258
  • 8
  • 26
  • 1
    In the first case, because $X$ and $Y$ are exchangeable, $X^2/(X^2+Y^2)$ and $Y^2/(X^2+Y^2)$ have the same distribution with positive, bounded expectation. Therefore each expectation will equal one-half the expectation of their sum, which is $1,$ whence the answer there is simply $1/2$ -- no hypergeometric functions are needed. – whuber Oct 31 '23 at 18:19
  • @whuber $X$ and $Y$ are not exchangeable, because they are non-centered, and can have different means, right? My gut feeling is that the formula can be heavily simplified as you say though, but I don't immediately see how. – dherrera Oct 31 '23 at 18:29
  • 1
    In your first formulation, at "$Z = (X, Y) \sim \mathcal{N}(\mu, \sigma^2 \mathbf{I}),$" you explicitly state that $X$ and $Y$ have the same distribution! – whuber Oct 31 '23 at 18:52
  • 2
    That's not what I meant, I meant that they both have the same variance (hence the $\sigma^2 \mathbf{I}$), but that $\mu \in \mathbb{R}^2$ has the mean of each one, $\mu_1$ and $\mu_2$, I'll make that clearer in the answer. – dherrera Oct 31 '23 at 18:57
2

This is not a complete Answer as I assume dependent with zero mean marginals, in partiucular, $$(X,Y)\sim\mathcal{N}\left(\left[ \begin{array}{c} 0 \\ 0 \end{array}\right],\left[ \begin{array}{cc} \sigma_x^2 & \rho\sigma_x\sigma_y \\ \rho \sigma_x\sigma_y & \sigma_y^2 \end{array} \right] \right).$$ Let $U=Y/\sigma_y$ and $V=X/\sigma_x$, such that we have $U=\rho V + \sqrt{1-\rho^2}Z$ where $Z$ is standard Gaussian and is independent from $V$. Therefore we obtain \begin{align} \frac{X^2}{X^2+Y^2}&=\frac{\sigma_x^2 V^2}{\sigma_x^2 V^2+\sigma_y^2 U^2}\\ &=\frac{\sigma_x^2 V^2}{\sigma_x^2 V^2+\sigma_y^2(\rho V + \sqrt{1-\rho^2}Z)^2}\\ &=\frac{\sigma_x^2}{\sigma_x^2+\sigma_y^2(\rho + \sqrt{1-\rho^2}\frac Z V)^2}. \end{align}

Since $\frac Z V\sim \mathcal{C}(0,1)$ we have \begin{align} E\frac{X^2}{X^2+Y^2}&=E\frac{\sigma_x^2}{\sigma_x^2+\sigma_y^2(\rho + \sqrt{1-\rho^2}\frac Z V)^2}\\ &=\int_\mathbb{R}\frac{\sigma_x^2}{\sigma_x^2+\sigma_y^2(\rho + \sqrt{1-\rho^2}c)^2}\frac 1{\pi}\frac1{1+c^2}dc. \end{align} And the latter integral can be easily evaluated, using residue theorem for example, to amount to $$\frac{1+\frac{\sigma_y}{\sigma_x}\sqrt{1-\rho^2}}{2 \frac{\sigma_y}{\sigma_x}\sqrt{1-\rho^2}+\frac{\sigma_y^2}{\sigma_x^2}+1}.$$

If $\sigma_x=\sigma_y$, then we find the mean to be $\frac12$, as expected and mentioned in the comments.

Math-fun
  • 425
  • 2
  • 8