Suppose we want to make inference on an unobserved realization $x$ of a random variable $\tilde x$, which is normally distributed with mean $\mu_x$ and variance $\sigma^2_x$. Suppose there is another random variable $\tilde y$ (whose unobserved realization we'll similarly call $y$) that is normally distributed with mean $\mu_y$ and variance $\sigma^2_y$. Let $\sigma_{xy}$ be the covariance of $\tilde x$ and $\tilde y$.
Now suppose we observe a signal on $x$, \begin{align}a=x+\tilde u,\end{align} where $\tilde u\sim\mathcal{N}(0,\phi_x^2)$, and a signal on $y$, \begin{align}b=y+\tilde v,\end{align} where $\tilde v\sim\mathcal{N}(0,\phi_y^2)$. Assume that $\tilde u$ and $\tilde v$ are independent.
What is the distribution of $x$ conditional on $a$ and $b$?
What I know so far: Using inverse-variance weighting, \begin{align}\mathbb{E}(x\,|\,a)=\frac{\frac{1}{\sigma_x^2}\mu_x+\frac{1}{\phi_x^2}a}{\frac{1}{\sigma_x^2}+\frac{1}{\phi_x^2}},\end{align} and \begin{align} \mathbb{V}\text{ar}(x\,|\,a)=\frac{1}{\frac{1}{\sigma_x^2}+\frac{1}{\phi_x^2}}. \end{align}
Since $x$ and $y$ are jointly drawn, $b$ should carry some information about $x$. Other than realizing this, I'm stuck. Any help is appreciated!