I'm facing a situation where I'd like to do Bayesian conjugate updating, but both the prior and the likelihood (a Student-t) can only be approximated by a finite mixture of normals. I know that a normal likelihood + mixture prior is a straightforward case (see e.g. old Persi Diaconis and Donald Ylvisaker paper) or the RBesT R package. But what if the likelihood is also described by a mixture of normal? Can you do conjugate updating for each component? How do the mixture weights change a-posteriori in that case?
Asked
Active
Viewed 35 times
1
-
It's a $O(k^n)$ complexity conjugate... – Xi'an Jun 20 '23 at 13:20
-
Here's a related question on using Gaussian mixtures as priors of Gaussian likelihoods. – Durden Jun 20 '23 at 15:05
-
Addendum: it appears that for conjugate updating with Gaussian mixture priors you will also need a likelihood that is a Gaussian mixture. In that case, the posterior is also a Gaussian mixture with updated parameters. For more details, have a look at part III of this paper. – Durden Jun 20 '23 at 20:37