1

I'm facing a situation where I'd like to do Bayesian conjugate updating, but both the prior and the likelihood (a Student-t) can only be approximated by a finite mixture of normals. I know that a normal likelihood + mixture prior is a straightforward case (see e.g. old Persi Diaconis and Donald Ylvisaker paper) or the RBesT R package. But what if the likelihood is also described by a mixture of normal? Can you do conjugate updating for each component? How do the mixture weights change a-posteriori in that case?

Björn
  • 32,022

0 Answers0