3

I am using a variational bayes method (without a M step since no parameters) to infer my model.

My question is,

  1. if it is working correctly will it increase the log likelihood of the data, $\log(p(y))$.
  2. If so does it guarantee that the lower bound $\int q(z)\log \frac{p(y,z)}{q(z)} dz$ will also keep increasing where, $z$ are the hidden variables which are marginalised out in 1.

Edit: Let us say that we have two latent variables $z_1,z_2$. VB theorem states that $q(z_1)=exp(E_{q(z_2)}[p(y,z_1,z_2)])/C_1$ and similarly $q(z_2)=exp(E_{q(z_1)}[p(y,z_1,z_2)])/C_2$ where, $C_1,C_2$ are normalising constants. You can simply iterate through these two until convergence.

Thus, I haven't used the lower bound to optimise at all. Just that I checked how the lower bound behaved by saving the value at iterations. It turns out that it actually went down before it started increasing. Is this possible if I had done my calculations properly?

sachinruk
  • 1,353
  • 1
  • 11
  • 24
  • Your question is confusing since "variational Bayes" implies that there are parameters and it is different from EM. – Tom Minka Nov 13 '14 at 18:23

1 Answers1

1

If you are using the coordinate ascent approach to variational Bayes then it guarantees to increase the lower bound at each step. (This is easy to remember by referring to the algorithm as "coordinate ascent".) It does not guarantee that the exact likelihood increases.

Tom Minka
  • 7,060