I am using a variational bayes method (without a M step since no parameters) to infer my model.
My question is,
- if it is working correctly will it increase the log likelihood of the data, $\log(p(y))$.
- If so does it guarantee that the lower bound $\int q(z)\log \frac{p(y,z)}{q(z)} dz$ will also keep increasing where, $z$ are the hidden variables which are marginalised out in 1.
Edit: Let us say that we have two latent variables $z_1,z_2$. VB theorem states that $q(z_1)=exp(E_{q(z_2)}[p(y,z_1,z_2)])/C_1$ and similarly $q(z_2)=exp(E_{q(z_1)}[p(y,z_1,z_2)])/C_2$ where, $C_1,C_2$ are normalising constants. You can simply iterate through these two until convergence.
Thus, I haven't used the lower bound to optimise at all. Just that I checked how the lower bound behaved by saving the value at iterations. It turns out that it actually went down before it started increasing. Is this possible if I had done my calculations properly?