2

In Neural networks, there are 2 concepts, batch learning and sequential learning.

Page 75 of "Machine Learning: A Probabilistic Perspective.", Kevin Patrick Murphy uses these terms in naive Bayes:

enter image description here

Could someone please give more explanation the difference on sequential mode and batch mode in the context of naive Bayes?

chl
  • 53,725
JJJohn
  • 1,875
  • 2
  • 12
  • 28

1 Answers1

1

Bayesian updating can be done all-at-once, or sequentially. What this means is that you can apply Bayes theorem to the data updating the prior to posterior one sample at a time, so you take sample $x_1$, use Bayes theorem to get posterior given this sample, use this posterior as a prior for sample $x_2$, etc. Under some mild assumptions, this would be equivalent to looking at all the data and it’s joint likelihood to update the prior.

Batch mode means the all-at-once update, or part-of-data-at-once approach, as compared to sequential.

Tim
  • 138,066