8

I am learning to use HMM and I am trying to solve the following problem. There is a robot moving around the nodes in graph. The robot can move to adjacent nodes with certain probabilities. Each time the robot steps into a new "node", a (noisy) information about the node is generated. That is, I do not know the exact node. I have the following data:

  • Each node node is a hidden state (finite number)
  • A transition matrix that defines the probabilies for the transitions between nodes. $A$

  • Emission probabilities for each hidden node

Using HMM standard functions I should be able to predict the "hidden state" at time t if I have "t observations" (observations from $1,\dots,t$) $P(X_t|O_{1,\dots,t})$. Is there a way to predict the next move (hidden state at t+1)? If I have "$t$-observations" (all observations at time $t$), is it possible to predict the most probable "hidden state" at time "$t+1$"? Which HMM principle should I use?

Rasoul
  • 262
andreSmol
  • 537
  • 2
    To get P(t+1) simply multiply your distribution over the hidden states at time t by the transition probabilities. – jerad Oct 04 '13 at 22:51
  • 1
    Rabiner's review paper on HHMs is an amazing source for starting to work with HMMs. All the important algorithms are described (forward algo, backward algo, and EM algo) Take a look (http://www.cs.cornell.edu/Courses/cs4758/2012sp/materials/hmm_paper_rabiner.pdf) – bdeonovic Oct 09 '13 at 19:59
  • 1
    My current interest is same kind of HMM model. I will not give detailed explanation, because there is a better alternative. I have learned whatever I know thanks to this paper: http://www1.se.cuhk.edu.hk/~hcheng/paper/sdm2013.pdf Especially focus on section 3.2 (3.2.1 and 3.2.2). – Dorukhan Arslan Jan 03 '16 at 04:18

2 Answers2

3

You use the forward algorithm to predict $P(X_{t+1})$.

$P(X_{t+1}|X_t, Y_{1:t} ) = \sum_{X} P(X_{t+1}|X_t) \cdot P(X_t|Y_{1:t}) $

So, you use the same principle for predicting $P(X_{t})$, but without being able to incorporate $Y_{t+1}$, since it is not observed yet.

Zhubarb
  • 8,269
  • 1
    How would we calculate P(Xt|Y1:t) ? it is opposite of emission probabilities. I have P(Y(t)/X(t)) which is emission probabilities. – Arpit Sisodia Dec 23 '16 at 05:34
0

To get the probability over hidden states at t_2, just multiply your posterior over t_1 by your transition matrix.

https://stackoverflow.com/questions/15554923/how-to-perform-a-prediction-with-matlabs-hidden-markov-model-statistics-toolbo

AnoAPI
  • 1