The definition of the Markov property is typically that the next state depends only on the present state and no past states. However, the mathematical definition I usually see (e.g. https://stats.stackexchange.com/a/2463/225553) says that P(X_t = x | X_{t-1}, X_{t-2}, X_{t-3}...X_t0) = P(X_t | X_{t-1}). That seems to say that the present state depends on only the immediate prior state. Are these equivalent definitions or am I misunderstanding the notation?
Asked
Active
Viewed 405 times
1 Answers
1
You correctly wrote... $$P(X_t = x | X_{t-1}, X_{t-2}, X_{t-3},...X_{t0}) = P(X_t | X_{t-1})$$ However this is equivalent to the following by virtue of shifting the time index by 1. $$P(X_{t+1} = x | X_{t}, X_{t-1}, X_{t-2},...X_{t0}) = P(X_{t+1} | X_{t})$$
In words: Given the present, the future is independent of the past.
Hope this helps. If not, feel free to comment and ask for clarification.
SecretAgentMan
- 1,575
$$P(X_t = x | X_{t-1}, X_{t-2}, X_{t-3},...X_{t0}) = P(X_t | X_{t-1})$$
$$X_{t-1}$$ is considered the "present"? I am still unsure if the statements "The future state depends only on the present state" is equivalent to "The present state depends only on the immediate past state".
– coderunner Nov 02 '18 at 05:29