A Markov chain is a mathematical system that undergoes transitions from one state to another according to certain probabilistic rules. The future state of the system depends only on its current state, and not on any of its past states. This property is known as the Markov property.
P ( X n + 1 = j ∣ X 0 , X 1 , … , X n ) = P ( X n + 1 = j ∣ X n )
Formally, a Markov chain is a sequence of random states \(X_0, X_1, X_2, ...\) that satisfy the Markov property:
If you’re interested in learning more about Markov chains, we highly recommend checking out the book “Markov Chains” by J.R. Norris. You can find a PDF version of the book online, and it’s a great resource for anyone looking to learn about this important topic.
The matrix \(P = (p_{ij})\) is called the transition matrix of the Markov chain.
p ij = P ( X n + 1 = j ∣ X n = i )