Markov chain/Definition

Markov process whose state space is finite or countably infinite.