Markov chain
WordNet

noun


(1)   A Markov process for which the parameter is discrete time values
WiktionaryText

Noun


Markov chain
  1. A discrete-time stochastic process with the Markov property.
 
x
OK