Skip to definition.
Get the FREE one-click dictionary software for Windows or the iPhone/iPad and Android apps


Noun: Markov chain
  1. A Markov process for which the parameter is discrete time values
    - Markoff chain

Derived forms: Markov chains

Type of: Markoff process, Markov process

Encyclopedia: Markov chain