Skip to definition.
Get the FREE one-click dictionary software for Windows or the iPhone/iPad and Android apps


Noun: Markov process
  1. A simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state
    - Markoff process

Derived forms: Markov processes

Type of: stochastic process

Encyclopedia: Markov process