Skip to definition.
Get the FREE one-click dictionary software for Windows or the iPhone/iPad and Android apps


Noun: Markoff chain
  1. A Markov process for which the parameter is discrete time values
    - Markov chain

Type of: Markoff process, Markov process

Encyclopedia: Markoff chain