Dictionary of English language
ABC-Word.com
We will find the definition for any word
We can help you solve a crossword puzzle

Defenition of the word Markov Chain

    • Defenition of the word Markov Chain

      • A model that is suitable for modelling a sequence of random variables, in which the probability that a variable assumes any specific value depends only on the value of a specified number of most recent variables that precede it (source: Nature).
      • Sequence of random variables (Xn) satisfying the Markov property, that is, such that Xn+1 (the future) depends only on Xn (the present) and not on Xk for k
      • a Markov process for which the parameter is discrete time values

    Hypernyms for the word Markov Chain

      • Markoff process
      • Markov process

    See other words