Markov chain

From AMS Glossary
Revision as of 20:28, 26 January 2012 by Perlwikibot (Talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search


Markov chain

A stochastic process with a finite number of states in which the probability of occurrence of a future state is conditional only upon the current state; past states are inconsequential.

In meteorology, Markov chains have been used to describe a raindrop size distribution in which the state at time step n + 1 is determined only by collisions between pairs of drops comprising the size distribution at time step n.

Personal tools
Namespaces
Variants