HomeServicesBlogDictionariesContactSpanish Course
← Back to search

Meaning of Markov chain | Babel Free

Noun CEFR B2

Definitions

A discrete-time stochastic process containing a Markov property.

Equivalents

Examples

“The probability density of the Bayseian posterior was estimated by Metropolis-coupled Markov chain Monte Carlo, with multiple incrementally heated chains.”

CEFR level

B2
Upper Intermediate
This word is part of the CEFR B2 vocabulary — upper intermediate level.

See also

Learn this word in context

See Markov chain used in real conversations inside our free language course.

Start Free Course