Meaning of Markov chain | Babel Free
Definitions
A discrete-time stochastic process containing a Markov property.
Equivalents
العربية
سِلْسِلَةُ مَارْكُوف
Čeština
Markovův řetězec
Deutsch
Markow-Kette
Español
cadena de Márkov
Suomi
Markovin ketju
日本語
マルコフ連鎖
Русский
цепь Ма́ркова
Examples
“The probability density of the Bayseian posterior was estimated by Metropolis-coupled Markov chain Monte Carlo, with multiple incrementally heated chains.”
CEFR level
B2
Upper Intermediate
This word is part of the CEFR B2 vocabulary — upper intermediate level.
This word is part of the CEFR B2 vocabulary — upper intermediate level.