English Dictionary

MARKOV CHAIN

Pronunciation (US): Play  (GB): Play

 Dictionary entry overview: What does Markov chain mean? 

MARKOV CHAIN (noun)
  The noun MARKOV CHAIN has 1 sense:

1. a Markov process for which the parameter is discrete time valuesplay

  Familiarity information: MARKOV CHAIN used as a noun is very rare.


 Dictionary entry details 


MARKOV CHAIN (noun)


Sense 1

Meaning:

A Markov process for which the parameter is discrete time values

Classified under:

Nouns denoting natural processes

Synonyms:

Markoff chain; Markov chain

Hypernyms ("Markov chain" is a kind of...):

Markoff process; Markov process (a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state)


 Learn English with... Proverbs 
"Waste not, want not." (English proverb)

"Tongue may muddle up and say the truth." (Azerbaijani proverb)

"Advice sharpens a rusty opinion." (Arabic proverb)

"Barking dogs don't bite." (Dutch proverb)



ALSO IN ENGLISH DICTIONARY:


© 2000-2023 AudioEnglish.org | AudioEnglish® is a Registered Trademark | Terms of use and privacy policy
Contact