请输入您要查询的英文单词:

 

单词 Markov process
释义

Definition of Markov process in English:

Markov process

noun
Mathematics
  • Any stochastic process for which the probabilities, at any one time, of the different future states depend only on the existing state and not on how that state was arrived at.

Origin

1930s; earliest use found in Transactions of the American Mathematical Society. After German Markoffsche Prozess.

Definition of Markov process in US English:

Markov process

noun
Mathematics
  • Any stochastic process for which the probabilities, at any one time, of the different future states depend only on the existing state and not on how that state was arrived at.

Origin

1930s; earliest use found in Transactions of the American Mathematical Society. After German Markoffsche Prozess.

随便看

 

英汉双解词典包含464360条英汉词条,基本涵盖了全部常用单词的翻译及用法,是英语学习的有利工具。

 

Copyright © 2004-2022 Newdu.com All Rights Reserved
更新时间:2025/1/31 6:06:03