请输入您要查询的英文单词:
单词
Markov chain
释义
Mar
·
kov chain
/ˈmɑːkɒf/
【
统
】
马尔可夫链
(离散状态的一个系统,其中从一个状态到另一状态的转变是一个固定的概率;亦作 Markov model)
[< Andrei Markov
(1856-1922)
, 苏联数学家]
随便看
breviaries
breviary
brevier
breviers
brevirostrate
brevis
brevises
brevities
brevity
brew
brewage
brewages
brewed
brewed up
brewer
breweries
brewers
brewer's
brewer's droop
brewers' yeast
brewer's yeast
brewer's yeasts
brewery
brewer yeast
brewhaha
英汉双解词典包含548777条英汉词条,基本涵盖了全部常用单词的翻译及用法,是英语学习的有利工具。
Copyright © 2004-2022 Newdu.com All Rights Reserved
更新时间:2026/3/8 18:15:22