请输入您要查询的英文单词:
单词
Markov chain
释义
Markov chain
['mɑ:kɔ:f]
【统计学】 马尔可夫链(一个随机事件的序列,其中每一个事件均由紧靠它的前一个事件决定) [亦作 Markoff chain]
随便看
Canning, George
cannings
cannister
cannisters
Cannizzaro reaction
Cannizzaro, Stasinlao
cannoli
cannolis
cannon
cannonade
cannonaded
cannonades
cannonading
cannonadings
Cannon, Annie Jump
cannonball
cannonballer
cannon bit
cannon bone
cannon bones
cannon cracker
cannon crackers
cannon cradle
cannoned
cannoneer
英汉双解词典包含483723条英汉词条,基本涵盖了全部常用单词的翻译及用法,是英语学习的有利工具。
Copyright © 2004-2022 Newdu.com All Rights Reserved
更新时间:2025/4/27 12:22:39