请输入您要查询的英文单词:
单词
Markov chain
释义
Markov chain
['mɑ:kɔ:f]
【统计学】 马尔可夫链(一个随机事件的序列,其中每一个事件均由紧靠它的前一个事件决定) [亦作 Markoff chain]
随便看
laborists
laborite
laborites
laborless
Labor omnia vincit
labor parties
Labor Party
labors
labor shed
labor skate
laborsome
labor spy
labor statesman
labor turnover
labor union
labor unionist
labor unions
Laboulbeniales
Laboulbeniineae
labour
labour after
labour along
Labour and Socialist International
labour a point
labour at
英汉双解词典包含483723条英汉词条,基本涵盖了全部常用单词的翻译及用法,是英语学习的有利工具。
Copyright © 2004-2022 Newdu.com All Rights Reserved
更新时间:2025/10/15 18:16:27