单词乎
首页 - 英汉词典 - M字开头的单词或词组 - Markov chain
英语解释

Markov chain

汉语翻译

【计】 马尔可夫链

Markov chain英语解释

名词 markov chain:

  1. a Markov process for which the parameter is discrete time values
    同义词:Markoff chain
0
纠错
为您推荐
terminating Markov chainirreducible Markov chainhomogeneous Markov chainmarkov propertymarkov processmarkov networkmarkov modelmarkov machineMarkov algorithmmarkovergodic Markov processdiscrete-time Markov process
猜你喜欢
point-to-point data transmissionplastic wastelaw of population registrationtheophylline ethanolaminesodium polymerizationmain body of assetsdispatcher's boardarthrodial jointencode/decode conversioncetane valuezifsay nay to sth.
应用推荐