单词乎
首页 - 英汉词典 - M字开头的单词或词组 - markov process
英语解释

markov process

汉语翻译

【计】 马尔可夫过程, 马尔可夫处理

markov process英语解释

名词 markov process:

  1. a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state
    同义词:Markoff process
0
纠错
为您推荐
ergodic Markov processdiscrete-time Markov processterminating Markov chainmarkov propertymarkov networkmarkov modelmarkov machineMarkov chainMarkov algorithmmarkovirreducible Markov chainhomogeneous Markov chain
猜你喜欢
universal jointsstilbesterolfoenustwin diodesuperlattice materialcyclic schedulingcarcinocidinuterine coughline Baudelocque'svanadic acidnitrogen-free agarwar-torn
应用推荐