更改

跳到导航 跳到搜索
添加733字节 、 2021年12月22日 (三) 20:00
无编辑摘要
第271行: 第271行:     
The Q-function used in the EM algorithm is based on the log likelihood. Therefore, it is regarded as the log-EM algorithm. The use of the log likelihood can be generalized to that of the α-log likelihood ratio. Then, the α-log likelihood ratio of the observed data can be exactly expressed as equality by using the Q-function of the α-log likelihood ratio and the α-divergence. Obtaining this Q-function is a generalized E step. Its maximization is a generalized M step. This pair is called the α-EM algorithm which contains the log-EM algorithm as its subclass. Thus, the α-EM algorithm by Yasuo Matsuyama is an exact generalization of the log-EM algorithm. No computation of gradient or Hessian matrix is needed. The α-EM shows faster convergence than the log-EM algorithm by choosing an appropriate α. The α-EM algorithm leads to a faster version of the Hidden Markov model estimation algorithm α-HMM.
 
The Q-function used in the EM algorithm is based on the log likelihood. Therefore, it is regarded as the log-EM algorithm. The use of the log likelihood can be generalized to that of the α-log likelihood ratio. Then, the α-log likelihood ratio of the observed data can be exactly expressed as equality by using the Q-function of the α-log likelihood ratio and the α-divergence. Obtaining this Q-function is a generalized E step. Its maximization is a generalized M step. This pair is called the α-EM algorithm which contains the log-EM algorithm as its subclass. Thus, the α-EM algorithm by Yasuo Matsuyama is an exact generalization of the log-EM algorithm. No computation of gradient or Hessian matrix is needed. The α-EM shows faster convergence than the log-EM algorithm by choosing an appropriate α. The α-EM algorithm leads to a faster version of the Hidden Markov model estimation algorithm α-HMM.
 +
 +
EM 算法中使用的 Q 函数基于对数似然。 因此,它被视为log-EM算法。 对数似然的使用可以推广到 α-对数似然比的使用。 然后,通过使用α-log似然比和α-散度的Q函数,可以将观测数据的α-log似然比精确表示为等式。 获得这个 Q 函数是一个广义的 E 步骤。 它的最大化是一个广义的 M 步。 这对称为 α-EM 算法,它包含 log-EM 算法作为其子类。 因此,Yasuo Matsuyama 的 α-EM 算法是 log-EM 算法的精确推广。 不需要计算梯度或 Hessian 矩阵。 通过选择合适的 α,α-EM 显示出比 log-EM 算法更快的收敛速度。 α-EM 算法导致了隐马尔可夫模型估计算法 α-HMM 的更快版本。
    
<small>This page was moved from [[wikipedia:en:Expectation–maximization algorithm]]. Its edit history can be viewed at [[EM算法/edithistory]]</small>
 
<small>This page was moved from [[wikipedia:en:Expectation–maximization algorithm]]. Its edit history can be viewed at [[EM算法/edithistory]]</small>
 
[[Category:待整理页面]]
 
[[Category:待整理页面]]
12

个编辑

导航菜单