更改

跳到导航 跳到搜索
删除41字节 、 2021年2月7日 (日) 19:36
第322行: 第322行:  
The Kullback-Leibler divergence formulation of the mutual information is predicated on that one is interested in comparing <math>p(x,y)</math> to the fully factorized [[outer product]] <math>p(x) \cdot p(y)</math>. In many problems, such as [[non-negative matrix factorization]], one is interested in less extreme factorizations; specifically, one wishes to compare <math>p(x,y)</math> to a low-rank matrix approximation in some unknown variable <math>w</math>; that is, to what degree one might have
 
The Kullback-Leibler divergence formulation of the mutual information is predicated on that one is interested in comparing <math>p(x,y)</math> to the fully factorized [[outer product]] <math>p(x) \cdot p(y)</math>. In many problems, such as [[non-negative matrix factorization]], one is interested in less extreme factorizations; specifically, one wishes to compare <math>p(x,y)</math> to a low-rank matrix approximation in some unknown variable <math>w</math>; that is, to what degree one might have
   −
The Kullback-Leibler divergence formulation of the mutual information is predicated on that one is interested in comparing 𝑝(𝑥,𝑦) to the fully factorized outer product 𝑝(𝑥)⋅𝑝(𝑦). In many problems, such as non-negative matrix factorization, one is interested in '''<font color="#32CD32">less extreme factorizations</font>'''; specifically, one wishes to compare 𝑝(𝑥,𝑦) to a low-rank matrix approximation in some unknown variable 𝑤; that is, to what degree one might have
+
The Kullback-Leibler divergence formulation of the mutual information is predicated on that one is interested in comparing 𝑝(𝑥,𝑦) to the fully factorized outer product 𝑝(𝑥)⋅𝑝(𝑦). In many problems, such as non-negative matrix factorization, one is interested in less extreme factorizations; specifically, one wishes to compare 𝑝(𝑥,𝑦) to a low-rank matrix approximation in some unknown variable 𝑤; that is, to what degree one might have
   −
互信息的KL散度公式是基于这样一个结论的:人们会更关注将<math>p(x,y)</math>与完全分解的'''<font color="#ff8000">外积 Outer Product</font>'''<math>p(x) \cdot p(y)</math>进行比较。在许多问题中,例如'''<font color="#ff8000">非负矩阵因式分解 Non-negative matrix factorization</font>''',人们对较不极端的因式分解感兴趣;具体地说,人们希望将<math>p(x,y)</math>与某个未知变量<math>w</math>中的低秩矩阵近似进行比较;也就是说,在多大程度上可能会有这样的结果:
+
互信息的KL散度公式是基于这样一个结论的:人们会更关注将<math>p(x,y)</math>与完全分解的'''<font color="#ff8000">外积 Outer Product</font>'''<math>p(x) \cdot p(y)</math>进行比较。在许多问题中,例如'''<font color="#ff8000">非负矩阵因式分解 Non-negative matrix factorization</font>''',人们对非极端因式分解感兴趣;具体地说,人们希望将<math>p(x,y)</math>与某个未知变量<math>w</math>中的低秩矩阵近似进行比较;也就是说,在多大程度上可能会有这样的结果:
    
:<math>p(x,y)\approx \sum_w p^\prime (x,w) p^{\prime\prime}(w,y)</math>
 
:<math>p(x,y)\approx \sum_w p^\prime (x,w) p^{\prime\prime}(w,y)</math>
99

个编辑

导航菜单