第321行: |
第321行: |
| The Kullback-Leibler divergence formulation of the mutual information is predicated on that one is interested in comparing <math>p(x,y)</math> to the fully factorized [[outer product]] <math>p(x) \cdot p(y)</math>. In many problems, such as [[non-negative matrix factorization]], one is interested in less extreme factorizations; specifically, one wishes to compare <math>p(x,y)</math> to a low-rank matrix approximation in some unknown variable <math>w</math>; that is, to what degree one might have | | The Kullback-Leibler divergence formulation of the mutual information is predicated on that one is interested in comparing <math>p(x,y)</math> to the fully factorized [[outer product]] <math>p(x) \cdot p(y)</math>. In many problems, such as [[non-negative matrix factorization]], one is interested in less extreme factorizations; specifically, one wishes to compare <math>p(x,y)</math> to a low-rank matrix approximation in some unknown variable <math>w</math>; that is, to what degree one might have |
| | | |
− | The Kullback-Leibler divergence formulation of the mutual information is predicated on that one is interested in comparing <math>p(x,y)</math> to the fully factorized outer product <math>p(x) \cdot p(y)</math>. In many problems, such as non-negative matrix factorization, one is interested in less extreme factorizations; specifically, one wishes to compare <math>p(x,y)</math> to a low-rank matrix approximation in some unknown variable <math>w</math>; that is, to what degree one might have | + | The Kullback-Leibler divergence formulation of the mutual information is predicated on that one is interested in comparing 𝑝(𝑥,𝑦) to the fully factorized outer product 𝑝(𝑥)⋅𝑝(𝑦). In many problems, such as non-negative matrix factorization, one is interested in less extreme factorizations; specifically, one wishes to compare 𝑝(𝑥,𝑦) to a low-rank matrix approximation in some unknown variable 𝑤; that is, to what degree one might have |
| | | |
− | 互信息的 Kullback-Leibler 散度公式是基于人们对数学 p (x,y) / math 与完全分解的外积 math p (x) cdot p (y) / math 的比较感兴趣。在许多问题中,比如非负矩阵分解,人们对非极端的因式分解感兴趣; 具体地说,人们希望将数学 p (x,y) / math 与某个未知变量 math w / math 中的低秩矩阵近似值进行比较; 也就是说,人们可能具有的程度
| + | 相互信息的Kullback-Leibler散度公式是基于这样一个结论的:人们有兴趣将𝑝(𝑥,𝑦)与完全分解的外积𝑝(𝑥)进行比较。在许多问题中,例如非负矩阵因式分解,人们对较不极端的因式分解感兴趣;具体地说,人们希望将𝑝(𝑥,𝑦)与某个未知变量𝑤中的低秩矩阵近似进行比较;也就是说,在多大程度上可能会有这样的结果 |
| | | |
| :<math>p(x,y)\approx \sum_w p^\prime (x,w) p^{\prime\prime}(w,y)</math> | | :<math>p(x,y)\approx \sum_w p^\prime (x,w) p^{\prime\prime}(w,y)</math> |
第329行: |
第329行: |
| <math>p(x,y)\approx \sum_w p^\prime (x,w) p^{\prime\prime}(w,y)</math> | | <math>p(x,y)\approx \sum_w p^\prime (x,w) p^{\prime\prime}(w,y)</math> |
| | | |
− | 约和 w p ^ prime (x,w) p ^ prime }(w,y) / math
| + | |
| | | |
| Alternately, one might be interested in knowing how much more information <math>p(x,y)</math> carries over its factorization. In such a case, the excess information that the full distribution <math>p(x,y)</math> carries over the matrix factorization is given by the Kullback-Leibler divergence | | Alternately, one might be interested in knowing how much more information <math>p(x,y)</math> carries over its factorization. In such a case, the excess information that the full distribution <math>p(x,y)</math> carries over the matrix factorization is given by the Kullback-Leibler divergence |
第366行: |
第366行: |
| | | |
| 在进程数学 w / math 只有一个数学 w / math 值的极端情况下,恢复了互信息的传统定义。 | | 在进程数学 w / math 只有一个数学 w / math 值的极端情况下,恢复了互信息的传统定义。 |
− |
| |
− |
| |
− |
| |
− |
| |
| | | |
| == 变形 Variations == | | == 变形 Variations == |