更改

跳到导航 跳到搜索
添加790字节 、 2020年8月9日 (日) 10:11
第53行: 第53行:  
设一对随机变量<math>(X,Y)</math>,参数空间为<math>\mathcal{X}\times\mathcal{Y}</math>。若它们之间的的联合概率分布为<math>P_{(X,Y)}</math>,边际分布分别为<math>P_X</math>和<math>P_Y</math>,则它们之间的互信息定义为:
 
设一对随机变量<math>(X,Y)</math>,参数空间为<math>\mathcal{X}\times\mathcal{Y}</math>。若它们之间的的联合概率分布为<math>P_{(X,Y)}</math>,边际分布分别为<math>P_X</math>和<math>P_Y</math>,则它们之间的互信息定义为:
    +
 +
{{Equation box 1
 +
|indent =
 +
|title=
 +
|equation =
 +
<math>
 +
I(X;Y) = D_{\mathrm{KL}}( P_{(X,Y)} \| P_{X} \otimes P_{Y} )
 +
</math>
 +
|cellpadding= 1
 +
|border
 +
|border colour = #0073CF
 +
|background colour=#F5FFFA}}
 +
where <math>D_{\mathrm{KL}}</math> is the [[Kullback–Leibler divergence]].
 +
Notice, as per property of the [[Kullback–Leibler divergence]], that <math>I(X;Y)</math> is equal to zero precisely when the joint distribution coincides with the product of the marginals, i.e. when <math>X</math> and <math>Y</math> are independent (and hence observing <math>Y</math> tells you nothing about <math>X</math>). In general <math>I(X;Y)</math> is non-negative, it is a measure of the price for encoding <math>(X,Y)</math> as a pair of independent random variables, when in reality they are not.
     
463

个编辑

导航菜单