更改

跳到导航 跳到搜索
添加35字节 、 2020年11月3日 (二) 18:38
第12行: 第12行:  
For random variables <math>X</math>, <math>Y</math>, and <math>Z</math> with [[Support (mathematics)|support sets]] <math>\mathcal{X}</math>, <math>\mathcal{Y}</math> and <math>\mathcal{Z}</math>, we define the conditional mutual information as
 
For random variables <math>X</math>, <math>Y</math>, and <math>Z</math> with [[Support (mathematics)|support sets]] <math>\mathcal{X}</math>, <math>\mathcal{Y}</math> and <math>\mathcal{Z}</math>, we define the conditional mutual information as
   −
对于具有支持集<math>\mathcal{X}</math>, <math>\mathcal{Y}</math> 和 <math>\mathcal{Z}</math>的随机变量
+
对于具有支持集<math>\mathcal{X}</math>, <math>\mathcal{Y}</math> 和 <math>\mathcal{Z}</math>的随机变量<math>X</math>, <math>Y</math>, 和 <math>Z</math>,我们将条件交互信息定义为:
<math>X</math>, <math>Y</math>, 和 <math>Z</math>,我们将条件交互信息定义为:
        第39行: 第38行:  
Thus <math>I(X;Y|Z)</math> is the expected (with respect to <math>Z</math>) [[Kullback–Leibler divergence]] from the conditional joint distribution <math>P_{(X,Y)|Z}</math> to the product of the conditional marginals <math>P_{X|Z}</math> and <math>P_{Y|Z}</math>. Compare with the definition of [[mutual information]].
 
Thus <math>I(X;Y|Z)</math> is the expected (with respect to <math>Z</math>) [[Kullback–Leibler divergence]] from the conditional joint distribution <math>P_{(X,Y)|Z}</math> to the product of the conditional marginals <math>P_{X|Z}</math> and <math>P_{Y|Z}</math>. Compare with the definition of [[mutual information]].
   −
因此,相较于交互信息的定义,<math>I(X;Y|Z)</math>可以表达为期望的Kullback-Leibler散度(相对于<math>Z</math>),即从条件联合分布<math>P_{(X,Y)|Z}</math>到条件边际<math>P_{X|Z}</math> 和 <math>P_{Y|Z}</math>的乘积。
+
因此,相较于交互信息的定义,<math>I(X;Y|Z)</math>可以表达为期望的'''<font color="#ff8000"> Kullback-Leibler散度</font>'''(相对于<math>Z</math>),即从条件联合分布<math>P_{(X,Y)|Z}</math>到条件边际<math>P_{X|Z}</math> 和 <math>P_{Y|Z}</math>的乘积。
    
==In terms of pmf's for discrete distributions==
 
==In terms of pmf's for discrete distributions==
961

个编辑

导航菜单