第210行: |
第210行: |
| 在诸如<math>I(A;B|C)</math>的表达式中,<math>A</math> <math>B</math> 和 <math>C</math>不限于表示单个随机变量,它们同时可以表示在同一概率空间上定义的任意随机变量集合的联合分布。类似概率论中的表达方式,我们可以使用逗号来表示这种联合分布,例如<math>I(A_0,A_1;B_1,B_2,B_3|C_0,C_1).</math>。因此,使用分号(或有时用冒号或楔形<math>\wedge</math>)来分隔交互信息符号的主要参数。(在联合熵的符号中,不需要作这样的区分,因为任意数量随机变量的'''<font color="#ff8000"> 联合熵Joint entropy</font>'''与它们联合分布的熵相同。) | | 在诸如<math>I(A;B|C)</math>的表达式中,<math>A</math> <math>B</math> 和 <math>C</math>不限于表示单个随机变量,它们同时可以表示在同一概率空间上定义的任意随机变量集合的联合分布。类似概率论中的表达方式,我们可以使用逗号来表示这种联合分布,例如<math>I(A_0,A_1;B_1,B_2,B_3|C_0,C_1).</math>。因此,使用分号(或有时用冒号或楔形<math>\wedge</math>)来分隔交互信息符号的主要参数。(在联合熵的符号中,不需要作这样的区分,因为任意数量随机变量的'''<font color="#ff8000"> 联合熵Joint entropy</font>'''与它们联合分布的熵相同。) |
| | | |
− | == Properties == | + | == Properties 属性== |
− | ===Nonnegativity=== | + | ===Nonnegativity 非负性=== |
| It is always true that | | It is always true that |
| :<math>I(X;Y|Z) \ge 0</math>, | | :<math>I(X;Y|Z) \ge 0</math>, |
| for discrete, jointly distributed random variables <math>X</math>, <math>Y</math> and <math>Z</math>. This result has been used as a basic building block for proving other [[inequalities in information theory]], in particular, those known as Shannon-type inequalities. Conditional mutual information is also non-negative for continuous random variables under certain regularity conditions.<ref>{{cite book |last1=Polyanskiy |first1=Yury |last2=Wu |first2=Yihong |title=Lecture notes on information theory |date=2017 |page=30 |url=http://people.lids.mit.edu/yp/homepage/data/itlectures_v5.pdf}}</ref> | | for discrete, jointly distributed random variables <math>X</math>, <math>Y</math> and <math>Z</math>. This result has been used as a basic building block for proving other [[inequalities in information theory]], in particular, those known as Shannon-type inequalities. Conditional mutual information is also non-negative for continuous random variables under certain regularity conditions.<ref>{{cite book |last1=Polyanskiy |first1=Yury |last2=Wu |first2=Yihong |title=Lecture notes on information theory |date=2017 |page=30 |url=http://people.lids.mit.edu/yp/homepage/data/itlectures_v5.pdf}}</ref> |
| + | 对于离散,联合分布的随机变量X,Y和Z,如下不等式永远成立: |
| | | |
− | ===Interaction information=== | + | |
| + | :<math>I(X;Y|Z) \ge 0</math>, |
| + | |
| + | |
| + | 该结果已被用作证明信息理论中其他不等式的基础,尤其是香农不等式。对于某些正则条件下的连续随机变量,条件交互信息也是非负的。 |
| + | |
| + | |
| + | |
| + | === Interaction information 交互信息 === |
| Conditioning on a third random variable may either increase or decrease the mutual information: that is, the difference <math>I(X;Y) - I(X;Y|Z)</math>, called the [[interaction information]], may be positive, negative, or zero. This is the case even when random variables are pairwise independent. Such is the case when: <math display="block">X \sim \mathrm{Bernoulli}(0.5), Z \sim \mathrm{Bernoulli}(0.5), \quad Y=\left\{\begin{array}{ll} X & \text{if }Z=0\\ 1-X & \text{if }Z=1 \end{array}\right.</math>in which case <math>X</math>, <math>Y</math> and <math>Z</math> are pairwise independent and in particular <math>I(X;Y)=0</math>, but <math>I(X;Y|Z)=1.</math> | | Conditioning on a third random variable may either increase or decrease the mutual information: that is, the difference <math>I(X;Y) - I(X;Y|Z)</math>, called the [[interaction information]], may be positive, negative, or zero. This is the case even when random variables are pairwise independent. Such is the case when: <math display="block">X \sim \mathrm{Bernoulli}(0.5), Z \sim \mathrm{Bernoulli}(0.5), \quad Y=\left\{\begin{array}{ll} X & \text{if }Z=0\\ 1-X & \text{if }Z=1 \end{array}\right.</math>in which case <math>X</math>, <math>Y</math> and <math>Z</math> are pairwise independent and in particular <math>I(X;Y)=0</math>, but <math>I(X;Y|Z)=1.</math> |
| | | |
− | ===Chain rule for mutual information=== | + | 对第三个随机变量的条件可能会增加或减少交互信息:其交互信息的差<math>I(X;Y) - I(X;Y|Z)</math>可以为正,负或零。即使随机变量是成对独立的也是如此。比如以下情况下: |
| + | |
| + | |
| + | <math display="block">X \sim \mathrm{Bernoulli}(0.5), Z \sim \mathrm{Bernoulli}(0.5), \quad Y=\left\{\begin{array}{ll} X & \text{if }Z=0\\ 1-X & \text{if }Z=1 \end{array}\right.</math> |
| + | |
| + | |
| + | <math>X</math>, <math>Y</math> 和 <math>Z</math>是成对独立的,特别是<math>I(X;Y)=0</math>,不过这里<math>I(X;Y|Z)=1.</math>。 |
| + | |
| + | === Chain rule for mutual information 交互信息的链式法则 === |
| :<math>I(X;Y,Z) = I(X;Z) + I(X;Y|Z)</math> | | :<math>I(X;Y,Z) = I(X;Z) + I(X;Y|Z)</math> |
| | | |