更改

删除225字节 、 2020年8月9日 (日) 15:50
第118行: 第118行:  
Intuitively, mutual information measures the information that <math>X</math> and <math>Y</math> share: It measures how much knowing one of these variables reduces uncertainty about the other. For example, if <math>X</math> and <math>Y</math> are independent, then knowing <math>X</math> does not give any information about <math>Y</math> and vice versa, so their mutual information is zero.  At the other extreme, if <math>X</math> is a deterministic function of <math>Y</math> and <math>Y</math> is a deterministic function of <math>X</math> then all information conveyed by <math>X</math> is shared with <math>Y</math>: knowing <math>X</math> determines the value of <math>Y</math> and vice versa. As a result, in this case the mutual information is the same as the uncertainty contained in <math>Y</math> (or <math>X</math>) alone, namely the entropy of <math>Y</math> (or <math>X</math>). Moreover, this mutual information is the same as the entropy of <math>X</math> and as the entropy of <math>Y</math>. (A very special case of this is when <math>X</math> and <math>Y</math> are the same random variable.)
 
Intuitively, mutual information measures the information that <math>X</math> and <math>Y</math> share: It measures how much knowing one of these variables reduces uncertainty about the other. For example, if <math>X</math> and <math>Y</math> are independent, then knowing <math>X</math> does not give any information about <math>Y</math> and vice versa, so their mutual information is zero.  At the other extreme, if <math>X</math> is a deterministic function of <math>Y</math> and <math>Y</math> is a deterministic function of <math>X</math> then all information conveyed by <math>X</math> is shared with <math>Y</math>: knowing <math>X</math> determines the value of <math>Y</math> and vice versa. As a result, in this case the mutual information is the same as the uncertainty contained in <math>Y</math> (or <math>X</math>) alone, namely the entropy of <math>Y</math> (or <math>X</math>). Moreover, this mutual information is the same as the entropy of <math>X</math> and as the entropy of <math>Y</math>. (A very special case of this is when <math>X</math> and <math>Y</math> are the same random variable.)
   −
直观地说,互信息测量了 math x / math 和 math y / math 共享的信息: 它测量了解其中一个变量减少了另一个变量的不确定性的程度。例如,如果数学 x / math 和数学 y / math 是独立的,那么知道数学 x / math 并不会给出任何关于数学 y / math 的信息,反之亦然,因此它们的相互信息为零。在另一个极端,如果数学 x / math 是数学 y / math 的确定性函数,而数学 y / math 是数学 x / math 的确定性函数,那么数学 x / math 传递的所有信息都与数学 y / math 共享: 知道数学 x / math 决定数学 y / math 的值,反之亦然。因此,在这种情况下,互信息与数学 y / math (或者数学 x / math)中包含的不确定性是一样的,即数学 y / math (或者数学 x / math)的熵。此外,这种互信息与数学 x / 数学的熵和数学 y / 数学的熵是一样的。(一个非常特殊的例子是,当数学 x / math 和数学 y / math 是同一个随机变量时。)
+
直观地说,相互信息衡量了<math>X</math><math>Y</math>共享的信息:它衡量了当已知其中一个变量后可以减少另一个变量多少的不确定性。例如,若<math>X</math>和<math>Y</math>是独立的,那么已知<math>X</math>不会得到关于<math>Y</math>的任何信息,反之亦然,因此它们的互信息为零。而另一个极端情况就是,若<math>X</math>是<math>Y</math>的确定函数,而<math>X</math>是<math>X</math>的确定性函数,则<math>X</math>传递的所有信息都与<math>Y</math>共享:即已知<math>X</math>就可以知道<math>Y</math>的值,反之亦然。因此,在这种情况下,互信息与仅包含在<math>Y</math>(或<math>X</math>)中的不确定性相同,即<math>Y</math>(或<math>X</math>)的熵。此外,这种情况下互信息与<math>X</math>的熵和<math>Y</math>的熵相同。(一个非常特殊的情况是当<math>X</math>和<math>Y</math>是相同的随机变量。)
      第128行: 第128行:  
Mutual information is a measure of the inherent dependence expressed in the joint distribution of <math>X</math> and <math>Y</math> relative to the joint distribution of <math>X</math> and <math>Y</math> under the assumption of independence. Mutual information therefore measures dependence in the following sense: <math>\operatorname{I}(X;Y)=0</math> if and only if <math>X</math> and <math>Y</math> are independent random variables.  This is easy to see in one direction: if <math>X</math> and <math>Y</math> are independent, then <math>p_{(X,Y)}(x,y)=p_X(x) \cdot p_Y(y)</math>, and therefore:
 
Mutual information is a measure of the inherent dependence expressed in the joint distribution of <math>X</math> and <math>Y</math> relative to the joint distribution of <math>X</math> and <math>Y</math> under the assumption of independence. Mutual information therefore measures dependence in the following sense: <math>\operatorname{I}(X;Y)=0</math> if and only if <math>X</math> and <math>Y</math> are independent random variables.  This is easy to see in one direction: if <math>X</math> and <math>Y</math> are independent, then <math>p_{(X,Y)}(x,y)=p_X(x) \cdot p_Y(y)</math>, and therefore:
   −
互信息是数学 x / 数学 y / 数学关于数学 x / 数学 y / 数学关于数学 x / 数学 y / 数学关于数学 x / 数学关于数学 y / 数学关于数学 x / 数学关于数学 x / 数学关于数学 y / 数学关于数学关于。因此,互信息在以下意义上度量依赖性: math operatorname { i }(x; y)0 / math 当且仅当 math x / math math y / math 是独立的随机变量。这很容易从一个方向看出来: 如果数学 x / math 和数学 y / math 是独立的,那么数学 p {(x,y)}(x,y) p x (x) cdot py (y) / math,因此:
+
互信息是在独立假设下,<math>X</math和<math>Y</math>的联合分布相对于<math>X</math和<math>Y</math>的联合分布表示的内在相关性的度量。因此互信息在以下意义上衡量相关性:<math>\operatorname{I}(X;Y)=0</math>当且仅当<math>X</math和<math>Y</math>是独立随机变量时。这很容易从一个方向看出:如果<math>X</math和<math>Y</math>是独立的,那么<math>p_{(X,Y)}(x,y)=p_X(x) \cdot p_Y(y)</math>,因此:
         −
  −
  −
:<math> \log{ \left( \frac{p_{(X,Y)}(x,y)}{p_X(x)\,p_Y(y)} \right) } = \log 1 = 0 .</math>
      
<math> \log{ \left( \frac{p_{(X,Y)}(x,y)}{p_X(x)\,p_Y(y)} \right) } = \log 1 = 0 .</math>
 
<math> \log{ \left( \frac{p_{(X,Y)}(x,y)}{p_X(x)\,p_Y(y)} \right) } = \log 1 = 0 .</math>
  −
(x,y)}{ p x (x) ,p y (y)} log 10. / math
        第148行: 第143行:  
Moreover, mutual information is nonnegative (i.e. <math>\operatorname{I}(X;Y) \ge 0</math> see below) and symmetric (i.e. <math>\operatorname{I}(X;Y) = \operatorname{I}(Y;X)</math> see below).
 
Moreover, mutual information is nonnegative (i.e. <math>\operatorname{I}(X;Y) \ge 0</math> see below) and symmetric (i.e. <math>\operatorname{I}(X;Y) = \operatorname{I}(Y;X)</math> see below).
   −
此外,互信息是非负的(即。(x; y) ge 0 / math 见下文)和对称(即。Math  operatorname { i }(x; y) operatorname { i }(y; x) / math 见下文)。
+
此外,互信息是非负的(即i(<math>\operatorname{I}(X;Y) \ge 0</math>,见下文)和对称的(即<math>\operatorname{I}(X;Y) = \operatorname{I}(Y;X)</math>,见下文)。
 
  −
 
  −
 
  −
 
      
== 与其他量的关系 Relation to other quantities ==
 
== 与其他量的关系 Relation to other quantities ==
463

个编辑