更改

删除92字节 、 2020年8月15日 (六) 17:17
第128行: 第128行:  
Mutual information is a measure of the inherent dependence expressed in the [[joint distribution]] of <math>X</math> and <math>Y</math> relative to the joint distribution of <math>X</math> and <math>Y</math> under the assumption of independence. Mutual information therefore measures dependence in the following sense: <math>\operatorname{I}(X;Y)=0</math> [[if and only if]] <math>X</math> and <math>Y</math> are independent random variables.  This is easy to see in one direction: if <math>X</math> and <math>Y</math> are independent, then <math>p_{(X,Y)}(x,y)=p_X(x) \cdot p_Y(y)</math>, and therefore:
 
Mutual information is a measure of the inherent dependence expressed in the [[joint distribution]] of <math>X</math> and <math>Y</math> relative to the joint distribution of <math>X</math> and <math>Y</math> under the assumption of independence. Mutual information therefore measures dependence in the following sense: <math>\operatorname{I}(X;Y)=0</math> [[if and only if]] <math>X</math> and <math>Y</math> are independent random variables.  This is easy to see in one direction: if <math>X</math> and <math>Y</math> are independent, then <math>p_{(X,Y)}(x,y)=p_X(x) \cdot p_Y(y)</math>, and therefore:
   −
Mutual information is a measure of the inherent dependence expressed in the joint distribution of <math>X</math> and <math>Y</math> relative to the joint distribution of <math>X</math> and <math>Y</math> under the assumption of independence. Mutual information therefore measures dependence in the following sense: <math>\operatorname{I}(X;Y)=0</math> if and only if <math>X</math> and <math>Y</math> are independent random variables. This is easy to see in one direction: if <math>X</math> and <math>Y</math> are independent, then <math>p_{(X,Y)}(x,y)=p_X(x) \cdot p_Y(y)</math>, and therefore:
+
Mutual information is a measure of the inherent dependence expressed in the joint distribution of 𝑋 and 𝑌 relative to the joint distribution of 𝑋 and 𝑌 under the assumption of independence. Mutual information therefore measures dependence in the following sense: I(𝑋;𝑌)=0 if and only if 𝑋 and 𝑌 are independent random variables. This is easy to see in one direction: if 𝑋 and 𝑌 are independent, then 𝑝(𝑋,𝑌)(𝑥,𝑦)=𝑝𝑋(𝑥)⋅𝑝𝑌(𝑦), and therefore:
    
互信息是在独立假设下,<math>X</math和<math>Y</math>的联合分布相对于<math>X</math和<math>Y</math>的联合分布表示的内在相关性的度量。因此互信息在以下意义上衡量相关性:<math>\operatorname{I}(X;Y)=0</math>当且仅当<math>X</math和<math>Y</math>是独立随机变量时。这很容易从一个方向看出:如果<math>X</math和<math>Y</math>是独立的,那么<math>p_{(X,Y)}(x,y)=p_X(x) \cdot p_Y(y)</math>,因此:
 
互信息是在独立假设下,<math>X</math和<math>Y</math>的联合分布相对于<math>X</math和<math>Y</math>的联合分布表示的内在相关性的度量。因此互信息在以下意义上衡量相关性:<math>\operatorname{I}(X;Y)=0</math>当且仅当<math>X</math和<math>Y</math>是独立随机变量时。这很容易从一个方向看出:如果<math>X</math和<math>Y</math>是独立的,那么<math>p_{(X,Y)}(x,y)=p_X(x) \cdot p_Y(y)</math>,因此:
463

个编辑