更改

删除6字节 、 2020年8月9日 (日) 11:22
第66行: 第66行:  
Notice, as per property of the [[Kullback–Leibler divergence]], that <math>I(X;Y)</math> is equal to zero precisely when the joint distribution coincides with the product of the marginals, i.e. when <math>X</math> and <math>Y</math> are independent (and hence observing <math>Y</math> tells you nothing about <math>X</math>). In general <math>I(X;Y)</math> is non-negative, it is a measure of the price for encoding <math>(X,Y)</math> as a pair of independent random variables, when in reality they are not.
 
Notice, as per property of the [[Kullback–Leibler divergence]], that <math>I(X;Y)</math> is equal to zero precisely when the joint distribution coincides with the product of the marginals, i.e. when <math>X</math> and <math>Y</math> are independent (and hence observing <math>Y</math> tells you nothing about <math>X</math>). In general <math>I(X;Y)</math> is non-negative, it is a measure of the price for encoding <math>(X,Y)</math> as a pair of independent random variables, when in reality they are not.
   −
注意,根据 Kullback-Leibler 散度的性质,当联合分布与边际乘积重合时,<math>I(X;Y)</math> 恰好等于零,即。当数学 x / 数学和数学 y / 数学是独立的(因此观察数学 y / 数学并不能告诉你数学 x / 数学)。在一般的数学 i (x; y) / math 是非负的,它是一种测量方法,用于将 math (x,y) / math 作为一对独立的随机变量进行编码,而实际上它们并不是。
+
需要注意的是,根据Kullback–Leibler散度的性质,当联合分布与边缘的乘积重合时,即当<math>X</math>和<math>Y</math>是独立的时,<math>I(X;Y)</math>等于零(因此已知𝑌的信息并不能得到任何关于𝑋的信息)。一般来说,<math>I(X;Y)</math>是非负的,因为它是编码<math>(X,Y)</math>作为一对独立随机变量的价格度量,但实际上它们并不是非负的。
    
== 关于离散分布的PMF In terms of PMFs for discrete distributions ==
 
== 关于离散分布的PMF In terms of PMFs for discrete distributions ==
463

个编辑