更改

跳到导航 跳到搜索
添加9字节 、 2020年10月28日 (三) 15:44
第90行: 第90行:  
<!-- This paragraph is incorrect; the last line is not the KL divergence between any two distributions, since p(x) is [in general] not a valid distribution over the domains of X and Y. The last formula above is the [[Kullback-Leibler divergence]], also known as relative entropy. Relative entropy is always positive, and vanishes if and only if <math>p(x,y) = p(x)</math>. This is when knowing <math>x</math> tells us everything about <math>y</math>.  ADDED: Could this comment be out of date since the KL divergence is not mentioned above? November 2014 -->
 
<!-- This paragraph is incorrect; the last line is not the KL divergence between any two distributions, since p(x) is [in general] not a valid distribution over the domains of X and Y. The last formula above is the [[Kullback-Leibler divergence]], also known as relative entropy. Relative entropy is always positive, and vanishes if and only if <math>p(x,y) = p(x)</math>. This is when knowing <math>x</math> tells us everything about <math>y</math>.  ADDED: Could this comment be out of date since the KL divergence is not mentioned above? November 2014 -->
   −
==Properties==
+
== Properties 属性 ==
 
===Conditional entropy equals zero===
 
===Conditional entropy equals zero===
 
<math>\Eta(Y|X)=0</math> if and only if the value of <math>Y</math> is completely determined by the value of <math>X</math>.
 
<math>\Eta(Y|X)=0</math> if and only if the value of <math>Y</math> is completely determined by the value of <math>X</math>.
961

个编辑

导航菜单