更改

跳到导航 跳到搜索
删除306字节 、 2020年10月26日 (一) 09:58
无编辑摘要
第4行: 第4行:  
{{Information theory}}
 
{{Information theory}}
   −
[[Image:Entropy-mutual-information-relative-entropy-relation-diagram.svg|thumb|256px|right|[[Venn diagram]] showing additive and subtractive relationships various [[Quantities of information|information measures]] associated with correlated variables <math>X</math> and <math>Y</math>. The area contained by both circles is the [[joint entropy]] <math>\Eta(X,Y)</math>. The circle on the left (red and violet) is the [[Entropy (information theory)|individual entropy]] <math>\Eta(X)</math>, with the red being the [[conditional entropy]] <math>\Eta(X|Y)</math>. The circle on the right (blue and violet) is <math>\Eta(Y)</math>, with the blue being <math>\Eta(Y|X)</math>. The violet is the [[mutual information]] <math>\operatorname{I}(X;Y)</math>.]]
+
[[文件:Entropy-mutual-information-relative-entropy-relation-diagram.svg|缩略图||该图表示在变量X、Y相关联的各种信息量之间,进行加减关系的维恩图。两个圆所包含的区域是联合熵H(X,Y)。左侧的圆圈(红色和紫色)是单个熵H(X),红色是条件熵H(X ǀ Y)。右侧的圆圈(蓝色和紫色)为H(Y),蓝色为H(Y ǀ X)。中间紫色的是相互信息i(X; Y)。]]
    
In [[information theory]], the '''conditional entropy''' quantifies the amount of information needed to describe the outcome of a [[random variable]] <math>Y</math> given that the value of another random variable <math>X</math> is known. Here, information is measured in [[Shannon (unit)|shannon]]s, [[Nat (unit)|nat]]s, or [[Hartley (unit)|hartley]]s. The ''entropy of <math>Y</math> conditioned on <math>X</math>'' is written as <math>\Eta(Y|X)</math>.
 
In [[information theory]], the '''conditional entropy''' quantifies the amount of information needed to describe the outcome of a [[random variable]] <math>Y</math> given that the value of another random variable <math>X</math> is known. Here, information is measured in [[Shannon (unit)|shannon]]s, [[Nat (unit)|nat]]s, or [[Hartley (unit)|hartley]]s. The ''entropy of <math>Y</math> conditioned on <math>X</math>'' is written as <math>\Eta(Y|X)</math>.
961

个编辑

导航菜单