更改

跳到导航 跳到搜索
添加1,644字节 、 2020年8月12日 (三) 17:43
无编辑摘要
第7行: 第7行:       −
[[File:Entropy-mutual-information-relative-entropy-relation-diagram.svg|thumb|256px|right|[[Venn diagram]] showing additive and subtractive relationships various information measures associated with correlated variables <math>X</math> and <math>Y</math>. The area contained by both circles is the [[joint entropy]] <math>(X,Y)</math>. The circle on the left (red and violet) is the [[Entropy (information theory)|individual entropy]] <math>(X)</math>, with the red being the [[conditional entropy]] <math>(X|Y)</math>. The circle on the right (blue and violet) is <math>(Y)</math>, with the blue being <math>(Y|X)</math>. The violet is the [[mutual information]] <math>\operatorname{I}(X;Y)</math>.  维恩图,显示加性和减法关系与相关变量𝑋和𝑌相关的各种信息度量。两个圆所包含的面积就是联合熵。左边的圆圈(红色和紫色)是个体熵,红色是条件熵。右边的圆圈(蓝色和紫色)是,蓝色是。紫罗兰色是相互信息I(𝑋;𝑌)。]]
+
[[File:Entropy-mutual-information-relative-entropy-relation-diagram.svg|thumb|256px|right|[[Venn diagram]] showing additive and subtractive relationships various information measures associated with correlated variables <math>X</math> and <math>Y</math>. The area contained by both circles is the [[joint entropy]] <math>H(X,Y)</math>. The circle on the left (red and violet) is the [[Entropy (information theory)|individual entropy]] <math>H(X)</math>, with the red being the [[conditional entropy]] <math>H(X|Y)</math>. The circle on the right (blue and violet) is <math>H(Y)</math>, with the blue being <math>H(Y|X)</math>. The violet is the [[mutual information]] <math>\operatorname{I}(X;Y)</math>.  这里的维恩图显示了各种信息间的交并补运算关系关系,这些信息都可以用来度量变量<math>X</math>和<math>Y</math>的各种相关性。图中所有面积(包括两个圆圈)表示二者的'''联合熵 Joint entropy'''<math>H(X,Y)</math>。左侧的整个圆圈表示变量<math>X</math>的'''独立熵 Individual entropy'''<math>H(X)</math>,红色(差集)部分表示X的'''条件熵 Conditional entropy'''<math>H(X|Y)</math>。右侧的整个圆圈表示变量<math>Y</math>的'''独立熵 Individual entropy'''<math>H(Y)</math>,蓝色(差集)部分表示X的'''条件熵 Conditional entropy'''<math>H(Y|X)</math>。两个圆中间的交集部分(紫色的部分)表示二者的互信息<math>\operatorname{I}(X;Y)</math>)。]]
   −
Venn diagram showing additive and subtractive relationships various information measures associated with correlated variables <math>X</math> and <math>Y</math>. The area contained by both circles is the joint entropy <math>(X,Y)</math>. The circle on the left (red and violet) is the individual entropy <math>(X)</math>, with the red being the conditional entropy <math>(X|Y)</math>. The circle on the right (blue and violet) is <math>(Y)</math>, with the blue being <math>(Y|X)</math>. The violet is the mutual information <math>\operatorname{I}(X;Y)</math>.
+
Venn diagram showing additive and subtractive relationships various information measures associated with correlated variables <math>X</math> and <math>Y</math>. The area contained by both circles is the joint entropy <math>H(X,Y)</math>. The circle on the left (red and violet) is the individual entropy <math>H(X)</math>, with the red being the conditional entropy <math>H(X|Y)</math>. The circle on the right (blue and violet) is <math>H(Y)</math>, with the blue being <math>H(Y|X)</math>. The violet is the mutual information <math>\operatorname{I}(X;Y)</math>.
   −
维恩图,显示加性和减法关系与相关变量𝑋和𝑌相关的各种信息度量。两个圆所包含的面积就是联合熵。左边的圆圈(红色和紫色)是个体熵,红色是条件熵。右边的圆圈(蓝色和紫色)是,蓝色是。紫罗兰色表示二者的相互信息I(𝑋;𝑌)。
+
Venn diagram showing additive and subtractive relationships various information measures associated with correlated variables 𝑋 and 𝑌. The area contained by both circles is the joint entropy H(𝑋,𝑌). The circle on the left (red and violet) is the individual entropy H(𝑋), with the red being the conditional entropy H(𝑋|𝑌). The circle on the right (blue and violet) is H(𝑌), with the blue being H(𝑌|𝑋). The violet is the mutual information I(𝑋;𝑌).
 +
 
 +
这里的维恩图显示了各种信息间的交并补运算关系关系,这些信息都可以用来度量变量<math>X</math>和<math>Y</math>的各种相关性。图中所有面积(包括两个圆圈)表示二者的<font color="#ff8000"> '''联合熵 Joint entropy'''</font><math>H(X,Y)</math>。左侧的整个圆圈表示变量<math>X</math>的<font color="#ff8000"> '''独立熵 Individual entropy'''</font><math>H(X)</math>,红色(差集)部分表示X的<font color="#ff8000"> '''条件熵 Conditional entropy'''</font><math>H(X|Y)</math>。右侧的整个圆圈表示变量<math>Y</math>的独立熵<math>H(Y)</math>,蓝色(差集)部分表示X的条件熵<math>H(Y|X)</math>。两个圆中间的交集部分(紫色的部分)表示二者的<font color="#ff8000">'''互信息 Mutual information'''</font><math>\operatorname{I}(X;Y)</math>)。
      第19行: 第21行:  
In probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the two variables. More specifically, it quantifies the "amount of information" (in units such as shannons, commonly called bits) obtained about one random variable through observing the other random variable.  The concept of mutual information is intricately linked to that of entropy of a random variable, a fundamental notion in information theory that quantifies the expected "amount of information" held in a random variable.
 
In probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the two variables. More specifically, it quantifies the "amount of information" (in units such as shannons, commonly called bits) obtained about one random variable through observing the other random variable.  The concept of mutual information is intricately linked to that of entropy of a random variable, a fundamental notion in information theory that quantifies the expected "amount of information" held in a random variable.
   −
在概率论和信息论中,两个随机变量的'''互信息 Mutual Information,MI'''是两个变量之间相互依赖性的度量。更具体地说,它量化了通过观察另一个随机变量而获得的关于一个随机变量的“信息量”(单位如''香农 shannons'',通常称为比特)。互信息的概念与随机变量的熵有着错综复杂的联系,熵是信息理论中的一个基本概念,它量化了随机变量中所包含的预期“信息量”。
+
在<font color="#ff8000"> '''概率论 Probability theory'''</font>和'<font color="#ff8000"> ''信息论 Information theory'''</font>中,两个随机变量的<font color="#ff8000"> '''互信息 Mutual Information,MI'''</font>是两个变量之间相互依赖性的度量。更具体地说,它量化了通过观察另一个随机变量而获得的关于一个随机变量的“信息量”(单位如''香农 shannons'',通常称为比特)。互信息的概念与随机变量的熵有着错综复杂的联系,熵是信息理论中的一个基本概念,它量化了随机变量中所包含的预期“信息量”。
     
463

个编辑

导航菜单