更改

添加742字节 、 2020年11月4日 (三) 17:37
第238行: 第238行:  
:<math>I(X;Y,Z) = I(X;Z) + I(X;Y|Z)</math>
 
:<math>I(X;Y,Z) = I(X;Z) + I(X;Y|Z)</math>
   −
==Multivariate mutual information==
+
== Multivariate mutual information 多元互信息 ==
 
{{main|Multivariate mutual information}}
 
{{main|Multivariate mutual information}}
 
The conditional mutual information can be used to inductively define a '''multivariate mutual information''' in a set- or [[Information theory and measure theory|measure-theoretic sense]] in the context of '''[[information diagram]]s'''.  In this sense we define the multivariate mutual information as follows:
 
The conditional mutual information can be used to inductively define a '''multivariate mutual information''' in a set- or [[Information theory and measure theory|measure-theoretic sense]] in the context of '''[[information diagram]]s'''.  In this sense we define the multivariate mutual information as follows:
 +
 +
结合信息图中的集合或度量理论,可以用条件互信息来归纳定义多元互信息。其定义表达式如下:
 +
 +
 
:<math>I(X_1;\ldots;X_{n+1}) = I(X_1;\ldots;X_n) - I(X_1;\ldots;X_n|X_{n+1}),</math>
 
:<math>I(X_1;\ldots;X_{n+1}) = I(X_1;\ldots;X_n) - I(X_1;\ldots;X_n|X_{n+1}),</math>
where
+
 
 +
 
 +
Where
 +
其中
 +
 
 +
 
 
:<math>I(X_1;\ldots;X_n|X_{n+1}) = \mathbb{E}_{X_{n+1}} [D_{\mathrm{KL}}( P_{(X_1,\ldots,X_n)|X_{n+1}} \| P_{X_1|X_{n+1}} \otimes\cdots\otimes P_{X_n|X_{n+1}} )].</math>
 
:<math>I(X_1;\ldots;X_n|X_{n+1}) = \mathbb{E}_{X_{n+1}} [D_{\mathrm{KL}}( P_{(X_1,\ldots,X_n)|X_{n+1}} \| P_{X_1|X_{n+1}} \otimes\cdots\otimes P_{X_n|X_{n+1}} )].</math>
This definition is identical to that of [[interaction information]] except for a change in sign in the case of an odd number of random variables.  A complication is that this multivariate mutual information (as well as the interaction information) can be positive, negative, or zero, which makes this quantity difficult to interpret intuitively.  In fact, for <math>n</math> random variables, there are <math>2^n-1</math> degrees of freedom for how they might be correlated in an information-theoretic sense, corresponding to each non-empty subset of these variables. These degrees of freedom are bounded by various Shannon- and non-Shannon-type [[inequalities in information theory]].
+
 
 +
 
 +
This definition is identical to that of [[interaction information]] except for a change in sign in the case of an odd number of random variables.  A complication is that this multivariate mutual information (as well as the interaction information) can be positive, negative, or zero, which makes this quantity difficult to interpret intuitively.  In fact, for <math>n</math> random variables, there are <math>2^n-1</math> degrees of freedom for how they might be correlated in an information-theoretic sense, corresponding to each non-empty subset of these variables. These degrees of freedom are bounded by various Shannon- and non-Shannon-type [[inequalities in information theory]].
 +
 
 +
该定义与'''<font color="#ff8000"> 交互信息Interaction information</font>'''的定义相同,只是在随机数为奇数的情况下符号发生了变化。一个复杂的问题是,该多元互信息(以及交互信息)可以是正,负或零,这使得其数量难以直观地解释。实际上,对于n个随机变量,存在2n-1个自由度。那么如何在信息理论上将它们关联,并对应于这些变量的每个非空子集,就是解决问题的关键。特别是这些自由度受到信息论中各种香农和非香农不等式的制约。
    
==References==
 
==References==
961

个编辑