更改

跳到导航 跳到搜索
添加396字节 、 2020年11月4日 (三) 15:19
第100行: 第100行:  
|background colour=#F5FFFA}}
 
|background colour=#F5FFFA}}
   −
==Some identities==
+
== Some identities 部分特性 ==
 
Alternatively, we may write in terms of joint and conditional [[Entropy (information theory)|entropies]] as<ref>{{cite book |last1=Cover |first1=Thomas |author-link1=Thomas M. Cover |last2=Thomas |first2=Joy A. |title=Elements of Information Theory |edition=2nd |location=New York |publisher=[[Wiley-Interscience]] |date=2006 |isbn=0-471-24195-4}}</ref>
 
Alternatively, we may write in terms of joint and conditional [[Entropy (information theory)|entropies]] as<ref>{{cite book |last1=Cover |first1=Thomas |author-link1=Thomas M. Cover |last2=Thomas |first2=Joy A. |title=Elements of Information Theory |edition=2nd |location=New York |publisher=[[Wiley-Interscience]] |date=2006 |isbn=0-471-24195-4}}</ref>
 +
同时我们也可以将联合和条件熵写为:
 +
 +
 
:<math>I(X;Y|Z) = H(X,Z) + H(Y,Z) - H(X,Y,Z) - H(Z)
 
:<math>I(X;Y|Z) = H(X,Z) + H(Y,Z) - H(X,Y,Z) - H(Z)
 
                 = H(X|Z) - H(X|Y,Z) = H(X|Z)+H(Y|Z)-H(X,Y|Z).</math>
 
                 = H(X|Z) - H(X|Y,Z) = H(X|Z)+H(Y|Z)-H(X,Y|Z).</math>
 +
 +
 
This can be rewritten to show its relationship to mutual information
 
This can be rewritten to show its relationship to mutual information
 +
这么表达以显示其与交互信息的关系
 +
 +
 
:<math>I(X;Y|Z) = I(X;Y,Z) - I(X;Z)</math>
 
:<math>I(X;Y|Z) = I(X;Y,Z) - I(X;Z)</math>
 +
 +
 
usually rearranged as '''the chain rule for mutual information'''
 
usually rearranged as '''the chain rule for mutual information'''
 +
通常情况下,表达式被重新整理为“交互信息的链式法则”
 +
 +
 
:<math>I(X;Y,Z) = I(X;Z) + I(X;Y|Z)</math>
 
:<math>I(X;Y,Z) = I(X;Z) + I(X;Y|Z)</math>
 +
 +
 
Another equivalent form of the above is<ref>[https://math.stackexchange.com/q/1863993 Decomposition on Math.StackExchange]</ref>
 
Another equivalent form of the above is<ref>[https://math.stackexchange.com/q/1863993 Decomposition on Math.StackExchange]</ref>
 +
上述的另一种等效形式是:
 +
 +
 
:<math>I(X;Y|Z) = H(Z|X) + H(X) + H(Z|Y) + H(Y) - H(Z|X,Y) - H(X,Y) - H(Z)
 
:<math>I(X;Y|Z) = H(Z|X) + H(X) + H(Z|Y) + H(Y) - H(Z|X,Y) - H(X,Y) - H(Z)
 
                 = I(X;Y) + H(Z|X) + H(Z|Y) - H(Z|X,Y) - H(Z)</math>
 
                 = I(X;Y) + H(Z|X) + H(Z|Y) - H(Z|X,Y) - H(Z)</math>
 +
    
Like mutual information, conditional mutual information can be expressed as a [[Kullback–Leibler divergence]]:
 
Like mutual information, conditional mutual information can be expressed as a [[Kullback–Leibler divergence]]:
 +
类似交互信息一样,条件交互信息可以表示为Kullback-Leibler散度:
 +
    
:<math> I(X;Y|Z) = D_{\mathrm{KL}}[ p(X,Y,Z) \| p(X|Z)p(Y|Z)p(Z) ]. </math>
 
:<math> I(X;Y|Z) = D_{\mathrm{KL}}[ p(X,Y,Z) \| p(X|Z)p(Y|Z)p(Z) ]. </math>
 +
    
Or as an expected value of simpler Kullback–Leibler divergences:
 
Or as an expected value of simpler Kullback–Leibler divergences:
 +
或作为更简单的Kullback-Leibler差异的期望值:
 +
 +
 
:<math> I(X;Y|Z) = \sum_{z \in \mathcal{Z}} p( Z=z ) D_{\mathrm{KL}}[ p(X,Y|z) \| p(X|z)p(Y|z) ]</math>,
 
:<math> I(X;Y|Z) = \sum_{z \in \mathcal{Z}} p( Z=z ) D_{\mathrm{KL}}[ p(X,Y|z) \| p(X|z)p(Y|z) ]</math>,
 
:<math> I(X;Y|Z) = \sum_{y \in \mathcal{Y}} p( Y=y ) D_{\mathrm{KL}}[ p(X,Z|y) \| p(X|Z)p(Z|y) ]</math>.
 
:<math> I(X;Y|Z) = \sum_{y \in \mathcal{Y}} p( Y=y ) D_{\mathrm{KL}}[ p(X,Z|y) \| p(X|Z)p(Z|y) ]</math>.
961

个编辑

导航菜单