更改

跳到导航 跳到搜索
添加1,016字节 、 2020年10月28日 (三) 16:53
第91行: 第91行:     
== Properties 属性 ==
 
== Properties 属性 ==
===Conditional entropy equals zero===
+
=== Conditional entropy equals zero 条件熵等于零 ===
 
<math>\Eta(Y|X)=0</math> if and only if the value of <math>Y</math> is completely determined by the value of <math>X</math>.
 
<math>\Eta(Y|X)=0</math> if and only if the value of <math>Y</math> is completely determined by the value of <math>X</math>.
   −
===Conditional entropy of independent random variables===
+
当且仅当<math>Y</math>的值完全由<math>X</math>的值确定时,才为<math>H(Y|X)=0</math>。
 +
 
 +
 
 +
 
 +
=== Conditional entropy of independent random variables 独立随机变量的条件熵 ===
 
Conversely, <math>\Eta(Y|X) = \Eta(Y)</math> if and only if <math>Y</math> and <math>X</math> are [[independent random variables]].
 
Conversely, <math>\Eta(Y|X) = \Eta(Y)</math> if and only if <math>Y</math> and <math>X</math> are [[independent random variables]].
   −
===Chain rule===
+
相反,当且仅当<math>Y</math>和<math>X</math>是独立随机变量时,则为<math>H(Y|X) =H(Y)</math>。
Assume that the combined system determined by two random variables <math>X</math> and <math>Y</math> has [[joint entropy]] <math>\Eta(X,Y)</math>, that is, we need <math>\Eta(X,Y)</math> bits of information on average to describe its exact state. Now if we first learn the value of <math>X</math>, we have gained <math>\Eta(X)</math> bits of information. Once <math>X</math> is known, we only need <math>\Eta(X,Y)-\Eta(X)</math> bits to describe the state of the whole system. This quantity is exactly <math>\Eta(Y|X)</math>, which gives the ''chain rule'' of conditional entropy:
+
 
 +
 
 +
 
 +
=== Chain rule 链式法则 ===
 +
Assume that the combined system determined by two random variables <math>X</math> and <math>Y</math> has [[joint entropy]] <math>H(X,Y)</math>, that is, we need <math>H(X,Y)</math> bits of information on average to describe its exact state. Now if we first learn the value of <math>X</math>, we have gained <math>H(X)</math> bits of information. Once <math>X</math> is known, we only need <math>H(X,Y)-H(X)</math> bits to describe the state of the whole system. This quantity is exactly <math>H(Y|X)</math>, which gives the ''chain rule'' of conditional entropy:
 +
 
 +
假设由两个随机变量<math>X</math>和<math>Y</math>确定的组合系统具有联合熵<math>H(X,Y)</math>,也就是说,我们通常需要<math>H(X,Y)</math>位信息来描述其确切状态。现在,如果我们首先获得<math>X</math>的值,我们将知晓<math>H(X)</math>位信息。一旦知道了<math>X</math>的值,我们就可以通过<math>H(X,Y)</math>-<math>H(X)</math>位来描述整个系统的状态。这个数量恰好是<math>H(Y|X)</math>,它给出了条件熵的链式法则:
 +
 
    
:<math>\Eta(Y|X)\, = \, \Eta(X,Y)- \Eta(X).</math><ref name=cover1991 />{{rp|17}}
 
:<math>\Eta(Y|X)\, = \, \Eta(X,Y)- \Eta(X).</math><ref name=cover1991 />{{rp|17}}
 +
    
The chain rule follows from the above definition of conditional entropy:
 
The chain rule follows from the above definition of conditional entropy:
 +
 +
链式法则遵循以上条件熵的定义:
 +
    
:<math>\begin{align}  
 
:<math>\begin{align}  
第111行: 第126行:  
  & = \Eta(X,Y) - \Eta(X).  
 
  & = \Eta(X,Y) - \Eta(X).  
 
\end{align}</math>
 
\end{align}</math>
 +
    
In general, a chain rule for multiple random variables holds:
 
In general, a chain rule for multiple random variables holds:
 +
 +
通常情况下,多个随机变量的链式法则表示为:
 +
    
:<math> \Eta(X_1,X_2,\ldots,X_n) =
 
:<math> \Eta(X_1,X_2,\ldots,X_n) =
 
  \sum_{i=1}^n \Eta(X_i | X_1, \ldots, X_{i-1}) </math><ref name=cover1991 />{{rp|22}}
 
  \sum_{i=1}^n \Eta(X_i | X_1, \ldots, X_{i-1}) </math><ref name=cover1991 />{{rp|22}}
 +
    
It has a similar form to [[Chain rule (probability)|chain rule]] in probability theory, except that addition instead of multiplication is used.
 
It has a similar form to [[Chain rule (probability)|chain rule]] in probability theory, except that addition instead of multiplication is used.
 +
 +
除了使用加法而不是乘法之外,它具有与概率论中的链式法则类似的形式。
 +
 +
    
===Bayes' rule===
 
===Bayes' rule===
961

个编辑

导航菜单