更改

跳到导航 跳到搜索
添加412字节 、 2020年11月3日 (二) 16:22
第52行: 第52行:  
其中<math>x_1,...,x_n</math>分别是<math>X_1,...,X_n</math>的特定值,<math>P(x_1, ..., x_n)</math>是这些值产生交集时的概率,如果<math>P(x_1, ..., x_n)=0</math>则<math>P(x_1, ..., x_n) \log_2[P(x_1, ..., x_n)]</math>定义为0。
 
其中<math>x_1,...,x_n</math>分别是<math>X_1,...,X_n</math>的特定值,<math>P(x_1, ..., x_n)</math>是这些值产生交集时的概率,如果<math>P(x_1, ..., x_n)=0</math>则<math>P(x_1, ..., x_n) \log_2[P(x_1, ..., x_n)]</math>定义为0。
   −
==Properties==
+
== Properties 属性 ==
 
  −
===Nonnegativity===
      +
===Nonnegativity 非负性===
 
The joint entropy of a set of random variables is a nonnegative number.
 
The joint entropy of a set of random variables is a nonnegative number.
 
+
一组随机变量的联合熵是一个非负数。
 
:<math>\Eta(X,Y) \geq 0</math>
 
:<math>\Eta(X,Y) \geq 0</math>
    
:<math>\Eta(X_1,\ldots, X_n) \geq 0</math>
 
:<math>\Eta(X_1,\ldots, X_n) \geq 0</math>
   −
===Greater than individual entropies===
+
=== Greater than individual entropies 大于单个熵===
    
The joint entropy of a set of variables is greater than or equal to the maximum of all of the individual entropies of the variables in the set.
 
The joint entropy of a set of variables is greater than or equal to the maximum of all of the individual entropies of the variables in the set.
 +
一组变量的联合熵大于或等于该组变量的所有单个熵的最大值。
    
:<math>\Eta(X,Y) \geq \max \left[\Eta(X),\Eta(Y) \right]</math>
 
:<math>\Eta(X,Y) \geq \max \left[\Eta(X),\Eta(Y) \right]</math>
第71行: 第71行:  
     \Bigl\{ \Eta\bigl(X_i\bigr) \Bigr\}</math>
 
     \Bigl\{ \Eta\bigl(X_i\bigr) \Bigr\}</math>
   −
===Less than or equal to the sum of individual entropies===
+
=== Less than or equal to the sum of individual entropies 小于或等于单个熵的总和===
    
The joint entropy of a set of variables is less than or equal to the sum of the individual entropies of the variables in the set.  This is an example of [[subadditivity]].  This inequality is an equality if and only if <math>X</math> and <math>Y</math> are [[statistically independent]].<ref name=cover1991 />{{rp|30}}
 
The joint entropy of a set of variables is less than or equal to the sum of the individual entropies of the variables in the set.  This is an example of [[subadditivity]].  This inequality is an equality if and only if <math>X</math> and <math>Y</math> are [[statistically independent]].<ref name=cover1991 />{{rp|30}}
 +
一组变量的联合熵小于或等于该组变量各个熵的总和。这是次可加性的一个例子。即当且仅当<math>X</math>和<math>Y</math>在统计上独立时,该不等式才是等式。
    
:<math>\Eta(X,Y) \leq \Eta(X) + \Eta(Y)</math>
 
:<math>\Eta(X,Y) \leq \Eta(X) + \Eta(Y)</math>
961

个编辑

导航菜单