第120行: |
第120行: |
| 提供了一个python软件包,用于计算n个变量的数据集中的所有多元联合熵,交互信息,条件交互信息,总相关性,信息距离。 | | 提供了一个python软件包,用于计算n个变量的数据集中的所有多元联合熵,交互信息,条件交互信息,总相关性,信息距离。 |
| | | |
− | ==Joint differential entropy== | + | == Joint differential entropy 联合微分熵 == |
− | ===Definition=== | + | === Definition 定义 === |
| + | |
| The above definition is for discrete random variables and just as valid in the case of continuous random variables. The continuous version of discrete joint entropy is called ''joint differential (or continuous) entropy''. Let <math>X</math> and <math>Y</math> be a continuous random variables with a [[joint probability density function]] <math>f(x,y)</math>. The differential joint entropy <math>h(X,Y)</math> is defined as<ref name=cover1991 />{{rp|249}} | | The above definition is for discrete random variables and just as valid in the case of continuous random variables. The continuous version of discrete joint entropy is called ''joint differential (or continuous) entropy''. Let <math>X</math> and <math>Y</math> be a continuous random variables with a [[joint probability density function]] <math>f(x,y)</math>. The differential joint entropy <math>h(X,Y)</math> is defined as<ref name=cover1991 />{{rp|249}} |
| + | |
| + | 上面的定义是针对离散随机变量的,不过对于连续随机变量同样有效。离散联合熵的连续形式称为联合微分(或连续)熵。令<math>X</math>和<math>Y</math>为具有'''<font color="#ff8000"> 联合概率密度函数Joint probability density function</font>''' <math>f(x,y)</math>的连续随机变量,那么微分联合熵<math>h(X,Y)</math>定义为: |
| + | |
| | | |
| {{Equation box 1 | | {{Equation box 1 |
第134行: |
第138行: |
| | | |
| For more than two continuous random variables <math>X_1, ..., X_n</math> the definition is generalized to: | | For more than two continuous random variables <math>X_1, ..., X_n</math> the definition is generalized to: |
| + | |
| + | 对于两个以上的连续随机变量<math>X_1, ..., X_n</math>,其定义可概括为: |
| + | |
| | | |
| {{Equation box 1 | | {{Equation box 1 |
第143行: |
第150行: |
| |border colour = #0073CF | | |border colour = #0073CF |
| |background colour=#F5FFFA}} | | |background colour=#F5FFFA}} |
| + | |
| | | |
| The [[integral]] is taken over the support of <math>f</math>. It is possible that the integral does not exist in which case we say that the differential entropy is not defined. | | The [[integral]] is taken over the support of <math>f</math>. It is possible that the integral does not exist in which case we say that the differential entropy is not defined. |
| | | |
− | ===Properties=== | + | 这里可以用积分处理表达<math>f</math>。当然如果微分熵没有定义,积分也可能不存在。 |
| + | |
| + | |
| + | |
| + | === Properties 属性 === |
| As in the discrete case the joint differential entropy of a set of random variables is smaller or equal than the sum of the entropies of the individual random variables: | | As in the discrete case the joint differential entropy of a set of random variables is smaller or equal than the sum of the entropies of the individual random variables: |
| + | |
| + | 与离散情况一样,一组随机变量的联合微分熵小于或等于各个随机变量的熵之和: |
| + | |
| + | |
| :<math>h(X_1,X_2, \ldots,X_n) \le \sum_{i=1}^n h(X_i)</math><ref name=cover1991 />{{rp|253}} | | :<math>h(X_1,X_2, \ldots,X_n) \le \sum_{i=1}^n h(X_i)</math><ref name=cover1991 />{{rp|253}} |
| + | |
| | | |
| The following chain rule holds for two random variables: | | The following chain rule holds for two random variables: |
| + | 以下链式法则适用于两个随机变量: |
| + | |
| + | |
| :<math>h(X,Y) = h(X|Y) + h(Y)</math> | | :<math>h(X,Y) = h(X|Y) + h(Y)</math> |
| + | |
| + | |
| In the case of more than two random variables this generalizes to:<ref name=cover1991 />{{rp|253}} | | In the case of more than two random variables this generalizes to:<ref name=cover1991 />{{rp|253}} |
| + | 对于两个以上的随机变量,一般可归纳为: |
| + | |
| + | |
| :<math>h(X_1,X_2, \ldots,X_n) = \sum_{i=1}^n h(X_i|X_1,X_2, \ldots,X_{i-1})</math> | | :<math>h(X_1,X_2, \ldots,X_n) = \sum_{i=1}^n h(X_i|X_1,X_2, \ldots,X_{i-1})</math> |
| + | |
| + | |
| Joint differential entropy is also used in the definition of the [[mutual information]] between continuous random variables: | | Joint differential entropy is also used in the definition of the [[mutual information]] between continuous random variables: |
| + | 联合微分熵也用于定义连续随机变量之间的交互信息: |
| + | |
| + | |
| :<math>\operatorname{I}(X,Y)=h(X)+h(Y)-h(X,Y)</math> | | :<math>\operatorname{I}(X,Y)=h(X)+h(Y)-h(X,Y)</math> |
| | | |