第181行: |
第181行: |
| |background colour=#F5FFFA}} | | |background colour=#F5FFFA}} |
| | | |
− | === Properties === | + | === Properties 属性 === |
| In contrast to the conditional entropy for discrete random variables, the conditional differential entropy may be negative. | | In contrast to the conditional entropy for discrete random variables, the conditional differential entropy may be negative. |
| + | |
| + | 与离散随机变量的条件熵相比,条件微分熵可能为负。 |
| + | |
| + | |
| | | |
| As in the discrete case there is a chain rule for differential entropy: | | As in the discrete case there is a chain rule for differential entropy: |
| + | |
| + | 与离散情况一样,微分熵也有链式法则: |
| + | |
| + | |
| :<math>h(Y|X)\,=\,h(X,Y)-h(X)</math><ref name=cover1991 />{{rp|253}} | | :<math>h(Y|X)\,=\,h(X,Y)-h(X)</math><ref name=cover1991 />{{rp|253}} |
| + | |
| + | |
| Notice however that this rule may not be true if the involved differential entropies do not exist or are infinite. | | Notice however that this rule may not be true if the involved differential entropies do not exist or are infinite. |
| + | |
| + | 但是请注意,如果所涉及的微分熵不存在或无限,则此规则可能不成立。 |
| + | |
| + | |
| | | |
| Joint differential entropy is also used in the definition of the [[mutual information]] between continuous random variables: | | Joint differential entropy is also used in the definition of the [[mutual information]] between continuous random variables: |
| + | |
| + | 联合微分熵也用于定义连续随机变量之间的共享信息: |
| + | |
| + | |
| :<math>\operatorname{I}(X,Y)=h(X)-h(X|Y)=h(Y)-h(Y|X)</math> | | :<math>\operatorname{I}(X,Y)=h(X)-h(X|Y)=h(Y)-h(Y|X)</math> |
| + | |
| | | |
| <math>h(X|Y) \le h(X)</math> with equality if and only if <math>X</math> and <math>Y</math> are independent.<ref name=cover1991 />{{rp|253}} | | <math>h(X|Y) \le h(X)</math> with equality if and only if <math>X</math> and <math>Y</math> are independent.<ref name=cover1991 />{{rp|253}} |
| + | |
| + | 当且仅当X和Y是独立的时,<math>h(X|Y) \le h(X)</math>才相等。 |
| | | |
| ===Relation to estimator error=== | | ===Relation to estimator error=== |