第12行: |
第12行: |
| Differential entropy (also referred to as continuous entropy) is a concept in information theory that began as an attempt by Shannon to extend the idea of (Shannon) entropy, a measure of average surprisal of a random variable, to continuous probability distributions. Unfortunately, Shannon did not derive this formula, and rather just assumed it was the correct continuous analogue of discrete entropy, but it is not. | | Differential entropy (also referred to as continuous entropy) is a concept in information theory that began as an attempt by Shannon to extend the idea of (Shannon) entropy, a measure of average surprisal of a random variable, to continuous probability distributions. Unfortunately, Shannon did not derive this formula, and rather just assumed it was the correct continuous analogue of discrete entropy, but it is not. |
| | | |
− | <font color="#ff8000"> 微分熵Differential entropy</font>(也被称为连续熵)是信息论中的一个概念,来源于香农尝试将他的熵的概念扩展到连续的概率分布。香农熵是衡量一个随机变量的平均惊异程度的指标。可惜的是,香农只是假设它是离散熵的正确连续模拟而并没有推导出公式,但事实上它并不是离散熵的正确连续模拟。 | + | <font color="#ff8000"> 微分熵Differential entropy</font>(也被称为连续熵)是信息论中的一个概念,其来源于香农尝试将他的香农熵的概念扩展到连续的概率分布。香农熵是衡量一个随机变量的平均惊异程度的指标。可惜的是,香农只是假设它是离散熵的正确连续模拟而并没有推导出公式,但事实上它并不是离散熵的正确连续模拟。 |
| | | |
| | | |
第34行: |
第34行: |
| In particular, for a constant <math>a</math> | | In particular, for a constant <math>a</math> |
| | | |
− | 特别是对于一个常量
| + | 特别地,对于一个常量 |
| | | |
| |indent = | | |indent = |
第69行: |
第69行: |
| | | |
| |background colour=#F5FFFA}} | | |background colour=#F5FFFA}} |
− |
| |
| | | |
| | | |
第78行: |
第77行: |
| For probability distributions which don't have an explicit density function expression, but have an explicit [[quantile function]] expression, <math>Q(p)</math>, then <math>h(Q)</math> can be defined in terms of the derivative of <math>Q(p)</math> i.e. the quantile density function <math>Q'(p)</math> as <ref>{{Citation |last1=Vasicek |first1=Oldrich |year=1976 |title=A Test for Normality Based on Sample Entropy |journal=[[Journal of the Royal Statistical Society, Series B]] |volume=38 |issue=1 |jstor=2984828 |postscript=. }}</ref>{{rp|54–59}} | | For probability distributions which don't have an explicit density function expression, but have an explicit [[quantile function]] expression, <math>Q(p)</math>, then <math>h(Q)</math> can be defined in terms of the derivative of <math>Q(p)</math> i.e. the quantile density function <math>Q'(p)</math> as <ref>{{Citation |last1=Vasicek |first1=Oldrich |year=1976 |title=A Test for Normality Based on Sample Entropy |journal=[[Journal of the Royal Statistical Society, Series B]] |volume=38 |issue=1 |jstor=2984828 |postscript=. }}</ref>{{rp|54–59}} |
| | | |
− | | + | --[[用户:CecileLi|CecileLi]]([[用户讨论:CecileLi|讨论]]) 【审校】此处缺无格式的英文及翻译 补充: |
− | | |
| :<math>h(Q) = \int_0^1 \log Q'(p)\,dp</math>. | | :<math>h(Q) = \int_0^1 \log Q'(p)\,dp</math>. |
| | | |