第130行: |
第130行: |
| | | |
| Let <math>g(x)</math> be a [[Normal distribution|Gaussian]] [[Probability density function|PDF]] with mean μ and variance <math>\sigma^2</math> and <math>f(x)</math> an arbitrary [[Probability density function|PDF]] with the same variance. Since differential entropy is translation invariant we can assume that <math>f(x)</math> has the same mean of <math>\mu</math> as <math>g(x)</math>. | | Let <math>g(x)</math> be a [[Normal distribution|Gaussian]] [[Probability density function|PDF]] with mean μ and variance <math>\sigma^2</math> and <math>f(x)</math> an arbitrary [[Probability density function|PDF]] with the same variance. Since differential entropy is translation invariant we can assume that <math>f(x)</math> has the same mean of <math>\mu</math> as <math>g(x)</math>. |
| + | |
| + | --[[用户:CecileLi|CecileLi]]([[用户讨论:CecileLi|讨论]]) 【审校】补充翻译:设<math>g(x)</math>是一个[[正态分布|高斯]][[概率密度函数| PDF]],具有均值μ和方差<math>\sigma^2</math>和<math>f(x)</math>具有相同方差的任意[[概率密度函数| PDF]]。由于微分熵是平移不变性的,我们可以假设<math>f(x)</math>与<math>g(x)</math>具有相同的<math>\mu</math>平均值。 |
| | | |
| Consider the [[Kullback–Leibler divergence]] between the two distributions | | Consider the [[Kullback–Leibler divergence]] between the two distributions |
第143行: |
第145行: |
| \end{align}</math> | | \end{align}</math> |
| | | |
| + | --[[用户:CecileLi|CecileLi]]([[用户讨论:CecileLi|讨论]]) 【审校】补充翻译:考虑两个分布之间的[[Kullback–Leibler散度]] |
| + | |
| + | :<math>0\leq D{KL}(f{g)=\int{-\infty}^\infty f(x)\log\left(\frac{f(x)}{g(x)}\right)dx=-h(f)-\int{-\infty}^\infty f(x)\log(g(x))dx。</math> |
| + | |
| + | 现在请注意 |
| + | :<math>\begin{align} |
| + | \int_{-\infty}^\infty f(x)\log(g(x)) dx &= \int_{-\infty}^\infty f(x)\log\left( \frac{1}{\sqrt{2\pi\sigma^2}}e^{-\frac{(x-\mu)^2}{2\sigma^2}}\right) dx \\ |
| + | &= \int_{-\infty}^\infty f(x) \log\frac{1}{\sqrt{2\pi\sigma^2}} dx + \log(e)\int_{-\infty}^\infty f(x)\left( -\frac{(x-\mu)^2}{2\sigma^2}\right) dx \\ |
| + | &= -\tfrac{1}{2}\log(2\pi\sigma^2) - \log(e)\frac{\sigma^2}{2\sigma^2} \\ |
| + | &= -\tfrac{1}{2}\left(\log(2\pi\sigma^2) + \log(e)\right) \\ |
| + | &= -\tfrac{1}{2}\log(2\pi e \sigma^2) \\ |
| + | &= -h(g) |
| + | \end{align}</math> |
| | | |
| because the result does not depend on <math>f(x)</math> other than through the variance. Combining the two results yields | | because the result does not depend on <math>f(x)</math> other than through the variance. Combining the two results yields |
| + | |
| + | --[[用户:CecileLi|CecileLi]]([[用户讨论:CecileLi|讨论]]) 【审校】补充翻译:因为结果不依赖于<math>f(x)</math>而不是通过方差。将这两个结果结合起来就得到了 |
| | | |
| :<math> h(g) - h(f) \geq 0 \!</math> | | :<math> h(g) - h(f) \geq 0 \!</math> |
第150行: |
第167行: |
| with equality when <math>f(x)=g(x)</math> following from the properties of Kullback–Leibler divergence. | | with equality when <math>f(x)=g(x)</math> following from the properties of Kullback–Leibler divergence. |
| | | |
− | | + | --[[用户:CecileLi|CecileLi]]([[用户讨论:CecileLi|讨论]]) 【审校】补充翻译:当f(x)=g(x)</math>遵循Kullback-Leibler散度的性质时相等。 |
| | | |
| ===Alternative proof=== | | ===Alternative proof=== |