更改

跳到导航 跳到搜索
删除42字节 、 2020年10月11日 (日) 19:52
第35行: 第35行:  
In information theory and statistics, negentropy is used as a measure of distance to normality. Out of all distributions with a given mean and variance, the normal or Gaussian distribution is the one with the highest entropy. Negentropy measures the difference in entropy between a given distribution and the Gaussian distribution with the same mean and variance. Thus, negentropy is always nonnegative, is invariant by any linear invertible change of coordinates, and vanishes if and only if the signal is Gaussian.
 
In information theory and statistics, negentropy is used as a measure of distance to normality. Out of all distributions with a given mean and variance, the normal or Gaussian distribution is the one with the highest entropy. Negentropy measures the difference in entropy between a given distribution and the Gaussian distribution with the same mean and variance. Thus, negentropy is always nonnegative, is invariant by any linear invertible change of coordinates, and vanishes if and only if the signal is Gaussian.
   −
在信息论和统计学中,负熵一般用于度量某分布与正态分布之间的距离。在所有具有给定均值和方差的分布中,正态分布或高斯分布的熵最大。负熵就用来度量具有相同均值和方差的给定分布和正态分布之间熵的差距。因此,负熵总是非负的,在任何线性可逆的坐标变换下都是不变的,当且仅当分布表现为<font color="#ff8000">高斯分布 Gaussian distribution</font>时才变为零。
+
在信息论和统计学中,负熵被用来度量到正态分布的距离。在所有具有给定均值和方差的分布中,正态分布或高斯分布的熵最大。负熵用来度量具有相同均值和方差的给定分布和正态分布之间熵的差距。因此,负熵总是非负的,在任何线性可逆的坐标变换下都是不变的,当且仅当信号是<font color="#ff8000">高斯分布 Gaussian distribution</font>时才变为零。
      第73行: 第73行:  
Negentropy is used in statistics and signal processing. It is related to network entropy, which is used in independent component analysis.
 
Negentropy is used in statistics and signal processing. It is related to network entropy, which is used in independent component analysis.
   −
负熵通常用于统计和信号处理。它与网络熵有关,网络熵可用于<font color="#ff8000">进行独立成分分析 independent component analysis</font>。
+
负熵通常用于统计和信号处理。它与网络熵有关,网络熵被用于<font color="#ff8000">独立成分分析 independent component analysis</font>。
      第81行: 第81行:  
The negentropy of a distribution is equal to the Kullback–Leibler divergence between <math>p_x</math> and a Gaussian distribution with the same mean and variance as <math>p_x</math> (see  Differential entropy#Maximization in the normal distribution for a proof). In particular, it is always nonnegative.
 
The negentropy of a distribution is equal to the Kullback–Leibler divergence between <math>p_x</math> and a Gaussian distribution with the same mean and variance as <math>p_x</math> (see  Differential entropy#Maximization in the normal distribution for a proof). In particular, it is always nonnegative.
   −
一个分布的负熵等于 <math>p_x</math> 和与 <math>p_x</math> 具有与它相同均值和方差的正态分布的 Kullback-Leibler 散度(参见正态分布的<font color="#ff8000">微分熵 Differential entropy</font>和最大化)。特别地,负熵总是非负的。
+
一个分布的负熵等于 <math>p_x</math> 和与 <math>p_x</math> 具有相同均值和方差的正态分布的 Kullback-Leibler 散度(参见正态分布的<font color="#ff8000">微分熵 Differential entropy</font>和最大化)。特别是,它总是非负的。
     
526

个编辑

导航菜单