更改

跳到导航 跳到搜索
删除6字节 、 2020年10月11日 (日) 19:58
第81行: 第81行:  
The negentropy of a distribution is equal to the Kullback–Leibler divergence between <math>p_x</math> and a Gaussian distribution with the same mean and variance as <math>p_x</math> (see  Differential entropy#Maximization in the normal distribution for a proof). In particular, it is always nonnegative.
 
The negentropy of a distribution is equal to the Kullback–Leibler divergence between <math>p_x</math> and a Gaussian distribution with the same mean and variance as <math>p_x</math> (see  Differential entropy#Maximization in the normal distribution for a proof). In particular, it is always nonnegative.
   −
一个分布的负熵等于 <math>p_x</math> 和与 <math>p_x</math> 具有与它相同均值和方差的正态分布的 Kullback-Leibler 散度(参见正态分布的<font color="#ff8000">微分熵 Differential entropy</font>和最大化)。特别地,负熵总是非负的。
+
一个分布的负熵等于 <math>p_x</math> 和具有与 <math>p_x</math> 相同均值和方差的正态分布的 Kullback-Leibler 散度(参见正态分布的<font color="#ff8000">微分熵 Differential entropy</font>和最大化)。特别地,负熵总是非负的。
     
526

个编辑

导航菜单