The negentropy of a distribution is equal to the Kullback–Leibler divergence between <math>p_x</math> and a Gaussian distribution with the same mean and variance as <math>p_x</math> (see Differential entropy#Maximization in the normal distribution for a proof). In particular, it is always nonnegative. | The negentropy of a distribution is equal to the Kullback–Leibler divergence between <math>p_x</math> and a Gaussian distribution with the same mean and variance as <math>p_x</math> (see Differential entropy#Maximization in the normal distribution for a proof). In particular, it is always nonnegative. |