A modification of differential entropy that addresses these drawbacks is the relative information entropy, also known as the Kullback–Leibler divergence, which includes an invariant measure factor (see limiting density of discrete points). | A modification of differential entropy that addresses these drawbacks is the relative information entropy, also known as the Kullback–Leibler divergence, which includes an invariant measure factor (see limiting density of discrete points). |