[[Image:Entropy-mutual-information-relative-entropy-relation-diagram.svg|thumb|256px|right|A misleading<ref>{{Cite book|author=D.J.C. Mackay|title= Information theory, inferences, and learning algorithms}}{{rp|141}}</ref> [[Venn diagram]] showing additive, and subtractive relationships between various [[Quantities of information|information measures]] associated with correlated variables X and Y. The area contained by both circles is the [[joint entropy]] H(X,Y). The circle on the left (red and violet) is the [[Entropy (information theory)|individual entropy]] H(X), with the red being the [[conditional entropy]] H(X{{!}}Y). The circle on the right (blue and violet) is H(Y), with the blue being H(Y{{!}}X). The violet is the [[mutual information]] I(X;Y).]]