Venn diagram showing additive and subtractive relationships various information measures associated with correlated variables <math>X</math> and <math>Y</math>. The area contained by both circles is the joint entropy <math>H(X,Y)</math>. The circle on the left (red and violet) is the individual entropy <math>H(X)</math>, with the red being the conditional entropy <math>H(X|Y)</math>. The circle on the right (blue and violet) is <math>H(Y)</math>, with the blue being <math>H(Y|X)</math>. The violet is the mutual information <math>\operatorname{I}(X;Y)</math>. | Venn diagram showing additive and subtractive relationships various information measures associated with correlated variables <math>X</math> and <math>Y</math>. The area contained by both circles is the joint entropy <math>H(X,Y)</math>. The circle on the left (red and violet) is the individual entropy <math>H(X)</math>, with the red being the conditional entropy <math>H(X|Y)</math>. The circle on the right (blue and violet) is <math>H(Y)</math>, with the blue being <math>H(Y|X)</math>. The violet is the mutual information <math>\operatorname{I}(X;Y)</math>. |