The mutual information of two jointly discrete random variables <math>X</math> and <math>Y</math> is calculated as a double sum:<ref name=cover1991>{{cite book|last1=Cover|first1=T.M.|last2=Thomas|first2=J.A.|title=Elements of Information Theory|url=https://archive.org/details/elementsofinform0000cove|url-access=registration|date=1991|isbn=978-0-471-24195-9|edition=Wiley}}</ref>{{rp|20}} | The mutual information of two jointly discrete random variables <math>X</math> and <math>Y</math> is calculated as a double sum:<ref name=cover1991>{{cite book|last1=Cover|first1=T.M.|last2=Thomas|first2=J.A.|title=Elements of Information Theory|url=https://archive.org/details/elementsofinform0000cove|url-access=registration|date=1991|isbn=978-0-471-24195-9|edition=Wiley}}</ref>{{rp|20}} |