Mutual information is a measure of the inherent dependence expressed in the joint distribution of 𝑋 and 𝑌 relative to the joint distribution of 𝑋 and 𝑌 under the assumption of independence. Mutual information therefore measures dependence in the following sense: I(𝑋;𝑌)=0 if and only if 𝑋 and 𝑌 are independent random variables. This is easy to see in one direction: if 𝑋 and 𝑌 are independent, then 𝑝(𝑋,𝑌)(𝑥,𝑦)=𝑝𝑋(𝑥)⋅𝑝𝑌(𝑦), and therefore: | Mutual information is a measure of the inherent dependence expressed in the joint distribution of 𝑋 and 𝑌 relative to the joint distribution of 𝑋 and 𝑌 under the assumption of independence. Mutual information therefore measures dependence in the following sense: I(𝑋;𝑌)=0 if and only if 𝑋 and 𝑌 are independent random variables. This is easy to see in one direction: if 𝑋 and 𝑌 are independent, then 𝑝(𝑋,𝑌)(𝑥,𝑦)=𝑝𝑋(𝑥)⋅𝑝𝑌(𝑦), and therefore: |