When <math>X</math> and <math>Y</math> are limited to be in a discrete number of states, observation data is summarized in a [[contingency table]], with row variable <math>X</math> (or <math>i</math>) and column variable <math>Y</math> (or <math>j</math>). Mutual information is one of the measures of [[association (statistics)|association]] or [[correlation and dependence|correlation]] between the row and column variables. Other measures of association include [[Pearson's chi-squared test]] statistics, [[G-test]] statistics, etc. In fact, mutual information is equal to [[G-test]] statistics divided by <math>2N</math>, where <math>N</math> is the sample size. | When <math>X</math> and <math>Y</math> are limited to be in a discrete number of states, observation data is summarized in a [[contingency table]], with row variable <math>X</math> (or <math>i</math>) and column variable <math>Y</math> (or <math>j</math>). Mutual information is one of the measures of [[association (statistics)|association]] or [[correlation and dependence|correlation]] between the row and column variables. Other measures of association include [[Pearson's chi-squared test]] statistics, [[G-test]] statistics, etc. In fact, mutual information is equal to [[G-test]] statistics divided by <math>2N</math>, where <math>N</math> is the sample size. |