The multivariate mutual-information functions generalize the pairwise independence case that states that <math>X_1,X_2</math> if and only if <math>I(X_1;X_2)=0</math>, to arbitrary numerous variable. n variables are mutually independent if and only if the <math>2^n-n-1</math> mutual information functions vanish <math>I(X_1;...;X_k)=0</math> with <math>n \ge k \ge 2</math> (theorem 2 <ref name=e21090869/>). In this sense, the <math>I(X_1;...;X_k)=0</math> can be used as a refined statistical independence criterion. | The multivariate mutual-information functions generalize the pairwise independence case that states that <math>X_1,X_2</math> if and only if <math>I(X_1;X_2)=0</math>, to arbitrary numerous variable. n variables are mutually independent if and only if the <math>2^n-n-1</math> mutual information functions vanish <math>I(X_1;...;X_k)=0</math> with <math>n \ge k \ge 2</math> (theorem 2 <ref name=e21090869/>). In this sense, the <math>I(X_1;...;X_k)=0</math> can be used as a refined statistical independence criterion. |