更改

跳到导航 跳到搜索
删除200字节 、 2020年8月8日 (六) 12:01
第2,007行: 第2,007行:       −
=== Absolute mutual information ===<!-- This section is linked from [[Kolmogorov complexity]] -->
+
=== 绝对互信息 Absolute mutual information ===<!-- This section is linked from Kolmogorov complexity -->
 
  −
=== Absolute mutual information ===<!-- This section is linked from Kolmogorov complexity -->
  −
 
  −
绝对的互信息! ——这一部分与科尔莫戈罗夫的复杂性有关——
      
Using the ideas of [[Kolmogorov complexity]], one can consider the mutual information of two sequences independent of any probability distribution:
 
Using the ideas of [[Kolmogorov complexity]], one can consider the mutual information of two sequences independent of any probability distribution:
第2,216行: 第2,212行:       −
=== For discrete data ===
+
=== 对于离散数据 For discrete data ===
 
  −
=== For discrete data ===
  −
 
  −
对于离散数据
  −
 
  −
 
  −
 
  −
 
      
When <math>X</math> and <math>Y</math>  are limited to be in a discrete number of states, observation data is summarized in a [[contingency table]], with row variable <math>X</math> (or <math>i</math>) and column variable <math>Y</math> (or <math>j</math>).  Mutual information is one of the measures of [[association (statistics)|association]] or [[correlation and dependence|correlation]] between the row and column variables.  Other measures of association include [[Pearson's chi-squared test]] statistics, [[G-test]] statistics, etc. In fact, mutual information is equal to [[G-test]] statistics divided by <math>2N</math>, where <math>N</math> is the sample size.
 
When <math>X</math> and <math>Y</math>  are limited to be in a discrete number of states, observation data is summarized in a [[contingency table]], with row variable <math>X</math> (or <math>i</math>) and column variable <math>Y</math> (or <math>j</math>).  Mutual information is one of the measures of [[association (statistics)|association]] or [[correlation and dependence|correlation]] between the row and column variables.  Other measures of association include [[Pearson's chi-squared test]] statistics, [[G-test]] statistics, etc. In fact, mutual information is equal to [[G-test]] statistics divided by <math>2N</math>, where <math>N</math> is the sample size.
463

个编辑

导航菜单