更改

跳到导航 跳到搜索
删除33字节 、 2020年8月19日 (三) 12:20
第971行: 第971行:  
=== 对于离散数据 For discrete data ===
 
=== 对于离散数据 For discrete data ===
   −
When <math>X</math> and <math>Y</math> are limited to be in a discrete number of states, observation data is summarized in a [[contingency table]], with row variable <math>X</math> (or <math>i</math>) and column variable <math>Y</math> (or <math>j</math>).  Mutual information is one of the measures of [[association (statistics)|association]] or [[correlation and dependence|correlation]] between the row and column variables.  Other measures of association include [[Pearson's chi-squared test]] statistics, [[G-test]] statistics, etc. In fact, mutual information is equal to [[G-test]] statistics divided by <math>2N</math>, where <math>N</math> is the sample size.
+
When <math>X</math> and <math>Y</math> are limited to be in a discrete number of states, observation data is summarized in a [[contingency table]], with row variable <math>X</math> (or <math>i</math>) and column variable <math>Y</math> (or <math>j</math>).  Mutual information is one of the measures of [[association (statistics)|association]] or [[correlation and dependence|correlation]] between the row and column variables.  Other measures of association include [[Pearson's chi-squared test]] statistics, [[G-test]] statistics, etc. In fact, mutual information is equal to [[G-test]] statistics divided by <math>2N</math>, where <math>N</math> is the sample size.
 
  −
When <math>X</math> and <math>Y</math>  are limited to be in a discrete number of states, observation data is summarized in a contingency table, with row variable <math>X</math> (or <math>i</math>) and column variable <math>Y</math> (or <math>j</math>).  Mutual information is one of the measures of association or correlation between the row and column variables.  Other measures of association include Pearson's chi-squared test statistics, G-test statistics, etc. In fact, mutual information is equal to G-test statistics divided by <math>2N</math>, where <math>N</math> is the sample size.
  −
 
  −
当数学 x / 数学和数学 y / 数学被限制在一个离散的数字状态时,观察数据被总结成一个列联表,有行变量 math x / math (或 math i / math)和列变量 math y / math (或 math j / math)。互信息是行变量和列变量之间关联或相关性的度量之一。其他衡量相关性的指标包括皮尔森卡方检验统计数据、 g 测试统计数据等。事实上,互信息等于 g 检验统计除以 math 2N / math,其中 math n / math 是样本量。
  −
 
  −
 
      +
When 𝑋 and 𝑌 are limited to be in a discrete number of states, observation data is summarized in a contingency table, with row variable 𝑋 (or 𝑖) and column variable 𝑌 (or 𝑗). Mutual information is one of the measures of association or correlation between the row and column variables. Other measures of association include Pearson's chi-squared test statistics, G-test statistics, etc. In fact, mutual information is equal to G-test statistics divided by 2𝑁, where 𝑁 is the sample size.
    +
当<math>X</math>和<math>Y</math>被限制为离散状态时,观测数据汇总在'''<font color="#ff8000">列联表 Contingency table</font>'''中,列变量为行变量<math>X</math>(或<math>i</math>)和列变量<math>Y</math>(或<math>j</math>)。互信息是行和列变量之间关联或相关性的度量之一。其他关联度量包括Pearson卡方检验统计量、'''<font color="#ff8000">G检验 G-test</font>'''统计量等。事实上,互信息等于G检验统计量除以<math>2N</math>,其中<math>N</math>为样本量。
    
== 应用 Applications ==
 
== 应用 Applications ==
463

个编辑

导航菜单