第587行: |
第587行: |
| Several generalizations of mutual information to more than two random variables have been proposed, such as total correlation (or multi-information) and interaction information. The expression and study of multivariate higher-degree mutual-information was achieved in two seemingly independent works: McGill (1954) who called these functions “interaction information”, and Hu Kuo Ting (1962) who also first proved the possible negativity of mutual-information for degrees higher than 2 and justified algebraically the intuitive correspondence to Venn diagrams | | Several generalizations of mutual information to more than two random variables have been proposed, such as total correlation (or multi-information) and interaction information. The expression and study of multivariate higher-degree mutual-information was achieved in two seemingly independent works: McGill (1954) who called these functions “interaction information”, and Hu Kuo Ting (1962) who also first proved the possible negativity of mutual-information for degrees higher than 2 and justified algebraically the intuitive correspondence to Venn diagrams |
| | | |
− | 提出了将互信息推广到两个以上随机变量的方法,如全相关(或多信息)和交互信息。多元高阶互信息的表达和研究是在两部看似独立的著作中实现的:McGill(1954)称这些函数为“交互信息”,胡国亭(1962)也首次证明了大于2度的互信息可能是负的,并用代数证明了维恩图的直观对应关系
| + | 目前提出了许多将互信息推广到两个以上随机变量的方法,如'''<font color="#ff8000">全相关 Total correlation</font>'''(或'''<font color="#ff8000">多信息 Multi-information</font>''')以及'''<font color="#ff8000">交互信息 Interaction information</font>'''。多元高阶互信息的表达和研究是在两部看似独立的著作中实现的:McGill(1954年)在文献[8]中将这些函数统称为“互信息”,胡国亭(1962年)也在文献[9]中首次证明了大于2度的互信息可能是负的,并在文献[10]中用代数的方法证明了互信息和维恩图的直观对应关系。 |
| | | |
| | | |
第623行: |
第623行: |
| where (as above) we define | | where (as above) we define |
| | | |
− | 在(如上所述)我们定义:
| + | 综上所述,我们定义: |
| | | |
| | | |
第637行: |
第637行: |
| (This definition of multivariate mutual information is identical to that of interaction information except for a change in sign when the number of random variables is odd.) | | (This definition of multivariate mutual information is identical to that of interaction information except for a change in sign when the number of random variables is odd.) |
| | | |
− | (这个多元互信息的定义与互信息的定义相同,随机变量的数目为奇数时符号的变化除外。)
| + | (这个多元互信息的定义与互信息的定义相同,对于随机变量的数目为奇数时符号的变化除外。) |
| | | |
| | | |
第650行: |
第650行: |
| | | |
| | | |
− | 多元互信息函数将𝑋1,𝑋2当且仅当𝐼(𝑋1;𝑋2)=0的两两独立情况推广到任意多变量。当且仅当2𝑛-𝑛-1互信息函数为𝑛(𝑛1;…;𝑋𝑘)=0且𝑛≥2时,n个变量相互独立(定理2)。从这个意义上讲,𝐼(𝑋1;…;𝑋𝑘)=0可以用作一个精确的统计独立性标准。
| + | 多元互信息函数将<math>I(X_1;X_2)=0</math>当且仅当<math>X_1,X_2</math>两两独立的情况推广到任意多变量。即当且仅当<math>2^n-n-1</math>的互信息函数为<math>I(X_1;...;X_k)=0</math>且math>n \ge k \ge 2</math>时,n个变量相互独立(定理2)。从这个意义上讲,<math>I(X_1;...;X_k)=0</math>可以用作一个精确的统计独立性标准。 |
| | | |
− | ==== 申请 Applications ==== | + | ==== 应用 Applications ==== |
| | | |
| For 3 variables, Brenner et al. applied multivariate mutual information to neural coding and called its negativity "synergy" <ref>{{cite journal | last1 = Brenner | first1 = N. | last2 = Strong | first2 = S. | last3 = Koberle | first3 = R. | last4 = Bialek | first4 = W. | year = 2000 | title = Synergy in a Neural Code | doi = 10.1162/089976600300015259 | pmid = 10935917 | journal = Neural Comput | volume = 12 | issue = 7 | pages = 1531–1552 }}</ref> and Watkinson et al. applied it to genetic expression <ref>{{cite journal | last1 = Watkinson | first1 = J. | last2 = Liang | first2 = K. | last3 = Wang | first3 = X. | last4 = Zheng | first4 = T.| last5 = Anastassiou | first5 = D. | year = 2009 | title = Inference of Regulatory Gene Interactions from Expression Data Using Three-Way Mutual Information | doi = 10.1111/j.1749-6632.2008.03757.x | pmid = 19348651 | journal = Chall. Syst. Biol. Ann. N. Y. Acad. Sci. | volume = 1158 | issue = 1 | pages = 302–313 | bibcode = 2009NYASA1158..302W | url = https://semanticscholar.org/paper/cb09223a34b08e6dcbf696385d9ab76fd9f37aa4 }}</ref>. For arbitrary k variables, Tapia et al. applied multivariate mutual information to gene expression <ref name=s41598>{{cite journal|last1=Tapia|first1=M.|last2=Baudot|first2=P.|last3=Formizano-Treziny|first3=C.|last4=Dufour|first4=M.|last5=Goaillard|first5=J.M.|year=2018|title=Neurotransmitter identity and electrophysiological phenotype are genetically coupled in midbrain dopaminergic neurons|doi= 10.1038/s41598-018-31765-z|pmid=30206240|pmc=6134142|journal=Sci. Rep.|volume=8|issue=1|pages=13637|bibcode=2018NatSR...813637T}}</ref> <ref name=e21090869/>). It can be zero, positive, or negative <ref>{{cite journal | last1 = Hu| first1 = K.T. | year = 1962 | title = On the Amount of Information | journal = Theory Probab. Appl. | volume = 7 | issue = 4 | pages = 439–447 | doi = 10.1137/1107041 }}</ref>. The positivity corresponds to relations generalizing the pairwise correlations, nullity corresponds to a refined notion of independence, and negativity detects high dimensional "emergent" relations and clusterized datapoints <ref name=s41598/>). | | For 3 variables, Brenner et al. applied multivariate mutual information to neural coding and called its negativity "synergy" <ref>{{cite journal | last1 = Brenner | first1 = N. | last2 = Strong | first2 = S. | last3 = Koberle | first3 = R. | last4 = Bialek | first4 = W. | year = 2000 | title = Synergy in a Neural Code | doi = 10.1162/089976600300015259 | pmid = 10935917 | journal = Neural Comput | volume = 12 | issue = 7 | pages = 1531–1552 }}</ref> and Watkinson et al. applied it to genetic expression <ref>{{cite journal | last1 = Watkinson | first1 = J. | last2 = Liang | first2 = K. | last3 = Wang | first3 = X. | last4 = Zheng | first4 = T.| last5 = Anastassiou | first5 = D. | year = 2009 | title = Inference of Regulatory Gene Interactions from Expression Data Using Three-Way Mutual Information | doi = 10.1111/j.1749-6632.2008.03757.x | pmid = 19348651 | journal = Chall. Syst. Biol. Ann. N. Y. Acad. Sci. | volume = 1158 | issue = 1 | pages = 302–313 | bibcode = 2009NYASA1158..302W | url = https://semanticscholar.org/paper/cb09223a34b08e6dcbf696385d9ab76fd9f37aa4 }}</ref>. For arbitrary k variables, Tapia et al. applied multivariate mutual information to gene expression <ref name=s41598>{{cite journal|last1=Tapia|first1=M.|last2=Baudot|first2=P.|last3=Formizano-Treziny|first3=C.|last4=Dufour|first4=M.|last5=Goaillard|first5=J.M.|year=2018|title=Neurotransmitter identity and electrophysiological phenotype are genetically coupled in midbrain dopaminergic neurons|doi= 10.1038/s41598-018-31765-z|pmid=30206240|pmc=6134142|journal=Sci. Rep.|volume=8|issue=1|pages=13637|bibcode=2018NatSR...813637T}}</ref> <ref name=e21090869/>). It can be zero, positive, or negative <ref>{{cite journal | last1 = Hu| first1 = K.T. | year = 1962 | title = On the Amount of Information | journal = Theory Probab. Appl. | volume = 7 | issue = 4 | pages = 439–447 | doi = 10.1137/1107041 }}</ref>. The positivity corresponds to relations generalizing the pairwise correlations, nullity corresponds to a refined notion of independence, and negativity detects high dimensional "emergent" relations and clusterized datapoints <ref name=s41598/>). |
| | | |
− | For 3 variables, Brenner et al. applied multivariate mutual information to neural coding and called its negativity "synergy" and Watkinson et al. applied it to genetic expression . For arbitrary k variables, Tapia et al. applied multivariate mutual information to gene expression . The positivity corresponds to relations generalizing the pairwise correlations, nullity corresponds to a refined notion of independence, and negativity detects high dimensional "emergent" relations and clusterized datapoints ). | + | For 3 variables, Brenner et al. applied multivariate mutual information to neural coding and called its negativity "synergy" and Watkinson et al. applied it to genetic expression . For arbitrary k variables, Tapia et al. applied multivariate mutual information to gene expression . '''<font color="#32CD32">The positivity corresponds to relations generalizing the pairwise correlations, nullity corresponds to a refined notion of independence, and negativity detects high dimensional "emergent" relations and clusterized datapoints </font>'''. |
| | | |
− | 对于3个变量,Brenner 等人。将多变量互信息应用到神经编码中,并将其负性称为“协同作用”和 Watkinson 等人。应用到基因表达上。对于任意的 k 变量,Tapia 等人。将多元互信息应用于基因表达。正性对应于一般化成对相关性的关系,无效性对应于一个精确的独立性概念,负性检测高维“涌现”关系和聚合数据点)。
| + | 对于3个变量,Brenner等人将多变量互信息应用到神经编码中,并将其称为'''<font color="#ff8000">负面“协同作用” Negativity "synergy"</font>''',接着Watkinson 等人将其应用到基因表达上。对于任意k个变量,Tapia 等人将多元互信息应用于基因表达——'''<font color="#32CD32">正性对应于一般化成对相关性的关系,无效性对应于一个精确的独立性概念,负性检测高维“涌现”关系和聚合数据点</font>'''。 |
| | | |
| | | |
第669行: |
第669行: |
| One high-dimensional generalization scheme which maximizes the mutual information between the joint distribution and other target variables is found to be useful in feature selection. | | One high-dimensional generalization scheme which maximizes the mutual information between the joint distribution and other target variables is found to be useful in feature selection. |
| | | |
− | 提出了一种能够最大化联合分布与其他目标变量之间的互信息的高维推广方案,该方案可用于特征选择。
| + | 目前已经提出了一种能够最大化联合分布与其他目标变量之间的互信息的高维推广方案,该方法可用于'''<font color="#ff8000"> 特征选择 Feature selection</font>'''。 |
| | | |
| Mutual information is also used in the area of signal processing as a [[Similarity measure|measure of similarity]] between two signals. For example, FMI metric<ref>{{cite journal | last1 = Haghighat | first1 = M. B. A. | last2 = Aghagolzadeh | first2 = A. | last3 = Seyedarabi | first3 = H. | year = 2011 | title = A non-reference image fusion metric based on mutual information of image features | doi = 10.1016/j.compeleceng.2011.07.012 | journal = Computers & Electrical Engineering | volume = 37 | issue = 5| pages = 744–756 }}</ref> is an image fusion performance measure that makes use of mutual information in order to measure the amount of information that the fused image contains about the source images. The [[Matlab]] code for this metric can be found at.<ref>{{cite web|url=http://www.mathworks.com/matlabcentral/fileexchange/45926-feature-mutual-information-fmi-image-fusion-metric|title=Feature Mutual Information (FMI) metric for non-reference image fusion - File Exchange - MATLAB Central|author=|date=|website=www.mathworks.com|accessdate=4 April 2018}}</ref> | | Mutual information is also used in the area of signal processing as a [[Similarity measure|measure of similarity]] between two signals. For example, FMI metric<ref>{{cite journal | last1 = Haghighat | first1 = M. B. A. | last2 = Aghagolzadeh | first2 = A. | last3 = Seyedarabi | first3 = H. | year = 2011 | title = A non-reference image fusion metric based on mutual information of image features | doi = 10.1016/j.compeleceng.2011.07.012 | journal = Computers & Electrical Engineering | volume = 37 | issue = 5| pages = 744–756 }}</ref> is an image fusion performance measure that makes use of mutual information in order to measure the amount of information that the fused image contains about the source images. The [[Matlab]] code for this metric can be found at.<ref>{{cite web|url=http://www.mathworks.com/matlabcentral/fileexchange/45926-feature-mutual-information-fmi-image-fusion-metric|title=Feature Mutual Information (FMI) metric for non-reference image fusion - File Exchange - MATLAB Central|author=|date=|website=www.mathworks.com|accessdate=4 April 2018}}</ref> |
第675行: |
第675行: |
| Mutual information is also used in the area of signal processing as a measure of similarity between two signals. For example, FMI metric is an image fusion performance measure that makes use of mutual information in order to measure the amount of information that the fused image contains about the source images. The Matlab code for this metric can be found at. | | Mutual information is also used in the area of signal processing as a measure of similarity between two signals. For example, FMI metric is an image fusion performance measure that makes use of mutual information in order to measure the amount of information that the fused image contains about the source images. The Matlab code for this metric can be found at. |
| | | |
− | 互信息也用于信号处理领域,来衡量两个信号之间的相似性。例如,FMI 度量是一种利用互信息来度量融合图像包含的关于源图像的信息量的图像融合性能度量。这个度量的 Matlab 代码可以在参考文献[17]中找到。
| + | 互信息也用于信号处理领域,用来进行两个信号之间的'''<font color="#ff8000">相似性度量 Similarity measure</font>'''。例如,FMI 度量是一种利用互信息来度量融合图像包含的关于源图像的信息量的图像融合性能度量。这个度量的 Matlab 代码可以在参考文献[17]中找到。 |
| | | |
| === 定向信息 Directed information === | | === 定向信息 Directed information === |