第591行: |
第591行: |
| :<math> | | :<math> |
| | | |
− | <math>
| + | \operatorname{I}(X_1;X_1) = H(X_1) |
− | | |
− | 数学
| |
− | | |
− | \operatorname{I}(X_1;X_1) = \Eta(X_1) | |
− | | |
− | \operatorname{I}(X_1;X_1) = \Eta(X_1)
| |
− | | |
− | { i }(x1; x1) Eta (x1)
| |
− | | |
− | </math>
| |
| | | |
| </math> | | </math> |
− |
| |
− | 数学
| |
− |
| |
| | | |
| | | |
第612行: |
第599行: |
| | | |
| and for <math>n > 1,</math> | | and for <math>n > 1,</math> |
− |
| |
− | and for <math>n > 1,</math>
| |
− |
| |
− | 还有数学1 / 数学
| |
| | | |
| :<math> | | :<math> |
− |
| |
− | <math>
| |
− |
| |
− | 数学
| |
| | | |
| \operatorname{I}(X_1;\,...\,;X_n) | | \operatorname{I}(X_1;\,...\,;X_n) |
− |
| |
− | \operatorname{I}(X_1;\,...\,;X_n)
| |
− |
| |
− | 操作者名{ i }(x1; ,... ,; xn)
| |
− |
| |
− | = \operatorname{I}(X_1;\,...\,;X_{n-1})
| |
| | | |
| = \operatorname{I}(X_1;\,...\,;X_{n-1}) | | = \operatorname{I}(X_1;\,...\,;X_{n-1}) |
− |
| |
− | 操作者名{ i }(x1; ,... ,; x{ n-1})
| |
| | | |
| - \operatorname{I}(X_1;\,...\,;X_{n-1}|X_n), | | - \operatorname{I}(X_1;\,...\,;X_{n-1}|X_n), |
− |
| |
− | - \operatorname{I}(X_1;\,...\,;X_{n-1}|X_n),
| |
− |
| |
− | - 操作者名{ i }(x1; ,... ,; x{ n-1} | xn) ,
| |
− |
| |
− | </math>
| |
| | | |
| </math> | | </math> |
− |
| |
− | 数学
| |
− |
| |
− |
| |
| | | |
| | | |
第658行: |
第619行: |
| | | |
| :<math> | | :<math> |
− |
| |
− | <math>
| |
− |
| |
− | 数学
| |
| | | |
| I(X_1;\ldots;X_{n-1}|X_{n}) = \mathbb{E}_{X_{n}} [D_{\mathrm{KL}}( P_{(X_1,\ldots,X_{n-1})|X_{n}} \| P_{X_1|X_{n}} \otimes\cdots\otimes P_{X_{n-1}|X_{n}} )]. | | I(X_1;\ldots;X_{n-1}|X_{n}) = \mathbb{E}_{X_{n}} [D_{\mathrm{KL}}( P_{(X_1,\ldots,X_{n-1})|X_{n}} \| P_{X_1|X_{n}} \otimes\cdots\otimes P_{X_{n-1}|X_{n}} )]. |
− |
| |
− | I(X_1;\ldots;X_{n-1}|X_{n}) = \mathbb{E}_{X_{n}} [D_{\mathrm{KL}}( P_{(X_1,\ldots,X_{n-1})|X_{n}} \| P_{X_1|X_{n}} \otimes\cdots\otimes P_{X_{n-1}|X_{n}} )].
| |
− |
| |
− | I (x1; ldots; x { n-1} | x { n }) mathbb { x { n }[ d { mathrum { KL }(p {(x1,ldots,x { n-1}) | x { n } | p { x1 | x { n } n } n 次数{ x { n-1} | x { n }]]。
| |
| | | |
| </math> | | </math> |
− |
| |
− | </math>
| |
− |
| |
− | 数学
| |
− |
| |
− |
| |
− |
| |
| | | |
| | | |
第713行: |
第659行: |
| | | |
| | | |
− | One high-dimensional generalization scheme which maximizes the mutual information between the joint distribution and other target variables is found to be useful in [[feature selection]].<ref>{{cite book | + | One high-dimensional generalization scheme which maximizes the mutual information between the joint distribution and other target variables is found to be useful in [[feature selection]].<ref>{{cite book|author1=Christopher D. Manning |author2=Prabhakar Raghavan |author3=Hinrich Schütze | title = An Introduction to Information Retrieval| publisher = [[Cambridge University Press]]| year = 2008| isbn = 978-0-521-86571-5 }}</ref> |
− | | |
− | One high-dimensional generalization scheme which maximizes the mutual information between the joint distribution and other target variables is found to be useful in feature selection.<ref>{{cite book
| |
− | | |
− | 提出了一种能够最大化联合分布与其他目标变量之间的互信息的高维推广方案,该方案可用于特征选择。 文档{ cite book
| |
− | | |
− | |author1=Christopher D. Manning |author2=Prabhakar Raghavan |author3=Hinrich Schütze | title = An Introduction to Information Retrieval
| |
− | | |
− | |author1=Christopher D. Manning |author2=Prabhakar Raghavan |author3=Hinrich Schütze | title = An Introduction to Information Retrieval
| |
− | | |
− | 作者: Christopher d. Manning | 作者: Prabhakar Raghavan | 作者: Hinrich sch tze | 信息检索入门
| |
− | | |
− | | publisher = [[Cambridge University Press]]
| |
− | | |
− | | publisher = Cambridge University Press
| |
− | | |
− | 出版商剑桥大学出版社
| |
− | | |
− | | year = 2008
| |
− | | |
− | | year = 2008
| |
− | | |
− | 2008年
| |
− | | |
− | | isbn = 978-0-521-86571-5 }}</ref>
| |
− | | |
− | | isbn = 978-0-521-86571-5 }}</ref>
| |
− | | |
− | / ref
| |
− | | |
| | | |
| | | |
| + | One high-dimensional generalization scheme which maximizes the mutual information between the joint distribution and other target variables is found to be useful in feature selection. |
| | | |
| + | 提出了一种能够最大化联合分布与其他目标变量之间的互信息的高维推广方案,该方案可用于特征选择。 |
| | | |
| Mutual information is also used in the area of signal processing as a [[Similarity measure|measure of similarity]] between two signals. For example, FMI metric<ref>{{cite journal | last1 = Haghighat | first1 = M. B. A. | last2 = Aghagolzadeh | first2 = A. | last3 = Seyedarabi | first3 = H. | year = 2011 | title = A non-reference image fusion metric based on mutual information of image features | doi = 10.1016/j.compeleceng.2011.07.012 | journal = Computers & Electrical Engineering | volume = 37 | issue = 5| pages = 744–756 }}</ref> is an image fusion performance measure that makes use of mutual information in order to measure the amount of information that the fused image contains about the source images. The [[Matlab]] code for this metric can be found at.<ref>{{cite web|url=http://www.mathworks.com/matlabcentral/fileexchange/45926-feature-mutual-information-fmi-image-fusion-metric|title=Feature Mutual Information (FMI) metric for non-reference image fusion - File Exchange - MATLAB Central|author=|date=|website=www.mathworks.com|accessdate=4 April 2018}}</ref> | | Mutual information is also used in the area of signal processing as a [[Similarity measure|measure of similarity]] between two signals. For example, FMI metric<ref>{{cite journal | last1 = Haghighat | first1 = M. B. A. | last2 = Aghagolzadeh | first2 = A. | last3 = Seyedarabi | first3 = H. | year = 2011 | title = A non-reference image fusion metric based on mutual information of image features | doi = 10.1016/j.compeleceng.2011.07.012 | journal = Computers & Electrical Engineering | volume = 37 | issue = 5| pages = 744–756 }}</ref> is an image fusion performance measure that makes use of mutual information in order to measure the amount of information that the fused image contains about the source images. The [[Matlab]] code for this metric can be found at.<ref>{{cite web|url=http://www.mathworks.com/matlabcentral/fileexchange/45926-feature-mutual-information-fmi-image-fusion-metric|title=Feature Mutual Information (FMI) metric for non-reference image fusion - File Exchange - MATLAB Central|author=|date=|website=www.mathworks.com|accessdate=4 April 2018}}</ref> |
第752行: |
第671行: |
| | | |
| 互信息也用于信号处理领域,作为两个信号之间相似性的度量。例如,FMI 度量是一种利用互信息来度量融合图像包含的关于源图像的信息量的图像融合性能度量。这个度量的 Matlab 代码可以在。 | | 互信息也用于信号处理领域,作为两个信号之间相似性的度量。例如,FMI 度量是一种利用互信息来度量融合图像包含的关于源图像的信息量的图像融合性能度量。这个度量的 Matlab 代码可以在。 |
− |
| |
− |
| |
− |
| |
− |
| |
| | | |
| === 定向信息 Directed information === | | === 定向信息 Directed information === |