第89行: |
第89行: |
| :<math>H(X_1,\ldots, X_n) \leq H(X_1) + \ldots + H(X_n)</math> | | :<math>H(X_1,\ldots, X_n) \leq H(X_1) + \ldots + H(X_n)</math> |
| | | |
− | ==Relations to other entropy measures== | + | == Relations to other entropy measures 与其他熵测度的关系 == |
| | | |
| Joint entropy is used in the definition of [[conditional entropy]]<ref name=cover1991 />{{rp|22}} | | Joint entropy is used in the definition of [[conditional entropy]]<ref name=cover1991 />{{rp|22}} |
| + | |
| + | 联合熵用于定义'''<font color="#ff8000"> 条件熵Conditional entropy </font>''': |
| + | |
| | | |
| :<math>\Eta(X|Y) = \Eta(X,Y) - \Eta(Y)\,</math>, | | :<math>\Eta(X|Y) = \Eta(X,Y) - \Eta(Y)\,</math>, |
| | | |
− | and <math display="block">\Eta(X_1,\dots,X_n) = \sum_{k=1}^n \Eta(X_k|X_{k-1},\dots, X_1)</math>It is also used in the definition of [[mutual information]]<ref name=cover1991 />{{rp|21}} | + | and <math display="block">\Eta(X_1,\dots,X_n) = \sum_{k=1}^n \Eta(X_k|X_{k-1},\dots, X_1)</math> |
| + | |
| + | |
| + | It is also used in the definition of [[mutual information]]<ref name=cover1991 />{{rp|21}} |
| + | 它也用于定义交互信息: |
| + | |
| | | |
| :<math>\operatorname{I}(X;Y) = \Eta(X) + \Eta(Y) - \Eta(X,Y)\,</math> | | :<math>\operatorname{I}(X;Y) = \Eta(X) + \Eta(Y) - \Eta(X,Y)\,</math> |
| + | |
| | | |
| In [[quantum information theory]], the joint entropy is generalized into the [[joint quantum entropy]]. | | In [[quantum information theory]], the joint entropy is generalized into the [[joint quantum entropy]]. |
| | | |
− | === Applications === | + | 在'''<font color="#ff8000"> 量子信息论Quantum information theory</font>'''中,联合熵被广义化为'''<font color="#ff8000"> 联合量子熵Joint quantum entropy</font>'''。 |
| + | |
| + | |
| + | |
| + | === Applications 应用 === |
| | | |
| A python package for computing all multivariate joint entropies, mutual informations, conditional mutual information, total correlations, information distance in a dataset of n variables is available.<ref>{{cite web|url=https://infotopo.readthedocs.io/en/latest/index.html|title=InfoTopo: Topological Information Data Analysis. Deep statistical unsupervised and supervised learning - File Exchange - Github|author=|date=|website=github.com/pierrebaudot/infotopopy/|accessdate=26 September 2020}}</ref> | | A python package for computing all multivariate joint entropies, mutual informations, conditional mutual information, total correlations, information distance in a dataset of n variables is available.<ref>{{cite web|url=https://infotopo.readthedocs.io/en/latest/index.html|title=InfoTopo: Topological Information Data Analysis. Deep statistical unsupervised and supervised learning - File Exchange - Github|author=|date=|website=github.com/pierrebaudot/infotopopy/|accessdate=26 September 2020}}</ref> |
| + | |
| + | 提供了一个python软件包,用于计算n个变量的数据集中的所有多元联合熵,交互信息,条件交互信息,总相关性,信息距离。 |
| | | |
| ==Joint differential entropy== | | ==Joint differential entropy== |