第9行: |
第9行: |
| | | |
| In [[information theory]], '''joint [[entropy (information theory)|entropy]]''' is a measure of the uncertainty associated with a set of [[random variables|variables]].<ref name=korn>{{cite book |author1=Theresa M. Korn |author2=Korn, Granino Arthur |title=Mathematical Handbook for Scientists and Engineers: Definitions, Theorems, and Formulas for Reference and Review |publisher=Dover Publications |location=New York |year= |isbn=0-486-41147-8 |oclc= |doi=}}</ref> | | In [[information theory]], '''joint [[entropy (information theory)|entropy]]''' is a measure of the uncertainty associated with a set of [[random variables|variables]].<ref name=korn>{{cite book |author1=Theresa M. Korn |author2=Korn, Granino Arthur |title=Mathematical Handbook for Scientists and Engineers: Definitions, Theorems, and Formulas for Reference and Review |publisher=Dover Publications |location=New York |year= |isbn=0-486-41147-8 |oclc= |doi=}}</ref> |
| + | |
| + | |
| | | |
| ==Definition 定义 == | | ==Definition 定义 == |
第51行: |
第53行: |
| | | |
| 其中<math>x_1,...,x_n</math>分别是<math>X_1,...,X_n</math>的特定值,<math>P(x_1, ..., x_n)</math>是这些值产生交集时的概率,如果<math>P(x_1, ..., x_n)=0</math>则<math>P(x_1, ..., x_n) \log_2[P(x_1, ..., x_n)]</math>定义为0。 | | 其中<math>x_1,...,x_n</math>分别是<math>X_1,...,X_n</math>的特定值,<math>P(x_1, ..., x_n)</math>是这些值产生交集时的概率,如果<math>P(x_1, ..., x_n)=0</math>则<math>P(x_1, ..., x_n) \log_2[P(x_1, ..., x_n)]</math>定义为0。 |
| + | |
| + | |
| | | |
| == Properties 属性 == | | == Properties 属性 == |
第77行: |
第81行: |
| :<math>H \bigl(X_1,\ldots, X_n \bigr) \geq \max_{1 \le i \le n} | | :<math>H \bigl(X_1,\ldots, X_n \bigr) \geq \max_{1 \le i \le n} |
| \Bigl\{H\bigl(X_i\bigr) \Bigr\}</math> | | \Bigl\{H\bigl(X_i\bigr) \Bigr\}</math> |
| + | |
| + | |
| | | |
| === Less than or equal to the sum of individual entropies 小于或等于单个熵的总和=== | | === Less than or equal to the sum of individual entropies 小于或等于单个熵的总和=== |
第88行: |
第94行: |
| | | |
| :<math>H(X_1,\ldots, X_n) \leq H(X_1) + \ldots + H(X_n)</math> | | :<math>H(X_1,\ldots, X_n) \leq H(X_1) + \ldots + H(X_n)</math> |
| + | |
| + | |
| | | |
| == Relations to other entropy measures 与其他熵测度的关系 == | | == Relations to other entropy measures 与其他熵测度的关系 == |
第119行: |
第127行: |
| | | |
| 提供了一个python软件包,用于计算n个变量的数据集中的所有多元联合熵,交互信息,条件交互信息,总相关性,信息距离。 | | 提供了一个python软件包,用于计算n个变量的数据集中的所有多元联合熵,交互信息,条件交互信息,总相关性,信息距离。 |
| + | |
| + | |
| | | |
| == Joint differential entropy 联合微分熵 == | | == Joint differential entropy 联合微分熵 == |