第361行: |
第361行: |
| | | |
| | | |
− | === 公制 Metric === | + | === 度量 Metric === |
| | | |
| Many applications require a [[metric (mathematics)|metric]], that is, a distance measure between pairs of points. The quantity | | Many applications require a [[metric (mathematics)|metric]], that is, a distance measure between pairs of points. The quantity |
第367行: |
第367行: |
| Many applications require a metric, that is, a distance measure between pairs of points. The quantity | | Many applications require a metric, that is, a distance measure between pairs of points. The quantity |
| | | |
− | 许多应用程序需要一个度量单位,即两个点之间的距离度量单位。数量
| + | 许多应用需要一个度量,即点对之间的距离度量。这个量: |
− | | |
| | | |
| | | |
第392行: |
第391行: |
| satisfies the properties of a metric (triangle inequality, non-negativity, indiscernability and symmetry). This distance metric is also known as the variation of information. | | satisfies the properties of a metric (triangle inequality, non-negativity, indiscernability and symmetry). This distance metric is also known as the variation of information. |
| | | |
− | 满足度量的性质(三角不等式、非负性、不可分性和对称性)。这个距离度量也称为信息的变化。
| + | 满足度量的性质(三角形不等式、非负性、不可除性和对称性)。这种距离度量也称为信息的变化。 |
− | | |
− | | |
| | | |
| | | |
| | | |
− | If <math>X, Y</math> are discrete random variables then all the entropy terms are non-negative, so <math>0 \le d(X,Y) \le H(X,Y)</math> and one can define a normalized distance
| |
| | | |
| If <math>X, Y</math> are discrete random variables then all the entropy terms are non-negative, so <math>0 \le d(X,Y) \le H(X,Y)</math> and one can define a normalized distance | | If <math>X, Y</math> are discrete random variables then all the entropy terms are non-negative, so <math>0 \le d(X,Y) \le H(X,Y)</math> and one can define a normalized distance |
| | | |
− | 如果数学 x,y / math 是离散随机变量,那么所有的熵项都是非负的,所以数学0 le d (x,y) le Eta (x,y) / math 可以定义一个标准化距离
| + | If 𝑋,𝑌 are discrete random variables then all the entropy terms are non-negative, so 0≤𝑑(𝑋,𝑌)≤𝐻(𝑋,𝑌) and one can define a normalized distance |
− | | |
| | | |
| + | 如果𝑋,𝑌是离散随机变量,那么所有熵项都是非负的,因此0≤𝑑(𝑋,𝑌)≤𝐻(𝑋,𝑌),可以定义一个标准化距离: |
| | | |
| | | |
第415行: |
第411行: |
| The metric <math>D</math> is a universal metric, in that if any other distance measure places <math>X</math> and <math>Y</math> close-by, then the <math>D</math> will also judge them close.<ref>{{cite journal|arxiv=q-bio/0311039|last1=Kraskov|first1=Alexander|title=Hierarchical Clustering Based on Mutual Information|last2=Stögbauer|first2=Harald|last3= Andrzejak|first3=Ralph G.|last4=Grassberger|first4=Peter|year=2003|bibcode=2003q.bio....11039K}}</ref>{{dubious|see talk page|date=November 2014}} | | The metric <math>D</math> is a universal metric, in that if any other distance measure places <math>X</math> and <math>Y</math> close-by, then the <math>D</math> will also judge them close.<ref>{{cite journal|arxiv=q-bio/0311039|last1=Kraskov|first1=Alexander|title=Hierarchical Clustering Based on Mutual Information|last2=Stögbauer|first2=Harald|last3= Andrzejak|first3=Ralph G.|last4=Grassberger|first4=Peter|year=2003|bibcode=2003q.bio....11039K}}</ref>{{dubious|see talk page|date=November 2014}} |
| | | |
− | The metric <math>D</math> is a universal metric, in that if any other distance measure places <math>X</math> and <math>Y</math> close-by, then the <math>D</math> will also judge them close. | + | The metric 𝐷 is a universal metric, in that if any other distance measure places 𝑋 and 𝑌 close-by, then the 𝐷 will also judge them close. |
− | | |
− | 数学公制 d / math 是一个通用的公制,因为如果任何其他距离公制把数学 x / math 和数学 y / math 放在附近,那么数学公制 d / math 也会把它们放在附近。
| |
| | | |
| + | 度量𝐷是一种通用度量,即如果任何其他距离度量将𝑋和𝑌放在附近,则𝐷也将判断它们接近。 |
| | | |
| | | |
第427行: |
第422行: |
| Plugging in the definitions shows that | | Plugging in the definitions shows that |
| | | |
− | 插入定义表明
| + | 从如下定义可以看出: |
| | | |
| | | |
第441行: |
第436行: |
| In a set-theoretic interpretation of information (see the figure for [[Conditional entropy]]), this is effectively the [[Jaccard index|Jaccard distance]] between <math>X</math> and <math>Y</math>. | | In a set-theoretic interpretation of information (see the figure for [[Conditional entropy]]), this is effectively the [[Jaccard index|Jaccard distance]] between <math>X</math> and <math>Y</math>. |
| | | |
− | In a set-theoretic interpretation of information (see the figure for Conditional entropy), this is effectively the Jaccard distance between <math>X</math> and <math>Y</math>. | + | In a set-theoretic interpretation of information (see the figure for Conditional entropy), this is effectively the Jaccard distance between 𝑋 and 𝑌. |
− | | |
− | 在对信息的集合论解释中(见条件熵图) ,这实际上是数学 x / 数学和数学 y / 数学之间的雅可比相似度系数。
| |
| | | |
| + | 在信息的集合论解释中(参见条件熵的图),这实际上就是𝑋和𝑌之间的Jaccard距离。 |
| | | |
| | | |
第464行: |
第458行: |
| is also a metric. | | is also a metric. |
| | | |
− | 也是一个度量标准。
| + | 也是一种度量标准。 |
| | | |
| === 条件互信息 Conditional mutual information === | | === 条件互信息 Conditional mutual information === |