第1,067行: |
第1,067行: |
| | | |
| For further discussions on the [[Approximate Dynamical Reversibility of Markov chains]], refer to the entry on [[Approximate Dynamical Reversibility]] and the relevant paper:<ref name="zhang_reversibility" /> | | For further discussions on the [[Approximate Dynamical Reversibility of Markov chains]], refer to the entry on [[Approximate Dynamical Reversibility]] and the relevant paper:<ref name="zhang_reversibility" /> |
− | ==EI与JS散度== | + | ==EI and JS Divergence== |
− | 根据{{EquationNote|2}}的表达式,我们知道,EI实际上是一种广义的[[JS散度]],即[[Jensen-Shannon divergence]]。
| + | According to the expression of {{EquationNote|2}}, we know that EI is actually a generalized [[(JS) divergence]], namely [[Jensen-Shannon divergence]]. |
− | | + | The so-called [[(JS) divergence]] is an indicator that measures the difference between two probability distributions defined on the same support set. Assuming two probability distributions [math]P[/math] and [math]Q[/math] defined on the support set [math]\mathcal{X}[/math], the JS divergence between them is defined as: |
− | 所谓的[[JS散度]]是一种度量两个定义在同一个支撑集上的概率分布之间差异的指标。设两个定义在支撑集[math]\mathcal{X}[/math]上的概率分布[math]P[/math]和[math]Q[/math],它们之间的JS散度定义为:
| |
| | | |
| <math> | | <math> |
第1,076行: |
第1,075行: |
| </math> | | </math> |
| | | |
− | <nowiki>其中,[math]M=\frac{P+Q}{2}=\frac{1}{2}\sum_{x\in\mathcal{X}}\left[P(x)+Q(x)\right][/math]为P和Q的平均分布,[math]D_{KL}[/math]为</nowiki>[[KL散度]]。 | + | <nowiki> Among them, [math]M=\frac{P+Q}{2}=\frac{1}{2}\sum_{x\in\mathcal{X}}\left[P(x)+Q(x)\right][/math] is the average distribution of P and Q, and [math]D_{KL}[/math] is</nowiki>[[KL Divergence]]. |
| + | |
| + | Compared with [[KL Divergence]], [[JS Divergence]] is a symmetric measure, i.e. [math]JSD(P||Q)=JSD(Q||P)[/math], while KL divergence is asymmetric. |
| + | |
| + | It can be seen that this formula has similarities with the {{EquationNote|2}} formula. It is not difficult to verify that when both P and Q are 2D vectors and form a Markov transition matrix K, the EI of K is the JS divergence of P and Q. |
| | | |
− | 与[[KL散度]]相比,[[JS散度]]是一种对称的度量,即[math]JSD(P||Q)=JSD(Q||P)[/math],而KL散度是非对称的。
| |
| | | |
− | 可以看出,该式与{{EquationNote|2}}式的相似之处。不难验证,当P和Q都是2维向量,且构成了一个马尔科夫转移矩阵K的时候,K的EI就是P、Q的JS散度。
| |
| | | |
| 进一步,在文献<ref name="GJS_divergence">{{cite journal|author=Jianhua Lin|title=Divergence Measures Based on the Shannon Entropy|journal=IEEE TRANSACTIONS ON INFORMATION THEORY|volume=37|issue=1|page=145-151|year=1991}}</ref>中,作者提出了[[广义的JS散度]]为:{{NumBlk|:| | | 进一步,在文献<ref name="GJS_divergence">{{cite journal|author=Jianhua Lin|title=Divergence Measures Based on the Shannon Entropy|journal=IEEE TRANSACTIONS ON INFORMATION THEORY|volume=37|issue=1|page=145-151|year=1991}}</ref>中,作者提出了[[广义的JS散度]]为:{{NumBlk|:| |