更改

跳到导航 跳到搜索
无编辑摘要
第1,090行: 第1,090行:  
By comparing with formula {{EquationNote|tow_terms}}, it is not difficult to find that when [math]\pi_i=\frac{1}{n}[/math], [math]JSD_{\pi}[/math] degenerates into EI.
 
By comparing with formula {{EquationNote|tow_terms}}, it is not difficult to find that when [math]\pi_i=\frac{1}{n}[/math], [math]JSD_{\pi}[/math] degenerates into EI.
   −
在文献<ref name="GJSD">{{cite conference|author1=Erik Englesson|author2=Hossein Azizpour|title=Generalized Jensen-Shannon Divergence Loss for Learning with Noisy Labels|conference=35th Conference on Neural Information Processing Systems (NeurIPS 2021)|year=2021}}</ref>中,作者们讨论了广义JS散度在分类多样性度量方面的应用。因此,EI也可以理解为是对行向量多样化程度的一种度量。
+
In the literature <ref name="GJSD">{{cite conference|author1=Erik Englesson|author2=Hossein Azizpour|title=Generalized Jensen-Shannon Divergence Loss for Learning with Noisy Labels|conference=35th Conference on Neural Information Processing Systems (NeurIPS 2021)|year=2021}}</ref>, the authors discussed the application of generalized JS divergence in measuring classification diversity. Therefore, EI can also be understood as a measure of the diversity of row vectors.
 +
 
 +
Furthermore, if we consider [[Shannon Entropy]][math]H(P_i)[/math] as a function, it is not difficult to verify that H is a [[Concave Function]], and formula {{EquationNote|GJSD}} is actually the [[Jensen Gap]] of the H function. There are numerous papers discussing the mathematical properties of this gap, including its upper and lower bound estimates.<ref name="Gao et al.">{{cite journal | last1 = Gao | first1 = Xiang | last2 = Sitharam | first2 = Meera | last3 = Roitberg | first3 = Adrian | year = 2019 | title = Bounds on the Jensen Gap, and Implications for Mean-Concentrated Distributions | journal=The Australian Journal of Mathematical Analysis and Applications | arxiv = 1712.05267 | url = https://ajmaa.org/searchroot/files/pdf/v16n2/v16i2p14.pdf | volume = 16 | issue = 2 }}</ref>
   −
进一步,如果我们将[[Shannon熵]][math]H(P_i)[/math]看做一个函数,则不难验证,H是一个[[凹函数]],那么公式{{EquationNote|GJSD}}实际上是H函数的[[Jensen差距]],即[[Jensen Gap]]。关于这一差距的数学性质,包括它的上下界估计,有大量的论文进行讨论<ref name="Gao et al.">{{cite journal | last1 = Gao | first1 = Xiang | last2 = Sitharam | first2 = Meera | last3 = Roitberg | first3 = Adrian | year = 2019 | title = Bounds on the Jensen Gap, and Implications for Mean-Concentrated Distributions | journal=The Australian Journal of Mathematical Analysis and Applications | arxiv = 1712.05267 | url = https://ajmaa.org/searchroot/files/pdf/v16n2/v16i2p14.pdf | volume = 16 | issue = 2 }}</ref>。
   
=References=
 
=References=
 
<references />
 
<references />
第1,098行: 第1,099行:  
=编者推荐=
 
=编者推荐=
 
下面是一些链接能够帮助读者更好的了解因果涌现的相关信息:
 
下面是一些链接能够帮助读者更好的了解因果涌现的相关信息:
 +
 
===因果涌现读书会===
 
===因果涌现读书会===
 
*[https://pattern.swarma.org/study_group_issue/490 因果涌现读书会第三季第一期:涌现、因果与人工智能]
 
*[https://pattern.swarma.org/study_group_issue/490 因果涌现读书会第三季第一期:涌现、因果与人工智能]
1,117

个编辑

导航菜单