更改

添加1字节 、 2024年9月29日 (星期日)
第1,132行: 第1,132行:  
In the literature <ref name="GJSD">{{cite conference|author1=Erik Englesson|author2=Hossein Azizpour|title=Generalized Jensen-Shannon Divergence Loss for Learning with Noisy Labels|conference=35th Conference on Neural Information Processing Systems (NeurIPS 2021)|year=2021}}</ref>, the authors discussed the application of generalized JS divergence in measuring classification diversity. Therefore, EI can also be understood as a measure of the diversity of row vectors.
 
In the literature <ref name="GJSD">{{cite conference|author1=Erik Englesson|author2=Hossein Azizpour|title=Generalized Jensen-Shannon Divergence Loss for Learning with Noisy Labels|conference=35th Conference on Neural Information Processing Systems (NeurIPS 2021)|year=2021}}</ref>, the authors discussed the application of generalized JS divergence in measuring classification diversity. Therefore, EI can also be understood as a measure of the diversity of row vectors.
   −
Furthermore, if we consider [[Shannon Entropy]][math]H(P_i)[/math] as a function, it is not difficult to verify that H is a [[Concave Function]], and formula {{EquationNote|GJSD}} is actually the [[Jensen Gap]] of the H function. There are numerous papers discussing the mathematical properties of this gap, including its upper and lower bound estimates.<ref name="Gao et al.">{{cite journal | last1 = Gao | first1 = Xiang | last2 = Sitharam | first2 = Meera | last3 = Roitberg | first3 = Adrian | year = 2019 | title = Bounds on the Jensen Gap, and Implications for Mean-Concentrated Distributions | journal=The Australian Journal of Mathematical Analysis and Applications | arxiv = 1712.05267 | url = https://ajmaa.org/searchroot/files/pdf/v16n2/v16i2p14.pdf | volume = 16 | issue = 2 }}</ref>
+
Furthermore, if we consider [[Shannon Entropy]] [math]H(P_i)[/math] as a function, it is not difficult to verify that H is a [[Concave Function]], and formula {{EquationNote|GJSD}} is actually the [[Jensen Gap]] of the H function. There are numerous papers discussing the mathematical properties of this gap, including its upper and lower bound estimates.<ref name="Gao et al.">{{cite journal | last1 = Gao | first1 = Xiang | last2 = Sitharam | first2 = Meera | last3 = Roitberg | first3 = Adrian | year = 2019 | title = Bounds on the Jensen Gap, and Implications for Mean-Concentrated Distributions | journal=The Australian Journal of Mathematical Analysis and Applications | arxiv = 1712.05267 | url = https://ajmaa.org/searchroot/files/pdf/v16n2/v16i2p14.pdf | volume = 16 | issue = 2 }}</ref>
    
=References=
 
=References=
786

个编辑