− | </math>是P的[[准范数]](quasinorm)<ref>Schatten norm from Wikipedia. https://en.wikipedia.org/wiki/Schatten norm</ref><ref>Recht, B., Fazel, M., Parrilo, P.A.: Guaranteed minimum-rank solutions of linear matrix equations via nuclear norm minimization. SIAM review 52(3), 471–501 (2010)</ref><ref>Chi, Y., Lu, Y.M., Chen, Y.: Nonconvex optimization meets low-rank matrix factorization: An overview. IEEE Transactions on Signal Processing 67(20), 52395269 (2019)</ref><ref name="Cui">Cui, S., Wang, S., Zhuo, J., Li, L., Huang, Q., Tian, Q.: Towards discriminability and diversity: Batch nuclear-norm maximization under label insufficient situations. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 3941–3950 (2020)</ref>。 | + | </math>是P的准范数(quasinorm)<ref>Schatten norm from Wikipedia. https://en.wikipedia.org/wiki/Schatten norm</ref><ref>Recht, B., Fazel, M., Parrilo, P.A.: Guaranteed minimum-rank solutions of linear matrix equations via nuclear norm minimization. SIAM review 52(3), 471–501 (2010)</ref><ref>Chi, Y., Lu, Y.M., Chen, Y.: Nonconvex optimization meets low-rank matrix factorization: An overview. IEEE Transactions on Signal Processing 67(20), 52395269 (2019)</ref><ref name="Cui">Cui, S., Wang, S., Zhuo, J., Li, L., Huang, Q., Tian, Q.: Towards discriminability and diversity: Batch nuclear-norm maximization under label insufficient situations. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 3941–3950 (2020)</ref>。 |
− | </math>并不是EI的唯一最小点,对于任何满足<math>P_{i}=P_{j},\forall{i}\in{[1,N]}</math>的TPM都能使EI=0.其次EI的上限和下限都是<math>\log{\Gamma_{\alpha}}</math>的线性项。这一点由下面的定理证明。 | + | </math>并不是EI的唯一最小点,对于任何满足<math>P_{i}=P_{j},\forall{i}\in{[1,N]}</math>的TPM都能使EI=0.其次EI的上限和下限都是<math>\log{\Gamma_{\alpha}}</math>的线性项。 |