更改

跳到导航 跳到搜索
添加333字节 、 2020年8月9日 (日) 16:48
第299行: 第299行:  
The first work to do this, which also showed how to do Bayesian estimation of many other information-theoretic properties besides mutual information, was <ref>{{cite journal | last1 = Wolpert | first1 = D.H. | last2 = Wolf | first2 = D.R. | year = 1995 | title = Estimating functions of probability distributions from a finite set of samples | journal = Physical Review E | volume = 52 | issue = 6 | pages = 6841–6854 | doi = 10.1103/PhysRevE.52.6841 | pmid = 9964199 | citeseerx = 10.1.1.55.7122 | bibcode = 1995PhRvE..52.6841W }}</ref>. Subsequent researchers have rederived <ref>{{cite journal | last1 = Hutter | first1 = M. | year = 2001 | title = Distribution of Mutual Information | journal = Advances in Neural Information Processing Systems 2001 }}</ref>
 
The first work to do this, which also showed how to do Bayesian estimation of many other information-theoretic properties besides mutual information, was <ref>{{cite journal | last1 = Wolpert | first1 = D.H. | last2 = Wolf | first2 = D.R. | year = 1995 | title = Estimating functions of probability distributions from a finite set of samples | journal = Physical Review E | volume = 52 | issue = 6 | pages = 6841–6854 | doi = 10.1103/PhysRevE.52.6841 | pmid = 9964199 | citeseerx = 10.1.1.55.7122 | bibcode = 1995PhRvE..52.6841W }}</ref>. Subsequent researchers have rederived <ref>{{cite journal | last1 = Hutter | first1 = M. | year = 2001 | title = Distribution of Mutual Information | journal = Advances in Neural Information Processing Systems 2001 }}</ref>
 
and extended <ref>{{cite journal | last1 = Archer | first1 = E. | last2 = Park | first2 = I.M. | last3 = Pillow | first3 = J. | year = 2013 | title = Bayesian and Quasi-Bayesian Estimators for Mutual Information from Discrete Data | journal = Entropy| volume = 15 | issue = 12 | pages = 1738–1755 | doi = 10.3390/e15051738 | citeseerx = 10.1.1.294.4690 | bibcode = 2013Entrp..15.1738A }}</ref>this analysis.  
 
and extended <ref>{{cite journal | last1 = Archer | first1 = E. | last2 = Park | first2 = I.M. | last3 = Pillow | first3 = J. | year = 2013 | title = Bayesian and Quasi-Bayesian Estimators for Mutual Information from Discrete Data | journal = Entropy| volume = 15 | issue = 12 | pages = 1738–1755 | doi = 10.3390/e15051738 | citeseerx = 10.1.1.294.4690 | bibcode = 2013Entrp..15.1738A }}</ref>this analysis.  
 +
 +
 +
关于这方面的第一项工作是文献[2]。后来的研究人员重新推导了[3]并扩展了关于[4]的这一分析。
 +
      第305行: 第309行:        +
最近的一篇论文[5],该论文基于一个专门针对相互信息本身估计的先验知识。
       
Besides, recently an estimation method accounting for continuous and multivariate outputs,  <math>Y</math>, was proposed in <ref>{{citation| journal = [[PLOS Computational Biology]]|volume = 15|issue = 7|pages = e1007132|doi = 10.1371/journal.pcbi.1007132|pmid = 31299056|pmc = 6655862|title=Information-theoretic analysis of multivariate single-cell signaling responses|author1= Tomasz Jetka|author2= Karol Nienaltowski|author3= Tomasz Winarski| author4=Slawomir Blonski| author5= Michal Komorowski|year=2019|bibcode = 2019PLSCB..15E7132J|arxiv = 1808.05581}}</ref>.
 
Besides, recently an estimation method accounting for continuous and multivariate outputs,  <math>Y</math>, was proposed in <ref>{{citation| journal = [[PLOS Computational Biology]]|volume = 15|issue = 7|pages = e1007132|doi = 10.1371/journal.pcbi.1007132|pmid = 31299056|pmc = 6655862|title=Information-theoretic analysis of multivariate single-cell signaling responses|author1= Tomasz Jetka|author2= Karol Nienaltowski|author3= Tomasz Winarski| author4=Slawomir Blonski| author5= Michal Komorowski|year=2019|bibcode = 2019PLSCB..15E7132J|arxiv = 1808.05581}}</ref>.
 +
 +
 +
此外,最近文献[6]提出了一种考虑连续以及多种输出变量𝑌的估计方法。
    
=== 独立性假设 Independence assumptions ===
 
=== 独立性假设 Independence assumptions ===
463

个编辑

导航菜单