转移熵
此词条暂由Henry翻译。 已由Vicky审校
转移熵 Transfer entropy(也可译为传递熵)是衡量两个随机过程之间有向(时间不对称)信息传递量的非参数统计量。[1][2][3]过程X到过程Y的转移熵是指在给定过去值Y得到过去值X时,Y值不确定性的减少量。更具体地,如果Xt和Yt(t∈N)表示两个随机过程,且信息量用 香农熵 Shannon entropy测量,则转移熵可以写为:
[math]\displaystyle{ T_{X\rightarrow Y} = H\left( Y_t \mid Y_{t-1:t-L}\right) - H\left( Y_t \mid Y_{t-1:t-L}, X_{t-1:t-L}\right) }[/math],
其中 H (x)是 x 的香农熵。上述转移熵的定义已被其他类型的熵测度(如 Rényi熵 Rényi entropy)所扩展。[3][4]
转移熵是条件[5][6]互信息 conditional mutual information,其历史变量为 Yt−1:t−L:
[math]\displaystyle{ T_{X\rightarrow Y} = I(Y_t ; X_{t-1:t-L} \mid Y_{t-1:t-L}). }[/math]
对于向量自回归过程 vector auto-regressive processes,转移熵简化为 格兰杰因果关系 Granger causality。[7]因此,当格兰杰因果关系的模型假设不成立时,例如对非线性信号的分析时,转移熵就更具优势。[8][9]然而,它通常需要更多的样本才能进行准确估计 。
熵公式中的概率可以用不同的方法估计,如分箱 binning、最近邻 nearest neighbors,或为了降低复杂度,使用非均匀嵌入方法。[10]
虽然转移熵最初定义为双变量分析,但它已经扩展到多变量形式,或者对其他潜在源变量进行调节,[11] 或者考虑从一组源的传递,[12]尽管这些形式再次需要更多的样本。
转移熵被用于估计神经元的功能连接[13][14]和社交网络的社交影响。[8]
转移熵是有向信息的有限形式,1990年由詹姆斯·梅西 James Massey[15]定义为
I(Xn→Yn)=∑ni=1I(Xi;Yi|Yi−1),其中 Xn表示向量X1,X2,...,Xn和Yn表示 Y1,Y2,...,Yn。有向信息在描述有无反馈[16] [17]信道的基本限制(信道容量)与基于因果信息赌博[18]方面起着重要作用。
参见
参考
- ↑ Schreiber, Thomas (1 July 2000). "Measuring information transfer". Physical Review Letters. 85 (2): 461–464. arXiv:nlin/0001042. Bibcode:2000PhRvL..85..461S. doi:10.1103/PhysRevLett.85.461. PMID 10991308.
- ↑ Seth, Anil (2007). "Granger causality". Scholarpedia. Vol. 2. p. 1667. Bibcode:2007SchpJ...2.1667S. doi:10.4249/scholarpedia.1667.
- ↑ 3.0 3.1 Hlaváčková-Schindler, Katerina; Palus, M; Vejmelka, M; Bhattacharya, J (1 March 2007). "Causality detection based on information-theoretic approaches in time series analysis". Physics Reports. 441 (1): 1–46. Bibcode:2007PhR...441....1H. CiteSeerX 10.1.1.183.1617. doi:10.1016/j.physrep.2006.12.004.
- ↑ Jizba, Petr; Kleinert, Hagen; Shefaat, Mohammad (2012-05-15). "Rényi's information transfer between financial time series". Physica A: Statistical Mechanics and Its Applications (in English). 391 (10): 2971–2989. arXiv:1106.5913. Bibcode:2012PhyA..391.2971J. doi:10.1016/j.physa.2011.12.064. ISSN 0378-4371.
- ↑ Wyner, A. D. (1978). "A definition of conditional mutual information for arbitrary ensembles". Information and Control. 38 (1): 51–59. doi:10.1016/s0019-9958(78)90026-8.
- ↑ Dobrushin, R. L. (1959). "General formulation of Shannon's main theorem in information theory". Uspekhi Mat. Nauk. 14: 3–104.
- ↑ Barnett, Lionel (1 December 2009). "Granger Causality and Transfer Entropy Are Equivalent for Gaussian Variables". Physical Review Letters. 103 (23): 238701. arXiv:0910.4514. Bibcode:2009PhRvL.103w8701B. doi:10.1103/PhysRevLett.103.238701. PMID 20366183.
- ↑ 8.0 8.1 Ver Steeg, Greg; Galstyan, Aram (2012). Information transfer in social media. ACM. pp. 509–518. arXiv:1110.2724. Bibcode:2011arXiv1110.2724V.
{{cite conference}}
: Unknown parameter|booktitle=
ignored (help) - ↑ Lungarella, M.; Ishiguro, K.; Kuniyoshi, Y.; Otsu, N. (1 March 2007). "Methods for quantifying the causal structure of bivariate time series". International Journal of Bifurcation and Chaos. 17 (3): 903–921. Bibcode:2007IJBC...17..903L. CiteSeerX 10.1.1.67.3585. doi:10.1142/S0218127407017628.
- ↑ Montalto, A; Faes, L; Marinazzo, D (Oct 2014). "MuTE: A MATLAB Toolbox to Compare Established and Novel Estimators of the Multivariate Transfer Entropy". PLOS ONE. 9 (10): e109462. Bibcode:2014PLoSO...9j9462M. doi:10.1371/journal.pone.0109462. PMC 4196918. PMID 25314003.
- ↑ Lizier, Joseph; Prokopenko, Mikhail; Zomaya, Albert (2008). "Local information transfer as a spatiotemporal filter for complex systems". Physical Review E. 77 (2): 026110. arXiv:0809.3275. Bibcode:2008PhRvE..77b6110L. doi:10.1103/PhysRevE.77.026110. PMID 18352093.
- ↑ Lizier, Joseph; Heinzle, Jakob; Horstmann, Annette; Haynes, John-Dylan; Prokopenko, Mikhail (2011). "Multivariate information-theoretic measures reveal directed information structure and task relevant changes in fMRI connectivity". Journal of Computational Neuroscience. 30 (1): 85–107. doi:10.1007/s10827-010-0271-2. PMID 20799057.
- ↑ Vicente, Raul; Wibral, Michael; Lindner, Michael; Pipa, Gordon (February 2011). "Transfer entropy—a model-free measure of effective connectivity for the neurosciences". Journal of Computational Neuroscience. 30 (1): 45–67. doi:10.1007/s10827-010-0262-3. PMC 3040354. PMID 20706781.
- ↑ Shimono, Masanori; Beggs, John (October 2014). "Functional clusters, hubs, and communities in the cortical microconnectome". Cerebral Cortex. 25 (10): 3743–57. doi:10.1093/cercor/bhu252. PMC 4585513. PMID 25336598.
- ↑ Massey, James (1990). "Causality, Feedback And Directed Information" (ISITA). CiteSeerX 10.1.1.36.5688.
{{cite journal}}
: Cite journal requires|journal=
(help) - ↑ Permuter, Haim Henry; Weissman, Tsachy; Goldsmith, Andrea J. (February 2009). "Finite State Channels With Time-Invariant Deterministic Feedback". IEEE Transactions on Information Theory. 55 (2): 644–662. arXiv:cs/0608070. doi:10.1109/TIT.2008.2009849.
- ↑ Kramer, G. (January 2003). "Capacity results for the discrete memoryless network". IEEE Transactions on Information Theory. 49 (1): 4–21. doi:10.1109/TIT.2002.806135.
- ↑ Permuter, Haim H.; Kim, Young-Han; Weissman, Tsachy (June 2011). "Interpretations of Directed Information in Portfolio Theory, Data Compression, and Hypothesis Testing". IEEE Transactions on Information Theory. 57 (6): 3248–3259. arXiv:0912.4872. doi:10.1109/TIT.2011.2136270.
外部链接
- "Transfer Entropy Toolbox". Google Code., a toolbox, developed in C++ and MATLAB, for computation of transfer entropy between spike trains.
- "Java Information Dynamics Toolkit (JIDT)". GitHub. 2019-01-16., a toolbox, developed in Java and usable in MATLAB, GNU Octave and Python, for computation of transfer entropy and related information-theoretic measures in both discrete and continuous-valued data.
- "Multivariate Transfer Entropy (MuTE) toolbox". GitHub. 2019-01-09., a toolbox, developed in MATLAB, for computation of transfer entropy with different estimators.
本中文词条由不是海绵宝宝欢迎在讨论页面留言。
本词条内容源自wikipedia及公开资料,遵守 CC3.0协议。