转移熵

来自集智百科 - 复杂系统|人工智能|复杂科学|复杂网络|自组织
Vicky讨论 | 贡献2020年11月22日 (日) 16:55的版本
跳到导航 跳到搜索

此词条暂由Henry翻译。 已由Vicky审校

Transfer entropy is a non-parametric statistic measuring the amount of directed (time-asymmetric) transfer of information between two random processes.[1][2][3] Transfer entropy from a process X to another process Y is the amount of uncertainty reduced in future values of Y by knowing the past values of X given past values of Y. More specifically, if [math]\displaystyle{ X_t }[/math] and [math]\displaystyle{ Y_t }[/math] for [math]\displaystyle{ t\in \mathbb{N} }[/math] denote two random processes and the amount of information is measured using Shannon's entropy, the transfer entropy can be written as:

Transfer entropy is a non-parametric statistic measuring the amount of directed (time-asymmetric) transfer of information between two random processes. Transfer entropy from a process X to another process Y is the amount of uncertainty reduced in future values of Y by knowing the past values of X given past values of Y. More specifically, if [math]\displaystyle{ X_t }[/math] and [math]\displaystyle{ Y_t }[/math] for [math]\displaystyle{ t\in \mathbb{N} }[/math] denote two random processes and the amount of information is measured using Shannon's entropy, the transfer entropy can be written as:

转移熵 Transfer entropy(也可译为传递熵)是衡量两个随机过程之间有向(时间不对称)信息传递量的非参数统计量。[4][2][3]过程X到过程Y的转移熵是指在给定过去值Y得到过去值X时,Y值不确定性的减少量。更具体地,如果Xt和Yt(t∈N)表示两个随机过程,且信息量用 香农熵 Shannon entropy测量,则转移熵可以写为:


[math]\displaystyle{ \lt math\gt 《数学》 T_{X\rightarrow Y} = H\left( Y_t \mid Y_{t-1:t-L}\right) - H\left( Y_t \mid Y_{t-1:t-L}, X_{t-1:t-L}\right), T_{X\rightarrow Y} = H\left( Y_t \mid Y_{t-1:t-L}\right) - H\left( Y_t \mid Y_{t-1:t-L}, X_{t-1:t-L}\right), T _ { x right tarrow y } = h left (y _ t mid y _ { t-1: t-L } right)-h left (y _ t mid y _ { t-1: t-L } ,x _ { t-1: t-L } right) , }[/math]

</math>

数学


where H(X) is Shannon entropy of X. The above definition of transfer entropy has been extended by other types of entropy measures such as Rényi entropy.[3][5]

where H(X) is Shannon entropy of X. The above definition of transfer entropy has been extended by other types of entropy measures such as Rényi entropy.

其中 H (x)是 x 的香农熵。上述转移熵的定义已被其他类型的熵测度(如 Rényi熵 Rényi entropy)所扩展。[3][6]


Transfer entropy is conditional mutual information,[7][8] with the history of the influenced variable [math]\displaystyle{ Y_{t-1:t-L} }[/math] in the condition:

Transfer entropy is conditional mutual information,[7][8] with the history of the influenced variable [math]\displaystyle{ Y_{t-1:t-L} }[/math] in the condition:

转移熵是条件互信息 conditional mutual information,其历史变量为 Yt−1:t−L:


[math]\displaystyle{ \lt math\gt 《数学》 T_{X\rightarrow Y} = I(Y_t ; X_{t-1:t-L} \mid Y_{t-1:t-L}). T_{X\rightarrow Y} = I(Y_t ; X_{t-1:t-L} \mid Y_{t-1:t-L}). T _ { x right tarrow y } = i (y _ t; x _ { t-1: t-L } mid y _ { t-1: t-L }). }[/math]

</math>

数学


Transfer entropy reduces to Granger causality for vector auto-regressive processes.[9] Hence, it is advantageous when the model assumption of Granger causality doesn't hold, for example, analysis of non-linear signals.[10][11] However, it usually requires more samples for accurate estimation.[12]

Transfer entropy reduces to Granger causality for vector auto-regressive processes. Hence, it is advantageous when the model assumption of Granger causality doesn't hold, for example, analysis of non-linear signals. However, it usually requires more samples for accurate estimation.

对于向量自回归过程 vector auto-regressive processes,转移熵简化为 格兰杰因果关系 Granger causality[9]因此,当格兰杰因果关系的模型假设不成立时,例如对非线性信号的分析时,转移熵就更具优势。[10][13]然而,它通常需要更多的样本才能进行准确估计 。

The probabilities in the entropy formula can be estimated using different approaches (binning, nearest neighbors) or, in order to reduce complexity, using a non-uniform embedding.[14]

The probabilities in the entropy formula can be estimated using different approaches (binning, nearest neighbors) or, in order to reduce complexity, using a non-uniform embedding.

熵公式中的概率可以用不同的方法估计,如分箱 binning最近邻 nearest neighbors,或为了降低复杂度,使用非均匀嵌入方法。[15]

While it was originally defined for bivariate analysis, transfer entropy has been extended to multivariate forms, either conditioning on other potential source variables[16] or considering transfer from a collection of sources,[17] although these forms require more samples again.

While it was originally defined for bivariate analysis, transfer entropy has been extended to multivariate forms, either conditioning on other potential source variables or considering transfer from a collection of sources, although these forms require more samples again.

虽然转移熵最初定义为双变量分析,但它已经扩展到多变量形式,或者对其他潜在源变量进行调节,[18] 或者考虑从一组源的传递,[17]尽管这些形式再次需要更多的样本。


Transfer entropy has been used for estimation of functional connectivity of neurons[17][19][20] and social influence in social networks.[10]

Transfer entropy has been used for estimation of functional connectivity of neurons and social influence in social networks.

转移熵被用于估计神经元的功能连接[21][20]和社交网络的社交影响。[10]

Transfer entropy is a finite version of the Directed Information which was defined in 1990 by James Massey [22] as

Transfer entropy is a finite version of the Directed Information which was defined in 1990 by James Massey as

转移熵是有向信息的有限形式,1990年由詹姆斯·梅西 James Massey[23]定义为

[math]\displaystyle{ I(X^n\to Y^n) =\sum_{i=1}^n I(X^i;Y_i|Y^{i-1}) }[/math], where [math]\displaystyle{ X^n }[/math] denotes the vector [math]\displaystyle{ X_1,X_2,...,X_n }[/math] and [math]\displaystyle{ Y^n }[/math] denotes [math]\displaystyle{ Y_1,Y_2,...,Y_n }[/math]. The directed information places an important role in characterizing the fundamental limits (channel capacity) of communication channels with or without feedback [24]

[math]\displaystyle{ I(X^n\to Y^n) =\sum_{i=1}^n I(X^i;Y_i|Y^{i-1}) }[/math], where [math]\displaystyle{ X^n }[/math] denotes the vector[math]\displaystyle{ X_1,X_2,...,X_n }[/math]and [math]\displaystyle{ Y^n }[/math] denotes [math]\displaystyle{ Y_1,Y_2,...,Y_n }[/math]. The directed information places an important role in characterizing the fundamental limits (channel capacity) of communication channels with or without feedback and gambling with causal side information,

I(Xn→Yn)=∑ni=1I(Xi;Yi|Yi−1),其中 Xn表示向量X1,X2,...,Xn和Yn表示 Y1,Y2,...,Yn。有向信息在描述有无反馈[25] [26]信道的基本限制(信道容量)与基于因果信息赌博[27]方面起着重要作用。

[28] and gambling with causal side information,[29]


See also

参见

条件互信息

因果关系

因果关系(物理)

结构方程模型

虚拟事实模型

互信息


References

参考

  1. Schreiber, Thomas (1 July 2000). "Measuring information transfer". Physical Review Letters. 85 (2): 461–464. arXiv:nlin/0001042. Bibcode:2000PhRvL..85..461S. doi:10.1103/PhysRevLett.85.461. PMID 10991308.
  2. 2.0 2.1 Seth, Anil (2007). "Granger causality". Scholarpedia. Vol. 2. p. 1667. Bibcode:2007SchpJ...2.1667S. doi:10.4249/scholarpedia.1667.
  3. 3.0 3.1 3.2 3.3 Hlaváčková-Schindler, Katerina; Palus, M; Vejmelka, M; Bhattacharya, J (1 March 2007). "Causality detection based on information-theoretic approaches in time series analysis". Physics Reports. 441 (1): 1–46. Bibcode:2007PhR...441....1H. CiteSeerX 10.1.1.183.1617. doi:10.1016/j.physrep.2006.12.004.
  4. Schreiber, Thomas (1 July 2000). "Measuring information transfer". Physical Review Letters. 85 (2): 461–464. arXiv:nlin/0001042. Bibcode:2000PhRvL..85..461S. doi:10.1103/PhysRevLett.85.461. PMID 10991308.
  5. Jizba, Petr; Kleinert, Hagen; Shefaat, Mohammad (2012-05-15). "Rényi's information transfer between financial time series". Physica A: Statistical Mechanics and Its Applications (in English). 391 (10): 2971–2989. arXiv:1106.5913. Bibcode:2012PhyA..391.2971J. doi:10.1016/j.physa.2011.12.064. ISSN 0378-4371.
  6. Jizba, Petr; Kleinert, Hagen; Shefaat, Mohammad (2012-05-15). "Rényi's information transfer between financial time series". Physica A: Statistical Mechanics and Its Applications (in English). 391 (10): 2971–2989. arXiv:1106.5913. Bibcode:2012PhyA..391.2971J. doi:10.1016/j.physa.2011.12.064. ISSN 0378-4371.
  7. 7.0 7.1 Wyner, A. D. (1978). "A definition of conditional mutual information for arbitrary ensembles". Information and Control. 38 (1): 51–59. doi:10.1016/s0019-9958(78)90026-8.
  8. 8.0 8.1 Dobrushin, R. L. (1959). "General formulation of Shannon's main theorem in information theory". Uspekhi Mat. Nauk. 14: 3–104.
  9. 9.0 9.1 Barnett, Lionel (1 December 2009). "Granger Causality and Transfer Entropy Are Equivalent for Gaussian Variables". Physical Review Letters. 103 (23): 238701. arXiv:0910.4514. Bibcode:2009PhRvL.103w8701B. doi:10.1103/PhysRevLett.103.238701. PMID 20366183.
  10. 10.0 10.1 10.2 10.3 Ver Steeg, Greg; Galstyan, Aram (2012). Information transfer in social media. ACM. pp. 509–518. arXiv:1110.2724. Bibcode:2011arXiv1110.2724V. {{cite conference}}: Unknown parameter |booktitle= ignored (help)
  11. Lungarella, M.; Ishiguro, K.; Kuniyoshi, Y.; Otsu, N. (1 March 2007). "Methods for quantifying the causal structure of bivariate time series". International Journal of Bifurcation and Chaos. 17 (3): 903–921. Bibcode:2007IJBC...17..903L. CiteSeerX 10.1.1.67.3585. doi:10.1142/S0218127407017628.
  12. Pereda, E; Quiroga, RQ; Bhattacharya, J (Sep–Oct 2005). "Nonlinear multivariate analysis of neurophysiological signals". Progress in Neurobiology. 77 (1–2): 1–37. arXiv:nlin/0510077. Bibcode:2005nlin.....10077P. doi:10.1016/j.pneurobio.2005.10.003. PMID 16289760.
  13. Lungarella, M.; Ishiguro, K.; Kuniyoshi, Y.; Otsu, N. (1 March 2007). "Methods for quantifying the causal structure of bivariate time series". International Journal of Bifurcation and Chaos. 17 (3): 903–921. Bibcode:2007IJBC...17..903L. CiteSeerX 10.1.1.67.3585. doi:10.1142/S0218127407017628.
  14. Montalto, A; Faes, L; Marinazzo, D (Oct 2014). "MuTE: A MATLAB Toolbox to Compare Established and Novel Estimators of the Multivariate Transfer Entropy". PLOS ONE. 9 (10): e109462. Bibcode:2014PLoSO...9j9462M. doi:10.1371/journal.pone.0109462. PMC 4196918. PMID 25314003.
  15. Montalto, A; Faes, L; Marinazzo, D (Oct 2014). "MuTE: A MATLAB Toolbox to Compare Established and Novel Estimators of the Multivariate Transfer Entropy". PLOS ONE. 9 (10): e109462. Bibcode:2014PLoSO...9j9462M. doi:10.1371/journal.pone.0109462. PMC 4196918. PMID 25314003.
  16. Lizier, Joseph; Prokopenko, Mikhail; Zomaya, Albert (2008). "Local information transfer as a spatiotemporal filter for complex systems". Physical Review E. 77 (2): 026110. arXiv:0809.3275. Bibcode:2008PhRvE..77b6110L. doi:10.1103/PhysRevE.77.026110. PMID 18352093.
  17. 17.0 17.1 17.2 Lizier, Joseph; Heinzle, Jakob; Horstmann, Annette; Haynes, John-Dylan; Prokopenko, Mikhail (2011). "Multivariate information-theoretic measures reveal directed information structure and task relevant changes in fMRI connectivity". Journal of Computational Neuroscience. 30 (1): 85–107. doi:10.1007/s10827-010-0271-2. PMID 20799057.
  18. Lizier, Joseph; Prokopenko, Mikhail; Zomaya, Albert (2008). "Local information transfer as a spatiotemporal filter for complex systems". Physical Review E. 77 (2): 026110. arXiv:0809.3275. Bibcode:2008PhRvE..77b6110L. doi:10.1103/PhysRevE.77.026110. PMID 18352093.
  19. Vicente, Raul; Wibral, Michael; Lindner, Michael; Pipa, Gordon (February 2011). "Transfer entropy—a model-free measure of effective connectivity for the neurosciences". Journal of Computational Neuroscience. 30 (1): 45–67. doi:10.1007/s10827-010-0262-3. PMC 3040354. PMID 20706781.
  20. 20.0 20.1 Shimono, Masanori; Beggs, John (October 2014). "Functional clusters, hubs, and communities in the cortical microconnectome". Cerebral Cortex. 25 (10): 3743–57. doi:10.1093/cercor/bhu252. PMC 4585513. PMID 25336598.
  21. Vicente, Raul; Wibral, Michael; Lindner, Michael; Pipa, Gordon (February 2011). "Transfer entropy—a model-free measure of effective connectivity for the neurosciences". Journal of Computational Neuroscience. 30 (1): 45–67. doi:10.1007/s10827-010-0262-3. PMC 3040354. PMID 20706781.
  22. Massey, James (1990). "Causality, Feedback And Directed Information" (ISITA). CiteSeerX 10.1.1.36.5688. {{cite journal}}: Cite journal requires |journal= (help)
  23. Massey, James (1990). "Causality, Feedback And Directed Information" (ISITA). CiteSeerX 10.1.1.36.5688. {{cite journal}}: Cite journal requires |journal= (help)
  24. Permuter, Haim Henry; Weissman, Tsachy; Goldsmith, Andrea J. (February 2009). "Finite State Channels With Time-Invariant Deterministic Feedback". IEEE Transactions on Information Theory. 55 (2): 644–662. arXiv:cs/0608070. doi:10.1109/TIT.2008.2009849.
  25. Permuter, Haim Henry; Weissman, Tsachy; Goldsmith, Andrea J. (February 2009). "Finite State Channels With Time-Invariant Deterministic Feedback". IEEE Transactions on Information Theory. 55 (2): 644–662. arXiv:cs/0608070. doi:10.1109/TIT.2008.2009849.
  26. Kramer, G. (January 2003). "Capacity results for the discrete memoryless network". IEEE Transactions on Information Theory. 49 (1): 4–21. doi:10.1109/TIT.2002.806135.
  27. Permuter, Haim H.; Kim, Young-Han; Weissman, Tsachy (June 2011). "Interpretations of Directed Information in Portfolio Theory, Data Compression, and Hypothesis Testing". IEEE Transactions on Information Theory. 57 (6): 3248–3259. arXiv:0912.4872. doi:10.1109/TIT.2011.2136270.
  28. Kramer, G. (January 2003). "Capacity results for the discrete memoryless network". IEEE Transactions on Information Theory. 49 (1): 4–21. doi:10.1109/TIT.2002.806135.
  29. Permuter, Haim H.; Kim, Young-Han; Weissman, Tsachy (June 2011). "Interpretations of Directed Information in Portfolio Theory, Data Compression, and Hypothesis Testing". IEEE Transactions on Information Theory. 57 (6): 3248–3259. arXiv:0912.4872. doi:10.1109/TIT.2011.2136270.


External links

外部链接

Category:Causality

分类: 因果关系

Category:Nonlinear time series analysis

类别: 非线性时间序列分析

Category:Nonparametric statistics

类别: 非参数统计

Category:Entropy and information

类别: 熵和信息


This page was moved from wikipedia:en:Transfer entropy. Its edit history can be viewed at 转移熵/edithistory