“转移熵”的版本间的差异
第1行: | 第1行: | ||
此词条暂由Henry翻译。 | 此词条暂由Henry翻译。 | ||
已由Vicky审校 | 已由Vicky审校 | ||
− | |||
− | |||
− | |||
− | |||
<font color="#ff8000"> 转移熵 Transfer entropy</font>(也可译为<font color="#ff8000">传递熵</font>)是衡量两个随机过程之间有向(时间不对称)信息传递量的非参数统计量。<ref>{{cite journal|last=Schreiber|first=Thomas|title=Measuring information transfer|journal=Physical Review Letters|date=1 July 2000|volume=85|issue=2|pages=461–464|doi=10.1103/PhysRevLett.85.461|pmid=10991308|arxiv=nlin/0001042|bibcode=2000PhRvL..85..461S}}</ref><ref name=Scholarpedia >{{cite encyclopedia |year= 2007 |title = Granger causality |volume = 2 |issue = 7 |pages = 1667 |last= Seth |first=Anil|encyclopedia=[[Scholarpedia]] |url=http://www.scholarpedia.org/article/Granger_causality|doi=10.4249/scholarpedia.1667 |bibcode=2007SchpJ...2.1667S|doi-access= free }}</ref><ref name=Schindler07>{{cite journal|last=Hlaváčková-Schindler|first=Katerina|author2=Palus, M |author3=Vejmelka, M |author4= Bhattacharya, J |title=Causality detection based on information-theoretic approaches in time series analysis|journal=Physics Reports|date=1 March 2007|volume=441|issue=1|pages=1–46|doi=10.1016/j.physrep.2006.12.004|bibcode=2007PhR...441....1H|citeseerx=10.1.1.183.1617}}</ref>过程X到过程Y的转移熵是指在给定过去值Y得到过去值X时,Y值不确定性的减少量。更具体地,如果Xt和Yt(t∈N)表示两个随机过程,且信息量用<font color="#ff8000"> 香农熵 Shannon entropy</font>测量,则转移熵可以写为: | <font color="#ff8000"> 转移熵 Transfer entropy</font>(也可译为<font color="#ff8000">传递熵</font>)是衡量两个随机过程之间有向(时间不对称)信息传递量的非参数统计量。<ref>{{cite journal|last=Schreiber|first=Thomas|title=Measuring information transfer|journal=Physical Review Letters|date=1 July 2000|volume=85|issue=2|pages=461–464|doi=10.1103/PhysRevLett.85.461|pmid=10991308|arxiv=nlin/0001042|bibcode=2000PhRvL..85..461S}}</ref><ref name=Scholarpedia >{{cite encyclopedia |year= 2007 |title = Granger causality |volume = 2 |issue = 7 |pages = 1667 |last= Seth |first=Anil|encyclopedia=[[Scholarpedia]] |url=http://www.scholarpedia.org/article/Granger_causality|doi=10.4249/scholarpedia.1667 |bibcode=2007SchpJ...2.1667S|doi-access= free }}</ref><ref name=Schindler07>{{cite journal|last=Hlaváčková-Schindler|first=Katerina|author2=Palus, M |author3=Vejmelka, M |author4= Bhattacharya, J |title=Causality detection based on information-theoretic approaches in time series analysis|journal=Physics Reports|date=1 March 2007|volume=441|issue=1|pages=1–46|doi=10.1016/j.physrep.2006.12.004|bibcode=2007PhR...441....1H|citeseerx=10.1.1.183.1617}}</ref>过程X到过程Y的转移熵是指在给定过去值Y得到过去值X时,Y值不确定性的减少量。更具体地,如果Xt和Yt(t∈N)表示两个随机过程,且信息量用<font color="#ff8000"> 香农熵 Shannon entropy</font>测量,则转移熵可以写为: | ||
第26行: | 第22行: | ||
数学 | 数学 | ||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
其中 H (x)是 x 的香农熵。上述转移熵的定义已被其他类型的熵测度(如<font color="#ff8000"> Rényi熵 Rényi entropy</font>)所扩展。<ref name =" Schindler07"/><ref>{{Cite journal|last=Jizba|first=Petr|last2=Kleinert|first2=Hagen|last3=Shefaat|first3=Mohammad|date=2012-05-15|title=Rényi's information transfer between financial time series|journal=Physica A: Statistical Mechanics and Its Applications|language=en|volume=391|issue=10|pages=2971–2989|doi=10.1016/j.physa.2011.12.064|issn=0378-4371|arxiv=1106.5913|bibcode=2012PhyA..391.2971J}}</ref> | 其中 H (x)是 x 的香农熵。上述转移熵的定义已被其他类型的熵测度(如<font color="#ff8000"> Rényi熵 Rényi entropy</font>)所扩展。<ref name =" Schindler07"/><ref>{{Cite journal|last=Jizba|first=Petr|last2=Kleinert|first2=Hagen|last3=Shefaat|first3=Mohammad|date=2012-05-15|title=Rényi's information transfer between financial time series|journal=Physica A: Statistical Mechanics and Its Applications|language=en|volume=391|issue=10|pages=2971–2989|doi=10.1016/j.physa.2011.12.064|issn=0378-4371|arxiv=1106.5913|bibcode=2012PhyA..391.2971J}}</ref> | ||
− | + | 转移熵是<font color="#ff8000">条件<ref name = Wyner1978>{{cite journal|last=Wyner|first=A. D. |title=A definition of conditional mutual information for arbitrary ensembles|journal=Information and Control|year=1978|volume=38|issue=1|pages=51–59|doi=10.1016/s0019-9958(78)90026-8|doi-access=free}}</ref><ref name = Dobrushin1959>{{cite journal|last=Dobrushin|first=R. L. |title=General formulation of Shannon's main theorem in information theory|journal=Uspekhi Mat. Nauk|year=1959|volume=14|pages=3–104}}</ref>互信息 conditional mutual information</font>,其历史变量为 Yt−1:t−L: | |
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
:<math> | :<math> | ||
第63行: | 第45行: | ||
数学 | 数学 | ||
− | |||
− | |||
− | |||
− | |||
− | |||
对于<font color="#ff8000">向量自回归过程 vector auto-regressive processes</font>,转移熵简化为<font color="#ff8000"> 格兰杰因果关系 Granger causality</font>。<ref name=Equal>{{cite journal|last=Barnett|first=Lionel|title=Granger Causality and Transfer Entropy Are Equivalent for Gaussian Variables|journal=Physical Review Letters|date=1 December 2009|volume=103|issue=23|doi=10.1103/PhysRevLett.103.238701|bibcode=2009PhRvL.103w8701B|pmid=20366183|page=238701|arxiv=0910.4514}}</ref>因此,当格兰杰因果关系的模型假设不成立时,例如对非线性信号的分析时,转移熵就更具优势。<ref name=Greg/><ref>{{cite journal|last=Lungarella|first=M.|author2=Ishiguro, K. |author3=Kuniyoshi, Y. |author4= Otsu, N. |title=Methods for quantifying the causal structure of bivariate time series|journal=International Journal of Bifurcation and Chaos|date=1 March 2007|volume=17|issue=3|pages=903–921|doi=10.1142/S0218127407017628|bibcode=2007IJBC...17..903L|citeseerx=10.1.1.67.3585}}</ref>然而,它通常需要更多的样本才能进行准确估计 。 | 对于<font color="#ff8000">向量自回归过程 vector auto-regressive processes</font>,转移熵简化为<font color="#ff8000"> 格兰杰因果关系 Granger causality</font>。<ref name=Equal>{{cite journal|last=Barnett|first=Lionel|title=Granger Causality and Transfer Entropy Are Equivalent for Gaussian Variables|journal=Physical Review Letters|date=1 December 2009|volume=103|issue=23|doi=10.1103/PhysRevLett.103.238701|bibcode=2009PhRvL.103w8701B|pmid=20366183|page=238701|arxiv=0910.4514}}</ref>因此,当格兰杰因果关系的模型假设不成立时,例如对非线性信号的分析时,转移熵就更具优势。<ref name=Greg/><ref>{{cite journal|last=Lungarella|first=M.|author2=Ishiguro, K. |author3=Kuniyoshi, Y. |author4= Otsu, N. |title=Methods for quantifying the causal structure of bivariate time series|journal=International Journal of Bifurcation and Chaos|date=1 March 2007|volume=17|issue=3|pages=903–921|doi=10.1142/S0218127407017628|bibcode=2007IJBC...17..903L|citeseerx=10.1.1.67.3585}}</ref>然而,它通常需要更多的样本才能进行准确估计 。 | ||
第76行: | 第53行: | ||
熵公式中的概率可以用不同的方法估计,如<font color="#ff8000">分箱 binning</font>、<font color="#ff8000">最近邻 nearest neighbors</font>,或为了降低复杂度,使用非均匀嵌入方法。<ref>{{cite journal|last=Montalto|first=A|author2=Faes, L |author3=Marinazzo, D |title=MuTE: A MATLAB Toolbox to Compare Established and Novel Estimators of the Multivariate Transfer Entropy.|journal=PLOS ONE|date=Oct 2014|pmid=25314003|doi=10.1371/journal.pone.0109462|volume=9|issue=10|pmc=4196918|page=e109462|bibcode=2014PLoSO...9j9462M}}</ref> | 熵公式中的概率可以用不同的方法估计,如<font color="#ff8000">分箱 binning</font>、<font color="#ff8000">最近邻 nearest neighbors</font>,或为了降低复杂度,使用非均匀嵌入方法。<ref>{{cite journal|last=Montalto|first=A|author2=Faes, L |author3=Marinazzo, D |title=MuTE: A MATLAB Toolbox to Compare Established and Novel Estimators of the Multivariate Transfer Entropy.|journal=PLOS ONE|date=Oct 2014|pmid=25314003|doi=10.1371/journal.pone.0109462|volume=9|issue=10|pmc=4196918|page=e109462|bibcode=2014PLoSO...9j9462M}}</ref> | ||
− | |||
− | |||
− | |||
− | |||
虽然转移熵最初定义为双变量分析,但它已经扩展到多变量形式,或者对其他潜在源变量进行调节,<ref>{{cite journal|last=Lizier|first=Joseph|author2=Prokopenko, Mikhail |author3=Zomaya, Albert |title=Local information transfer as a spatiotemporal filter for complex systems|journal=Physical Review E|year=2008|volume=77|issue=2|pages=026110|doi=10.1103/PhysRevE.77.026110|pmid=18352093|arxiv=0809.3275|bibcode=2008PhRvE..77b6110L}}</ref> 或者考虑从一组源的传递,<ref name = Lizier2011>{{cite journal|last=Lizier|first=Joseph|author2=Heinzle, Jakob |author3=Horstmann, Annette |author4=Haynes, John-Dylan |author5= Prokopenko, Mikhail |title=Multivariate information-theoretic measures reveal directed information structure and task relevant changes in fMRI connectivity|journal=Journal of Computational Neuroscience|year=2011|volume=30|issue=1|pages=85–107|doi=10.1007/s10827-010-0271-2|pmid=20799057}}</ref>尽管这些形式再次需要更多的样本。 | 虽然转移熵最初定义为双变量分析,但它已经扩展到多变量形式,或者对其他潜在源变量进行调节,<ref>{{cite journal|last=Lizier|first=Joseph|author2=Prokopenko, Mikhail |author3=Zomaya, Albert |title=Local information transfer as a spatiotemporal filter for complex systems|journal=Physical Review E|year=2008|volume=77|issue=2|pages=026110|doi=10.1103/PhysRevE.77.026110|pmid=18352093|arxiv=0809.3275|bibcode=2008PhRvE..77b6110L}}</ref> 或者考虑从一组源的传递,<ref name = Lizier2011>{{cite journal|last=Lizier|first=Joseph|author2=Heinzle, Jakob |author3=Horstmann, Annette |author4=Haynes, John-Dylan |author5= Prokopenko, Mikhail |title=Multivariate information-theoretic measures reveal directed information structure and task relevant changes in fMRI connectivity|journal=Journal of Computational Neuroscience|year=2011|volume=30|issue=1|pages=85–107|doi=10.1007/s10827-010-0271-2|pmid=20799057}}</ref>尽管这些形式再次需要更多的样本。 | ||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
转移熵被用于估计神经元的功能连接<ref>{{cite journal|last=Vicente|first=Raul|author2=Wibral, Michael |author3=Lindner, Michael |author4= Pipa, Gordon |title=Transfer entropy—a model-free measure of effective connectivity for the neurosciences |journal=Journal of Computational Neuroscience|date=February 2011|volume=30|issue=1|pages=45–67|doi=10.1007/s10827-010-0262-3|pmid=20706781|pmc=3040354}}</ref><ref name = Shimono2014>{{cite journal|last=Shimono|first=Masanori|author2=Beggs, John |title=Functional clusters, hubs, and communities in the cortical microconnectome |url=https://cercor.oxfordjournals.org/content/early/2014/10/21/cercor.bhu252.full |journal=Cerebral Cortex|date= October 2014|volume=25|issue=10|pages=3743–57|doi=10.1093/cercor/bhu252 |pmid=25336598 |pmc=4585513}}</ref>和社交网络的社交影响。<ref name=Greg>{{cite conference |arxiv=1110.2724|title= Information transfer in social media|last1= Ver Steeg |first1= Greg|last2=Galstyan|first2= Aram |year= 2012|publisher= [[Association for Computing Machinery|ACM]]|booktitle= Proceedings of the 21st international conference on World Wide Web (WWW '12) |pages= 509–518 |bibcode=2011arXiv1110.2724V}}</ref> | 转移熵被用于估计神经元的功能连接<ref>{{cite journal|last=Vicente|first=Raul|author2=Wibral, Michael |author3=Lindner, Michael |author4= Pipa, Gordon |title=Transfer entropy—a model-free measure of effective connectivity for the neurosciences |journal=Journal of Computational Neuroscience|date=February 2011|volume=30|issue=1|pages=45–67|doi=10.1007/s10827-010-0262-3|pmid=20706781|pmc=3040354}}</ref><ref name = Shimono2014>{{cite journal|last=Shimono|first=Masanori|author2=Beggs, John |title=Functional clusters, hubs, and communities in the cortical microconnectome |url=https://cercor.oxfordjournals.org/content/early/2014/10/21/cercor.bhu252.full |journal=Cerebral Cortex|date= October 2014|volume=25|issue=10|pages=3743–57|doi=10.1093/cercor/bhu252 |pmid=25336598 |pmc=4585513}}</ref>和社交网络的社交影响。<ref name=Greg>{{cite conference |arxiv=1110.2724|title= Information transfer in social media|last1= Ver Steeg |first1= Greg|last2=Galstyan|first2= Aram |year= 2012|publisher= [[Association for Computing Machinery|ACM]]|booktitle= Proceedings of the 21st international conference on World Wide Web (WWW '12) |pages= 509–518 |bibcode=2011arXiv1110.2724V}}</ref> | ||
第107行: | 第74行: | ||
− | == | + | == 参见 == |
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
+ | * [[条件互信息]] | ||
− | + | * [[因果关系]] | |
− | |||
− | |||
+ | * [[因果关系(物理)]] | ||
+ | * [[结构方程模型]] | ||
− | + | * [[虚拟事实模型 ]] | |
− | |||
− | * | ||
− | |||
− | |||
− | * | + | * [[互信息]] |
+ | == 参考 == | ||
+ | {{Reflist|2}} | ||
− | + | == 外部链接 == | |
− | + | * {{cite web|title=Transfer Entropy Toolbox|url=http://code.google.com/p/transfer-entropy-toolbox/|publisher=[[Google Code]]}}, a toolbox, developed in [[C++]] and [[MATLAB]], for computation of transfer entropy between spike trains. | |
− | + | * {{cite web|title=Java Information Dynamics Toolkit (JIDT)|url=https://github.com/jlizier/jidt|publisher=[[GitHub]]|date=2019-01-16}}, a toolbox, developed in [[Java (programming language)|Java]] and usable in [[MATLAB]], [[GNU Octave]] and [[Python (programming language)|Python]], for computation of transfer entropy and related information-theoretic measures in both discrete and continuous-valued data. | |
− | + | * {{cite web|title=Multivariate Transfer Entropy (MuTE) toolbox|url=https://github.com/montaltoalessandro/MuTE|publisher=[[GitHub]]|date=2019-01-09}}, a toolbox, developed in [[MATLAB]], for computation of transfer entropy with different estimators. | |
− | |||
− | |||
− | |||
− | |||
− | |||
− | [[ | ||
− | |||
− | |||
− | |||
− | |||
− | |||
− | [[ | ||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | + | 本中文词条由[[用户:不是海绵宝宝|不是海绵宝宝]]欢迎在讨论页面留言。 | |
− | [[ | + | '''本词条内容源自wikipedia及公开资料,遵守 CC3.0协议。''' |
+ | [[分类: 因果关系]] [[分类: 非线性时间序列分析]] [[分类: 非参数统计]] [[分类: 熵和信息]] |
2020年12月13日 (日) 21:40的版本
此词条暂由Henry翻译。 已由Vicky审校
转移熵 Transfer entropy(也可译为传递熵)是衡量两个随机过程之间有向(时间不对称)信息传递量的非参数统计量。[1][2][3]过程X到过程Y的转移熵是指在给定过去值Y得到过去值X时,Y值不确定性的减少量。更具体地,如果Xt和Yt(t∈N)表示两个随机过程,且信息量用 香农熵 Shannon entropy测量,则转移熵可以写为:
- [math]\displaystyle{ \lt math\gt 《数学》 T_{X\rightarrow Y} = H\left( Y_t \mid Y_{t-1:t-L}\right) - H\left( Y_t \mid Y_{t-1:t-L}, X_{t-1:t-L}\right), T_{X\rightarrow Y} = H\left( Y_t \mid Y_{t-1:t-L}\right) - H\left( Y_t \mid Y_{t-1:t-L}, X_{t-1:t-L}\right), T _ { x right tarrow y } = h left (y _ t mid y _ { t-1: t-L } right)-h left (y _ t mid y _ { t-1: t-L } ,x _ { t-1: t-L } right) , }[/math]
</math>
数学
其中 H (x)是 x 的香农熵。上述转移熵的定义已被其他类型的熵测度(如 Rényi熵 Rényi entropy)所扩展。[3][4]
转移熵是条件[5][6]互信息 conditional mutual information,其历史变量为 Yt−1:t−L:
- [math]\displaystyle{ \lt math\gt 《数学》 T_{X\rightarrow Y} = I(Y_t ; X_{t-1:t-L} \mid Y_{t-1:t-L}). T_{X\rightarrow Y} = I(Y_t ; X_{t-1:t-L} \mid Y_{t-1:t-L}). T _ { x right tarrow y } = i (y _ t; x _ { t-1: t-L } mid y _ { t-1: t-L }). }[/math]
</math>
数学
对于向量自回归过程 vector auto-regressive processes,转移熵简化为 格兰杰因果关系 Granger causality。[7]因此,当格兰杰因果关系的模型假设不成立时,例如对非线性信号的分析时,转移熵就更具优势。[8][9]然而,它通常需要更多的样本才能进行准确估计 。
The probabilities in the entropy formula can be estimated using different approaches (binning, nearest neighbors) or, in order to reduce complexity, using a non-uniform embedding.[10]
The probabilities in the entropy formula can be estimated using different approaches (binning, nearest neighbors) or, in order to reduce complexity, using a non-uniform embedding.
熵公式中的概率可以用不同的方法估计,如分箱 binning、最近邻 nearest neighbors,或为了降低复杂度,使用非均匀嵌入方法。[11]
虽然转移熵最初定义为双变量分析,但它已经扩展到多变量形式,或者对其他潜在源变量进行调节,[12] 或者考虑从一组源的传递,[13]尽管这些形式再次需要更多的样本。
转移熵被用于估计神经元的功能连接[14][15]和社交网络的社交影响。[8]
Transfer entropy is a finite version of the Directed Information which was defined in 1990 by James Massey [16] as
Transfer entropy is a finite version of the Directed Information which was defined in 1990 by James Massey as
转移熵是有向信息的有限形式,1990年由詹姆斯·梅西 James Massey[17]定义为
[math]\displaystyle{ I(X^n\to Y^n) =\sum_{i=1}^n I(X^i;Y_i|Y^{i-1}) }[/math], where [math]\displaystyle{ X^n }[/math] denotes the vector [math]\displaystyle{ X_1,X_2,...,X_n }[/math] and [math]\displaystyle{ Y^n }[/math] denotes [math]\displaystyle{ Y_1,Y_2,...,Y_n }[/math]. The directed information places an important role in characterizing the fundamental limits (channel capacity) of communication channels with or without feedback [18]
[math]\displaystyle{ I(X^n\to Y^n) =\sum_{i=1}^n I(X^i;Y_i|Y^{i-1}) }[/math], where [math]\displaystyle{ X^n }[/math] denotes the vector[math]\displaystyle{ X_1,X_2,...,X_n }[/math]and [math]\displaystyle{ Y^n }[/math] denotes [math]\displaystyle{ Y_1,Y_2,...,Y_n }[/math]. The directed information places an important role in characterizing the fundamental limits (channel capacity) of communication channels with or without feedback and gambling with causal side information,
I(Xn→Yn)=∑ni=1I(Xi;Yi|Yi−1),其中 Xn表示向量X1,X2,...,Xn和Yn表示 Y1,Y2,...,Yn。有向信息在描述有无反馈[19] [20]信道的基本限制(信道容量)与基于因果信息赌博[21]方面起着重要作用。
[22] and gambling with causal side information,[23]
参见
参考
- ↑ Schreiber, Thomas (1 July 2000). "Measuring information transfer". Physical Review Letters. 85 (2): 461–464. arXiv:nlin/0001042. Bibcode:2000PhRvL..85..461S. doi:10.1103/PhysRevLett.85.461. PMID 10991308.
- ↑ Seth, Anil (2007). "Granger causality". Scholarpedia. Vol. 2. p. 1667. Bibcode:2007SchpJ...2.1667S. doi:10.4249/scholarpedia.1667.
- ↑ 3.0 3.1 Hlaváčková-Schindler, Katerina; Palus, M; Vejmelka, M; Bhattacharya, J (1 March 2007). "Causality detection based on information-theoretic approaches in time series analysis". Physics Reports. 441 (1): 1–46. Bibcode:2007PhR...441....1H. CiteSeerX 10.1.1.183.1617. doi:10.1016/j.physrep.2006.12.004.
- ↑ Jizba, Petr; Kleinert, Hagen; Shefaat, Mohammad (2012-05-15). "Rényi's information transfer between financial time series". Physica A: Statistical Mechanics and Its Applications (in English). 391 (10): 2971–2989. arXiv:1106.5913. Bibcode:2012PhyA..391.2971J. doi:10.1016/j.physa.2011.12.064. ISSN 0378-4371.
- ↑ Wyner, A. D. (1978). "A definition of conditional mutual information for arbitrary ensembles". Information and Control. 38 (1): 51–59. doi:10.1016/s0019-9958(78)90026-8.
- ↑ Dobrushin, R. L. (1959). "General formulation of Shannon's main theorem in information theory". Uspekhi Mat. Nauk. 14: 3–104.
- ↑ Barnett, Lionel (1 December 2009). "Granger Causality and Transfer Entropy Are Equivalent for Gaussian Variables". Physical Review Letters. 103 (23): 238701. arXiv:0910.4514. Bibcode:2009PhRvL.103w8701B. doi:10.1103/PhysRevLett.103.238701. PMID 20366183.
- ↑ 8.0 8.1 Ver Steeg, Greg; Galstyan, Aram (2012). Information transfer in social media. ACM. pp. 509–518. arXiv:1110.2724. Bibcode:2011arXiv1110.2724V.
{{cite conference}}
: Unknown parameter|booktitle=
ignored (help) - ↑ Lungarella, M.; Ishiguro, K.; Kuniyoshi, Y.; Otsu, N. (1 March 2007). "Methods for quantifying the causal structure of bivariate time series". International Journal of Bifurcation and Chaos. 17 (3): 903–921. Bibcode:2007IJBC...17..903L. CiteSeerX 10.1.1.67.3585. doi:10.1142/S0218127407017628.
- ↑ Montalto, A; Faes, L; Marinazzo, D (Oct 2014). "MuTE: A MATLAB Toolbox to Compare Established and Novel Estimators of the Multivariate Transfer Entropy". PLOS ONE. 9 (10): e109462. Bibcode:2014PLoSO...9j9462M. doi:10.1371/journal.pone.0109462. PMC 4196918. PMID 25314003.
- ↑ Montalto, A; Faes, L; Marinazzo, D (Oct 2014). "MuTE: A MATLAB Toolbox to Compare Established and Novel Estimators of the Multivariate Transfer Entropy". PLOS ONE. 9 (10): e109462. Bibcode:2014PLoSO...9j9462M. doi:10.1371/journal.pone.0109462. PMC 4196918. PMID 25314003.
- ↑ Lizier, Joseph; Prokopenko, Mikhail; Zomaya, Albert (2008). "Local information transfer as a spatiotemporal filter for complex systems". Physical Review E. 77 (2): 026110. arXiv:0809.3275. Bibcode:2008PhRvE..77b6110L. doi:10.1103/PhysRevE.77.026110. PMID 18352093.
- ↑ Lizier, Joseph; Heinzle, Jakob; Horstmann, Annette; Haynes, John-Dylan; Prokopenko, Mikhail (2011). "Multivariate information-theoretic measures reveal directed information structure and task relevant changes in fMRI connectivity". Journal of Computational Neuroscience. 30 (1): 85–107. doi:10.1007/s10827-010-0271-2. PMID 20799057.
- ↑ Vicente, Raul; Wibral, Michael; Lindner, Michael; Pipa, Gordon (February 2011). "Transfer entropy—a model-free measure of effective connectivity for the neurosciences". Journal of Computational Neuroscience. 30 (1): 45–67. doi:10.1007/s10827-010-0262-3. PMC 3040354. PMID 20706781.
- ↑ Shimono, Masanori; Beggs, John (October 2014). "Functional clusters, hubs, and communities in the cortical microconnectome". Cerebral Cortex. 25 (10): 3743–57. doi:10.1093/cercor/bhu252. PMC 4585513. PMID 25336598.
- ↑ Massey, James (1990). "Causality, Feedback And Directed Information" (ISITA). CiteSeerX 10.1.1.36.5688.
{{cite journal}}
: Cite journal requires|journal=
(help) - ↑ Massey, James (1990). "Causality, Feedback And Directed Information" (ISITA). CiteSeerX 10.1.1.36.5688.
{{cite journal}}
: Cite journal requires|journal=
(help) - ↑ Permuter, Haim Henry; Weissman, Tsachy; Goldsmith, Andrea J. (February 2009). "Finite State Channels With Time-Invariant Deterministic Feedback". IEEE Transactions on Information Theory. 55 (2): 644–662. arXiv:cs/0608070. doi:10.1109/TIT.2008.2009849.
- ↑ Permuter, Haim Henry; Weissman, Tsachy; Goldsmith, Andrea J. (February 2009). "Finite State Channels With Time-Invariant Deterministic Feedback". IEEE Transactions on Information Theory. 55 (2): 644–662. arXiv:cs/0608070. doi:10.1109/TIT.2008.2009849.
- ↑ Kramer, G. (January 2003). "Capacity results for the discrete memoryless network". IEEE Transactions on Information Theory. 49 (1): 4–21. doi:10.1109/TIT.2002.806135.
- ↑ Permuter, Haim H.; Kim, Young-Han; Weissman, Tsachy (June 2011). "Interpretations of Directed Information in Portfolio Theory, Data Compression, and Hypothesis Testing". IEEE Transactions on Information Theory. 57 (6): 3248–3259. arXiv:0912.4872. doi:10.1109/TIT.2011.2136270.
- ↑ Kramer, G. (January 2003). "Capacity results for the discrete memoryless network". IEEE Transactions on Information Theory. 49 (1): 4–21. doi:10.1109/TIT.2002.806135.
- ↑ Permuter, Haim H.; Kim, Young-Han; Weissman, Tsachy (June 2011). "Interpretations of Directed Information in Portfolio Theory, Data Compression, and Hypothesis Testing". IEEE Transactions on Information Theory. 57 (6): 3248–3259. arXiv:0912.4872. doi:10.1109/TIT.2011.2136270.
外部链接
- "Transfer Entropy Toolbox". Google Code., a toolbox, developed in C++ and MATLAB, for computation of transfer entropy between spike trains.
- "Java Information Dynamics Toolkit (JIDT)". GitHub. 2019-01-16., a toolbox, developed in Java and usable in MATLAB, GNU Octave and Python, for computation of transfer entropy and related information-theoretic measures in both discrete and continuous-valued data.
- "Multivariate Transfer Entropy (MuTE) toolbox". GitHub. 2019-01-09., a toolbox, developed in MATLAB, for computation of transfer entropy with different estimators.
本中文词条由不是海绵宝宝欢迎在讨论页面留言。
本词条内容源自wikipedia及公开资料,遵守 CC3.0协议。