更改

删除4,311字节 、 2020年10月25日 (日) 17:43
第1行: 第1行: −
此词条暂由彩云小译翻译,未经人工整理和审校,带来阅读不便,请见谅。
+
此词条暂由彩云小译翻译,翻译字数共95,未经人工整理和审校,带来阅读不便,请见谅。
    
'''Transfer entropy''' is a [[non-parametric statistics|non-parametric statistic]] measuring the amount of directed (time-asymmetric) transfer of [[information]] between two [[random process]]es.<ref>{{cite journal|last=Schreiber|first=Thomas|title=Measuring information transfer|journal=Physical Review Letters|date=1 July 2000|volume=85|issue=2|pages=461–464|doi=10.1103/PhysRevLett.85.461|pmid=10991308|arxiv=nlin/0001042|bibcode=2000PhRvL..85..461S}}</ref><ref name=Scholarpedia >{{cite encyclopedia |year= 2007 |title = Granger causality |volume = 2 |issue = 7 |pages = 1667 |last= Seth |first=Anil|encyclopedia=[[Scholarpedia]] |url=http://www.scholarpedia.org/article/Granger_causality|doi=10.4249/scholarpedia.1667 |bibcode=2007SchpJ...2.1667S|doi-access= free }}</ref><ref name=Schindler07>{{cite journal|last=Hlaváčková-Schindler|first=Katerina|author2=Palus, M |author3=Vejmelka, M |author4= Bhattacharya, J |title=Causality detection based on information-theoretic approaches in time series analysis|journal=Physics Reports|date=1 March 2007|volume=441|issue=1|pages=1–46|doi=10.1016/j.physrep.2006.12.004|bibcode=2007PhR...441....1H|citeseerx=10.1.1.183.1617}}</ref> Transfer entropy from a process ''X'' to another process ''Y'' is the amount of uncertainty reduced in future values of ''Y''  by knowing the past values of ''X'' given past values of ''Y''. More specifically, if  <math> X_t </math>  and  <math> Y_t </math>  for  <math> t\in \mathbb{N} </math>  denote two random processes and the amount of information is measured using [[Shannon's entropy]], the transfer entropy can be written as:
 
'''Transfer entropy''' is a [[non-parametric statistics|non-parametric statistic]] measuring the amount of directed (time-asymmetric) transfer of [[information]] between two [[random process]]es.<ref>{{cite journal|last=Schreiber|first=Thomas|title=Measuring information transfer|journal=Physical Review Letters|date=1 July 2000|volume=85|issue=2|pages=461–464|doi=10.1103/PhysRevLett.85.461|pmid=10991308|arxiv=nlin/0001042|bibcode=2000PhRvL..85..461S}}</ref><ref name=Scholarpedia >{{cite encyclopedia |year= 2007 |title = Granger causality |volume = 2 |issue = 7 |pages = 1667 |last= Seth |first=Anil|encyclopedia=[[Scholarpedia]] |url=http://www.scholarpedia.org/article/Granger_causality|doi=10.4249/scholarpedia.1667 |bibcode=2007SchpJ...2.1667S|doi-access= free }}</ref><ref name=Schindler07>{{cite journal|last=Hlaváčková-Schindler|first=Katerina|author2=Palus, M |author3=Vejmelka, M |author4= Bhattacharya, J |title=Causality detection based on information-theoretic approaches in time series analysis|journal=Physics Reports|date=1 March 2007|volume=441|issue=1|pages=1–46|doi=10.1016/j.physrep.2006.12.004|bibcode=2007PhR...441....1H|citeseerx=10.1.1.183.1617}}</ref> Transfer entropy from a process ''X'' to another process ''Y'' is the amount of uncertainty reduced in future values of ''Y''  by knowing the past values of ''X'' given past values of ''Y''. More specifically, if  <math> X_t </math>  and  <math> Y_t </math>  for  <math> t\in \mathbb{N} </math>  denote two random processes and the amount of information is measured using [[Shannon's entropy]], the transfer entropy can be written as:
   −
Transfer entropy is a non-parametric statistic measuring the amount of directed (time-asymmetric) transfer of information between two random processes. Transfer entropy from a process X to another process Y is the amount of uncertainty reduced in future values of Y  by knowing the past values of X given past values of Y. More specifically, if  <math> X_t </math>  and  <math> Y_t </math>  for  <math> t\in \mathbb{N} </math>  denote two random processes and the amount of information is measured using Shannon's entropy, the transfer entropy can be written as:
+
Transfer entropy is a non-parametric statistic measuring the amount of directed (time-asymmetric) transfer of information between two random processes. and social influence in social networks.
   −
转移熵是一个非参数统计量的数量有向(时间不对称)的信息传递之间的两个随机过程。从一个过程 x 到另一个过程 y 的转移熵是通过知道 x 的过去值减少了 y 的未来值的不确定性量。更具体地说,如果 mathbb / math 中的 math x t / math 和 math y t / math 表示两个随机过程,并且用 Shannon 的熵来度量信息量,那么迁移熵可以写成:
+
转移熵是衡量两个随机过程之间有向(时间不对称)信息转移量的非参数统计量。以及社交网络的社会影响力。
       +
 +
Transfer entropy is a finite version of the  Directed Information which was defined in 1990 by James Massey  as
 +
 +
转移熵是有向信息的有限形式,1990年由 James Massey 定义为
    
:<math>
 
:<math>
   −
<math>
+
<math>I(X^n\to Y^n) =\sum_{i=1}^n I(X^i;Y_i|Y^{i-1})</math>,  where <math>X^n</math> denotes the vector <math>X_1,X_2,...,X_n</math> and <math>Y^n</math> denotes <math>Y_1,Y_2,...,Y_n</math>. The directed information places an important role in characterizing the fundamental limits (channel capacity) of communication channels with or without feedback 
   −
数学
+
I (x ^ n to y ^ n) = sum { i = 1} ^ n i (x ^ i; y _ i | y ^ { i-1}) </math > ,其中 < math > x ^ n </math > 表示向量 < math > x1,x2,... ,xn </math > 和 < math > y ^ n </math > 表示 < math > y _ 1,y _ 2,... ,y _ n </math > 。定向信息在描述有无反馈信道的基本限制(信道容量)方面起着重要作用
    
T_{X\rightarrow Y} = H\left( Y_t \mid Y_{t-1:t-L}\right) - H\left( Y_t \mid Y_{t-1:t-L}, X_{t-1:t-L}\right),
 
T_{X\rightarrow Y} = H\left( Y_t \mid Y_{t-1:t-L}\right) - H\left( Y_t \mid Y_{t-1:t-L}, X_{t-1:t-L}\right),
   −
T_{X\rightarrow Y} = H\left( Y_t \mid Y_{t-1:t-L}\right) - H\left( Y_t \mid Y_{t-1:t-L}, X_{t-1:t-L}\right),
+
and gambling with causal side information,
 
  −
T { x 右列 y } h 左(y 中 y { t-1: t-L }右)-h 左(y 中 y { t-1: t-L } ,x { t-1: t-L }右) ,
     −
</math>
+
和赌博与因果方面的信息,
    
</math>
 
</math>
  −
数学
            
where ''H''(''X'') is Shannon entropy of ''X''. The above definition of transfer entropy has been extended by other types of [[entropy (information theory)|entropy]] measures such as [[Rényi entropy]].<ref name ="  Schindler07"/><ref>{{Cite journal|last=Jizba|first=Petr|last2=Kleinert|first2=Hagen|last3=Shefaat|first3=Mohammad|date=2012-05-15|title=Rényi's information transfer between financial time series|journal=Physica A: Statistical Mechanics and Its Applications|language=en|volume=391|issue=10|pages=2971–2989|doi=10.1016/j.physa.2011.12.064|issn=0378-4371|arxiv=1106.5913|bibcode=2012PhyA..391.2971J}}</ref>
 
where ''H''(''X'') is Shannon entropy of ''X''. The above definition of transfer entropy has been extended by other types of [[entropy (information theory)|entropy]] measures such as [[Rényi entropy]].<ref name ="  Schindler07"/><ref>{{Cite journal|last=Jizba|first=Petr|last2=Kleinert|first2=Hagen|last3=Shefaat|first3=Mohammad|date=2012-05-15|title=Rényi's information transfer between financial time series|journal=Physica A: Statistical Mechanics and Its Applications|language=en|volume=391|issue=10|pages=2971–2989|doi=10.1016/j.physa.2011.12.064|issn=0378-4371|arxiv=1106.5913|bibcode=2012PhyA..391.2971J}}</ref>
  −
where H(X) is Shannon entropy of X. The above definition of transfer entropy has been extended by other types of entropy measures such as Rényi entropy.
  −
  −
其中 h (x)是 x 的 Shannon 熵。上述的转移熵定义被其他类型的熵测度(如罗曼熵)所扩展。
            
Transfer entropy is [[conditional mutual information]],<ref name = Wyner1978>{{cite journal|last=Wyner|first=A. D. |title=A definition of conditional mutual information for arbitrary ensembles|journal=Information and Control|year=1978|volume=38|issue=1|pages=51–59|doi=10.1016/s0019-9958(78)90026-8|doi-access=free}}</ref><ref name = Dobrushin1959>{{cite journal|last=Dobrushin|first=R. L. |title=General formulation of Shannon's main theorem in information theory|journal=Uspekhi Mat. Nauk|year=1959|volume=14|pages=3–104}}</ref> with the history of the influenced variable <math>Y_{t-1:t-L}</math> in the condition:
 
Transfer entropy is [[conditional mutual information]],<ref name = Wyner1978>{{cite journal|last=Wyner|first=A. D. |title=A definition of conditional mutual information for arbitrary ensembles|journal=Information and Control|year=1978|volume=38|issue=1|pages=51–59|doi=10.1016/s0019-9958(78)90026-8|doi-access=free}}</ref><ref name = Dobrushin1959>{{cite journal|last=Dobrushin|first=R. L. |title=General formulation of Shannon's main theorem in information theory|journal=Uspekhi Mat. Nauk|year=1959|volume=14|pages=3–104}}</ref> with the history of the influenced variable <math>Y_{t-1:t-L}</math> in the condition:
  −
Transfer entropy is conditional mutual information, with the history of the influenced variable <math>Y_{t-1:t-L}</math> in the condition:
  −
  −
转移熵是条件互信息,带有受影响变量 math 的历史 y { t-1: t-L } / math 在条件:
            
:<math>
 
:<math>
  −
<math>
  −
  −
数学
  −
  −
T_{X\rightarrow Y} = I(Y_t ; X_{t-1:t-L} \mid Y_{t-1:t-L}).
      
T_{X\rightarrow Y} = I(Y_t ; X_{t-1:t-L} \mid Y_{t-1:t-L}).
 
T_{X\rightarrow Y} = I(Y_t ; X_{t-1:t-L} \mid Y_{t-1:t-L}).
  −
T { x  right tarrow y } i (y t; x { t-1: t-L } mid y { t-1: t-L }).
      
</math>
 
</math>
  −
</math>
  −
  −
数学
            
Transfer entropy reduces to [[Granger causality]] for [[Autoregressive model|vector auto-regressive processes]].<ref name=Equal>{{cite journal|last=Barnett|first=Lionel|title=Granger Causality and Transfer Entropy Are Equivalent for Gaussian Variables|journal=Physical Review Letters|date=1 December 2009|volume=103|issue=23|doi=10.1103/PhysRevLett.103.238701|bibcode=2009PhRvL.103w8701B|pmid=20366183|page=238701|arxiv=0910.4514}}</ref> Hence, it is advantageous when the model assumption of Granger causality doesn't hold, for example, analysis of [[non-linear regression|non-linear signals]].<ref name=Greg/><ref>{{cite journal|last=Lungarella|first=M.|author2=Ishiguro, K. |author3=Kuniyoshi, Y. |author4= Otsu, N. |title=Methods for quantifying the causal structure of bivariate time series|journal=International Journal of Bifurcation and Chaos|date=1 March 2007|volume=17|issue=3|pages=903–921|doi=10.1142/S0218127407017628|bibcode=2007IJBC...17..903L|citeseerx=10.1.1.67.3585}}</ref> However, it usually requires more samples for accurate estimation.<ref>{{cite journal|last=Pereda|first=E|author2=Quiroga, RQ |author3=Bhattacharya, J |title=Nonlinear multivariate analysis of neurophysiological signals.|journal=Progress in Neurobiology|date=Sep–Oct 2005|volume=77|issue=1–2|pages=1–37|pmid=16289760|doi=10.1016/j.pneurobio.2005.10.003|arxiv=nlin/0510077|bibcode=2005nlin.....10077P}}</ref>
 
Transfer entropy reduces to [[Granger causality]] for [[Autoregressive model|vector auto-regressive processes]].<ref name=Equal>{{cite journal|last=Barnett|first=Lionel|title=Granger Causality and Transfer Entropy Are Equivalent for Gaussian Variables|journal=Physical Review Letters|date=1 December 2009|volume=103|issue=23|doi=10.1103/PhysRevLett.103.238701|bibcode=2009PhRvL.103w8701B|pmid=20366183|page=238701|arxiv=0910.4514}}</ref> Hence, it is advantageous when the model assumption of Granger causality doesn't hold, for example, analysis of [[non-linear regression|non-linear signals]].<ref name=Greg/><ref>{{cite journal|last=Lungarella|first=M.|author2=Ishiguro, K. |author3=Kuniyoshi, Y. |author4= Otsu, N. |title=Methods for quantifying the causal structure of bivariate time series|journal=International Journal of Bifurcation and Chaos|date=1 March 2007|volume=17|issue=3|pages=903–921|doi=10.1142/S0218127407017628|bibcode=2007IJBC...17..903L|citeseerx=10.1.1.67.3585}}</ref> However, it usually requires more samples for accurate estimation.<ref>{{cite journal|last=Pereda|first=E|author2=Quiroga, RQ |author3=Bhattacharya, J |title=Nonlinear multivariate analysis of neurophysiological signals.|journal=Progress in Neurobiology|date=Sep–Oct 2005|volume=77|issue=1–2|pages=1–37|pmid=16289760|doi=10.1016/j.pneurobio.2005.10.003|arxiv=nlin/0510077|bibcode=2005nlin.....10077P}}</ref>
  −
Transfer entropy reduces to Granger causality for vector auto-regressive processes. Hence, it is advantageous when the model assumption of Granger causality doesn't hold, for example, analysis of non-linear signals. However, it usually requires more samples for accurate estimation.
  −
  −
对于向量自回归过程,传递熵降低到格兰杰因果关系。因此,当格兰杰因果关系的模型假设不成立时,例如,对非线性信号的分析是有利的。然而,为了精确估计,通常需要更多的样本。
      
The probabilities in the entropy formula can be estimated using different approaches (binning, nearest neighbors) or, in order to reduce complexity, using a non-uniform embedding.<ref>{{cite journal|last=Montalto|first=A|author2=Faes, L |author3=Marinazzo, D |title=MuTE: A MATLAB Toolbox to Compare Established and Novel Estimators of the Multivariate Transfer Entropy.|journal=PLOS ONE|date=Oct 2014|pmid=25314003|doi=10.1371/journal.pone.0109462|volume=9|issue=10|pmc=4196918|page=e109462|bibcode=2014PLoSO...9j9462M}}</ref>
 
The probabilities in the entropy formula can be estimated using different approaches (binning, nearest neighbors) or, in order to reduce complexity, using a non-uniform embedding.<ref>{{cite journal|last=Montalto|first=A|author2=Faes, L |author3=Marinazzo, D |title=MuTE: A MATLAB Toolbox to Compare Established and Novel Estimators of the Multivariate Transfer Entropy.|journal=PLOS ONE|date=Oct 2014|pmid=25314003|doi=10.1371/journal.pone.0109462|volume=9|issue=10|pmc=4196918|page=e109462|bibcode=2014PLoSO...9j9462M}}</ref>
  −
The probabilities in the entropy formula can be estimated using different approaches (binning, nearest neighbors) or, in order to reduce complexity, using a non-uniform embedding.
  −
  −
熵公式中的概率可以通过不同的方法估计(包括结对、最近邻) ,或者为了降低复杂度,可以使用非均匀嵌入。
      
While it was originally defined for [[bivariate analysis]], transfer entropy has been extended to [[Multivariate analysis|multivariate]] forms, either conditioning on other potential source variables<ref>{{cite journal|last=Lizier|first=Joseph|author2=Prokopenko, Mikhail |author3=Zomaya, Albert |title=Local information transfer as a spatiotemporal filter for complex systems|journal=Physical Review E|year=2008|volume=77|issue=2|pages=026110|doi=10.1103/PhysRevE.77.026110|pmid=18352093|arxiv=0809.3275|bibcode=2008PhRvE..77b6110L}}</ref> or considering transfer from a collection of sources,<ref name = Lizier2011>{{cite journal|last=Lizier|first=Joseph|author2=Heinzle, Jakob |author3=Horstmann, Annette |author4=Haynes, John-Dylan |author5= Prokopenko, Mikhail |title=Multivariate information-theoretic measures reveal directed information structure and task relevant changes in fMRI connectivity|journal=Journal of Computational Neuroscience|year=2011|volume=30|issue=1|pages=85–107|doi=10.1007/s10827-010-0271-2|pmid=20799057}}</ref> although these forms require more samples again.
 
While it was originally defined for [[bivariate analysis]], transfer entropy has been extended to [[Multivariate analysis|multivariate]] forms, either conditioning on other potential source variables<ref>{{cite journal|last=Lizier|first=Joseph|author2=Prokopenko, Mikhail |author3=Zomaya, Albert |title=Local information transfer as a spatiotemporal filter for complex systems|journal=Physical Review E|year=2008|volume=77|issue=2|pages=026110|doi=10.1103/PhysRevE.77.026110|pmid=18352093|arxiv=0809.3275|bibcode=2008PhRvE..77b6110L}}</ref> or considering transfer from a collection of sources,<ref name = Lizier2011>{{cite journal|last=Lizier|first=Joseph|author2=Heinzle, Jakob |author3=Horstmann, Annette |author4=Haynes, John-Dylan |author5= Prokopenko, Mikhail |title=Multivariate information-theoretic measures reveal directed information structure and task relevant changes in fMRI connectivity|journal=Journal of Computational Neuroscience|year=2011|volume=30|issue=1|pages=85–107|doi=10.1007/s10827-010-0271-2|pmid=20799057}}</ref> although these forms require more samples again.
  −
While it was originally defined for bivariate analysis, transfer entropy has been extended to multivariate forms, either conditioning on other potential source variables or considering transfer from a collection of sources, although these forms require more samples again.
  −
  −
虽然传递熵最初定义为二元分析,但它已经扩展到多元形式,要么是对其他潜在源变量的条件作用,要么是考虑从一组源的传递,尽管这些形式需要更多的样本。
            
Transfer entropy has been used for estimation of [[functional connectivity]] of [[neurons]]<ref name=Lizier2011 /><ref>{{cite journal|last=Vicente|first=Raul|author2=Wibral, Michael |author3=Lindner, Michael |author4= Pipa, Gordon |title=Transfer entropy—a model-free measure of effective connectivity for the neurosciences |journal=Journal of Computational Neuroscience|date=February 2011|volume=30|issue=1|pages=45–67|doi=10.1007/s10827-010-0262-3|pmid=20706781|pmc=3040354}}</ref><ref name = Shimono2014>{{cite journal|last=Shimono|first=Masanori|author2=Beggs, John |title=Functional clusters, hubs, and communities in the cortical microconnectome |url=https://cercor.oxfordjournals.org/content/early/2014/10/21/cercor.bhu252.full |journal=Cerebral Cortex|date= October 2014|volume=25|issue=10|pages=3743–57|doi=10.1093/cercor/bhu252 |pmid=25336598 |pmc=4585513}}</ref> and [[social influence]] in [[social networks]].<ref name=Greg>{{cite conference |arxiv=1110.2724|title= Information transfer in social media|last1= Ver Steeg |first1= Greg|last2=Galstyan|first2=  Aram  |year= 2012|publisher= [[Association for Computing Machinery|ACM]]|booktitle= Proceedings of the 21st international conference on World Wide Web (WWW '12) |pages= 509–518 |bibcode=2011arXiv1110.2724V}}</ref>
 
Transfer entropy has been used for estimation of [[functional connectivity]] of [[neurons]]<ref name=Lizier2011 /><ref>{{cite journal|last=Vicente|first=Raul|author2=Wibral, Michael |author3=Lindner, Michael |author4= Pipa, Gordon |title=Transfer entropy—a model-free measure of effective connectivity for the neurosciences |journal=Journal of Computational Neuroscience|date=February 2011|volume=30|issue=1|pages=45–67|doi=10.1007/s10827-010-0262-3|pmid=20706781|pmc=3040354}}</ref><ref name = Shimono2014>{{cite journal|last=Shimono|first=Masanori|author2=Beggs, John |title=Functional clusters, hubs, and communities in the cortical microconnectome |url=https://cercor.oxfordjournals.org/content/early/2014/10/21/cercor.bhu252.full |journal=Cerebral Cortex|date= October 2014|volume=25|issue=10|pages=3743–57|doi=10.1093/cercor/bhu252 |pmid=25336598 |pmc=4585513}}</ref> and [[social influence]] in [[social networks]].<ref name=Greg>{{cite conference |arxiv=1110.2724|title= Information transfer in social media|last1= Ver Steeg |first1= Greg|last2=Galstyan|first2=  Aram  |year= 2012|publisher= [[Association for Computing Machinery|ACM]]|booktitle= Proceedings of the 21st international conference on World Wide Web (WWW '12) |pages= 509–518 |bibcode=2011arXiv1110.2724V}}</ref>
  −
Transfer entropy has been used for estimation of functional connectivity of neurons and social influence in social networks.
  −
  −
传递熵被用来估计社会网络中神经元的功能连通性和社会影响。
      
Transfer entropy is a finite version of the  [[Directed Information]] which was defined in 1990 by [[James Massey]] <ref>{{cite journal|last1=Massey|first1=James|title=Causality, Feedback And Directed Information|date=1990|issue=ISITA|citeseerx=10.1.1.36.5688}}</ref> as  
 
Transfer entropy is a finite version of the  [[Directed Information]] which was defined in 1990 by [[James Massey]] <ref>{{cite journal|last1=Massey|first1=James|title=Causality, Feedback And Directed Information|date=1990|issue=ISITA|citeseerx=10.1.1.36.5688}}</ref> as  
  −
Transfer entropy is a finite version of the  Directed Information which was defined in 1990 by James Massey  as
  −
  −
转移熵是有向信息的有限形式,1990年由 James Massey 定义为
      
<math>I(X^n\to Y^n) =\sum_{i=1}^n I(X^i;Y_i|Y^{i-1})</math>,  where <math>X^n</math> denotes the vector <math>X_1,X_2,...,X_n</math> and <math>Y^n</math> denotes <math>Y_1,Y_2,...,Y_n</math>. The [[directed information]] places an important role in characterizing the fundamental limits ([[channel capacity]]) of communication channels with or without feedback <ref>{{cite journal|last1=Permuter|first1=Haim Henry|last2=Weissman|first2=Tsachy|last3=Goldsmith|first3=Andrea J.|title=Finite State Channels With Time-Invariant Deterministic Feedback|journal=IEEE Transactions on Information Theory|date=February 2009|volume=55|issue=2|pages=644–662|doi=10.1109/TIT.2008.2009849|arxiv=cs/0608070}}</ref>  
 
<math>I(X^n\to Y^n) =\sum_{i=1}^n I(X^i;Y_i|Y^{i-1})</math>,  where <math>X^n</math> denotes the vector <math>X_1,X_2,...,X_n</math> and <math>Y^n</math> denotes <math>Y_1,Y_2,...,Y_n</math>. The [[directed information]] places an important role in characterizing the fundamental limits ([[channel capacity]]) of communication channels with or without feedback <ref>{{cite journal|last1=Permuter|first1=Haim Henry|last2=Weissman|first2=Tsachy|last3=Goldsmith|first3=Andrea J.|title=Finite State Channels With Time-Invariant Deterministic Feedback|journal=IEEE Transactions on Information Theory|date=February 2009|volume=55|issue=2|pages=644–662|doi=10.1109/TIT.2008.2009849|arxiv=cs/0608070}}</ref>  
  −
<math>I(X^n\to Y^n) =\sum_{i=1}^n I(X^i;Y_i|Y^{i-1})</math>,  where <math>X^n</math> denotes the vector <math>X_1,X_2,...,X_n</math> and <math>Y^n</math> denotes <math>Y_1,Y_2,...,Y_n</math>. The directed information places an important role in characterizing the fundamental limits (channel capacity) of communication channels with or without feedback 
  −
  −
数学 i (x ^ n 到 y ^ n) sum { i ^ n i (x ^ i; y | y ^ { i-1}) / math,其中数学 x ^ n / math 表示向量数学 x1,x2,... ,xn / math 和数学 y ^ n / math 表示数学 y ^ n,y ^ 2,... ,y / math。定向信息在描述有无反馈信道的基本限制(信道容量)方面起着重要作用
      
<ref>{{cite journal|last1=Kramer|first1=G.|title=Capacity results for the discrete memoryless network|journal=IEEE Transactions on Information Theory|date=January 2003|volume=49|issue=1|pages=4–21|doi=10.1109/TIT.2002.806135}}</ref> and [[gambling]] with causal side information,<ref>{{cite journal|last1=Permuter|first1=Haim H.|last2=Kim|first2=Young-Han|last3=Weissman|first3=Tsachy|title=Interpretations of Directed Information in Portfolio Theory, Data Compression, and Hypothesis Testing|journal=IEEE Transactions on Information Theory|date=June 2011|volume=57|issue=6|pages=3248–3259|doi=10.1109/TIT.2011.2136270|arxiv=0912.4872}}</ref>
 
<ref>{{cite journal|last1=Kramer|first1=G.|title=Capacity results for the discrete memoryless network|journal=IEEE Transactions on Information Theory|date=January 2003|volume=49|issue=1|pages=4–21|doi=10.1109/TIT.2002.806135}}</ref> and [[gambling]] with causal side information,<ref>{{cite journal|last1=Permuter|first1=Haim H.|last2=Kim|first2=Young-Han|last3=Weissman|first3=Tsachy|title=Interpretations of Directed Information in Portfolio Theory, Data Compression, and Hypothesis Testing|journal=IEEE Transactions on Information Theory|date=June 2011|volume=57|issue=6|pages=3248–3259|doi=10.1109/TIT.2011.2136270|arxiv=0912.4872}}</ref>
  −
and gambling with causal side information,
  −
  −
和赌博的因果方面的信息,
  −
  −
  −
  −
== See also ==
  −
  −
* [[Conditional mutual information]]
  −
  −
* [[Causality]]
  −
  −
* [[Causality (physics)]]
  −
  −
* [[Structural equation modeling]]
  −
  −
* [[Rubin causal model]]
  −
  −
* [[Mutual information]]
  −
  −
  −
  −
== References ==
  −
  −
{{Reflist|2}}
  −
  −
  −
  −
== External links ==
  −
  −
*  {{cite web|title=Transfer Entropy Toolbox|url=http://code.google.com/p/transfer-entropy-toolbox/|publisher=[[Google Code]]}}, a toolbox, developed in [[C++]] and [[MATLAB]], for computation of transfer entropy between spike trains.
  −
  −
*  {{cite web|title=Java Information Dynamics Toolkit (JIDT)|url=https://github.com/jlizier/jidt|publisher=[[GitHub]]|date=2019-01-16}}, a toolbox, developed in [[Java (programming language)|Java]] and usable in [[MATLAB]], [[GNU Octave]] and [[Python (programming language)|Python]], for computation of transfer entropy and related information-theoretic measures in both discrete and continuous-valued data.
  −
  −
*  {{cite web|title=Multivariate Transfer Entropy (MuTE) toolbox|url=https://github.com/montaltoalessandro/MuTE|publisher=[[GitHub]]|date=2019-01-09}}, a toolbox, developed in [[MATLAB]], for computation of transfer entropy with different estimators.
  −
  −
  −
  −
[[Category:Causality]]
      
Category:Causality
 
Category:Causality
第149行: 第65行:  
分类: 因果关系
 
分类: 因果关系
   −
[[Category:Nonlinear time series analysis]]
+
 
    
Category:Nonlinear time series analysis
 
Category:Nonlinear time series analysis
第155行: 第71行:  
类别: 非线性时间序列分析
 
类别: 非线性时间序列分析
   −
[[Category:Nonparametric statistics]]
+
== See also ==
    
Category:Nonparametric statistics
 
Category:Nonparametric statistics
第161行: 第77行:  
类别: 无母数统计
 
类别: 无母数统计
   −
[[Category:Entropy and information]]
+
* [[Conditional mutual information]]
    
Category:Entropy and information
 
Category:Entropy and information
1,568

个编辑