Transfer entropy is a non-parametric statistic measuring the amount of directed (time-asymmetric) transfer of information between two random processes. Transfer entropy from a process X to another process Y is the amount of uncertainty reduced in future values of Y by knowing the past values of X given past values of Y. More specifically, if <math> X_t </math> and <math> Y_t </math> for <math> t\in \mathbb{N} </math> denote two random processes and the amount of information is measured using Shannon's entropy, the transfer entropy can be written as: | Transfer entropy is a non-parametric statistic measuring the amount of directed (time-asymmetric) transfer of information between two random processes. Transfer entropy from a process X to another process Y is the amount of uncertainty reduced in future values of Y by knowing the past values of X given past values of Y. More specifically, if <math> X_t </math> and <math> Y_t </math> for <math> t\in \mathbb{N} </math> denote two random processes and the amount of information is measured using Shannon's entropy, the transfer entropy can be written as: |