更改

跳到导航 跳到搜索
添加822字节 、 2024年9月17日 (星期二)
无编辑摘要
第3行: 第3行:  
|description=This paper is one of the few comprehensive introductory articles on the causal emergence theory and effective information (EI) in the integrated information theory in the current Internet world, including the source of effective information, the definition and decomposition of effective information, practical examples, and how to expand to continuous variables, what is the relationship with causal metrics and dynamic reversibility.
 
|description=This paper is one of the few comprehensive introductory articles on the causal emergence theory and effective information (EI) in the integrated information theory in the current Internet world, including the source of effective information, the definition and decomposition of effective information, practical examples, and how to expand to continuous variables, what is the relationship with causal metrics and dynamic reversibility.
 
}}
 
}}
 +
Effective Information (EI) is a core concept in the theory of [[Causal Emergence]], used to measure the strength of [[Causal Effects]] in [[Markov Dynamics]]. In this context, causal effect refers to the extent to which different input distributions lead to different output distributions when viewing the dynamics as a black box. The degree of this connection is the causal effect. EI can typically be decomposed into two components: Determinism and Degeneracy. Determinism indicates how well the next state of the system can be predicted from its previous state, while Degeneracy refers to how well one can infer the previous state from the next state. A system with higher Determinism or lower Degeneracy will have higher Effective Information. In this page, all [math]\log[/math] represent logarithmic operations with a base of 2.
   −
Effective Information (EI) is a core concept in the theory of Causal Emergence, used to measure the strength of causal effects in Markov dynamics. In this context, causal effect refers to the extent to which different input distributions lead to different output distributions when viewing the dynamics as a black box. The degree of this connection is the causal effect. EI can typically be decomposed into two components: Determinism and Degeneracy. Determinism indicates how well the next state of the system can be predicted from its previous state, while Degeneracy refers to how well one can infer the previous state from the next state. A system with higher Determinism or lower Degeneracy will have higher Effective Information. In this page, all [math]log[/math] represent logarithmic operations with a base of 2.
   
=Historical Background=
 
=Historical Background=
The concept of Effective Information (EI) was first introduced by Giulio Tononi in 2003 as a key measure in Integrated Information Theory<sup>[1]</sup>. A system is said to have a high degree of integration when there is a strong causal connection among its components, and EI is the metric used to quantify this degree of causal connection.
+
The concept of Effective Information (EI) was first introduced by [[Giulio Tononi]] in 2003 <ref name=tononi_2003>{{cite journal |last1=Tononi|first1=G.|last2=Sporns|first2=O.|title=Measuring information integration|journal=BMC Neuroscience|volume=4 |issue=31 |year=2003|url=https://doi.org/10.1186/1471-2202-4-31}}</ref>, as a key measure in [[Integrated Information Theory]]. A system is said to have a high degree of integration when there is a strong causal connection among its components, and EI is the metric used to quantify this degree of causal connection.
 +
 
 +
In 2013, [[Giulio Tononi]]'s student, [[Erik Hoel]], further refined the concept of EI to quantitatively characterize emergence, leading to the development of the theory of [[Causal Emergence]]<ref name=hoel_2013>{{cite journal|last1=Hoel|first1=Erik P.|last2=Albantakis|first2=L.|last3=Tononi|first3=G.|title=Quantifying causal emergence shows that macro can beat micro|journal=Proceedings of the National Academy of Sciences|volume=110|issue=49|page=19790–19795|year=2013|url=https://doi.org/10.1073/pnas.1314922110}}</ref>. In this theory, Hoel used [[Judea Pearl]]'s [[do operator]] to modify the general [[Mutual Information]] metric <ref name="pearl_causality">{{cite book|title=因果论——模型、推理和推断|author1=Judea Pearl|author2=刘礼|author3=杨矫云|author4=廖军|author5=李廉|publisher=机械工业出版社|year=2022|month=4}}</ref>, which made EI fundamentally different from [[Mutual Information]]. While [[Mutual Information]] measures correlation, EI—due to the use of the [[do operator]]—measures causality. The article also introduced a [[Normalized Version of EI]], referred to as Eff.
   −
In 2013, Giulio Tononi's student, Erik Hoel, further refined the concept of EI to quantitatively characterize emergence, leading to the development of the theory of Causal Emergence<sup>[2]</sup>. In this theory, Hoel used Judea Pearl's "do" operator to modify the general mutual information metric, which made EI fundamentally different from mutual information. While mutual information measures correlation, EI—due to the use of the "do" operator—measures causality. The article also introduced a normalized version of EI, referred to as Eff.
      
Traditionally, EI was primarily applied to discrete-state Markov chains. To extend this to continuous domains, P. Chvykov and E. Hoel collaborated in 2020 to propose the theory of Causal Geometry, expanding EI's definition to function mappings with continuous state variables. By incorporating Information Geometry, they explored a perturbative form of EI and compared it with Fisher Information, proposing the concept of Causal Geometry. However, this method of calculating EI for continuous variables required the assumption of infinitesimal variance for normal distribution variables, which was an overly stringent condition.
 
Traditionally, EI was primarily applied to discrete-state Markov chains. To extend this to continuous domains, P. Chvykov and E. Hoel collaborated in 2020 to propose the theory of Causal Geometry, expanding EI's definition to function mappings with continuous state variables. By incorporating Information Geometry, they explored a perturbative form of EI and compared it with Fisher Information, proposing the concept of Causal Geometry. However, this method of calculating EI for continuous variables required the assumption of infinitesimal variance for normal distribution variables, which was an overly stringent condition.
2,365

个编辑

导航菜单