| In 2022, to address the calculation of EI in general [[Feedforward Neural Networks]], [[Zhang Jiang]] and [[Liu Kaiwei]] removed the variance constraint from the [[Causal Geometry]] approach and explored a more general form of EI, <ref name=zhang_nis>{{cite journal|title=Neural Information Squeezer for Causal Emergence|first1=Jiang|last1=Zhang|first2=Kaiwei|last2=Liu|journal=Entropy|year=2022|volume=25|issue=1|page=26|url=https://api.semanticscholar.org/CorpusID:246275672}}</ref><ref name=yang_nis+>{{cite journal|title=Finding emergence in data by maximizing effective information|author1=Mingzhe Yang|author2=Zhipeng Wang|author3=Kaiwei Liu|author4=Yingqi Rong|author5=Bing Yuan|author6=Jiang Zhang|journal=arXiv|page=2308.09952|year=2024}}</ref><ref name=liu_exact>{{cite journal|title=An Exact Theory of Causal Emergence for Stochastic Iterative Systems|author1=Kaiwei Liu|author2=Bing Yuan|author3=Jiang Zhang|journal=arXiv|page=2405.09207|year=2024}}</ref>. Nonetheless, a limitation remained: because the uniform distribution of variables in the real-number domain is strictly defined over an infinite space, the calculation of EI involved a parameter [math]L[/math], representing the range of the uniform distribution. To avoid this issue and enable comparisons of EI at different levels of [[Granularity]], the authors proposed the concept of [[Dimension-averaged EI]]. They found that the [[Measure of Causal Emergence]] defined by [[Dimension-averaged EI]] was solely dependent on the determinant of the [[Neural Network]]'s [[Jacobian Matrix]] and the variance of the random variables in the two compared dimensions, independent of other parameters such as L. Additionally, [[Dimension-averaged EI]] could be viewed as a [[Normalized EI]], or Eff. | | In 2022, to address the calculation of EI in general [[Feedforward Neural Networks]], [[Zhang Jiang]] and [[Liu Kaiwei]] removed the variance constraint from the [[Causal Geometry]] approach and explored a more general form of EI, <ref name=zhang_nis>{{cite journal|title=Neural Information Squeezer for Causal Emergence|first1=Jiang|last1=Zhang|first2=Kaiwei|last2=Liu|journal=Entropy|year=2022|volume=25|issue=1|page=26|url=https://api.semanticscholar.org/CorpusID:246275672}}</ref><ref name=yang_nis+>{{cite journal|title=Finding emergence in data by maximizing effective information|author1=Mingzhe Yang|author2=Zhipeng Wang|author3=Kaiwei Liu|author4=Yingqi Rong|author5=Bing Yuan|author6=Jiang Zhang|journal=arXiv|page=2308.09952|year=2024}}</ref><ref name=liu_exact>{{cite journal|title=An Exact Theory of Causal Emergence for Stochastic Iterative Systems|author1=Kaiwei Liu|author2=Bing Yuan|author3=Jiang Zhang|journal=arXiv|page=2405.09207|year=2024}}</ref>. Nonetheless, a limitation remained: because the uniform distribution of variables in the real-number domain is strictly defined over an infinite space, the calculation of EI involved a parameter [math]L[/math], representing the range of the uniform distribution. To avoid this issue and enable comparisons of EI at different levels of [[Granularity]], the authors proposed the concept of [[Dimension-averaged EI]]. They found that the [[Measure of Causal Emergence]] defined by [[Dimension-averaged EI]] was solely dependent on the determinant of the [[Neural Network]]'s [[Jacobian Matrix]] and the variance of the random variables in the two compared dimensions, independent of other parameters such as L. Additionally, [[Dimension-averaged EI]] could be viewed as a [[Normalized EI]], or Eff. |
− | Essentially, EI is a quantity that depends only on the [[Dynamics]] of a [[Markov Dynamic System]]—specifically on the [[Markov State Transition Matrix]]<ref name=review>{{cite journal|last1=Yuan|first1=Bing|last2=Zhang|first2=Jiang|last3=Lyu|first3=Aobo|last4=Wu|first4=Jiaying|last5=Wang|first5=Zhipeng|last6=Yang|first6=Mingzhe|last7=Liu|first7=Kaiwei|last8=Mou|first8=Muyun|last9=Cui|first9=Peng|year=2024|title=Emergence and Causality in Complex Systems: A Survey of Causal Emergence and Related Quantitative Studies|journal=Entropy|volume=26|issue=2|page=108|url=https://doi.org/10.3390/e26020108}}</ref>. In their latest work on [[Dynamical Reversibility]] and [[Causal Emergence]], [[Zhang Jiang]] and colleagues pointed out that EI is actually a characterization of the reversibility of the underlying [[Markov Transition Matrix]], and they attempted to directly characterize the reversibility of Markov chain dynamics as a replacement for EI<ref name=zhang_reversibility>{{cite journal|author1=Jiang Zhang|author2=Ruyi Tao|author3=Keng Hou Leong|author4=Mingzhe Yang|author5=Bing Yuan|year=2024|title=Dynamical reversibility and a new theory of causal emergence|url=https://arxiv.org/abs/2402.15054|journal=arXiv}}</ref>. | + | Essentially, EI is a quantity that depends only on the [[Dynamics]] of a [[Markov Dynamic System]]—specifically on the [[Markov State Probability Transition Matrix]]<ref name=review>{{cite journal|last1=Yuan|first1=Bing|last2=Zhang|first2=Jiang|last3=Lyu|first3=Aobo|last4=Wu|first4=Jiaying|last5=Wang|first5=Zhipeng|last6=Yang|first6=Mingzhe|last7=Liu|first7=Kaiwei|last8=Mou|first8=Muyun|last9=Cui|first9=Peng|year=2024|title=Emergence and Causality in Complex Systems: A Survey of Causal Emergence and Related Quantitative Studies|journal=Entropy|volume=26|issue=2|page=108|url=https://doi.org/10.3390/e26020108}}</ref>. In their latest work on [[Dynamical Reversibility]] and [[Causal Emergence]], [[Zhang Jiang]] and colleagues pointed out that EI is actually a characterization of the reversibility of the underlying [[Markov Transition Matrix]], and they attempted to directly characterize the reversibility of Markov chain dynamics as a replacement for EI<ref name=zhang_reversibility>{{cite journal|author1=Jiang Zhang|author2=Ruyi Tao|author3=Keng Hou Leong|author4=Mingzhe Yang|author5=Bing Yuan|year=2024|title=Dynamical reversibility and a new theory of causal emergence|url=https://arxiv.org/abs/2402.15054|journal=arXiv}}</ref>. |