更改

跳到导航 跳到搜索
添加6,017字节 、 2024年10月29日 (星期二)
无编辑摘要
第30行: 第30行:  
There have been some related works in the early stage that attempted to quantitatively analyze emergence. The computational mechanics theory proposed by Crutchfield et al. [27] considers causal states. This method discusses related concepts based on the division of state space and is very similar to Erik Hoel's causal emergence theory. On the other hand, Seth et al. proposed the G-emergence theory [28] to quantify emergence by using Granger causality. Computational mechanics The computational mechanics theory attempts to express the causal laws of emergence in a quantitative framework, that is, how to construct a coarse-grained causal model from a random process so that this model can generate the time series of the observed random process [27].
 
There have been some related works in the early stage that attempted to quantitatively analyze emergence. The computational mechanics theory proposed by Crutchfield et al. [27] considers causal states. This method discusses related concepts based on the division of state space and is very similar to Erik Hoel's causal emergence theory. On the other hand, Seth et al. proposed the G-emergence theory [28] to quantify emergence by using Granger causality. Computational mechanics The computational mechanics theory attempts to express the causal laws of emergence in a quantitative framework, that is, how to construct a coarse-grained causal model from a random process so that this model can generate the time series of the observed random process [27].
   −
Here, the random process can be represented by s. Based on time t, the random process can be divided into two parts: the process before time t and the process after time t, st1 and st2. Computational mechanics records the set of all possible historical processes st1 as S1, and the set of all future processes as S2.
+
Here, the random process can be represented by <math>\overleftrightarrow{s}</math>. Based on time <math>t</math>, the random process can be divided into two parts: the process before time <math>t</math> and the process after time <math>t</math>, <math>\overleftarrow{s_t}</math> and <math>\overrightarrow{s_t}</math>. Computational mechanics records the set of all possible historical processes <math>\overleftarrow{s_t}</math> as <math> \overleftarrow{S}</math>, and the set of all future processes as <math> \overrightarrow{S}</math>.
 +
 
 +
The goal of computational mechanics is to establish a model that hopes to reconstruct and predict the observed random sequence in a certain degree of accuracy. However, the randomness of the sequence makes it impossible for us to obtain a perfect reconstruction. Therefore, we need a coarse-grained mapping to capture the ordered structure in the random sequence. This coarse-grained mapping can be characterized by a partitioning function <math>\eta: \overleftarrow{S}→\mathcal{R}</math>, which can divide <math>\overleftarrow{S}</math> into several mutually exclusive subsets (all mutually exclusive subsets form the complete set), and the formed set is denoted as <math>\mathcal{R}</math>.
 +
 
 +
Computational mechanics regards any subset <math>R \in \mathcal{R}</math> as a macroscopic state. For a set of macroscopic state sets <math>\mathcal{R}</math>, computational mechanics uses Shannon entropy to define its statistical complexity index <math>C_\mu</math> to measure the complexity of the state. The following equation:
 +
 
 +
<math>
 +
C_\mu(\mathcal{R})\triangleq -\sum_{\rho\in \mathcal{R}} P(\mathcal{R}=\rho)\log_2 P(\mathcal{R}=\rho)
 +
</math>
 +
 
 +
It can be proved that when a set of states is used to build a prediction model, statistical complexity is approximately equivalent to the size of the prediction model.
 +
 
 +
Furthermore, in order to achieve the best balance between predictability and simplicity for the set of macroscopic states, computational mechanics defines the concept of causal equivalence. If <math>P\left ( \overrightarrow{s}|\overleftarrow{s}\right )=P\left ( \overrightarrow{s}|{\overleftarrow{s}}'\right )</math>, then <math>\overleftarrow{s}</math> and <math>{\overleftarrow{s}}'</math> are causally equivalent. This equivalence relation can divide all historical processes into equivalence classes and define them as causal states. All causal states of the historical process <math>\overleftarrow{s}</math> can be characterized by a mapping <math>\epsilon \left ( \overleftarrow{s} \right )</math>. Here, <math>\epsilon: \overleftarrow{\mathcal{S}}\rightarrow 2^{\overleftarrow{\mathcal{S}}}</math> is a function that maps the historical process <math>\overleftarrow{s}</math> to the causal state <math>\epsilon(\overleftarrow{s})\in 2^{\overleftarrow{\mathcal{S}}}</math>.
 +
 
 +
Further, we can denote the causal transition probability between two causal states <math>S_i</math> and <math>S_j</math> as <math>T_{ij}^{\left ( s \right )}</math>, which is similar to a coarsened macroscopic dynamics. The <math>\epsilon</math>-machine of a random process is defined as an ordered pair <math>\left \{ \epsilon,T \right \}</math>. This is a pattern discovery machine that can achieve prediction by learning the <math>\epsilon</math> and <math>T</math> functions. This is equivalent to defining the so-called identification problem of emergent causality. Here, the <math>\epsilon</math>-machine is a machine that attempts to discover emergent causality in data.
 +
 
 +
Computational mechanics can prove that the causal states obtained through the <math>\epsilon</math>-machine have three important characteristics: "maximum predictability", "minimum statistical complexity", and "minimum randomness", and it is verified that it is optimal in a certain sense. In addition, the author introduces a hierarchical machine reconstruction algorithm that can calculate causal states and <math>\epsilon</math>-machines from observational data. Although this algorithm may not be applicable to all scenarios, the author takes chaotic dynamics, hidden Markov models, and cellular automata as examples and gives numerical calculation results and corresponding machine reconstruction paths.
 +
 
 +
Although the original computational mechanics does not give a clear definition and quantitative theory of emergence, some researchers later further advanced the development of this theory. Shalizi et al.<ref name="The_calculi_of_emergence"></ref> discussed the relationship between computational mechanics and emergence in their work. If process <math>{\overleftarrow{s}}'</math> has higher prediction efficiency than process <math>\overleftarrow{s}</math>, then emergence occurs in process <math>{\overleftarrow{s}}'</math>. The prediction efficiency <math>e</math> of a process is defined as the ratio of its excess entropy to its statistical complexity (<math>e=\frac{E}{C_{\mu}}</math>). <math>e</math> is a real number between 0 and 1. We can regard it as a part of the historical memory stored in the process. In two cases, <math>C_{\mu}=0</math>. One is that this process is completely uniform and deterministic; the other is that it is independently and identically distributed. In both cases, there cannot be any interesting predictions, so we set <math>e=0</math>. At the same time, the author explains that emergence can be understood as a dynamical process in which a pattern gains the ability to adapt to different environments.
 +
 
 +
The causal emergence framework has many similarities with computational mechanics. All historical processes <math>\overleftarrow{s}</math> can be regarded as microscopic states. All <math>R \in \mathcal{R}</math> correspond to macroscopic states. The function <math>\eta</math> can be understood as a possible coarse-graining function. The causal state <math>\epsilon \left ( \overleftarrow{s} \right )</math> is a special state that can at least have the same predictive power as the microscopic state <math>\overleftarrow{s}</math>. Therefore, <math>\epsilon</math> can be understood as an effective coarse-graining strategy. Causal transfer <math>T</math> corresponds to effective macroscopic dynamics. The characteristic of minimum randomness characterizes the determinism of macroscopic dynamics and can be measured by effective information in causal emergence.
150

个编辑

导航菜单