第904行: |
第904行: |
| ===EI and Φ=== | | ===EI and Φ=== |
| The integrated information (or the degree of integration) <math>\Phi</math>, can be defined as the minimum value of EI between any two bipartitions of a system. Suppose the system is 𝑋, and 𝑆 is a subset of 𝑋, that is partitioned into two parts, 𝐴 and 𝐵. There are causal interactions between 𝐴, 𝐵, and the rest of 𝑋. [[文件:OriginalEI.png|350x350px|整合信息论中的划分|替代=|缩略图|链接=https://wiki.swarma.org/index.php/%E6%96%87%E4%BB%B6:OriginalEI.png]] In this scenario, we can measure the strength of these causal interactions. First, we calculate the EI from 𝐴 to 𝐵, i.e., we intervene on 𝐴 such that it follows the maximum entropy distribution, then measure the mutual information between 𝐴 and 𝐵: | | The integrated information (or the degree of integration) <math>\Phi</math>, can be defined as the minimum value of EI between any two bipartitions of a system. Suppose the system is 𝑋, and 𝑆 is a subset of 𝑋, that is partitioned into two parts, 𝐴 and 𝐵. There are causal interactions between 𝐴, 𝐵, and the rest of 𝑋. [[文件:OriginalEI.png|350x350px|整合信息论中的划分|替代=|缩略图|链接=https://wiki.swarma.org/index.php/%E6%96%87%E4%BB%B6:OriginalEI.png]] In this scenario, we can measure the strength of these causal interactions. First, we calculate the EI from 𝐴 to 𝐵, i.e., we intervene on 𝐴 such that it follows the maximum entropy distribution, then measure the mutual information between 𝐴 and 𝐵: |
| + | |
| <math> | | <math> |
| EI(A\rightarrow B) = I(A^{H^{max}}: B) | | EI(A\rightarrow B) = I(A^{H^{max}}: B) |
| </math> | | </math> |
| | | |
− | 这里<math>A^{H^{max}}</math>表示A上的最大熵分布,也就是前文中的均匀分布。该式子中没有明确表示,但实际上暗含了一种从A到B的因果机制,即[math]Pr(B|A)[/math]。它在干预中始终保持不变。这样,如果A的不同状态会导致B有很不一样的变化,这个EI值会很高;反之,如果无论A怎么变,B都受到很少的影响,那么EI就会很低。显然,这种度量是有方向的,A对B的EI和B对A的EI可以很不同。我们可以把这两个方向的EI加在一起,得到S在某一个划分下的EI大小。
| + | Here, <math>A^{H^{max}}</math> denotes the maximum entropy distribution on 𝐴, which is implicitly a uniform distribution. This equation implies a causal mechanism from 𝐴 to 𝐵, denoted as |
| + | [math]Pr(B|A)[/math], which remains unchanged during the intervention. If different states of 𝐴 cause significantly different changes in 𝐵, the EI will be high; conversely, if changes in 𝐴 have little effect on 𝐵, the EI will be low. This measure is directional, so EI from 𝐴 to 𝐵 may differ from EI from 𝐵 to 𝐴. We can sum these two directions to obtain the total EI for the partition: |
| | | |
| <math> | | <math> |
第914行: |
第916行: |
| </math> | | </math> |
| | | |
− | 遍历各种划分,如果存在某一个划分S,使得EI为0,说明这个S可以被看做是两个因果独立的部分,所以整合程度也应该是0。从这种特殊例子中我们可以看出,我们应该关注所有划分中有效信息最小的那个。当然,不同划分会导致A和B的状态空间就不一样,所以应该做一个归一化处理,即除以A和B最大熵值中较小的那一个。于是,我们可以有一个最小信息划分(minimum information bipartition,MIB)。整合程度<math>\Phi</math>定义如下:
| + | Traversing various partitions, if there exists a partition S that makes EI 0, it indicates that S can be seen as two causally independent parts, so the degree of integration should also be 0. From this particular example, we can see that we should focus on the one with the least significant information among all partitions. Of course, different partitions will result in different state spaces for A and B, so a normalization process should be performed by dividing by the smaller of the maximum entropy values of A and B. So, we can have a minimum information partition (MIB). The degree of integration <math>\Phi</math> is defined as follows: |
| | | |
| <math> | | <math> |
第920行: |
第922行: |
| </math> | | </math> |
| | | |
− | 这就是[[整合信息能力]]与有效信息的关系。
| + | This defines the relationship between [[Integrated Information Ability]] and EI. |
− | ===区别=== | + | ===Distinction=== |
− | 值得注意的是,与马尔科夫链的EI计算不同,这里的EI更多衡量的是系统中两个部分彼此之间的因果联系。而马尔科夫链的EI衡量的是同一个系统在不同两个时刻之间的因果关联强度。
| + | It’s important to note that unlike EI calculations for Markov chains, the EI here measures the causal connections between two parts of the system, rather than the strength of causal connections across two different time points in the same system. |
| ==EI与其它因果度量指标== | | ==EI与其它因果度量指标== |
| EI是一种度量因果机制中因果变量的因果联系强弱的一种指标。而在EI提出之前,已有多个因果度量指标被提出了。那么,EI和这些因果度量指标之间存在着什么样的联系呢?事实上,正如Comolatti与Hoel在2022年的文章中所指出的,包括EI在内的这些因果度量指标都可以统一表达为两个基本要素的组合<ref name=":0">Comolatti, R., & Hoel, E. (2022). Causal emergence is widespread across measures of causation. ''arXiv preprint arXiv:2202.01854''.</ref>。这两个基本要素被称为“因果元语”(Causal Primatives),分别代表了因果关系中的'''充分性'''和'''必要性'''。 | | EI是一种度量因果机制中因果变量的因果联系强弱的一种指标。而在EI提出之前,已有多个因果度量指标被提出了。那么,EI和这些因果度量指标之间存在着什么样的联系呢?事实上,正如Comolatti与Hoel在2022年的文章中所指出的,包括EI在内的这些因果度量指标都可以统一表达为两个基本要素的组合<ref name=":0">Comolatti, R., & Hoel, E. (2022). Causal emergence is widespread across measures of causation. ''arXiv preprint arXiv:2202.01854''.</ref>。这两个基本要素被称为“因果元语”(Causal Primatives),分别代表了因果关系中的'''充分性'''和'''必要性'''。 |