更改

跳到导航 跳到搜索
大小无更改 、 2024年10月11日 (星期五)
第148行: 第148行:  
==Effective Information as the Distribution Difference==
 
==Effective Information as the Distribution Difference==
   −
In the literature<ref name='tononi_2008'>{{cite journal|author=GIULIO TONONI|title=Consciousness as Integrated Information: a Provisional Manifesto|journal=Biol. Bull.|volume=215|page=216–242|year=2008}}</ref>, the author defines valid information in another way. This new form of effective information depends on the state of the outcome variable (Y), that is, the state of [math]\tilde{Y}[/math] after intervening [math]X[/math] to be uniformly distributed is the given value [math]Y_0[/math]. Under this condition, effective information is defined as the [[KL Divergence]] of two probability distributions, which are the prior distributions of the dependent variable [math]X[/math], i.e. the uniform distribution [math]U[/math] on [math]\mathcal{X}[/math], and the causal mechanism f from X to Y, which causes the dependent variable [math]Y[/math] to become another variable [math]\tilde{Y}[/math]. Therefore, based on the observation that the value of this dependent variable [math]Y[/math] is [math]Y_0[/math], we can infer in reverse that The posterior distribution of the dependent variable [math]\tilde{X}[/math], i.e. [math]P(\tilde{X}|\tilde{Y}=Y_0,f)[/math].
+
In the literature<ref name='tononi_2008'>{{cite journal|author=GIULIO TONONI|title=Consciousness as Integrated Information: a Provisional Manifesto|journal=Biol. Bull.|volume=215|page=216–242|year=2008}}</ref>, the author defines valid information in another way. This new form of effective information depends on the state of the outcome variable (Y), that is, the state of [math]\tilde{Y}[/math] after intervening [math]X[/math] to be uniformly distributed is the given value [math]Y_0[/math]. Under this condition, effective information is defined as the [[KL Divergence]] of two probability distributions, which are the prior distributions of the dependent variable [math]X[/math], i.e. the uniform distribution [math]U[/math] on [math]\mathcal{X}[/math], and the causal mechanism f from X to Y, which causes the dependent variable [math]Y[/math] to become another variable [math]\tilde{Y}[/math]. Therefore, based on the observation that the value of this dependent variable [math]Y[/math] is [math]Y_0[/math], we can infer in reverse that the posterior distribution of the dependent variable [math]\tilde{X}[/math], i.e. [math]P(\tilde{X}|\tilde{Y}=Y_0,f)[/math].
    
So, there will be a difference between the prior probability distribution and the posterior probability distribution, which is the effective information generated by the causal mechanism f, and can be defined as:
 
So, there will be a difference between the prior probability distribution and the posterior probability distribution, which is the effective information generated by the causal mechanism f, and can be defined as:
2,426

个编辑

导航菜单