更改

添加2,319字节 、 2020年9月9日 (三) 14:36
第847行: 第847行:  
*“P”包含(但不必完全是)有向链,<math> u \cdots \leftarrow m \leftarrow \cdots v</math> or <math> u \cdots \rightarrow m \rightarrow \cdots v</math>, 使中间节点“m”位于''Z'',
 
*“P”包含(但不必完全是)有向链,<math> u \cdots \leftarrow m \leftarrow \cdots v</math> or <math> u \cdots \rightarrow m \rightarrow \cdots v</math>, 使中间节点“m”位于''Z'',
 
*''P'' contains a fork, <math> u \cdots \leftarrow m \rightarrow \cdots v</math>, such that the middle node ''m'' is in ''Z'', or
 
*''P'' contains a fork, <math> u \cdots \leftarrow m \rightarrow \cdots v</math>, such that the middle node ''m'' is in ''Z'', or
 
+
*“P”包含一个分支,<math>u\cdots\leftarrow m\rightarrow\cdots v</math>, 使中间节点“m”位于“z”中,或
 
*''P'' contains an inverted fork (or collider), <math> u \cdots \rightarrow m \leftarrow \cdots v</math>, such that the middle node ''m'' is not in ''Z'' and no descendant of ''m'' is in ''Z''.
 
*''P'' contains an inverted fork (or collider), <math> u \cdots \rightarrow m \leftarrow \cdots v</math>, such that the middle node ''m'' is not in ''Z'' and no descendant of ''m'' is in ''Z''.
 
+
*“P”包含一个倒叉(或对撞机),<math>u\cdots\rightarrow m \leftarrow \cdots v</math>, 中间节点“m”不在“z”中,在''Z''中也没有“m”i的后代。
      第864行: 第864行:  
X is a Bayesian network with respect to G if, for any two nodes u, v:
 
X is a Bayesian network with respect to G if, for any two nodes u, v:
   −
对于任意两个节点 u,v: ,x 是 g 的贝氏网路:
+
对于任意两个节点 u,v: ,x 是 g 的'''<font color="#ff8000"> 贝叶斯网络Bayesian network</font>''':
      第890行: 第890行:  
Although Bayesian networks are often used to represent causal relationships, this need not be the case: a directed edge from u to v does not require that X<sub>v</sub> be causally dependent on X<sub>u</sub>. This is demonstrated by the fact that Bayesian networks on the graphs:
 
Although Bayesian networks are often used to represent causal relationships, this need not be the case: a directed edge from u to v does not require that X<sub>v</sub> be causally dependent on X<sub>u</sub>. This is demonstrated by the fact that Bayesian networks on the graphs:
   −
虽然贝叶斯网络经常被用来表示因果关系,但这并不需要: 从 u 到 v 的有向边并不要求 x 子 v / 子因果地依赖于 x 子 u / 子。图表上的贝叶斯网络证明了这一点:
+
虽然'''<font color="#ff8000"> 贝叶斯网络Bayesian network</font>'''经常被用来表示因果关系,但这种情形并不需要: 从 u 到 v 的有向边并不要求 X<sub>v</sub>起因于X<sub>u</sub>。图表上的'''<font color="#ff8000"> 贝叶斯网络Bayesian network</font>'''证明了这一点:
      第914行: 第914行:  
A causal network is a Bayesian network with the requirement that the relationships be causal. The additional semantics of causal networks specify that if a node X is actively caused to be in a given state x (an action written as do(X&nbsp;=&nbsp;x)), then the probability density function changes to that of the network obtained by cutting the links from the parents of X to X, and setting X to the caused value x. Using these semantics, the impact of external interventions from data obtained prior to intervention can be predicted.
 
A causal network is a Bayesian network with the requirement that the relationships be causal. The additional semantics of causal networks specify that if a node X is actively caused to be in a given state x (an action written as do(X&nbsp;=&nbsp;x)), then the probability density function changes to that of the network obtained by cutting the links from the parents of X to X, and setting X to the caused value x. Using these semantics, the impact of external interventions from data obtained prior to intervention can be predicted.
   −
一个因果网络是一个因果关系必须是因果关系的贝氏网路。因果网络的附加语义规定,如果一个节点 x 主动地处于给定的状态 x (一个动作写成 do (x x)) ,那么这个概率密度函数就会改变为通过从 x 的父节点切断到 x 的链接,并将 x 设置为引起的值 x 而获得的网络的状态。利用这些语义,可以预测干预前获得的数据的外部干预的影响。
+
一个因果网络是一个关系必须是因果关系的'''<font color="#ff8000"> 贝叶斯网络Bayesian network</font>'''。因果网络的附加语义规定,如果一个节点 x 主动地处于给定的状态 x (一个动作写成 do (x x)) ,那么这个概率密度函数就会改变为通过从 x 的父节点切断到 x 的链接,并将 x 设置为引起的值 x 而获得的网络的状态。利用这些语义,可以预测干预前获得的数据的外部干预的影响。
      第924行: 第924行:  
In 1990, while working at Stanford University on large bioinformatic applications, Cooper proved that exact inference in Bayesian networks is NP-hard.<ref>
 
In 1990, while working at Stanford University on large bioinformatic applications, Cooper proved that exact inference in Bayesian networks is NP-hard.<ref>
   −
1990年,库珀在斯坦福大学研究大规模生物信息学应用时,证明了贝叶斯网络中的精确推理是 np 难的。 裁判
+
1990年,库珀在斯坦福大学研究大规模生物信息学应用时,证明了'''<font color="#ff8000"> 贝叶斯网络Bayesian network</font>'''中的精确推理是[[NP-hard]](NP难问题,NP 是指非确定性多项式( non-deterministic polynomial ,缩写 NP ) 。 裁判
    
{{cite journal | first = Gregory F. | last = Cooper | name-list-format = vanc | title = The Computational Complexity of Probabilistic Inference Using Bayesian Belief Networks | url = https://stat.duke.edu/~sayan/npcomplete.pdf | journal = Artificial Intelligence | volume = 42 | issue = 2–3 | date = 1990 | pages = 393–405 | doi = 10.1016/0004-3702(90)90060-d }}
 
{{cite journal | first = Gregory F. | last = Cooper | name-list-format = vanc | title = The Computational Complexity of Probabilistic Inference Using Bayesian Belief Networks | url = https://stat.duke.edu/~sayan/npcomplete.pdf | journal = Artificial Intelligence | volume = 42 | issue = 2–3 | date = 1990 | pages = 393–405 | doi = 10.1016/0004-3702(90)90060-d }}
第932行: 第932行:  
</ref> This result prompted research on approximation algorithms with the aim of developing a tractable approximation to probabilistic inference. In 1993, Dagum and Luby proved two surprising results on the complexity of approximation of probabilistic inference in Bayesian networks.<ref>
 
</ref> This result prompted research on approximation algorithms with the aim of developing a tractable approximation to probabilistic inference. In 1993, Dagum and Luby proved two surprising results on the complexity of approximation of probabilistic inference in Bayesian networks.<ref>
   −
/ ref 这个结果促进了近似算法的研究,目的是发展一种易于处理的近似方法来进行概率推理。1993年,Dagum 和 Luby 证明了贝叶斯网络概率推理近似复杂度的两个令人惊讶的结果
+
/ ref 这个结果促进了近似算法的研究,目的是发展一种易于处理的近似方法来进行概率推理。1993年,Dagum 和 Luby 证明了'''<font color="#ff8000"> 贝叶斯网络Bayesian network</font>'''概率推理近似复杂度的两个令人惊讶的结果
    
{{cite journal | vauthors = Dagum P, Luby M | author-link1 = Paul Dagum | author-link2 = Michael Luby | title = Approximating probabilistic inference in Bayesian belief networks is NP-hard | journal = Artificial Intelligence | volume = 60 | issue = 1 | date = 1993 | pages = 141–153 | doi = 10.1016/0004-3702(93)90036-b | citeseerx = 10.1.1.333.1586 }}
 
{{cite journal | vauthors = Dagum P, Luby M | author-link1 = Paul Dagum | author-link2 = Michael Luby | title = Approximating probabilistic inference in Bayesian belief networks is NP-hard | journal = Artificial Intelligence | volume = 60 | issue = 1 | date = 1993 | pages = 141–153 | doi = 10.1016/0004-3702(93)90036-b | citeseerx = 10.1.1.333.1586 }}
第948行: 第948行:  
At about the same time, Roth proved that exact inference in Bayesian networks is in fact #P-complete (and thus as hard as counting the number of satisfying assignments of a conjunctive normal form formula (CNF) and that approximate inference within a factor 2<sup>n<sup>1−ɛ</sup></sup> for every ɛ > 0, even for Bayesian networks with restricted architecture, is NP-hard.
 
At about the same time, Roth proved that exact inference in Bayesian networks is in fact #P-complete (and thus as hard as counting the number of satisfying assignments of a conjunctive normal form formula (CNF) and that approximate inference within a factor 2<sup>n<sup>1−ɛ</sup></sup> for every ɛ > 0, even for Bayesian networks with restricted architecture, is NP-hard.
   −
与此同时,Roth 证明了贝叶斯网络中的精确推理实际上是 # p 完全的(因此就像计算一个合取范式公式(CNF)的满意分配数一样困难) ,而且对于每个0,即使对于有限结构的贝叶斯网络来说,在因子2 sup n sup 1 / sup / sup 中的近似推理也是 np 困难的。
+
与此同时,Roth 证明了'''<font color="#ff8000"> 贝叶斯网络Bayesian network</font>'''中的精确推理实际上是 # p 完全的(因此就像计算一个合取范式公式(CNF)的满意分配数一样困难) ,而且对于每个0,即使对于有限结构的'''<font color="#ff8000"> 贝叶斯网络Bayesian network</font>'''来说,在因子2 sup n sup 1 / sup / sup 中的近似推理也是 np 困难的。
      第956行: 第956行:  
In practical terms, these complexity results suggested that while Bayesian networks were rich representations for AI and machine learning applications, their use in large real-world applications would need to be tempered by either topological structural constraints, such as naïve Bayes networks, or by restrictions on the conditional probabilities. The bounded variance algorithm was the first provable fast approximation algorithm to efficiently approximate probabilistic inference in Bayesian networks with guarantees on the error approximation. This powerful algorithm required the minor restriction on the conditional probabilities of the Bayesian network to be bounded away from zero and one by 1/p(n) where p(n) was any polynomial on the number of nodes in the network&nbsp;n.
 
In practical terms, these complexity results suggested that while Bayesian networks were rich representations for AI and machine learning applications, their use in large real-world applications would need to be tempered by either topological structural constraints, such as naïve Bayes networks, or by restrictions on the conditional probabilities. The bounded variance algorithm was the first provable fast approximation algorithm to efficiently approximate probabilistic inference in Bayesian networks with guarantees on the error approximation. This powerful algorithm required the minor restriction on the conditional probabilities of the Bayesian network to be bounded away from zero and one by 1/p(n) where p(n) was any polynomial on the number of nodes in the network&nbsp;n.
   −
实际上,这些复杂性结果表明,虽然贝叶斯网络是人工智能和机器学习应用的丰富表现形式,但它们在大型实际应用中的使用需要通过拓扑结构约束(如天真的贝叶斯网络)或条件概率约束加以调整。有界方差算法是贝叶斯网络中第一个在误差近似下有效近似概率推理的可证明的快速近似演算法算法。这个强大的算法需要对贝氏网路的条件概率进行小的限制,使其远离0和1 / p (n) ,其中 p (n)是网络 n 中节点数的任意多项式。
+
实际上,这些复杂性结果表明,虽然'''<font color="#ff8000"> 贝叶斯网络Bayesian network</font>'''是人工智能和机器学习应用的丰富表现形式,但它们在大型实际应用中的使用需要通过拓扑结构约束(如天真的贝叶斯网络)或条件概率约束加以调整。'''<font color="#ff8000">有界方差算法Bounded variance algorithm </font>'''是贝叶斯网络中第一个在误差近似下有效近似概率推理的可证明的快速近似演算法算法。这个强大的算法需要对贝氏网路的条件概率进行小的限制,使其远离0和1 / p (n) ,其中 p (n)是网络 n 中节点数的任意多项式。
      第975行: 第975行:     
* [[Just another Gibbs sampler]] (JAGS) – Open-source alternative to WinBUGS. Uses Gibbs sampling.
 
* [[Just another Gibbs sampler]] (JAGS) – Open-source alternative to WinBUGS. Uses Gibbs sampling.
 
+
* [[Just another Gibbs sampler]] (JAGS) WinBUGS的开源替代品。使用吉布斯抽样法
 
* [[OpenBUGS]] – Open-source development of WinBUGS.
 
* [[OpenBUGS]] – Open-source development of WinBUGS.
 
+
* [[OpenBUGS]] –WinBUGS的开源开发。
 
* [[SPSS Modeler]] – Commercial software that includes an implementation for Bayesian networks.
 
* [[SPSS Modeler]] – Commercial software that includes an implementation for Bayesian networks.
 
+
* [[SPSS Modeler]] –包括贝叶斯网络实现的商业软件。
 
* [[Stan (software)]] – Stan is an open-source package for obtaining Bayesian inference using the No-U-Turn sampler (NUTS),<ref>{{Cite document |arxiv = 1111.4246|bibcode = 2011arXiv1111.4246H|last1 = Hoffman|first1 = Matthew D.|last2 = Gelman|first2 = Andrew|title = The No-U-Turn Sampler: Adaptively Setting Path Lengths in Hamiltonian Monte Carlo|year = 2011}}</ref> a variant of Hamiltonian Monte Carlo.
 
* [[Stan (software)]] – Stan is an open-source package for obtaining Bayesian inference using the No-U-Turn sampler (NUTS),<ref>{{Cite document |arxiv = 1111.4246|bibcode = 2011arXiv1111.4246H|last1 = Hoffman|first1 = Matthew D.|last2 = Gelman|first2 = Andrew|title = The No-U-Turn Sampler: Adaptively Setting Path Lengths in Hamiltonian Monte Carlo|year = 2011}}</ref> a variant of Hamiltonian Monte Carlo.
 
+
* [[Stan (software)]] – stan是一个开源软件包,用于使用不掉头取样器(NUTS),<ref>{{Cite document |arxiv = 1111.4246|bibcode = 2011arXiv1111.4246H|last1 = Hoffman|first1 = Matthew D.|last2 = Gelman|first2 = Andrew|title = The No-U-Turn Sampler:在哈密顿蒙特卡罗中自适应设置路径长度|年=2011}}的一个变体。
 
* [[PyMC3]] – A Python library implementing an embedded domain specific language to represent bayesian networks, and a variety of samplers (including NUTS)
 
* [[PyMC3]] – A Python library implementing an embedded domain specific language to represent bayesian networks, and a variety of samplers (including NUTS)
 
+
* [[PyMC3]] – 一个python库,它实现了一个嵌入式领域特定语言来表示贝叶斯网络和各种采样器(包括坚果NUTS)。
 
* [[WinBUGS]] – One of the first computational implementations of MCMC samplers. No longer maintained.
 
* [[WinBUGS]] – One of the first computational implementations of MCMC samplers. No longer maintained.
 
+
* [[WinBUGS]] –MCMC采样器的第一个计算实现之一。不再支持。
      第994行: 第994行:  
The term Bayesian network was coined by Judea Pearl in 1985 to emphasize:<ref>{{cite conference |last=Pearl |first=J. | name-list-format = vanc  |authorlink=Judea Pearl |year=1985 |title=Bayesian Networks: A Model of Self-Activated Memory for Evidential Reasoning |conference=Proceedings of the 7th Conference of the Cognitive Science Society, University of California, Irvine, CA
 
The term Bayesian network was coined by Judea Pearl in 1985 to emphasize:<ref>{{cite conference |last=Pearl |first=J. | name-list-format = vanc  |authorlink=Judea Pearl |year=1985 |title=Bayesian Networks: A Model of Self-Activated Memory for Evidential Reasoning |conference=Proceedings of the 7th Conference of the Cognitive Science Society, University of California, Irvine, CA
   −
1985年,朱迪亚 · 珀尔创造了贝氏网路一词来强调: ref { cite conference | last Pearl | first j。贝叶斯网络: 证据推理的自激记忆模型 | 第七届认知科学学会会议论文集,加州大学欧文分校
+
1985年,朱迪亚 · 珀尔创造了'''<font color="#ff8000"> 贝叶斯网络Bayesian network</font>'''一词来强调: ref { cite conference | last Pearl | first j。贝叶斯网络: 证据推理的自激记忆模型 | 第七届认知科学学会会议论文集,加州大学欧文分校
    
|pages=329&ndash;334 |url=http://ftp.cs.ucla.edu/tech-report/198_-reports/850017.pdf|access-date=2009-05-01 |format=UCLA Technical Report CSD-850017}}</ref>
 
|pages=329&ndash;334 |url=http://ftp.cs.ucla.edu/tech-report/198_-reports/850017.pdf|access-date=2009-05-01 |format=UCLA Technical Report CSD-850017}}</ref>
第1,005行: 第1,005行:     
*the often subjective nature of the input information
 
*the often subjective nature of the input information
 
+
*输入信息通常是主观的
 
*the reliance on Bayes' conditioning as the basis for updating information
 
*the reliance on Bayes' conditioning as the basis for updating information
 
+
*依赖贝叶斯条件作为信息更新的基础
 
*the distinction between causal and evidential modes of reasoning<ref>{{Cite journal | last = Bayes | first = T. | name-list-format = vanc | authorlink = Thomas Bayes | year = 1763 | title = An Essay towards solving a Problem in the Doctrine of Chances | journal = [[Philosophical Transactions of the Royal Society]] | volume = 53 | pages = 370–418 | doi = 10.1098/rstl.1763.0053 | last2 = Price | title-link = An Essay towards solving a Problem in the Doctrine of Chances | doi-access = free }}</ref>
 
*the distinction between causal and evidential modes of reasoning<ref>{{Cite journal | last = Bayes | first = T. | name-list-format = vanc | authorlink = Thomas Bayes | year = 1763 | title = An Essay towards solving a Problem in the Doctrine of Chances | journal = [[Philosophical Transactions of the Royal Society]] | volume = 53 | pages = 370–418 | doi = 10.1098/rstl.1763.0053 | last2 = Price | title-link = An Essay towards solving a Problem in the Doctrine of Chances | doi-access = free }}</ref>
 
+
*因果推理和证据推理模式的区别<ref>{{Cite journal | last = Bayes | first = T. | name-list-format = vanc | authorlink = Thomas Bayes | year = 1763 | title = An Essay towards solving a Problem in the Doctrine of Chances | journal = [[Philosophical Transactions of the Royal Society]] | volume = 53 | pages = 370–418 | doi = 10.1098/rstl.1763.0053 | last2 = Price | title-link = An Essay towards solving a Problem in the Doctrine of Chances | doi-access = free }}</ref>
     
561

个编辑