第752行: |
第752行: |
| | | |
| | | |
− | ==Inference complexity and approximation algorithms推理复杂度与近似算法== | + | ==推理复杂度与近似算法== |
| | | |
| In 1990, while working at Stanford University on large bioinformatic applications, Cooper proved that exact inference in Bayesian networks is [[NP-hard]].<ref> | | In 1990, while working at Stanford University on large bioinformatic applications, Cooper proved that exact inference in Bayesian networks is [[NP-hard]].<ref> |
− | | + | {{cite journal | first = Gregory F. | last = Cooper | name-list-style = vanc | title = The Computational Complexity of Probabilistic Inference Using Bayesian Belief Networks | url = https://stat.duke.edu/~sayan/npcomplete.pdf | journal = Artificial Intelligence | volume = 42 | issue = 2–3 | date = 1990 | pages = 393–405 | doi = 10.1016/0004-3702(90)90060-d }} |
− | In 1990, while working at Stanford University on large bioinformatic applications, Cooper proved that exact inference in Bayesian networks is NP-hard.<ref>
| |
− | | |
− | 1990年,库珀在斯坦福大学研究大规模生物信息学应用时,证明了'''<font color="#ff8000"> 贝叶斯网络Bayesian network</font>'''中的精确推理是[[NP-hard]](NP难问题,NP 是指非确定性多项式( non-deterministic polynomial ,缩写 NP ) 。 裁判
| |
− | | |
− | {{cite journal | first = Gregory F. | last = Cooper | name-list-format = vanc | title = The Computational Complexity of Probabilistic Inference Using Bayesian Belief Networks | url = https://stat.duke.edu/~sayan/npcomplete.pdf | journal = Artificial Intelligence | volume = 42 | issue = 2–3 | date = 1990 | pages = 393–405 | doi = 10.1016/0004-3702(90)90060-d }} | |
− | | |
| </ref> This result prompted research on approximation algorithms with the aim of developing a tractable approximation to probabilistic inference. In 1993, Dagum and [[Michael Luby|Luby]] proved two surprising results on the complexity of approximation of probabilistic inference in Bayesian networks.<ref> | | </ref> This result prompted research on approximation algorithms with the aim of developing a tractable approximation to probabilistic inference. In 1993, Dagum and [[Michael Luby|Luby]] proved two surprising results on the complexity of approximation of probabilistic inference in Bayesian networks.<ref> |
− |
| |
− | </ref> This result prompted research on approximation algorithms with the aim of developing a tractable approximation to probabilistic inference. In 1993, Dagum and Luby proved two surprising results on the complexity of approximation of probabilistic inference in Bayesian networks.<ref>
| |
− |
| |
− | / ref 这个结果促进了近似算法的研究,目的是发展一种易于处理的近似方法来进行概率推理。1993年,Dagum 和 Luby 证明了'''<font color="#ff8000"> 贝叶斯网络Bayesian network</font>'''概率推理近似复杂度的两个令人惊讶的结果
| |
− |
| |
| {{cite journal | vauthors = Dagum P, Luby M | author-link1 = Paul Dagum | author-link2 = Michael Luby | title = Approximating probabilistic inference in Bayesian belief networks is NP-hard | journal = Artificial Intelligence | volume = 60 | issue = 1 | date = 1993 | pages = 141–153 | doi = 10.1016/0004-3702(93)90036-b | citeseerx = 10.1.1.333.1586 }} | | {{cite journal | vauthors = Dagum P, Luby M | author-link1 = Paul Dagum | author-link2 = Michael Luby | title = Approximating probabilistic inference in Bayesian belief networks is NP-hard | journal = Artificial Intelligence | volume = 60 | issue = 1 | date = 1993 | pages = 141–153 | doi = 10.1016/0004-3702(93)90036-b | citeseerx = 10.1.1.333.1586 }} |
− |
| |
| </ref> First, they proved that no tractable [[deterministic algorithm]] can approximate probabilistic inference to within an [[absolute error]] ''ɛ'' < 1/2. Second, they proved that no tractable [[randomized algorithm]] can approximate probabilistic inference to within an absolute error ''ɛ'' < 1/2 with confidence probability greater than 1/2. | | </ref> First, they proved that no tractable [[deterministic algorithm]] can approximate probabilistic inference to within an [[absolute error]] ''ɛ'' < 1/2. Second, they proved that no tractable [[randomized algorithm]] can approximate probabilistic inference to within an absolute error ''ɛ'' < 1/2 with confidence probability greater than 1/2. |
| | | |
− | </ref> First, they proved that no tractable deterministic algorithm can approximate probabilistic inference to within an absolute error ɛ < 1/2. Second, they proved that no tractable randomized algorithm can approximate probabilistic inference to within an absolute error ɛ < 1/2 with confidence probability greater than 1/2.
| + | 1990年,Cooper在斯坦福大学研究大规模生物信息学应用时,证明了贝叶斯网络中的精确推理是[[NP-hard]](NP难问题,NP 是指非确定性多项式( non-deterministic polynomial ,缩写 NP ) 。 这个结果促进了近似算法的研究,目的是开发出一种易于处理的近似方法来进行概率推理。1993年,Dagum 和 Luby 证明了贝叶斯网络中近似概率推理复杂度的两个令人惊讶的结果。首先,他们证明了任何易于处理的确定性近似概率推断算法的绝对误差都不可能小于1/2。其次,他们证明了没有任何易于处理的随机化的近似概率推断算法可以在置信度大于1 / 2的情况下,误差小于1 / 2。 |
− | | |
− | / ref 首先,他们证明了任何易于处理的确定性算法都无法在绝对误差1 / 2以内近似进行概率推断。其次,他们证明了没有任何易于处理的随机化算法可以在置信概率大于1 / 2的情况下,在绝对误差1 / 2的范围内近似进行概率推断。
| |
| | | |
| | | |
第782行: |
第768行: |
| At about the same time, Roth proved that exact inference in Bayesian networks is in fact #P-complete (and thus as hard as counting the number of satisfying assignments of a conjunctive normal form formula (CNF) and that approximate inference within a factor 2<sup>n<sup>1−ɛ</sup></sup> for every ɛ > 0, even for Bayesian networks with restricted architecture, is NP-hard. | | At about the same time, Roth proved that exact inference in Bayesian networks is in fact #P-complete (and thus as hard as counting the number of satisfying assignments of a conjunctive normal form formula (CNF) and that approximate inference within a factor 2<sup>n<sup>1−ɛ</sup></sup> for every ɛ > 0, even for Bayesian networks with restricted architecture, is NP-hard. |
| | | |
− | 与此同时,Roth 证明了'''<font color="#ff8000"> 贝叶斯网络Bayesian network</font>'''中的精确推理实际上是 # p 完全的(因此就像计算一个合取范式公式(CNF)的满意分配数一样困难) ,而且对于每个0,即使对于有限结构的'''<font color="#ff8000"> 贝叶斯网络Bayesian network</font>'''来说,在因子2 sup n sup 1 / sup / sup 中的近似推理也是 np 困难的。 | + | 与此同时,Roth 证明了贝叶斯网络中的精确推理实际上是 #P-完全的(因此就像计数使得一个合取范式表达式(CNF)为真的赋值的数量一样困难) 。即使对于受限结构的贝叶斯网络来说,在对于任意''ɛ'' > 0,2<sup>''n''<sup>1−''ɛ''</sup></sup> 的近似推理也是 NP-hard的。 |
| | | |
| | | |
第790行: |
第776行: |
| In practical terms, these complexity results suggested that while Bayesian networks were rich representations for AI and machine learning applications, their use in large real-world applications would need to be tempered by either topological structural constraints, such as naïve Bayes networks, or by restrictions on the conditional probabilities. The bounded variance algorithm was the first provable fast approximation algorithm to efficiently approximate probabilistic inference in Bayesian networks with guarantees on the error approximation. This powerful algorithm required the minor restriction on the conditional probabilities of the Bayesian network to be bounded away from zero and one by 1/p(n) where p(n) was any polynomial on the number of nodes in the network n. | | In practical terms, these complexity results suggested that while Bayesian networks were rich representations for AI and machine learning applications, their use in large real-world applications would need to be tempered by either topological structural constraints, such as naïve Bayes networks, or by restrictions on the conditional probabilities. The bounded variance algorithm was the first provable fast approximation algorithm to efficiently approximate probabilistic inference in Bayesian networks with guarantees on the error approximation. This powerful algorithm required the minor restriction on the conditional probabilities of the Bayesian network to be bounded away from zero and one by 1/p(n) where p(n) was any polynomial on the number of nodes in the network n. |
| | | |
− | 实际上,这些复杂性结果表明,虽然'''<font color="#ff8000"> 贝叶斯网络Bayesian network</font>'''是人工智能和机器学习应用的丰富表现形式,但它们在大型实际应用中的使用需要通过拓扑结构约束(如天真的贝叶斯网络)或条件概率约束加以调整。'''<font color="#ff8000">有界方差算法Bounded variance algorithm </font>'''是贝叶斯网络中第一个在误差近似下有效近似概率推理的可证明的快速近似演算法算法。这个强大的算法需要对贝氏网路的条件概率进行小的限制,使其远离0和1 / p (n) ,其中 p (n)是网络 n 中节点数的任意多项式。
| + | 实际上,这些复杂性结果表明,虽然贝叶斯网络在人工智能和机器学习应用中是一种强大的表示,但它们在大型实际应用中需要对拓扑结构施加约束(如朴素贝叶斯网络)或对条件概率施加约束。'''<font color="#ff8000">有界方差算法 Bounded variance algorithm </font>'''是贝叶斯网络中第一个被证明在保证误差的情况下还能进行快速近似概率推理的算法。这个强大的算法需要对贝叶斯网络的条件概率进行细微的限制,使其处于[0 + 1/p(n), 1 - 1/p(n)]的区间内 ,其中 p (n)是网络中节点数n的任意多项式。 |
| | | |
| | | |