更改

跳到导航 跳到搜索
添加475字节 、 2020年10月13日 (二) 14:37
无编辑摘要
第273行: 第273行:  
The term "artificial general intelligence" was used as early as 1997, by Mark Gubrud in a discussion of the implications of fully automated military production and operations. The term was re-introduced and popularized by Shane Legg and Ben Goertzel around 2002. The research objective is much older, for example Doug Lenat's Cyc project (that began in 1984), and Allen Newell's Soar project are regarded as within the scope of AGI. AGI research activity in 2006 was described by Pei Wang and Ben Goertzel as "producing publications and preliminary results". The first summer school in AGI was organized in Xiamen, China in 2009 by the Xiamen university's Artificial Brain Laboratory and OpenCog. The first university course was given in 2010 and 2011 at Plovdiv University, Bulgaria by Todor Arnaudov. MIT presented a course in AGI in 2018, organized by Lex Fridman and featuring a number of guest lecturers. However, as yet, most AI researchers have devoted little attention to AGI, with some claiming that intelligence is too complex to be completely replicated in the near term. However, a small number of computer scientists are active in AGI research, and many of this group are contributing to a series of AGI conferences. The research is extremely diverse and often pioneering in nature. In the introduction to his book, Goertzel says that estimates of the time needed before a truly flexible AGI is built vary from 10 years to over a century, but the consensus in the AGI research community seems to be that the timeline discussed by Ray Kurzweil in The Singularity is Near (i.e. between 2015 and 2045) is plausible.
 
The term "artificial general intelligence" was used as early as 1997, by Mark Gubrud in a discussion of the implications of fully automated military production and operations. The term was re-introduced and popularized by Shane Legg and Ben Goertzel around 2002. The research objective is much older, for example Doug Lenat's Cyc project (that began in 1984), and Allen Newell's Soar project are regarded as within the scope of AGI. AGI research activity in 2006 was described by Pei Wang and Ben Goertzel as "producing publications and preliminary results". The first summer school in AGI was organized in Xiamen, China in 2009 by the Xiamen university's Artificial Brain Laboratory and OpenCog. The first university course was given in 2010 and 2011 at Plovdiv University, Bulgaria by Todor Arnaudov. MIT presented a course in AGI in 2018, organized by Lex Fridman and featuring a number of guest lecturers. However, as yet, most AI researchers have devoted little attention to AGI, with some claiming that intelligence is too complex to be completely replicated in the near term. However, a small number of computer scientists are active in AGI research, and many of this group are contributing to a series of AGI conferences. The research is extremely diverse and often pioneering in nature. In the introduction to his book, Goertzel says that estimates of the time needed before a truly flexible AGI is built vary from 10 years to over a century, but the consensus in the AGI research community seems to be that the timeline discussed by Ray Kurzweil in The Singularity is Near (i.e. between 2015 and 2045) is plausible.
   −
“通用人工智能”一词早在1997年就由马克·古布鲁德(Mark Gubrud)在讨论全自动化军事生产和作业的影响时使用。这个术语在2002年左右被肖恩·莱格(Shane Legg)和本·格兹尔(Ben Goertzel)重新引入并推广。通用人工智能的研究目标要古老得多,例如道格•雷纳特(Doug Lenat)的 Cyc 项目(始于1984年) ,以及艾伦•纽厄尔(Allen Newell)的 Soar 项目被认为属于通用人工智能的范畴。王培(Pei Wang)和本·格兹尔将2006年的通用人工智能研究活动描述为“发表论文和取得初步成果”。2009年,厦门大学人工脑实验室和 OpenCog 在中国厦门组织了通用人工智能的第一个暑期学校。第一个大学课程于2010年和2011年在保加利亚普罗夫迪夫大学由托多尔·阿瑙多夫(Todor Arnaudov)开设。2018年,麻省理工学院开设了一门通用人工智能课程,由莱克斯·弗里德曼(Lex Fridman)组织,并邀请了一些客座讲师。然而,迄今为止,大多数人工智能研究人员对通用人工智能关注甚少,一些人声称,智能过于复杂,在短期内无法完全复制。然而,少数计算机科学家积极参与通用人工智能的研究,其中许多人正在为通用人工智能的一系列会议做出贡献。这项研究极其多样化,而且往往具有开创性。格兹尔在他的书的序言中,说,制造一个真正灵活的通用人工智能所需的时间约为10年到超过一个世纪不等,但是通用人工智能研究团体的似乎一致认为雷·库兹韦尔(Ray Kurzweil)在'''<font color="#ff8000">《奇点临近》</font>'''(即在2015年至2045年之间)中讨论的时间线是可信的。
+
“通用人工智能”一词早在1997年就由马克·古布鲁德(Mark Gubrud)在讨论全自动化军事生产和作业的影响时使用。这个术语在2002年左右被肖恩·莱格(Shane Legg)和本·格兹尔(Ben Goertzel)重新引入并推广。通用人工智能的研究目标要古老得多,例如道格•雷纳特(Doug Lenat)的 Cyc 项目(始于1984年) ,以及艾伦•纽厄尔(Allen Newell)的 Soar 项目被认为属于通用人工智能的范畴。王培(Pei Wang)和本·格兹尔将2006年的通用人工智能研究活动描述为“发表论文和取得初步成果”。2009年,厦门大学人工脑实验室和 OpenCog 在中国厦门组织了通用人工智能的第一个暑期学校。第一个大学课程于2010年和2011年在保加利亚普罗夫迪夫大学由托多尔·阿瑙多夫(Todor Arnaudov)开设。2018年,麻省理工学院开设了一门通用人工智能课程,由莱克斯·弗里德曼(Lex Fridman)组织,并邀请了一些客座讲师。然而,迄今为止,大多数人工智能研究人员对通用人工智能关注甚少,一些人声称,智能过于复杂,在短期内无法完全复制。然而,少数计算机科学家积极参与通用人工智能的研究,其中许多人正在为通用人工智能的一系列会议做出贡献。这项研究极其多样化,而且往往具有开创性。格兹尔在他的书的序言中,说,制造一个真正灵活的通用人工智能所需的时间约为10年到超过一个世纪不等,但是通用人工智能研究团体的似乎一致认为雷·库兹韦尔(Ray Kurzweil)在'''<font color="#ff8000">《奇点临近》(The Singularity is Near)</font>'''(即在2015年至2045年之间)中讨论的时间线是可信的。
      第327行: 第327行:  
[[File:Estimations of Human Brain Emulation Required Performance.svg|thumb|right|400px|Estimates of how much processing power is needed to emulate a human brain at various levels (from Ray Kurzweil, and [[Anders Sandberg]] and [[Nick Bostrom]]), along with the fastest supercomputer from [[TOP500]] mapped by year. Note the logarithmic scale and exponential trendline, which assumes the computational capacity doubles every 1.1 years. Kurzweil believes that mind uploading will be possible at neural simulation, while the Sandberg, Bostrom report is less certain about where [[consciousness]] arises.{{sfn|Sandberg|Boström|2008}}]] For low-level brain simulation, an extremely powerful computer would be required. The [[human brain]] has a huge number of [[synapses]]. Each of the 10<sup>11</sup> (one hundred billion) [[neurons]] has on average 7,000 synaptic connections (synapses) to other neurons. It has been estimated that the brain of a three-year-old child has about 10<sup>15</sup> synapses (1 quadrillion). This number declines with age, stabilizing by adulthood. Estimates vary for an adult, ranging from 10<sup>14</sup> to 5×10<sup>14</sup> synapses (100 to 500 trillion).{{sfn|Drachman|2005}} An estimate of the brain's processing power, based on a simple switch model for neuron activity, is around 10<sup>14</sup> (100 trillion) synaptic updates per second ([[SUPS]]).{{sfn|Russell|Norvig|2003}} In 1997, Kurzweil looked at various estimates for the hardware required to equal the human brain and adopted a figure of 10<sup>16</sup> computations per second (cps).<ref>In "Mind Children" {{Harvnb|Moravec|1988|page=61}} 10<sup>15</sup> cps is used. More recently, in 1997, <{{cite web|url=http://www.transhumanist.com/volume1/moravec.htm |title=Archived copy |accessdate=23 June 2006 |url-status=dead |archiveurl=https://web.archive.org/web/20060615031852/http://transhumanist.com/volume1/moravec.htm |archivedate=15 June 2006 }}> Moravec argued for 10<sup>8</sup> MIPS which would roughly correspond to 10<sup>14</sup> cps.  Moravec talks in terms of MIPS, not "cps", which is a non-standard term Kurzweil introduced.</ref> (For comparison, if a "computation" was equivalent to one "[[FLOPS|floating point operation]]" –  a measure used to rate current [[supercomputer]]s – then 10<sup>16</sup> "computations" would be equivalent to 10 [[Peta-|petaFLOPS]], [[FLOPS#Performance records|achieved in 2011]]). He used this figure to predict the necessary hardware would be available sometime between 2015 and 2025, if the exponential growth in computer power at the time of writing continued.
 
[[File:Estimations of Human Brain Emulation Required Performance.svg|thumb|right|400px|Estimates of how much processing power is needed to emulate a human brain at various levels (from Ray Kurzweil, and [[Anders Sandberg]] and [[Nick Bostrom]]), along with the fastest supercomputer from [[TOP500]] mapped by year. Note the logarithmic scale and exponential trendline, which assumes the computational capacity doubles every 1.1 years. Kurzweil believes that mind uploading will be possible at neural simulation, while the Sandberg, Bostrom report is less certain about where [[consciousness]] arises.{{sfn|Sandberg|Boström|2008}}]] For low-level brain simulation, an extremely powerful computer would be required. The [[human brain]] has a huge number of [[synapses]]. Each of the 10<sup>11</sup> (one hundred billion) [[neurons]] has on average 7,000 synaptic connections (synapses) to other neurons. It has been estimated that the brain of a three-year-old child has about 10<sup>15</sup> synapses (1 quadrillion). This number declines with age, stabilizing by adulthood. Estimates vary for an adult, ranging from 10<sup>14</sup> to 5×10<sup>14</sup> synapses (100 to 500 trillion).{{sfn|Drachman|2005}} An estimate of the brain's processing power, based on a simple switch model for neuron activity, is around 10<sup>14</sup> (100 trillion) synaptic updates per second ([[SUPS]]).{{sfn|Russell|Norvig|2003}} In 1997, Kurzweil looked at various estimates for the hardware required to equal the human brain and adopted a figure of 10<sup>16</sup> computations per second (cps).<ref>In "Mind Children" {{Harvnb|Moravec|1988|page=61}} 10<sup>15</sup> cps is used. More recently, in 1997, <{{cite web|url=http://www.transhumanist.com/volume1/moravec.htm |title=Archived copy |accessdate=23 June 2006 |url-status=dead |archiveurl=https://web.archive.org/web/20060615031852/http://transhumanist.com/volume1/moravec.htm |archivedate=15 June 2006 }}> Moravec argued for 10<sup>8</sup> MIPS which would roughly correspond to 10<sup>14</sup> cps.  Moravec talks in terms of MIPS, not "cps", which is a non-standard term Kurzweil introduced.</ref> (For comparison, if a "computation" was equivalent to one "[[FLOPS|floating point operation]]" –  a measure used to rate current [[supercomputer]]s – then 10<sup>16</sup> "computations" would be equivalent to 10 [[Peta-|petaFLOPS]], [[FLOPS#Performance records|achieved in 2011]]). He used this figure to predict the necessary hardware would be available sometime between 2015 and 2025, if the exponential growth in computer power at the time of writing continued.
   −
Estimates of how much processing power is needed to emulate a human brain at various levels (from Ray Kurzweil, and [[Anders Sandberg and Nick Bostrom), along with the fastest supercomputer from TOP500 mapped by year. Note the logarithmic scale and exponential trendline, which assumes the computational capacity doubles every 1.1 years. Kurzweil believes that mind uploading will be possible at neural simulation, while the Sandberg, Bostrom report is less certain about where consciousness arises.]] For low-level brain simulation, an extremely powerful computer would be required. The human brain has a huge number of synapses. Each of the 10<sup>11</sup> (one hundred billion) neurons has on average 7,000 synaptic connections (synapses) to other neurons. It has been estimated that the brain of a three-year-old child has about 10<sup>15</sup> synapses (1 quadrillion). This number declines with age, stabilizing by adulthood. Estimates vary for an adult, ranging from 10<sup>14</sup> to 5×10<sup>14</sup> synapses (100 to 500 trillion). An estimate of the brain's processing power, based on a simple switch model for neuron activity, is around 10<sup>14</sup> (100 trillion) synaptic updates per second (SUPS). In 1997, Kurzweil looked at various estimates for the hardware required to equal the human brain and adopted a figure of 10<sup>16</sup> computations per second (cps). (For comparison, if a "computation" was equivalent to one "floating point operation" –  a measure used to rate current supercomputers – then 10<sup>16</sup> "computations" would be equivalent to 10 petaFLOPS, achieved in 2011). He used this figure to predict the necessary hardware would be available sometime between 2015 and 2025, if the exponential growth in computer power at the time of writing continued.
+
Estimates of how much processing power is needed to emulate a human brain at various levels (from Ray Kurzweil, and [[Anders Sandberg and Nick Bostrom), along with the fastest supercomputer from TOP500 mapped by year. Note the logarithmic scale and exponential trendline, which assumes the computational capacity doubles every 1.1 years. Kurzweil believes that mind uploading will be possible at neural simulation, while the Sandberg, Bostrom report is less certain about where consciousness arises.根据对在不同水平上模拟人类大脑的所需处理能力的估计(来自 Ray Kurzweil,[ Anders Sandberg 和 Nick Bostrom ]) ,以及每年从最快的五百台超级计算机获得的数据,绘制出对数尺度趋势线和指数趋势线。它呈现出计算能力每1.1年增长一倍。库兹韦尔相信,在神经模拟中上传思维是可能的,而桑德伯格和博斯特罗姆的报告对意识从何产生则不太确定。]] For low-level brain simulation, an extremely powerful computer would be required. The human brain has a huge number of synapses. Each of the 10<sup>11</sup> (one hundred billion) neurons has on average 7,000 synaptic connections (synapses) to other neurons. It has been estimated that the brain of a three-year-old child has about 10<sup>15</sup> synapses (1 quadrillion). This number declines with age, stabilizing by adulthood. Estimates vary for an adult, ranging from 10<sup>14</sup> to 5×10<sup>14</sup> synapses (100 to 500 trillion). An estimate of the brain's processing power, based on a simple switch model for neuron activity, is around 10<sup>14</sup> (100 trillion) synaptic updates per second (SUPS). In 1997, Kurzweil looked at various estimates for the hardware required to equal the human brain and adopted a figure of 10<sup>16</sup> computations per second (cps). (For comparison, if a "computation" was equivalent to one "floating point operation" –  a measure used to rate current supercomputers – then 10<sup>16</sup> "computations" would be equivalent to 10 petaFLOPS, achieved in 2011). He used this figure to predict the necessary hardware would be available sometime between 2015 and 2025, if the exponential growth in computer power at the time of writing continued.
   −
根据对在不同水平上模拟人类大脑的所需处理能力的估计(来自 Ray Kurzweil,[ Anders Sandberg 和 Nick Bostrom ]) ,以及每年从最快的五百台超级计算机获得的数据,绘制出对数尺度趋势线和指数趋势线。它呈现出计算能力每1.1年增长一倍。库兹韦尔相信,在神经模拟中上传思维是可能的,而桑德伯格和博斯特罗姆的报告对意识从何产生则不太确定。]为进行低层次的大脑模拟,需要一个非常强大的计算机。人类的大脑有大量的突触。每10个 sup 11 / sup (1000亿)神经元平均与其他神经元有7000个突触连接(突触)。据估计,一个三岁儿童的大脑约有10个 sup 15 / sup 突触(1千万亿)。这个数字随着年龄的增长而下降,成年后趋于稳定。而每个成年人的估计情况互不相同,从10个 sup 14 / sup 到5个 sup 10 sup 14 / sup 突触(100万亿到500万亿)不等。基于神经元活动的简单开关模型,对大脑处理能力的估计大约是每秒10次 / 秒(100万亿)突触更新(SUPS)。1997年,库兹韦尔研究了等价模拟人脑所需硬件的各种估计,并采纳了每秒10 sup 16 / sup 计算(cps)这个估计结果。(作为比较,如果一次“计算”相当于一次“浮点运算”——一种用于对当前超级计算机进行评级的措施——那么10 sup 16 / sup次“计算”相当于2011年达到的的每秒10000亿次浮点运算)。他用这个数字来预测,如果在撰写本文时计算机能力方面的指数增长继续下去的话,那么在2015年到2025年之间的某个时候,必要的硬件将会出现。
+
根据对在不同水平上模拟人类大脑的所需处理能力的估计(来自 Ray Kurzweil,[ Anders Sandberg 和 Nick Bostrom ]) ,以及每年从最快的五百台超级计算机获得的数据,绘制出对数尺度趋势线和指数趋势线。它呈现出计算能力每1.1年增长一倍。库兹韦尔相信,在神经模拟中上传思维是可能的,而桑德伯格和博斯特罗姆的报告对意识从何产生则不太确定。为进行低层次的大脑模拟,需要一个非常强大的计算机。人类的大脑有大量的突触。每10个 sup 11 / sup (1000亿)神经元平均与其他神经元有7000个突触连接(突触)。据估计,一个三岁儿童的大脑约有10个 sup 15 / sup 突触(1千万亿)。这个数字随着年龄的增长而下降,成年后趋于稳定。而每个成年人的估计情况互不相同,从10个 sup 14 / sup 到5个 sup 10 sup 14 / sup 突触(100万亿到500万亿)不等。基于神经元活动的简单开关模型,对大脑处理能力的估计大约是每秒10次 / 秒(100万亿)突触更新(SUPS)。1997年,库兹韦尔研究了等价模拟人脑所需硬件的各种估计,并采纳了每秒10 sup 16 / sup 计算(cps)这个估计结果。(作为比较,如果一次“计算”相当于一次“浮点运算”——一种用于对当前超级计算机进行评级的措施——那么10 sup 16 / sup次“计算”相当于2011年达到的的每秒10000亿次浮点运算)。他用这个数字来预测,如果在撰写本文时计算机能力方面的指数增长继续下去的话,那么在2015年到2025年之间的某个时候,必要的硬件将会出现。
     
97

个编辑

导航菜单