更改

跳到导航 跳到搜索
添加5字节 、 2022年4月29日 (五) 10:49
审校
第104行: 第104行:  
Significant ethical limitations may be placed on neuromorphic engineering due to public perception.<ref name=":27">{{Cite report|url=https://ai100.stanford.edu/sites/g/files/sbiybj9861/f/ai_100_report_0831fnl.pdf|title=Artificial Intelligence and Life in 2030|author=2015 Study Panel|date=September 2016|work=One Hundred Year Study on Artificial Intelligence (AI100)|publisher=Stanford University}}</ref> Special [[Eurobarometer]] 382: Public Attitudes Towards Robots, a survey conducted by the European Commission, found that 60% of [[European Union]] citizens wanted a ban of robots in the care of children, the elderly, or the disabled. Furthermore, 34% were in favor of a ban on robots in education, 27% in healthcare, and 20% in leisure. The European Commission classifies these areas as notably “human.” The report cites increased public concern with robots that are able to mimic or replicate human functions. Neuromorphic engineering, by definition, is designed to replicate the function of the human brain.<ref name=":1">{{Cite web|url=http://ec.europa.eu/commfrontoffice/publicopinion/archives/ebs/ebs_382_en.pdf|title=Special Eurobarometer 382: Public Attitudes Towards Robots|last=European Commission|date=September 2012|website=European Commission}}</ref>
 
Significant ethical limitations may be placed on neuromorphic engineering due to public perception.<ref name=":27">{{Cite report|url=https://ai100.stanford.edu/sites/g/files/sbiybj9861/f/ai_100_report_0831fnl.pdf|title=Artificial Intelligence and Life in 2030|author=2015 Study Panel|date=September 2016|work=One Hundred Year Study on Artificial Intelligence (AI100)|publisher=Stanford University}}</ref> Special [[Eurobarometer]] 382: Public Attitudes Towards Robots, a survey conducted by the European Commission, found that 60% of [[European Union]] citizens wanted a ban of robots in the care of children, the elderly, or the disabled. Furthermore, 34% were in favor of a ban on robots in education, 27% in healthcare, and 20% in leisure. The European Commission classifies these areas as notably “human.” The report cites increased public concern with robots that are able to mimic or replicate human functions. Neuromorphic engineering, by definition, is designed to replicate the function of the human brain.<ref name=":1">{{Cite web|url=http://ec.europa.eu/commfrontoffice/publicopinion/archives/ebs/ebs_382_en.pdf|title=Special Eurobarometer 382: Public Attitudes Towards Robots|last=European Commission|date=September 2012|website=European Commission}}</ref>
   −
由于公众认知的相关忧虑,神经形态工程学可能会受到严重的伦理限制。<ref name=":27" />欧盟委员会进行的一项调查发现,60% 的欧盟公民希望禁止机器人参与照顾儿童、老人或残疾人的工作。此外,34% 的人支持禁止机器人用于教育,27% 的人支持禁止机器人用于医疗保健,20% 的人支持禁止机器人用于休闲。欧盟委员会将以上领域划入“人类”范畴。报告指出,公众越来越关注能够模仿或复制人类功能的机器人。神经形态工程,顾名思义,是为了复制人脑的功能而设计的。<ref name=":1" />
+
由于公众认知的相关忧虑,神经形态工程学可能会受到严重的伦理限制。<ref name=":27" />欧盟委员会进行的一项调查发现,60% 的欧盟公民希望禁止机器人参与照顾儿童、老人或残疾人的工作。此外,34% 的人支持禁止机器人用于教育,27% 的人支持禁止机器人用于医疗保健,20% 的人支持禁止机器人用于娱乐。欧盟委员会将以上领域划入“人类”范畴。报告指出,公众越来越关注能够模仿或复制人类功能的机器人。而神经形态工程,顾名思义,是为了模仿人脑的功能而设计的。<ref name=":1" />
    
The democratic concerns surrounding neuromorphic engineering are likely to become even more profound in the future. The European Commission found that EU citizens between the ages of 15 and 24 are more likely to think of robots as human-like (as opposed to instrument-like) than EU citizens over the age of 55. When presented an image of a robot that had been defined as human-like, 75% of EU citizens aged 15–24 said it corresponded with the idea they had of robots while only 57% of EU citizens over the age of 55 responded the same way. The human-like nature of neuromorphic systems, therefore, could place them in the categories of robots many EU citizens would like to see banned in the future.<ref name=":1" />
 
The democratic concerns surrounding neuromorphic engineering are likely to become even more profound in the future. The European Commission found that EU citizens between the ages of 15 and 24 are more likely to think of robots as human-like (as opposed to instrument-like) than EU citizens over the age of 55. When presented an image of a robot that had been defined as human-like, 75% of EU citizens aged 15–24 said it corresponded with the idea they had of robots while only 57% of EU citizens over the age of 55 responded the same way. The human-like nature of neuromorphic systems, therefore, could place them in the categories of robots many EU citizens would like to see banned in the future.<ref name=":1" />
   −
围绕神经形态工程的公众担忧可能在未来变得更加深刻。欧盟委员会发现,相比于55岁以上的欧盟公民,15至24岁的欧盟公民更有可能认为机器人像人(而不是像仪器)。当看到一张“类人”机器人的图片时,年龄在15岁至24岁之间的欧盟公民中有75% 的人表示这符合他们对机器人的想法,而55岁以上的欧盟公民中只有57% 的人有同样的反应。因此,神经形态系统可能因为其类似人类的特性而被归入许多欧盟公民希望在未来禁止使用的机器人类别。<ref name=":1" />
+
围绕神经形态工程的公众担忧可能在未来变得更加严重。欧盟委员会发现,相比于55岁以上的欧盟公民,15至24岁的欧盟公民更有可能认为机器人像人(而不是像仪器)。当看到一张“类人”机器人的图片时,年龄在15岁至24岁之间的欧盟公民中有75% 的人表示这符合他们对机器人的想法,而55岁以上的欧盟公民中只有57% 的人有同样的反应。因此,神经形态系统可能因为其类似人类的特性而被归入许多欧盟公民希望在未来禁止使用的机器人类别。<ref name=":1" />
    
=== Personhood===
 
=== Personhood===
=== 人格问题===
+
=== 人格权问题===
 
As neuromorphic systems have become increasingly advanced, some scholars{{who|date=August 2021}} have advocated for granting [[personhood]] rights to these systems. If the brain is what grants humans their personhood, to what extent does a neuromorphic system have to mimic the human brain to be granted personhood rights? Critics of technology development in the [[Human Brain Project]], which aims to advance brain-inspired computing, have argued that advancement in neuromorphic computing could lead to machine consciousness or personhood.<ref name=":28">{{Cite journal|last=Aicardi|first=Christine|date=September 2018|title=Accompanying technology development in the Human Brain Project: From foresight to ethics management|journal=Futures|volume=102|pages=114–124|doi=10.1016/j.futures.2018.01.005|doi-access=free}}</ref> If these systems are to be treated as people, critics argue, then many tasks humans perform using neuromorphic systems, including the act of termination of neuromorphic systems, may be morally impermissible as these acts would violate the autonomy of the neuromorphic systems.<ref name=":29">{{Cite journal|last=Lim|first=Daniel|date=2014-06-01|title=Brain simulation and personhood: a concern with the Human Brain Project|journal=Ethics and Information Technology|language=en|volume=16|issue=2|pages=77–89|doi=10.1007/s10676-013-9330-5|s2cid=17415814|issn=1572-8439}}</ref>
 
As neuromorphic systems have become increasingly advanced, some scholars{{who|date=August 2021}} have advocated for granting [[personhood]] rights to these systems. If the brain is what grants humans their personhood, to what extent does a neuromorphic system have to mimic the human brain to be granted personhood rights? Critics of technology development in the [[Human Brain Project]], which aims to advance brain-inspired computing, have argued that advancement in neuromorphic computing could lead to machine consciousness or personhood.<ref name=":28">{{Cite journal|last=Aicardi|first=Christine|date=September 2018|title=Accompanying technology development in the Human Brain Project: From foresight to ethics management|journal=Futures|volume=102|pages=114–124|doi=10.1016/j.futures.2018.01.005|doi-access=free}}</ref> If these systems are to be treated as people, critics argue, then many tasks humans perform using neuromorphic systems, including the act of termination of neuromorphic systems, may be morally impermissible as these acts would violate the autonomy of the neuromorphic systems.<ref name=":29">{{Cite journal|last=Lim|first=Daniel|date=2014-06-01|title=Brain simulation and personhood: a concern with the Human Brain Project|journal=Ethics and Information Technology|language=en|volume=16|issue=2|pages=77–89|doi=10.1007/s10676-013-9330-5|s2cid=17415814|issn=1572-8439}}</ref>
   第121行: 第121行:  
The [[Joint Artificial Intelligence Center]], a branch of the U.S. military, is a center dedicated to the procurement and implementation of AI software and neuromorphic hardware for combat use. Specific applications include smart headsets/goggles and robots. JAIC intends to rely heavily on neuromorphic technology to connect "every fighter every shooter" within a network of neuromorphic-enabled units.
 
The [[Joint Artificial Intelligence Center]], a branch of the U.S. military, is a center dedicated to the procurement and implementation of AI software and neuromorphic hardware for combat use. Specific applications include smart headsets/goggles and robots. JAIC intends to rely heavily on neuromorphic technology to connect "every fighter every shooter" within a network of neuromorphic-enabled units.
   −
'''<font color="#ff8000">联合人工智能中心The Joint Artificial Intelligence Center</font>'''(JAIC),是美国军队的一个分支,专门从事采购和实施用于战斗的人工智能软件和神经形态硬件。具体应用包括智能耳机/护目镜和机器人。JAIC打算高度依赖神经形态技术,'''<font color="#32CD32">使用神经形态技术来连接神经形态单位网络中的“每个战士每个射手”</font>'''。
+
'''<font color="#ff8000">联合人工智能中心The Joint Artificial Intelligence Center</font>'''(JAIC),是美国军队的一个分支,专门从事采购和实施用于战斗的人工智能软件和神经形态硬件。具体应用包括智能耳机/护目镜和机器人。JAIC打算高度依赖神经形态技术,'''<font color="#32CD32">使用神经形态技术来连接神经形态单位网络中的“每个战士每个射手”</font>'''。
      第128行: 第128行:  
Skeptics have argued that there is no way to apply the electronic personhood, the concept of personhood that would apply to neuromorphic technology, legally. In a letter signed by 285 experts in law, robotics, medicine, and ethics opposing a European Commission proposal to recognize “smart robots” as legal persons, the authors write, “A legal status for a robot can’t derive from the [[Natural person|Natural Person]] model, since the robot would then hold [[human rights]], such as the right to dignity, the right to its integrity, the right to remuneration or the right to citizenship, thus directly confronting the Human rights. This would be in contradiction with the [[Charter of Fundamental Rights of the European Union]] and the [[Convention for the Protection of Human Rights and Fundamental Freedoms]].”<ref name=":30">{{Cite web|url=http://www.robotics-openletter.eu/|title=Robotics Openletter {{!}} Open letter to the European Commission|language=fr-FR|access-date=2019-05-10}}</ref>
 
Skeptics have argued that there is no way to apply the electronic personhood, the concept of personhood that would apply to neuromorphic technology, legally. In a letter signed by 285 experts in law, robotics, medicine, and ethics opposing a European Commission proposal to recognize “smart robots” as legal persons, the authors write, “A legal status for a robot can’t derive from the [[Natural person|Natural Person]] model, since the robot would then hold [[human rights]], such as the right to dignity, the right to its integrity, the right to remuneration or the right to citizenship, thus directly confronting the Human rights. This would be in contradiction with the [[Charter of Fundamental Rights of the European Union]] and the [[Convention for the Protection of Human Rights and Fundamental Freedoms]].”<ref name=":30">{{Cite web|url=http://www.robotics-openletter.eu/|title=Robotics Openletter {{!}} Open letter to the European Commission|language=fr-FR|access-date=2019-05-10}}</ref>
   −
质疑者者认为,在法律上没有办法应用能够适用于神经形态技术的电子人格。在一封由285名法律、机器人技术、医学和伦理学专家的联名信中,作者们反对欧盟委员会提出的承认“智能机器人”为法人的提议。他们写道,“机器人的法律地位无法从'''<font color="#ff8000">自然人Natural Person</font>'''模型中推导出来,因为机器人将被赋予'''<font color="#ff8000">人权Human rights</font>''',如尊严权、完整权、报酬权或公民权,从而直接面临人权问题。这将有悖于《欧联基本权利宪章》和《欧洲保障人权和根本自由公约》”。<ref name=":30" />
+
怀疑派认为,在法律上没有办法应用能够适用于神经形态技术的电子人格。在一封由285名法律、机器人技术、医学和伦理学专家的联名信中,作者们反对欧盟委员会提出的承认“智能机器人”为法人的提议。他们写道,“机器人的法律地位无法从'''<font color="#ff8000">自然人Natural Person</font>'''模型中推导出来,因为机器人将被赋予'''<font color="#ff8000">人权Human rights</font>''',如尊严权、完整权、报酬权或公民权,从而直接面临人权问题。这将有悖于《欧联基本权利宪章》和《欧洲保障人权和根本自由公约》”。<ref name=":30" />
    
===Ownership and property rights ===
 
===Ownership and property rights ===
18

个编辑

导航菜单