更改
无编辑摘要
ANN起初希望达到的目标是用[https://en.wikipedia.org/wiki/Human_brain 人脑]相同方式解决问题。然而,随着时间的发展,人们将注意力转向执行具体的任务,导致了从[https://en.wikipedia.org/wiki/Biology 生物学]的偏离。ANN被用于多种任务,包括[https://en.wikipedia.org/wiki/Computer_vision 计算机视觉],[https://en.wikipedia.org/wiki/Speech_recognition 语音识别],[https://en.wikipedia.org/wiki/Machine_translation 机器翻译],[https://en.wikipedia.org/wiki/Social_network 社交网络]滤波,[https://en.wikipedia.org/wiki/General_game_playing 棋类和电子游戏]和[https://en.wikipedia.org/wiki/Medical_diagnosis 医疗诊断]。
ANN起初希望达到的目标是用[https://en.wikipedia.org/wiki/Human_brain 人脑]相同方式解决问题。然而,随着时间的发展,人们将注意力转向执行具体的任务,导致了从[https://en.wikipedia.org/wiki/Biology 生物学]的偏离。ANN被用于多种任务,包括[https://en.wikipedia.org/wiki/Computer_vision 计算机视觉],[https://en.wikipedia.org/wiki/Speech_recognition 语音识别],[https://en.wikipedia.org/wiki/Machine_translation 机器翻译],[https://en.wikipedia.org/wiki/Social_network 社交网络]滤波,[https://en.wikipedia.org/wiki/General_game_playing 棋类和电子游戏]和[https://en.wikipedia.org/wiki/Medical_diagnosis 医疗诊断]。
== 历史 ==
== 历史 ==
A common criticism of neural networks, particularly in robotics, is that they require too much training for real-world operation.{{Citation needed|date=November 2014}} Potential solutions include randomly shuffling training examples, by using a numerical optimization algorithm that does not take too large steps when changing the network connections following an example and by grouping examples in so-called mini-batches. Improving the training efficiency and convergence capability has always been an ongoing research area for neural network. For example, by introducing a recursive least squares algorithm for [[cerebellar model articulation controller|CMAC]] neural network, the training process only takes one step to converge.<ref name="Qin1"/>
A common criticism of neural networks, particularly in robotics, is that they require too much training for real-world operation.{{Citation needed|date=November 2014}} Potential solutions include randomly shuffling training examples, by using a numerical optimization algorithm that does not take too large steps when changing the network connections following an example and by grouping examples in so-called mini-batches. Improving the training efficiency and convergence capability has always been an ongoing research area for neural network. For example, by introducing a recursive least squares algorithm for [[cerebellar model articulation controller|CMAC]] neural network, the training process only takes one step to converge.<ref name="Qin1"/>
===Theoretical issues===
===理论问题(Theoretical issues)===
没有神经网络解决了计算困难的问题例如【八皇后】问题,【旅行商问题】或【整数因子分解】对于大整数的问题。
A fundamental objection is that they do not reflect how real neurons function. Back propagation is a critical part of most artificial neural networks, although no such mechanism exists in biological neural networks.<ref>{{cite journal | last1 = Crick | first1 = Francis | year = 1989 | title = The recent excitement about neural networks | journal = Nature | volume = 337 | issue = 6203 | pages = 129–132 | doi = 10.1038/337129a0 | url = http://europepmc.org/abstract/med/2911347 | pmid=2911347| bibcode = 1989Natur.337..129C }}</ref> How information is coded by real neurons is not known. [[Sensory neuron|Sensor neurons]] fire [[action potential]]s more frequently with sensor activation and [[muscle cell]]s pull more strongly when their associated [[motor neuron]]s receive action potentials more frequently.<ref>{{cite journal | last1 = Adrian | first1 = Edward D. | year = 1926 | title = The impulses produced by sensory nerve endings | journal = The Journal of Physiology | volume = 61 | issue = 1 | pages = 49–72 | doi = 10.1113/jphysiol.1926.sp002273 | pmid = 16993776 | pmc = 1514809 | url = http://onlinelibrary.wiley.com/doi/10.1113/jphysiol.1926.sp002273/full }}</ref> Other than the case of relaying information from a sensor neuron to a motor neuron, almost nothing of the principles of how information is handled by biological neural networks is known.
A fundamental objection is that they do not reflect how real neurons function. Back propagation is a critical part of most artificial neural networks, although no such mechanism exists in biological neural networks.<ref>{{cite journal | last1 = Crick | first1 = Francis | year = 1989 | title = The recent excitement about neural networks | journal = Nature | volume = 337 | issue = 6203 | pages = 129–132 | doi = 10.1038/337129a0 | url = http://europepmc.org/abstract/med/2911347 | pmid=2911347| bibcode = 1989Natur.337..129C }}</ref> How information is coded by real neurons is not known. [[Sensory neuron|Sensor neurons]] fire [[action potential]]s more frequently with sensor activation and [[muscle cell]]s pull more strongly when their associated [[motor neuron]]s receive action potentials more frequently.<ref>{{cite journal | last1 = Adrian | first1 = Edward D. | year = 1926 | title = The impulses produced by sensory nerve endings | journal = The Journal of Physiology | volume = 61 | issue = 1 | pages = 49–72 | doi = 10.1113/jphysiol.1926.sp002273 | pmid = 16993776 | pmc = 1514809 | url = http://onlinelibrary.wiley.com/doi/10.1113/jphysiol.1926.sp002273/full }}</ref> Other than the case of relaying information from a sensor neuron to a motor neuron, almost nothing of the principles of how information is handled by biological neural networks is known.
Although it is true that analyzing what has been learned by an artificial neural network is difficult, it is much easier to do so than to analyze what has been learned by a biological neural network. Furthermore, researchers involved in exploring learning algorithms for neural networks are gradually uncovering general principles that allow a learning machine to be successful. For example, local vs non-local learning and shallow vs deep architecture.<ref>{{cite web|url=http://www.iro.umontreal.ca/~lisa/publications2/index.php/publications/show/4|title=Scaling Learning Algorithms towards {AI} – LISA – Publications – Aigaion 2.0|publisher=}}</ref>
Although it is true that analyzing what has been learned by an artificial neural network is difficult, it is much easier to do so than to analyze what has been learned by a biological neural network. Furthermore, researchers involved in exploring learning algorithms for neural networks are gradually uncovering general principles that allow a learning machine to be successful. For example, local vs non-local learning and shallow vs deep architecture.<ref>{{cite web|url=http://www.iro.umontreal.ca/~lisa/publications2/index.php/publications/show/4|title=Scaling Learning Algorithms towards {AI} – LISA – Publications – Aigaion 2.0|publisher=}}</ref>
===Hybrid approaches===
===混合方法(Hybrid approaches)===
混合模型(结合了神经网络和符号化方法)的拥护者声称这种混合可以更好地捕获人类大脑的机制
==类型==
==类型==
人工神经网络有很多类型。最简单的静态类型有一个或多个静态部分,包括一些单元,一些层,单元权重和[https://en.wikipedia.org/wiki/Topology 拓扑学]。动态类型允许这些中的一个或多个在学习过程中变化。后者更复杂,但是可以缩短学习时长并且产生更好的结果。一些类型允许/需要被操作“监督”,而另一些操作独立。一些类型的操作完全在硬件层面,而其他的完全在软件而且在通用计算机上运行。
人工神经网络有很多类型。最简单的静态类型有一个或多个静态部分,包括一些单元,一些层,单元权重和[https://en.wikipedia.org/wiki/Topology 拓扑学]。动态类型允许这些中的一个或多个在学习过程中变化。后者更复杂,但是可以缩短学习时长并且产生更好的结果。一些类型允许/需要被操作“监督”,而另一些操作独立。一些类型的操作完全在硬件层面,而其他的完全在软件而且在通用计算机上运行。
* [[Tensor product network]]
* [[Tensor product network]]
* [[Time delay neural network]] (TDNN)
* [[Time delay neural network]] (TDNN)
==引用==
==引用==
==参考文献==
==参考书目==
* {{Cite journal| author=Bhadeshia H. K. D. H. | year=1999 |title=Neural Networks in Materials Science | journal=ISIJ International | volume=39 |pages=966–979 | doi=10.2355/isijinternational.39.966 | url=http://www.msm.cam.ac.uk/phase-trans/abstracts/neural.review.pdf| issue=10}}
* {{Cite journal| author=Bhadeshia H. K. D. H. | year=1999 |title=Neural Networks in Materials Science | journal=ISIJ International | volume=39 |pages=966–979 | doi=10.2355/isijinternational.39.966 | url=http://www.msm.cam.ac.uk/phase-trans/abstracts/neural.review.pdf| issue=10}}
* {{Cite book|url=https://www.worldcat.org/oclc/33101074|title=Neural networks for pattern recognition|last=M.|first=Bishop, Christopher|date=1995|publisher=Clarendon Press|isbn=0198538499|oclc=33101074 }}
* {{Cite book|url=https://www.worldcat.org/oclc/33101074|title=Neural networks for pattern recognition|last=M.|first=Bishop, Christopher|date=1995|publisher=Clarendon Press|isbn=0198538499|oclc=33101074 }}
* {{Cite book|url=https://www.worldcat.org/oclc/27145760|title=Neural networks for statistical modeling|last1=Smith |first1=Murray|date=1993|publisher=Van Nostrand Reinhold|isbn=0442013108|oclc=27145760}}
* {{Cite book|url=https://www.worldcat.org/oclc/27145760|title=Neural networks for statistical modeling|last1=Smith |first1=Murray|date=1993|publisher=Van Nostrand Reinhold|isbn=0442013108|oclc=27145760}}
* {{Cite book|url=https://www.worldcat.org/oclc/27429729|title=Advanced methods in neural computing|last=Wasserman |first=Philip D.|year=1993|publisher=Van Nostrand Reinhold|isbn=0442004613|oclc=27429729}}
* {{Cite book|url=https://www.worldcat.org/oclc/27429729|title=Advanced methods in neural computing|last=Wasserman |first=Philip D.|year=1993|publisher=Van Nostrand Reinhold|isbn=0442004613|oclc=27429729}}
==外部链接==
==外部链接==
*[https://nnplayground.com Interactive visualization of neural work]
*[https://nnplayground.com Interactive visualization of neural work]
*[http://numerentur.org/redes-neuronales-artificiales/ Neural Networks (in Spanish)]
*[http://numerentur.org/redes-neuronales-artificiales/ Neural Networks (in Spanish)]
[[Category:]]
[[分类:]]
[[Category:Classification algorithms]]
[[Category:分类算法]]
[[Category:Computational neuroscience]]
[[Category:计算神经科学]]
[[Category:Market research]]
[[Category:市场调查]]
[[Category:Market segmentation]]
[[Category:市场分割]]
[[Category:Mathematical psychology]]
[[Category:数学心理学]]
[[Category:Mathematical and quantitative methods (economics)]]
[[Category:数学和定量方法(经济学)]]
本词条内容翻译自 en.wikipedia.org,遵守 CC3.0协议
本词条内容翻译自 en.wikipedia.org,遵守 CC3.0协议