更改

跳到导航 跳到搜索
删除44字节 、 2022年4月24日 (日) 15:17
第15行: 第15行:  
Proceedings of the European Symposium on Artificial Neural Networks ESANN 2007, pp. 471–482. It is a generalisation of earlier neural network architectures such as recurrent neural networks, liquid-state machines and echo-state networks. Reservoir computing also extends to physical systems that are not networks in the classical sense, but rather continuous systems in space and/or time: e.g. a literal "bucket of water" can serve as a reservoir that performs computations on inputs given as perturbations of the surface. The resultant complexity of such recurrent neural networks was found to be useful in solving a variety of problems including language processing and dynamic system modeling. However, training of recurrent neural networks is challenging and computationally expensive. Reservoir computing reduces those training-related challenges by fixing the dynamics of the reservoir and only training the linear output layer.
 
Proceedings of the European Symposium on Artificial Neural Networks ESANN 2007, pp. 471–482. It is a generalisation of earlier neural network architectures such as recurrent neural networks, liquid-state machines and echo-state networks. Reservoir computing also extends to physical systems that are not networks in the classical sense, but rather continuous systems in space and/or time: e.g. a literal "bucket of water" can serve as a reservoir that performs computations on inputs given as perturbations of the surface. The resultant complexity of such recurrent neural networks was found to be useful in solving a variety of problems including language processing and dynamic system modeling. However, training of recurrent neural networks is challenging and computationally expensive. Reservoir computing reduces those training-related challenges by fixing the dynamics of the reservoir and only training the linear output layer.
   −
= = 历史 = = 油藏计算的概念源于神经网络中使用递归连接来创建一个复杂的动力系统。和 Jan Van Campenhout。油藏计算概述: 理论、应用和实现2007年欧洲人工神经网络研讨会论文集。471–482.它是早期神经网络体系结构,如回归神经网络,液态机和回声状态网络的一个推广。储备计算还扩展到物理系统,不是传统意义上的网络,而是空间和/或时间上的连续系统: 例如:。“一桶水”可以作为一个蓄水池,对地表的扰动输入进行计算。这种递归神经网络的结果复杂性被发现在解决包括语言处理和动态系统建模在内的各种问题中是有用的。然而,递归神经网络的训练是具有挑战性和计算代价昂贵的。油藏计算通过固定油藏的动态和只训练线性输出层来减少那些与训练相关的挑战。
+
储备池计算的概念源于神经网络中使用递归连接来创建一个复杂的动力系统。它是对早期神经网络体系结构,比如循环神经网络,液体状态机和回声状态网络的一个推广。储备计算还可以扩展到物理系统,在物理系统中它不是传统意义上的网络,而是空间和/或时间上的连续系统: 例如:。“一桶水”可以看作一个蓄水池,可以对它表面的扰动输入进行计算。循环神经网络内部的复杂性,对于解决包括语言处理和动态系统建模在内的各种问题是有用的。然而,循环神经网络的训练是具有挑战性的,并且它的计算开销十分巨大。储备池计算通过固定储备池的动力学特性,并且只训练线性读出层的特点,可以减少循环神经网络在训练上的问题。
    
A large variety of nonlinear dynamical systems can serve as a reservoir that performs computations. In recent years semiconductor lasers have attracted considerable interest as computation can be fast and energy efficient compared to electrical components.
 
A large variety of nonlinear dynamical systems can serve as a reservoir that performs computations. In recent years semiconductor lasers have attracted considerable interest as computation can be fast and energy efficient compared to electrical components.
20

个编辑

导航菜单