第2行: |
第2行: |
| | | |
| {{Short description|A type of recurrent neural network with random and non-trainable internal structure}} | | {{Short description|A type of recurrent neural network with random and non-trainable internal structure}} |
− | '''Reservoir computing''' is a framework for computation derived from [[recurrent neural network]] theory that maps input signals into higher dimensional computational spaces through the dynamics of a fixed, non-linear system called a reservoir.<ref name=":4">{{Cite journal|last1=Tanaka|first1=Gouhei|last2=Yamane|first2=Toshiyuki|last3=Héroux|first3=Jean Benoit|last4=Nakane|first4=Ryosho|last5=Kanazawa|first5=Naoki|last6=Takeda|first6=Seiji|last7=Numata|first7=Hidetoshi|last8=Nakano|first8=Daiju|last9=Hirose|first9=Akira|title=Recent advances in physical reservoir computing: A review|journal=Neural Networks|volume=115|pages=100–123|doi=10.1016/j.neunet.2019.03.005|pmid=30981085|issn=0893-6080|year=2019|doi-access=free}}</ref> After the input signal is fed into the reservoir, which is treated as a "black box," a simple readout mechanism is trained to read the state of the reservoir and map it to the desired output.<ref name=":4" /> The first key benefit of this framework is that training is performed only at the readout stage, as the reservoir dynamics are fixed.<ref name=":4" /> The second is that the computational power of naturally available systems, both classical and quantum mechanical, can be used to reduce the effective computational cost.<ref>{{Cite journal|last1=Röhm|first1=André|last2=Lüdge|first2=Kathy|date=2018-08-03|title=Multiplexed networks: reservoir computing with virtual and real nodes|journal=Journal of Physics Communications|volume=2|issue=8|pages=085007|bibcode=2018JPhCo...2h5007R|doi=10.1088/2399-6528/aad56d|issn=2399-6528|doi-access=free}}</ref> | + | '''Reservoir computing''' is a framework for computation derived from [[recurrent neural network]] theory that maps input signals into higher dimensional computational spaces through the dynamics of a fixed, non-linear system called a reservoir.<ref name=":4">{{Cite journal|last1=Tanaka|first1=Gouhei|last2=Yamane|first2=Toshiyuki|last3=Héroux|first3=Jean Benoit|last4=Nakane|first4=Ryosho|last5=Kanazawa|first5=Naoki|last6=Takeda|first6=Seiji|last7=Numata|first7=Hidetoshi|last8=Nakano|first8=Daiju|last9=Hirose|first9=Akira|title=Recent advances in physical reservoir computing: A review|journal=Neural Networks|volume=115|pages=100–123|doi=10.1016/j.neunet.2019.03.005|pmid=30981085|issn=0893-6080|year=2019|doi-access=free}}</ref> After the input signal is fed into the reservoir, which is treated as a "black box," a simple readout mechanism is trained to read the state of the reservoir and map it to the desired output.<ref name=":4" /> The first key benefit of this framework is that training is performed only at the readout stage, as the reservoir dynamics are fixed.<ref name=":4" /> The second is that the computational power of naturally available systems, both classical and quantum mechanical, can be used to reduce the effective computational cost.<ref name=":6">{{Cite journal|last1=Röhm|first1=André|last2=Lüdge|first2=Kathy|date=2018-08-03|title=Multiplexed networks: reservoir computing with virtual and real nodes|journal=Journal of Physics Communications|volume=2|issue=8|pages=085007|bibcode=2018JPhCo...2h5007R|doi=10.1088/2399-6528/aad56d|issn=2399-6528|doi-access=free}}</ref> |
| | | |
− | 储备池计算是一个从循环神经网络理论中得出来的计算框架,储备池是一个固定的,非线性系统,其内部具有动力学过程,这个动力学过程将输入信号映射到更高维的计算空间。当输入信号被送入储备池(储备池通常被当作一个“黑匣子”)后,可以训练一个简单的读出机制来读取储备池中神经元的状态并将其映射到所需的输出。这个框架的第一个关键好处是,训练只在读出阶段进行,在读出阶段储备池动力学特性保持不变。第二个好处是这个储备池系统的计算能力,无论是在经典力学还是量子力学中,都可以有效的降低计算成本。
| + | 储备池计算是一个从循环神经网络理论中得出来的计算框架,储备池是一个固定的,非线性系统,其内部具有动力学过程,这个动力学过程将输入信号映射到更高维的计算空间。<ref name=":4" />当输入信号被送入储备池(储备池通常被当作一个“黑匣子”)后,可以训练一个简单的读出机制来读取储备池中神经元的状态并将其映射到所需的输出。<ref name=":4" />这个框架的第一个关键好处是,训练只在读出阶段进行,在读出阶段储备池动力学特性保持不变。<ref name=":4" />第二个好处是这个储备池系统的计算能力,无论是在经典力学还是量子力学中,都可以有效的降低计算成本。<ref name=":6" /> |
| | | |
| == History == | | == History == |
| The concept of reservoir computing stems from the use of recursive connections within [[neural network]]s to create a complex dynamical system.<ref name=":0">[[Benjamin Schrauwen|Schrauwen, Benjamin]], [[David Verstraeten]], and [[Jan Van Campenhout]]. | | The concept of reservoir computing stems from the use of recursive connections within [[neural network]]s to create a complex dynamical system.<ref name=":0">[[Benjamin Schrauwen|Schrauwen, Benjamin]], [[David Verstraeten]], and [[Jan Van Campenhout]]. |
| "An overview of reservoir computing: theory, applications, and implementations." | | "An overview of reservoir computing: theory, applications, and implementations." |
− | Proceedings of the European Symposium on Artificial Neural Networks ESANN 2007, pp. 471–482.</ref> It is a generalisation of earlier neural network architectures such as recurrent neural networks, [[Liquid state machine|liquid-state machines]] and [[Echo state network|echo-state networks]]. Reservoir computing also extends to physical systems that are not networks in the classical sense, but rather continuous systems in space and/or time: e.g. a literal "bucket of water" can serve as a reservoir that performs computations on inputs given as perturbations of the surface.<ref>{{Cite book|last1=Fernando|first1=C.|last2=Sojakka|first2=Sampsa|title=Advances in Artificial Life|chapter=Pattern Recognition in a Bucket|date=2003 |url=https://www.semanticscholar.org/paper/Pattern-Recognition-in-a-Bucket-Fernando-Sojakka/af342af4d0e674aef3bced5fd90875c6f2e04abc |series=Lecture Notes in Computer Science|volume=2801|pages=588–597|doi=10.1007/978-3-540-39432-7_63|isbn=978-3-540-20057-4|s2cid=15073928}}</ref> The resultant complexity of such recurrent neural networks was found to be useful in solving a variety of problems including language processing and dynamic system modeling.<ref name=":0" /> However, training of recurrent neural networks is challenging and computationally expensive.<ref name=":0" /> Reservoir computing reduces those training-related challenges by fixing the dynamics of the reservoir and only training the linear output layer.<ref name=":0" /> | + | Proceedings of the European Symposium on Artificial Neural Networks ESANN 2007, pp. 471–482.</ref> It is a generalisation of earlier neural network architectures such as recurrent neural networks, [[Liquid state machine|liquid-state machines]] and [[Echo state network|echo-state networks]]. Reservoir computing also extends to physical systems that are not networks in the classical sense, but rather continuous systems in space and/or time: e.g. a literal "bucket of water" can serve as a reservoir that performs computations on inputs given as perturbations of the surface.<ref name=":9">{{Cite book|last1=Fernando|first1=C.|last2=Sojakka|first2=Sampsa|title=Advances in Artificial Life|chapter=Pattern Recognition in a Bucket|date=2003 |url=https://www.semanticscholar.org/paper/Pattern-Recognition-in-a-Bucket-Fernando-Sojakka/af342af4d0e674aef3bced5fd90875c6f2e04abc |series=Lecture Notes in Computer Science|volume=2801|pages=588–597|doi=10.1007/978-3-540-39432-7_63|isbn=978-3-540-20057-4|s2cid=15073928}}</ref> The resultant complexity of such recurrent neural networks was found to be useful in solving a variety of problems including language processing and dynamic system modeling.<ref name=":0" /> However, training of recurrent neural networks is challenging and computationally expensive.<ref name=":0" /> Reservoir computing reduces those training-related challenges by fixing the dynamics of the reservoir and only training the linear output layer.<ref name=":0" /> |
| | | |
| The concept of reservoir computing stems from the use of recursive connections within neural networks to create a complex dynamical system.Schrauwen, Benjamin, David Verstraeten, and Jan Van Campenhout. | | The concept of reservoir computing stems from the use of recursive connections within neural networks to create a complex dynamical system.Schrauwen, Benjamin, David Verstraeten, and Jan Van Campenhout. |
第15行: |
第15行: |
| Proceedings of the European Symposium on Artificial Neural Networks ESANN 2007, pp. 471–482. It is a generalisation of earlier neural network architectures such as recurrent neural networks, liquid-state machines and echo-state networks. Reservoir computing also extends to physical systems that are not networks in the classical sense, but rather continuous systems in space and/or time: e.g. a literal "bucket of water" can serve as a reservoir that performs computations on inputs given as perturbations of the surface. The resultant complexity of such recurrent neural networks was found to be useful in solving a variety of problems including language processing and dynamic system modeling. However, training of recurrent neural networks is challenging and computationally expensive. Reservoir computing reduces those training-related challenges by fixing the dynamics of the reservoir and only training the linear output layer. | | Proceedings of the European Symposium on Artificial Neural Networks ESANN 2007, pp. 471–482. It is a generalisation of earlier neural network architectures such as recurrent neural networks, liquid-state machines and echo-state networks. Reservoir computing also extends to physical systems that are not networks in the classical sense, but rather continuous systems in space and/or time: e.g. a literal "bucket of water" can serve as a reservoir that performs computations on inputs given as perturbations of the surface. The resultant complexity of such recurrent neural networks was found to be useful in solving a variety of problems including language processing and dynamic system modeling. However, training of recurrent neural networks is challenging and computationally expensive. Reservoir computing reduces those training-related challenges by fixing the dynamics of the reservoir and only training the linear output layer. |
| | | |
− | 储备池计算的概念源于神经网络中使用递归连接来创建一个复杂的动力系统。它是对早期神经网络体系结构,比如循环神经网络,液体状态机和回声状态网络的一个推广。储备计算还可以扩展到物理系统,在物理系统中它不是传统意义上的网络,而是空间和/或时间上的连续系统: 例如:。“一桶水”可以看作一个蓄水池,可以对它表面的扰动输入进行计算。循环神经网络内部的复杂性,对于解决包括语言处理和动态系统建模在内的各种问题是有用的。然而,循环神经网络的训练是具有挑战性的,它的计算开销十分巨大。储备池计算通过固定储备池的动力学特性,只训练线性读出层的特点,可以减少循环神经网络在训练上的问题。
| + | 储备池计算的概念源于神经网络中使用递归连接来创建一个复杂的动力系统。<ref name=":0" />它是对早期神经网络体系结构,比如循环神经网络,液体状态机和回声状态网络的一个推广。储备计算还可以扩展到物理系统,在物理系统中它不是传统意义上的网络,而是空间和/或时间上的连续系统: 例如:。“一桶水”可以看作一个蓄水池,可以对它表面的扰动输入进行计算。<ref name=":9" />循环神经网络内部的复杂性,对于解决包括语言处理和动态系统建模在内的各种问题是有用的。<ref name=":0" />然而,循环神经网络的训练是具有挑战性的,它的计算开销十分巨大。<ref name=":0" />储备池计算通过固定储备池的动力学特性,只训练线性读出层的特点,可以减少循环神经网络在训练上的问题。<ref name=":0" /> |
| | | |
| A large variety of nonlinear dynamical systems can serve as a reservoir that performs computations. In recent years semiconductor lasers have attracted considerable interest as computation can be fast and energy efficient compared to electrical components. | | A large variety of nonlinear dynamical systems can serve as a reservoir that performs computations. In recent years semiconductor lasers have attracted considerable interest as computation can be fast and energy efficient compared to electrical components. |
第27行: |
第27行: |
| Recent advances in both AI and quantum information theory have given rise to the concept of quantum neural networks. These hold promise in quantum information processing, which is challenging to classical networks, but can also find application in solving classical problems. In 2018, a physical realization of a quantum reservoir computing architecture was demonstrated in the form of nuclear spins within a molecular solid. However, the nuclear spin experiments in did not demonstrate quantum reservoir computing per se as they did not involve processing of sequential data. Rather the data were vector inputs, which makes this more accurately a demonstration of quantum implementation of a random kitchen sink algorithm (also going by the name of extreme learning machines in some communities). In 2019, another possible implementation of quantum reservoir processors was proposed in the form of two-dimensional fermionic lattices. In 2020, realization of reservoir computing on gate-based quantum computers was proposed and demonstrated on cloud-based IBM superconducting near-term quantum computers. | | Recent advances in both AI and quantum information theory have given rise to the concept of quantum neural networks. These hold promise in quantum information processing, which is challenging to classical networks, but can also find application in solving classical problems. In 2018, a physical realization of a quantum reservoir computing architecture was demonstrated in the form of nuclear spins within a molecular solid. However, the nuclear spin experiments in did not demonstrate quantum reservoir computing per se as they did not involve processing of sequential data. Rather the data were vector inputs, which makes this more accurately a demonstration of quantum implementation of a random kitchen sink algorithm (also going by the name of extreme learning machines in some communities). In 2019, another possible implementation of quantum reservoir processors was proposed in the form of two-dimensional fermionic lattices. In 2020, realization of reservoir computing on gate-based quantum computers was proposed and demonstrated on cloud-based IBM superconducting near-term quantum computers. |
| | | |
− | 人工智能和量子信息理论的最新进展引出了量子神经网络的概念。这些技术在量子信息处理领域具有广阔的应用前景, 量子神经网络正在逐渐挑战经典的网络,同时量子神经网络在解决经典问题方面也具有广阔的应用前景。2018年,一个量子储备池计算架构的物理实现以分子固体中的核自旋的形式被证明。然而,核自旋实验并没有证明量子储备池计算本身,因为它们并不涉及序列数据的处理。相反,当数据是矢量输入时,其更准确地演示了一个随机厨房槽算法的量子实现(在一些社区中也被称为极限学习机制)。2019年,另一种可能的量子库处理器的实现被提出,以二维费米晶格的形式来实现。2020年,在基于门的量子计算机上实现了储备池计算,并在基于云的 IBM 超导近期量子计算机上进行了演示。
| + | 人工智能和量子信息理论的最新进展引出了量子神经网络的概念。<ref name=":2" />这些技术在量子信息处理领域具有广阔的应用前景, 量子神经网络正在逐渐挑战经典的网络,同时量子神经网络在解决经典问题方面也具有广阔的应用前景。<ref name=":2" /><ref name=":3" />2018年,一个量子储备池计算架构的物理实现以分子固体中的核自旋的形式被证明。<ref name=":3" />然而,核自旋实验<ref name=":3" />并没有证明量子储备池计算本身,因为它们并不涉及序列数据的处理。相反,当数据是矢量输入时,其更准确地演示了一个随机厨房槽<ref name="RB08" />算法的量子实现(在一些社区中也被称为极限学习机制)。2019年,另一种可能的量子库处理器的实现被提出,以二维费米晶格的形式来实现。<ref name=":3" />2020年,在基于门的量子计算机上实现了储备池计算,并在基于云的 IBM 超导近期量子计算机上进行了演示。<ref name="JNY20" /> |
| | | |
− | Reservoir computers have been used for [[Time series|time-series]] analysis purposes. In particular, some of their usages involve [[Chaos theory|chaotic]] [[Time series|time-series]] prediction,<ref>{{Cite journal|last1=Pathak|first1=Jaideep|last2=Hunt|first2=Brian|last3=Girvan|first3=Michelle|last4=Lu|first4=Zhixin|last5=Ott|first5=Edward|date=2018-01-12|title=Model-Free Prediction of Large Spatiotemporally Chaotic Systems from Data: A Reservoir Computing Approach|journal=Physical Review Letters|volume=120|issue=2|pages=024102|doi=10.1103/PhysRevLett.120.024102|pmid=29376715|bibcode=2018PhRvL.120b4102P|doi-access=free}}</ref><ref>{{Cite journal|last1=Vlachas|first1=P.R.|last2=Pathak|first2=J.|last3=Hunt|first3=B.R.|last4=Sapsis|first4=T.P.|last5=Girvan|first5=M.|last6=Ott|first6=E.|last7=Koumoutsakos|first7=P.|date=2020-03-21|title=Backpropagation algorithms and Reservoir Computing in Recurrent Neural Networks for the forecasting of complex spatiotemporal dynamics|url=http://dx.doi.org/10.1016/j.neunet.2020.02.016|journal=Neural Networks|volume=126|pages=191–217|doi=10.1016/j.neunet.2020.02.016|pmid=32248008|issn=0893-6080|arxiv=1910.05266|s2cid=211146609}}</ref> separation of [[Chaos theory|chaotic]] signals,<ref>{{Cite journal|last1=Krishnagopal|first1=Sanjukta|last2=Girvan|first2=Michelle|last3=Ott|first3=Edward|last4=Hunt|first4=Brian R.|date=2020-02-01|title=Separation of chaotic signals by reservoir computing|url=https://aip.scitation.org/doi/10.1063/1.5132766|journal=Chaos: An Interdisciplinary Journal of Nonlinear Science|volume=30|issue=2|pages=023123|doi=10.1063/1.5132766|pmid=32113243|issn=1054-1500|arxiv=1910.10080|bibcode=2020Chaos..30b3123K|s2cid=204823815}}</ref> and link inference of [[Network theory|networks]] from their dynamics.<ref>{{Cite journal|last1=Banerjee|first1=Amitava|last2=Hart|first2=Joseph D.|last3=Roy|first3=Rajarshi|last4=Ott|first4=Edward|date=2021-07-20|title=Machine Learning Link Inference of Noisy Delay-Coupled Networks with Optoelectronic Experimental Tests|journal=Physical Review X|volume=11|issue=3|pages=031014|doi=10.1103/PhysRevX.11.031014|arxiv=2010.15289|bibcode=2021PhRvX..11c1014B|doi-access=free}}</ref> | + | Reservoir computers have been used for [[Time series|time-series]] analysis purposes. In particular, some of their usages involve [[Chaos theory|chaotic]] [[Time series|time-series]] prediction,<ref name=":10">{{Cite journal|last1=Pathak|first1=Jaideep|last2=Hunt|first2=Brian|last3=Girvan|first3=Michelle|last4=Lu|first4=Zhixin|last5=Ott|first5=Edward|date=2018-01-12|title=Model-Free Prediction of Large Spatiotemporally Chaotic Systems from Data: A Reservoir Computing Approach|journal=Physical Review Letters|volume=120|issue=2|pages=024102|doi=10.1103/PhysRevLett.120.024102|pmid=29376715|bibcode=2018PhRvL.120b4102P|doi-access=free}}</ref><ref name=":11">{{Cite journal|last1=Vlachas|first1=P.R.|last2=Pathak|first2=J.|last3=Hunt|first3=B.R.|last4=Sapsis|first4=T.P.|last5=Girvan|first5=M.|last6=Ott|first6=E.|last7=Koumoutsakos|first7=P.|date=2020-03-21|title=Backpropagation algorithms and Reservoir Computing in Recurrent Neural Networks for the forecasting of complex spatiotemporal dynamics|url=http://dx.doi.org/10.1016/j.neunet.2020.02.016|journal=Neural Networks|volume=126|pages=191–217|doi=10.1016/j.neunet.2020.02.016|pmid=32248008|issn=0893-6080|arxiv=1910.05266|s2cid=211146609}}</ref> separation of [[Chaos theory|chaotic]] signals,<ref name=":12">{{Cite journal|last1=Krishnagopal|first1=Sanjukta|last2=Girvan|first2=Michelle|last3=Ott|first3=Edward|last4=Hunt|first4=Brian R.|date=2020-02-01|title=Separation of chaotic signals by reservoir computing|url=https://aip.scitation.org/doi/10.1063/1.5132766|journal=Chaos: An Interdisciplinary Journal of Nonlinear Science|volume=30|issue=2|pages=023123|doi=10.1063/1.5132766|pmid=32113243|issn=1054-1500|arxiv=1910.10080|bibcode=2020Chaos..30b3123K|s2cid=204823815}}</ref> and link inference of [[Network theory|networks]] from their dynamics.<ref name=":13">{{Cite journal|last1=Banerjee|first1=Amitava|last2=Hart|first2=Joseph D.|last3=Roy|first3=Rajarshi|last4=Ott|first4=Edward|date=2021-07-20|title=Machine Learning Link Inference of Noisy Delay-Coupled Networks with Optoelectronic Experimental Tests|journal=Physical Review X|volume=11|issue=3|pages=031014|doi=10.1103/PhysRevX.11.031014|arxiv=2010.15289|bibcode=2021PhRvX..11c1014B|doi-access=free}}</ref> |
| | | |
| Reservoir computers have been used for time-series analysis purposes. In particular, some of their usages involve chaotic time-series prediction, separation of chaotic signals, and link inference of networks from their dynamics. | | Reservoir computers have been used for time-series analysis purposes. In particular, some of their usages involve chaotic time-series prediction, separation of chaotic signals, and link inference of networks from their dynamics. |
| | | |
− | 储备池计算已经被用于时间序列分析。特别是在混沌时间序列预测、混沌信号分离、网络动力学链路推理等方面的应用。
| + | 储备池计算已经被用于时间序列分析。特别是在混沌时间序列预测<ref name=":10" /><ref name=":11" />、混沌信号分离<ref name=":12" />、网络动力学链路推理等方面的应用。<ref name=":13" /> |
| | | |
| == Classical reservoir computing == | | == Classical reservoir computing == |
第46行: |
第46行: |
| The 'reservoir' in reservoir computing is the internal structure of the computer, and must have two properties: it must be made up of individual, non-linear units, and it must be capable of storing information. The non-linearity describes the response of each unit to input, which is what allows reservoir computers to solve complex problems. Reservoirs are able to store information by connecting the units in recurrent loops, where the previous input affects the next response. The change in reaction due to the past allows the computers to be trained to complete specific tasks. | | The 'reservoir' in reservoir computing is the internal structure of the computer, and must have two properties: it must be made up of individual, non-linear units, and it must be capable of storing information. The non-linearity describes the response of each unit to input, which is what allows reservoir computers to solve complex problems. Reservoirs are able to store information by connecting the units in recurrent loops, where the previous input affects the next response. The change in reaction due to the past allows the computers to be trained to complete specific tasks. |
| | | |
− | 储备池计算中的“储备池”是这个计算机的内部结构,必须具有两个特性: 第一个特性是必须由多个独立的的非线性单元组成,第二个特性是必须能够存储信息。非线性特性描述了每个单元对输入的响应,这使得储备池计算机能够解决复杂的问题。储备池能够通过循环回路中的每个单元的连接来储存信息,其中上一个输入影响下一个响应。响应的历史变化允许计算机被训练来完成特定的任务。 | + | 储备池计算中的“储备池”是这个计算机的内部结构,必须具有两个特性: 第一个特性是必须由多个独立的的非线性单元组成,第二个特性是必须能够存储信息。非线性特性描述了每个单元对输入的响应,这使得储备池计算机能够解决复杂的问题。储备池能够通过循环回路中的每个单元的连接来储存信息,其中上一个输入影响下一个响应。响应的历史变化允许计算机被训练来完成特定的任务。<ref name=":1" /> |
| | | |
| Reservoirs can be virtual or physical.<ref name=":1" /> Virtual reservoirs are typically randomly generated and are designed like neural networks.<ref name=":1" /><ref name=":0" /> Virtual reservoirs can be designed to have non-linearity and recurrent loops, but, unlike neural networks, the connections between units are randomized and remain unchanged throughout computation.<ref name=":1" /> Physical reservoirs are possible because of the inherent non-linearity of certain natural systems. The interaction between ripples on the surface of water contains the nonlinear dynamics required in reservoir creation, and a pattern recognition RC was developed by first inputting ripples with electric motors then recording and analyzing the ripples in the readout.<ref name=":4" /> | | Reservoirs can be virtual or physical.<ref name=":1" /> Virtual reservoirs are typically randomly generated and are designed like neural networks.<ref name=":1" /><ref name=":0" /> Virtual reservoirs can be designed to have non-linearity and recurrent loops, but, unlike neural networks, the connections between units are randomized and remain unchanged throughout computation.<ref name=":1" /> Physical reservoirs are possible because of the inherent non-linearity of certain natural systems. The interaction between ripples on the surface of water contains the nonlinear dynamics required in reservoir creation, and a pattern recognition RC was developed by first inputting ripples with electric motors then recording and analyzing the ripples in the readout.<ref name=":4" /> |
第52行: |
第52行: |
| Reservoirs can be virtual or physical. Virtual reservoirs are typically randomly generated and are designed like neural networks. Virtual reservoirs can be designed to have non-linearity and recurrent loops, but, unlike neural networks, the connections between units are randomized and remain unchanged throughout computation. Physical reservoirs are possible because of the inherent non-linearity of certain natural systems. The interaction between ripples on the surface of water contains the nonlinear dynamics required in reservoir creation, and a pattern recognition RC was developed by first inputting ripples with electric motors then recording and analyzing the ripples in the readout. | | Reservoirs can be virtual or physical. Virtual reservoirs are typically randomly generated and are designed like neural networks. Virtual reservoirs can be designed to have non-linearity and recurrent loops, but, unlike neural networks, the connections between units are randomized and remain unchanged throughout computation. Physical reservoirs are possible because of the inherent non-linearity of certain natural systems. The interaction between ripples on the surface of water contains the nonlinear dynamics required in reservoir creation, and a pattern recognition RC was developed by first inputting ripples with electric motors then recording and analyzing the ripples in the readout. |
| | | |
− | 储备池可以是虚拟的,也可以是物理实现的。虚拟的储备池通常是随机产生的,设计类似于神经网络。它可以设计成具有非线性且具有循环回路,但是,与神经网络不同,单元之间的连接是随机的,并且在整个计算过程中保持不变。由于某些自然系统固有的非线性,物理储备池是可能存在的。水面波纹之间的相互作用包含了储备池的形成所需的非线性动力学,通过电动机输入波纹,然后对读出的波纹进行记录和分析,建立了模式识别 RC(模式识别储备池计算)。
| + | 储备池可以是虚拟的,也可以是物理实现的。<ref name=":1" />虚拟的储备池通常是随机产生的,设计类似于神经网络。<ref name=":1" /><ref name=":0" />它可以设计成具有非线性且具有循环回路,但是,与神经网络不同,单元之间的连接是随机的,并且在整个计算过程中保持不变。<ref name=":1" />由于某些自然系统固有的非线性,物理储备池是可能存在的。水面波纹之间的相互作用包含了储备池的形成所需的非线性动力学,通过电动机输入波纹,然后对读出的波纹进行记录和分析,建立了模式识别 RC(模式识别储备池计算)。<ref name=":4" /> |
| | | |
| === Readout === | | === Readout === |
第64行: |
第64行: |
| The readout is a neural network layer that performs a linear transformation on the output of the reservoir. The weights of the readout layer are trained by analyzing the spatiotemporal patterns of the reservoir after excitation by known inputs, and by utilizing a training method such as a linear regression or a Ridge regression. As its implementation depends on spatiotemporal reservoir patterns, the details of readout methods are tailored to each type of reservoir. For example, the readout for a reservoir computer using a container of liquid as its reservoir might entail observing spatiotemporal patterns on the surface of the liquid. | | The readout is a neural network layer that performs a linear transformation on the output of the reservoir. The weights of the readout layer are trained by analyzing the spatiotemporal patterns of the reservoir after excitation by known inputs, and by utilizing a training method such as a linear regression or a Ridge regression. As its implementation depends on spatiotemporal reservoir patterns, the details of readout methods are tailored to each type of reservoir. For example, the readout for a reservoir computer using a container of liquid as its reservoir might entail observing spatiotemporal patterns on the surface of the liquid. |
| | | |
− | 读出层是神经网络的一个层,它对储备池的输出进行一个线性映射。储备池在已知输入刺激后,通过分析储备池的时空模式,以及利用线性回归或岭回归等训练方法,对读出层的权重进行训练。由于这个实现取决于时空储存器模式,所以读出权重训练的细节是针对每种储备池型量身定制的。例如,使用液态容器作为储备池的储备池计算机,其读出可能需要观察液体表面的时空模式。
| + | 读出层是神经网络的一个层,它对储备池的输出进行一个线性映射。<ref name=":4" />储备池在已知输入刺激后,通过分析储备池的时空模式,以及利用线性回归或岭回归等训练方法,对读出层的权重进行训练。<ref name=":4" />由于这个实现取决于时空储存器模式,所以读出权重训练的细节是针对每种储备池型量身定制的。<ref name=":4" />例如,使用液态容器作为储备池的储备池计算机,其读出可能需要观察液体表面的时空模式。<ref name=":4" /> |
| | | |
| === Types === | | === Types === |