第21行: |
第21行: |
| A large variety of nonlinear dynamical systems can serve as a reservoir that performs computations. In recent years semiconductor lasers have attracted considerable interest as computation can be fast and energy efficient compared to electrical components. | | A large variety of nonlinear dynamical systems can serve as a reservoir that performs computations. In recent years semiconductor lasers have attracted considerable interest as computation can be fast and energy efficient compared to electrical components. |
| | | |
− | 各种各样的非线性动力系统可以作为一个水库,执行计算。近年来,半导体激光器引起了人们的极大兴趣,因为与电子元件相比,半导体激光器的运算速度更快,能量效率更高。
| + | 各种各样的非线性动力系统可以看作一个储备池来进行计算。近年来,半导体激光器引起了人们的极大兴趣,因为与电子元件相比,半导体激光器的运算速度更快,能量效率更高。 |
| | | |
| Recent advances in both AI and quantum information theory have given rise to the concept of [[quantum neural networks]].<ref name=":2" /> These hold promise in quantum information processing, which is challenging to classical networks, but can also find application in solving classical problems.<ref name=":2">{{Cite journal|last1=Ghosh|first1=Sanjib|last2=Opala|first2=Andrzej|last3=Matuszewski|first3=Michał|last4=Paterek|first4=Tomasz|last5=Liew|first5=Timothy C. H.|date=December 2019|title=Quantum reservoir processing|arxiv=1811.10335|journal=NPJ Quantum Information|volume=5|issue=1|pages=35|doi=10.1038/s41534-019-0149-8|bibcode=2019npjQI...5...35G|s2cid=119197635|issn=2056-6387}}</ref><ref name=":3">{{cite arXiv|last1=Negoro|first1=Makoto|last2=Mitarai|first2=Kosuke|last3=Fujii|first3=Keisuke|last4=Nakajima|first4=Kohei|last5=Kitagawa|first5=Masahiro|date=2018-06-28|title=Machine learning with controllable quantum dynamics of a nuclear spin ensemble in a solid|class=quant-ph|eprint=1806.10910}}</ref> In 2018, a physical realization of a quantum reservoir computing architecture was demonstrated in the form of nuclear spins within a molecular solid.<ref name=":3" /> However, the nuclear spin experiments in <ref name=":3" /> did not demonstrate quantum reservoir computing per se as they did not involve processing of sequential data. Rather the data were vector inputs, which makes this more accurately a demonstration of quantum implementation of a [[random kitchen sink]]<ref name="RB08">{{cite journal|last1=Rahimi|first1=Ali|last2=Recht|first2=Benjamin|date=December 2008|title=Weighted Sums of Random Kitchen Sinks: Replacing minimization with randomization in Learning|journal=NIPS'08: Proceedings of the 21st International Conference on Neural Information Processing Systems|url=http://papers.nips.cc/paper/3495-weighted-sums-of-random-kitchen-sinks-replacing-minimization-with-randomization-in-learning.pdf|pages=1313–1320}}</ref> algorithm (also going by the name of [[extreme learning machine]]s in some communities). In 2019, another possible implementation of quantum reservoir processors was proposed in the form of two-dimensional fermionic lattices.<ref name=":3" /> In 2020, realization of reservoir computing on gate-based quantum computers was proposed and demonstrated on cloud-based IBM superconducting near-term quantum computers.<ref name="JNY20">{{cite journal|last1=Chen|first1=Jiayin|last2=Nurdin|first2=Hendra|last3=Yamamoto|first3=Naoki|title=Temporal Information Processing on Noisy Quantum Computers|journal=Physical Review Applied|volume=14|pages=024065|date=2020-08-24|issue=2|doi=10.1103/PhysRevApplied.14.024065|arxiv=2001.09498|bibcode=2020PhRvP..14b4065C|s2cid=210920543|url=https://doi.org/10.1103/PhysRevApplied.14.024065}}</ref> | | Recent advances in both AI and quantum information theory have given rise to the concept of [[quantum neural networks]].<ref name=":2" /> These hold promise in quantum information processing, which is challenging to classical networks, but can also find application in solving classical problems.<ref name=":2">{{Cite journal|last1=Ghosh|first1=Sanjib|last2=Opala|first2=Andrzej|last3=Matuszewski|first3=Michał|last4=Paterek|first4=Tomasz|last5=Liew|first5=Timothy C. H.|date=December 2019|title=Quantum reservoir processing|arxiv=1811.10335|journal=NPJ Quantum Information|volume=5|issue=1|pages=35|doi=10.1038/s41534-019-0149-8|bibcode=2019npjQI...5...35G|s2cid=119197635|issn=2056-6387}}</ref><ref name=":3">{{cite arXiv|last1=Negoro|first1=Makoto|last2=Mitarai|first2=Kosuke|last3=Fujii|first3=Keisuke|last4=Nakajima|first4=Kohei|last5=Kitagawa|first5=Masahiro|date=2018-06-28|title=Machine learning with controllable quantum dynamics of a nuclear spin ensemble in a solid|class=quant-ph|eprint=1806.10910}}</ref> In 2018, a physical realization of a quantum reservoir computing architecture was demonstrated in the form of nuclear spins within a molecular solid.<ref name=":3" /> However, the nuclear spin experiments in <ref name=":3" /> did not demonstrate quantum reservoir computing per se as they did not involve processing of sequential data. Rather the data were vector inputs, which makes this more accurately a demonstration of quantum implementation of a [[random kitchen sink]]<ref name="RB08">{{cite journal|last1=Rahimi|first1=Ali|last2=Recht|first2=Benjamin|date=December 2008|title=Weighted Sums of Random Kitchen Sinks: Replacing minimization with randomization in Learning|journal=NIPS'08: Proceedings of the 21st International Conference on Neural Information Processing Systems|url=http://papers.nips.cc/paper/3495-weighted-sums-of-random-kitchen-sinks-replacing-minimization-with-randomization-in-learning.pdf|pages=1313–1320}}</ref> algorithm (also going by the name of [[extreme learning machine]]s in some communities). In 2019, another possible implementation of quantum reservoir processors was proposed in the form of two-dimensional fermionic lattices.<ref name=":3" /> In 2020, realization of reservoir computing on gate-based quantum computers was proposed and demonstrated on cloud-based IBM superconducting near-term quantum computers.<ref name="JNY20">{{cite journal|last1=Chen|first1=Jiayin|last2=Nurdin|first2=Hendra|last3=Yamamoto|first3=Naoki|title=Temporal Information Processing on Noisy Quantum Computers|journal=Physical Review Applied|volume=14|pages=024065|date=2020-08-24|issue=2|doi=10.1103/PhysRevApplied.14.024065|arxiv=2001.09498|bibcode=2020PhRvP..14b4065C|s2cid=210920543|url=https://doi.org/10.1103/PhysRevApplied.14.024065}}</ref> |
第27行: |
第27行: |
| Recent advances in both AI and quantum information theory have given rise to the concept of quantum neural networks. These hold promise in quantum information processing, which is challenging to classical networks, but can also find application in solving classical problems. In 2018, a physical realization of a quantum reservoir computing architecture was demonstrated in the form of nuclear spins within a molecular solid. However, the nuclear spin experiments in did not demonstrate quantum reservoir computing per se as they did not involve processing of sequential data. Rather the data were vector inputs, which makes this more accurately a demonstration of quantum implementation of a random kitchen sink algorithm (also going by the name of extreme learning machines in some communities). In 2019, another possible implementation of quantum reservoir processors was proposed in the form of two-dimensional fermionic lattices. In 2020, realization of reservoir computing on gate-based quantum computers was proposed and demonstrated on cloud-based IBM superconducting near-term quantum computers. | | Recent advances in both AI and quantum information theory have given rise to the concept of quantum neural networks. These hold promise in quantum information processing, which is challenging to classical networks, but can also find application in solving classical problems. In 2018, a physical realization of a quantum reservoir computing architecture was demonstrated in the form of nuclear spins within a molecular solid. However, the nuclear spin experiments in did not demonstrate quantum reservoir computing per se as they did not involve processing of sequential data. Rather the data were vector inputs, which makes this more accurately a demonstration of quantum implementation of a random kitchen sink algorithm (also going by the name of extreme learning machines in some communities). In 2019, another possible implementation of quantum reservoir processors was proposed in the form of two-dimensional fermionic lattices. In 2020, realization of reservoir computing on gate-based quantum computers was proposed and demonstrated on cloud-based IBM superconducting near-term quantum computers. |
| | | |
− | 人工智能和量子信息理论的最新进展引出了量子神经网络的概念。这些技术在量子信息处理领域具有广阔的应用前景,而量子信息处理是对经典网络的挑战,同时在解决经典问题方面也具有广阔的应用前景。2018年,一个量子库计算架构的物理实现被证明是以分子固体中的核自旋的形式出现的。然而,年的核自旋实验并没有证明量子库计算本身,因为它们并不涉及序列数据的处理。相反,数据是矢量输入,这使得这更准确地演示了一个随机厨房水槽算法的量子实现(在一些社区中也使用极端学习机的名称)。2019年,另一种可能的量子库处理器的实现被提出为二维费米晶格的形式。2020年,在基于门的量子计算机上实现了蓄电池计算,并在基于云的 IBM 超导近期量子计算机上进行了演示。
| + | 人工智能和量子信息理论的最新进展引出了量子神经网络的概念。这些技术在量子信息处理领域具有广阔的应用前景, 量子神经网络正在开始挑战经典的网络,同时量子神经网络在解决经典问题方面也具有广阔的应用前景。2018年,一个量子储备池计算架构的物理实现以分子固体中的核自旋的形式被证明。然而,核自旋实验并没有证明量子储备池计算本身,因为它们并不涉及序列数据的处理。相反,当数据是矢量输入时,其更准确地演示了一个随机厨房槽算法的量子实现(在一些社区中也被称为极限学习机制)。2019年,另一种可能的量子库处理器的实现被提出,以二维费米晶格的形式来实现。2020年,在基于门的量子计算机上实现了储备池计算,并在基于云的 IBM 超导近期量子计算机上进行了演示。 |
| | | |
| Reservoir computers have been used for [[Time series|time-series]] analysis purposes. In particular, some of their usages involve [[Chaos theory|chaotic]] [[Time series|time-series]] prediction,<ref>{{Cite journal|last1=Pathak|first1=Jaideep|last2=Hunt|first2=Brian|last3=Girvan|first3=Michelle|last4=Lu|first4=Zhixin|last5=Ott|first5=Edward|date=2018-01-12|title=Model-Free Prediction of Large Spatiotemporally Chaotic Systems from Data: A Reservoir Computing Approach|journal=Physical Review Letters|volume=120|issue=2|pages=024102|doi=10.1103/PhysRevLett.120.024102|pmid=29376715|bibcode=2018PhRvL.120b4102P|doi-access=free}}</ref><ref>{{Cite journal|last1=Vlachas|first1=P.R.|last2=Pathak|first2=J.|last3=Hunt|first3=B.R.|last4=Sapsis|first4=T.P.|last5=Girvan|first5=M.|last6=Ott|first6=E.|last7=Koumoutsakos|first7=P.|date=2020-03-21|title=Backpropagation algorithms and Reservoir Computing in Recurrent Neural Networks for the forecasting of complex spatiotemporal dynamics|url=http://dx.doi.org/10.1016/j.neunet.2020.02.016|journal=Neural Networks|volume=126|pages=191–217|doi=10.1016/j.neunet.2020.02.016|pmid=32248008|issn=0893-6080|arxiv=1910.05266|s2cid=211146609}}</ref> separation of [[Chaos theory|chaotic]] signals,<ref>{{Cite journal|last1=Krishnagopal|first1=Sanjukta|last2=Girvan|first2=Michelle|last3=Ott|first3=Edward|last4=Hunt|first4=Brian R.|date=2020-02-01|title=Separation of chaotic signals by reservoir computing|url=https://aip.scitation.org/doi/10.1063/1.5132766|journal=Chaos: An Interdisciplinary Journal of Nonlinear Science|volume=30|issue=2|pages=023123|doi=10.1063/1.5132766|pmid=32113243|issn=1054-1500|arxiv=1910.10080|bibcode=2020Chaos..30b3123K|s2cid=204823815}}</ref> and link inference of [[Network theory|networks]] from their dynamics.<ref>{{Cite journal|last1=Banerjee|first1=Amitava|last2=Hart|first2=Joseph D.|last3=Roy|first3=Rajarshi|last4=Ott|first4=Edward|date=2021-07-20|title=Machine Learning Link Inference of Noisy Delay-Coupled Networks with Optoelectronic Experimental Tests|journal=Physical Review X|volume=11|issue=3|pages=031014|doi=10.1103/PhysRevX.11.031014|arxiv=2010.15289|bibcode=2021PhRvX..11c1014B|doi-access=free}}</ref> | | Reservoir computers have been used for [[Time series|time-series]] analysis purposes. In particular, some of their usages involve [[Chaos theory|chaotic]] [[Time series|time-series]] prediction,<ref>{{Cite journal|last1=Pathak|first1=Jaideep|last2=Hunt|first2=Brian|last3=Girvan|first3=Michelle|last4=Lu|first4=Zhixin|last5=Ott|first5=Edward|date=2018-01-12|title=Model-Free Prediction of Large Spatiotemporally Chaotic Systems from Data: A Reservoir Computing Approach|journal=Physical Review Letters|volume=120|issue=2|pages=024102|doi=10.1103/PhysRevLett.120.024102|pmid=29376715|bibcode=2018PhRvL.120b4102P|doi-access=free}}</ref><ref>{{Cite journal|last1=Vlachas|first1=P.R.|last2=Pathak|first2=J.|last3=Hunt|first3=B.R.|last4=Sapsis|first4=T.P.|last5=Girvan|first5=M.|last6=Ott|first6=E.|last7=Koumoutsakos|first7=P.|date=2020-03-21|title=Backpropagation algorithms and Reservoir Computing in Recurrent Neural Networks for the forecasting of complex spatiotemporal dynamics|url=http://dx.doi.org/10.1016/j.neunet.2020.02.016|journal=Neural Networks|volume=126|pages=191–217|doi=10.1016/j.neunet.2020.02.016|pmid=32248008|issn=0893-6080|arxiv=1910.05266|s2cid=211146609}}</ref> separation of [[Chaos theory|chaotic]] signals,<ref>{{Cite journal|last1=Krishnagopal|first1=Sanjukta|last2=Girvan|first2=Michelle|last3=Ott|first3=Edward|last4=Hunt|first4=Brian R.|date=2020-02-01|title=Separation of chaotic signals by reservoir computing|url=https://aip.scitation.org/doi/10.1063/1.5132766|journal=Chaos: An Interdisciplinary Journal of Nonlinear Science|volume=30|issue=2|pages=023123|doi=10.1063/1.5132766|pmid=32113243|issn=1054-1500|arxiv=1910.10080|bibcode=2020Chaos..30b3123K|s2cid=204823815}}</ref> and link inference of [[Network theory|networks]] from their dynamics.<ref>{{Cite journal|last1=Banerjee|first1=Amitava|last2=Hart|first2=Joseph D.|last3=Roy|first3=Rajarshi|last4=Ott|first4=Edward|date=2021-07-20|title=Machine Learning Link Inference of Noisy Delay-Coupled Networks with Optoelectronic Experimental Tests|journal=Physical Review X|volume=11|issue=3|pages=031014|doi=10.1103/PhysRevX.11.031014|arxiv=2010.15289|bibcode=2021PhRvX..11c1014B|doi-access=free}}</ref> |
第33行: |
第33行: |
| Reservoir computers have been used for time-series analysis purposes. In particular, some of their usages involve chaotic time-series prediction, separation of chaotic signals, and link inference of networks from their dynamics. | | Reservoir computers have been used for time-series analysis purposes. In particular, some of their usages involve chaotic time-series prediction, separation of chaotic signals, and link inference of networks from their dynamics. |
| | | |
− | 储层计算机已用于时间序列分析的目的。特别是混沌时间序列预测、混沌信号分离、网络动力学链路推理等方面的应用。
| + | 储备池计算已经被用于时间序列分析。特别是在混沌时间序列预测、混沌信号分离、网络动力学链路推理等方面的应用。 |
| | | |
| == Classical reservoir computing == | | == Classical reservoir computing == |
第46行: |
第46行: |
| The 'reservoir' in reservoir computing is the internal structure of the computer, and must have two properties: it must be made up of individual, non-linear units, and it must be capable of storing information. The non-linearity describes the response of each unit to input, which is what allows reservoir computers to solve complex problems. Reservoirs are able to store information by connecting the units in recurrent loops, where the previous input affects the next response. The change in reaction due to the past allows the computers to be trained to complete specific tasks. | | The 'reservoir' in reservoir computing is the internal structure of the computer, and must have two properties: it must be made up of individual, non-linear units, and it must be capable of storing information. The non-linearity describes the response of each unit to input, which is what allows reservoir computers to solve complex problems. Reservoirs are able to store information by connecting the units in recurrent loops, where the previous input affects the next response. The change in reaction due to the past allows the computers to be trained to complete specific tasks. |
| | | |
− | 油藏计算中的“油藏”是计算机的内部结构,必须具有两个特性: 它必须由单个的非线性单元组成,并且必须能够存储信息。非线性描述了每个单元对输入的响应,这使得油藏计算机能够解决复杂的问题。水库能够储存的信息,连接单位在循环回路,其中前一个输入影响下一个响应。由于过去的反应的变化允许计算机被训练来完成特定的任务。
| + | 储备池计算中的“储备池”是这个计算框架的内部结构,必须具有两个特性: 它必须由多个独立的的非线性单元组成,并且必须能够存储信息。非线性特性描述了每个单元对输入的响应,这使得储备池计算框架能够解决复杂的问题。储备池能够通过循环回路中的每个单元的连接来储存信息,其中上一个输入影响下一个响应。响应的历史变化允许计算机被训练来完成特定的任务。 |
| | | |
| Reservoirs can be virtual or physical.<ref name=":1" /> Virtual reservoirs are typically randomly generated and are designed like neural networks.<ref name=":1" /><ref name=":0" /> Virtual reservoirs can be designed to have non-linearity and recurrent loops, but, unlike neural networks, the connections between units are randomized and remain unchanged throughout computation.<ref name=":1" /> Physical reservoirs are possible because of the inherent non-linearity of certain natural systems. The interaction between ripples on the surface of water contains the nonlinear dynamics required in reservoir creation, and a pattern recognition RC was developed by first inputting ripples with electric motors then recording and analyzing the ripples in the readout.<ref name=":4" /> | | Reservoirs can be virtual or physical.<ref name=":1" /> Virtual reservoirs are typically randomly generated and are designed like neural networks.<ref name=":1" /><ref name=":0" /> Virtual reservoirs can be designed to have non-linearity and recurrent loops, but, unlike neural networks, the connections between units are randomized and remain unchanged throughout computation.<ref name=":1" /> Physical reservoirs are possible because of the inherent non-linearity of certain natural systems. The interaction between ripples on the surface of water contains the nonlinear dynamics required in reservoir creation, and a pattern recognition RC was developed by first inputting ripples with electric motors then recording and analyzing the ripples in the readout.<ref name=":4" /> |