更改

大小无更改 、 2022年4月24日 (日) 19:53
第27行: 第27行:  
Recent advances in both AI and quantum information theory have given rise to the concept of quantum neural networks. These hold promise in quantum information processing, which is challenging to classical networks, but can also find application in solving classical problems.  In 2018, a physical realization of a quantum reservoir computing architecture was demonstrated in the form of nuclear spins within a molecular solid. However, the nuclear spin experiments in  did not demonstrate quantum reservoir computing per se as they did not involve processing of sequential data. Rather the data were vector inputs, which makes this more accurately a demonstration of quantum implementation of a random kitchen sink algorithm (also going by the name of extreme learning machines in some communities). In 2019, another possible implementation of quantum reservoir processors was proposed in the form of two-dimensional fermionic lattices. In 2020, realization of reservoir computing on gate-based quantum computers was proposed and demonstrated on cloud-based IBM superconducting near-term quantum computers.
 
Recent advances in both AI and quantum information theory have given rise to the concept of quantum neural networks. These hold promise in quantum information processing, which is challenging to classical networks, but can also find application in solving classical problems.  In 2018, a physical realization of a quantum reservoir computing architecture was demonstrated in the form of nuclear spins within a molecular solid. However, the nuclear spin experiments in  did not demonstrate quantum reservoir computing per se as they did not involve processing of sequential data. Rather the data were vector inputs, which makes this more accurately a demonstration of quantum implementation of a random kitchen sink algorithm (also going by the name of extreme learning machines in some communities). In 2019, another possible implementation of quantum reservoir processors was proposed in the form of two-dimensional fermionic lattices. In 2020, realization of reservoir computing on gate-based quantum computers was proposed and demonstrated on cloud-based IBM superconducting near-term quantum computers.
   −
人工智能和量子信息理论的最新进展引出了量子神经网络的概念。这些技术在量子信息处理领域具有广阔的应用前景, 量子神经网络正在开始挑战经典的网络,同时量子神经网络在解决经典问题方面也具有广阔的应用前景。2018年,一个量子储备池计算架构的物理实现以分子固体中的核自旋的形式被证明。然而,核自旋实验并没有证明量子储备池计算本身,因为它们并不涉及序列数据的处理。相反,当数据是矢量输入时,其更准确地演示了一个随机厨房槽算法的量子实现(在一些社区中也被称为极限学习机制)。2019年,另一种可能的量子库处理器的实现被提出,以二维费米晶格的形式来实现。2020年,在基于门的量子计算机上实现了储备池计算,并在基于云的 IBM 超导近期量子计算机上进行了演示。
+
人工智能和量子信息理论的最新进展引出了量子神经网络的概念。这些技术在量子信息处理领域具有广阔的应用前景, 量子神经网络正在逐渐挑战经典的网络,同时量子神经网络在解决经典问题方面也具有广阔的应用前景。2018年,一个量子储备池计算架构的物理实现以分子固体中的核自旋的形式被证明。然而,核自旋实验并没有证明量子储备池计算本身,因为它们并不涉及序列数据的处理。相反,当数据是矢量输入时,其更准确地演示了一个随机厨房槽算法的量子实现(在一些社区中也被称为极限学习机制)。2019年,另一种可能的量子库处理器的实现被提出,以二维费米晶格的形式来实现。2020年,在基于门的量子计算机上实现了储备池计算,并在基于云的 IBM 超导近期量子计算机上进行了演示。
    
Reservoir computers have been used for [[Time series|time-series]] analysis purposes. In particular, some of their usages involve [[Chaos theory|chaotic]] [[Time series|time-series]] prediction,<ref>{{Cite journal|last1=Pathak|first1=Jaideep|last2=Hunt|first2=Brian|last3=Girvan|first3=Michelle|last4=Lu|first4=Zhixin|last5=Ott|first5=Edward|date=2018-01-12|title=Model-Free Prediction of Large Spatiotemporally Chaotic Systems from Data: A Reservoir Computing Approach|journal=Physical Review Letters|volume=120|issue=2|pages=024102|doi=10.1103/PhysRevLett.120.024102|pmid=29376715|bibcode=2018PhRvL.120b4102P|doi-access=free}}</ref><ref>{{Cite journal|last1=Vlachas|first1=P.R.|last2=Pathak|first2=J.|last3=Hunt|first3=B.R.|last4=Sapsis|first4=T.P.|last5=Girvan|first5=M.|last6=Ott|first6=E.|last7=Koumoutsakos|first7=P.|date=2020-03-21|title=Backpropagation algorithms and Reservoir Computing in Recurrent Neural Networks for the forecasting of complex spatiotemporal dynamics|url=http://dx.doi.org/10.1016/j.neunet.2020.02.016|journal=Neural Networks|volume=126|pages=191–217|doi=10.1016/j.neunet.2020.02.016|pmid=32248008|issn=0893-6080|arxiv=1910.05266|s2cid=211146609}}</ref> separation of [[Chaos theory|chaotic]] signals,<ref>{{Cite journal|last1=Krishnagopal|first1=Sanjukta|last2=Girvan|first2=Michelle|last3=Ott|first3=Edward|last4=Hunt|first4=Brian R.|date=2020-02-01|title=Separation of chaotic signals by reservoir computing|url=https://aip.scitation.org/doi/10.1063/1.5132766|journal=Chaos: An Interdisciplinary Journal of Nonlinear Science|volume=30|issue=2|pages=023123|doi=10.1063/1.5132766|pmid=32113243|issn=1054-1500|arxiv=1910.10080|bibcode=2020Chaos..30b3123K|s2cid=204823815}}</ref> and link inference of [[Network theory|networks]] from their dynamics.<ref>{{Cite journal|last1=Banerjee|first1=Amitava|last2=Hart|first2=Joseph D.|last3=Roy|first3=Rajarshi|last4=Ott|first4=Edward|date=2021-07-20|title=Machine Learning Link Inference of Noisy Delay-Coupled Networks with Optoelectronic Experimental Tests|journal=Physical Review X|volume=11|issue=3|pages=031014|doi=10.1103/PhysRevX.11.031014|arxiv=2010.15289|bibcode=2021PhRvX..11c1014B|doi-access=free}}</ref>
 
Reservoir computers have been used for [[Time series|time-series]] analysis purposes. In particular, some of their usages involve [[Chaos theory|chaotic]] [[Time series|time-series]] prediction,<ref>{{Cite journal|last1=Pathak|first1=Jaideep|last2=Hunt|first2=Brian|last3=Girvan|first3=Michelle|last4=Lu|first4=Zhixin|last5=Ott|first5=Edward|date=2018-01-12|title=Model-Free Prediction of Large Spatiotemporally Chaotic Systems from Data: A Reservoir Computing Approach|journal=Physical Review Letters|volume=120|issue=2|pages=024102|doi=10.1103/PhysRevLett.120.024102|pmid=29376715|bibcode=2018PhRvL.120b4102P|doi-access=free}}</ref><ref>{{Cite journal|last1=Vlachas|first1=P.R.|last2=Pathak|first2=J.|last3=Hunt|first3=B.R.|last4=Sapsis|first4=T.P.|last5=Girvan|first5=M.|last6=Ott|first6=E.|last7=Koumoutsakos|first7=P.|date=2020-03-21|title=Backpropagation algorithms and Reservoir Computing in Recurrent Neural Networks for the forecasting of complex spatiotemporal dynamics|url=http://dx.doi.org/10.1016/j.neunet.2020.02.016|journal=Neural Networks|volume=126|pages=191–217|doi=10.1016/j.neunet.2020.02.016|pmid=32248008|issn=0893-6080|arxiv=1910.05266|s2cid=211146609}}</ref> separation of [[Chaos theory|chaotic]] signals,<ref>{{Cite journal|last1=Krishnagopal|first1=Sanjukta|last2=Girvan|first2=Michelle|last3=Ott|first3=Edward|last4=Hunt|first4=Brian R.|date=2020-02-01|title=Separation of chaotic signals by reservoir computing|url=https://aip.scitation.org/doi/10.1063/1.5132766|journal=Chaos: An Interdisciplinary Journal of Nonlinear Science|volume=30|issue=2|pages=023123|doi=10.1063/1.5132766|pmid=32113243|issn=1054-1500|arxiv=1910.10080|bibcode=2020Chaos..30b3123K|s2cid=204823815}}</ref> and link inference of [[Network theory|networks]] from their dynamics.<ref>{{Cite journal|last1=Banerjee|first1=Amitava|last2=Hart|first2=Joseph D.|last3=Roy|first3=Rajarshi|last4=Ott|first4=Edward|date=2021-07-20|title=Machine Learning Link Inference of Noisy Delay-Coupled Networks with Optoelectronic Experimental Tests|journal=Physical Review X|volume=11|issue=3|pages=031014|doi=10.1103/PhysRevX.11.031014|arxiv=2010.15289|bibcode=2021PhRvX..11c1014B|doi-access=free}}</ref>
20

个编辑