“储备池计算”的版本间的差异
1210080212(讨论 | 贡献) |
1210080212(讨论 | 贡献) |
||
第15行: | 第15行: | ||
Proceedings of the European Symposium on Artificial Neural Networks ESANN 2007, pp. 471–482. It is a generalisation of earlier neural network architectures such as recurrent neural networks, liquid-state machines and echo-state networks. Reservoir computing also extends to physical systems that are not networks in the classical sense, but rather continuous systems in space and/or time: e.g. a literal "bucket of water" can serve as a reservoir that performs computations on inputs given as perturbations of the surface. The resultant complexity of such recurrent neural networks was found to be useful in solving a variety of problems including language processing and dynamic system modeling. However, training of recurrent neural networks is challenging and computationally expensive. Reservoir computing reduces those training-related challenges by fixing the dynamics of the reservoir and only training the linear output layer. | Proceedings of the European Symposium on Artificial Neural Networks ESANN 2007, pp. 471–482. It is a generalisation of earlier neural network architectures such as recurrent neural networks, liquid-state machines and echo-state networks. Reservoir computing also extends to physical systems that are not networks in the classical sense, but rather continuous systems in space and/or time: e.g. a literal "bucket of water" can serve as a reservoir that performs computations on inputs given as perturbations of the surface. The resultant complexity of such recurrent neural networks was found to be useful in solving a variety of problems including language processing and dynamic system modeling. However, training of recurrent neural networks is challenging and computationally expensive. Reservoir computing reduces those training-related challenges by fixing the dynamics of the reservoir and only training the linear output layer. | ||
− | + | 储备池计算的概念源于神经网络中使用递归连接来创建一个复杂的动力系统。它是对早期神经网络体系结构,比如循环神经网络,液体状态机和回声状态网络的一个推广。储备计算还可以扩展到物理系统,在物理系统中它不是传统意义上的网络,而是空间和/或时间上的连续系统: 例如:。“一桶水”可以看作一个蓄水池,可以对它表面的扰动输入进行计算。循环神经网络内部的复杂性,对于解决包括语言处理和动态系统建模在内的各种问题是有用的。然而,循环神经网络的训练是具有挑战性的,并且它的计算开销十分巨大。储备池计算通过固定储备池的动力学特性,并且只训练线性读出层的特点,可以减少循环神经网络在训练上的问题。 | |
A large variety of nonlinear dynamical systems can serve as a reservoir that performs computations. In recent years semiconductor lasers have attracted considerable interest as computation can be fast and energy efficient compared to electrical components. | A large variety of nonlinear dynamical systems can serve as a reservoir that performs computations. In recent years semiconductor lasers have attracted considerable interest as computation can be fast and energy efficient compared to electrical components. |
2022年4月24日 (日) 15:17的版本
此词条暂由彩云小译翻译,翻译字数共1698,未经人工整理和审校,带来阅读不便,请见谅。
Reservoir computing is a framework for computation derived from recurrent neural network theory that maps input signals into higher dimensional computational spaces through the dynamics of a fixed, non-linear system called a reservoir.[1] After the input signal is fed into the reservoir, which is treated as a "black box," a simple readout mechanism is trained to read the state of the reservoir and map it to the desired output.[1] The first key benefit of this framework is that training is performed only at the readout stage, as the reservoir dynamics are fixed.[1] The second is that the computational power of naturally available systems, both classical and quantum mechanical, can be used to reduce the effective computational cost.[2]
储备池计算是一个从循环神经网络理论中得出来的计算框架,储备池是一个固定的,非线性系统,其内部具有动力学过程,这个动力学过程将输入信号映射到更高维的计算空间。当输入信号被送入储备池(储备池通常被当作一个“黑匣子”)后,可以训练一个简单的读出机制来读取储备池中神经元的状态并将其映射到所需的输出。这个框架的第一个关键好处是,训练只在读出阶段进行,在读出阶段储备池动力学特性保持不变。第二个好处是这个储备池系统的计算能力,无论是在经典力学还是量子力学中,都可以有效的降低计算成本。
History
The concept of reservoir computing stems from the use of recursive connections within neural networks to create a complex dynamical system.[3] It is a generalisation of earlier neural network architectures such as recurrent neural networks, liquid-state machines and echo-state networks. Reservoir computing also extends to physical systems that are not networks in the classical sense, but rather continuous systems in space and/or time: e.g. a literal "bucket of water" can serve as a reservoir that performs computations on inputs given as perturbations of the surface.[4] The resultant complexity of such recurrent neural networks was found to be useful in solving a variety of problems including language processing and dynamic system modeling.[3] However, training of recurrent neural networks is challenging and computationally expensive.[3] Reservoir computing reduces those training-related challenges by fixing the dynamics of the reservoir and only training the linear output layer.[3]
The concept of reservoir computing stems from the use of recursive connections within neural networks to create a complex dynamical system.Schrauwen, Benjamin, David Verstraeten, and Jan Van Campenhout. "An overview of reservoir computing: theory, applications, and implementations." Proceedings of the European Symposium on Artificial Neural Networks ESANN 2007, pp. 471–482. It is a generalisation of earlier neural network architectures such as recurrent neural networks, liquid-state machines and echo-state networks. Reservoir computing also extends to physical systems that are not networks in the classical sense, but rather continuous systems in space and/or time: e.g. a literal "bucket of water" can serve as a reservoir that performs computations on inputs given as perturbations of the surface. The resultant complexity of such recurrent neural networks was found to be useful in solving a variety of problems including language processing and dynamic system modeling. However, training of recurrent neural networks is challenging and computationally expensive. Reservoir computing reduces those training-related challenges by fixing the dynamics of the reservoir and only training the linear output layer.
储备池计算的概念源于神经网络中使用递归连接来创建一个复杂的动力系统。它是对早期神经网络体系结构,比如循环神经网络,液体状态机和回声状态网络的一个推广。储备计算还可以扩展到物理系统,在物理系统中它不是传统意义上的网络,而是空间和/或时间上的连续系统: 例如:。“一桶水”可以看作一个蓄水池,可以对它表面的扰动输入进行计算。循环神经网络内部的复杂性,对于解决包括语言处理和动态系统建模在内的各种问题是有用的。然而,循环神经网络的训练是具有挑战性的,并且它的计算开销十分巨大。储备池计算通过固定储备池的动力学特性,并且只训练线性读出层的特点,可以减少循环神经网络在训练上的问题。
A large variety of nonlinear dynamical systems can serve as a reservoir that performs computations. In recent years semiconductor lasers have attracted considerable interest as computation can be fast and energy efficient compared to electrical components.
A large variety of nonlinear dynamical systems can serve as a reservoir that performs computations. In recent years semiconductor lasers have attracted considerable interest as computation can be fast and energy efficient compared to electrical components.
各种各样的非线性动力系统可以作为一个水库,执行计算。近年来,半导体激光器引起了人们的极大兴趣,因为与电子元件相比,半导体激光器的运算速度更快,能量效率更高。
Recent advances in both AI and quantum information theory have given rise to the concept of quantum neural networks.[5] These hold promise in quantum information processing, which is challenging to classical networks, but can also find application in solving classical problems.[5][6] In 2018, a physical realization of a quantum reservoir computing architecture was demonstrated in the form of nuclear spins within a molecular solid.[6] However, the nuclear spin experiments in [6] did not demonstrate quantum reservoir computing per se as they did not involve processing of sequential data. Rather the data were vector inputs, which makes this more accurately a demonstration of quantum implementation of a random kitchen sink[7] algorithm (also going by the name of extreme learning machines in some communities). In 2019, another possible implementation of quantum reservoir processors was proposed in the form of two-dimensional fermionic lattices.[6] In 2020, realization of reservoir computing on gate-based quantum computers was proposed and demonstrated on cloud-based IBM superconducting near-term quantum computers.[8]
Recent advances in both AI and quantum information theory have given rise to the concept of quantum neural networks. These hold promise in quantum information processing, which is challenging to classical networks, but can also find application in solving classical problems. In 2018, a physical realization of a quantum reservoir computing architecture was demonstrated in the form of nuclear spins within a molecular solid. However, the nuclear spin experiments in did not demonstrate quantum reservoir computing per se as they did not involve processing of sequential data. Rather the data were vector inputs, which makes this more accurately a demonstration of quantum implementation of a random kitchen sink algorithm (also going by the name of extreme learning machines in some communities). In 2019, another possible implementation of quantum reservoir processors was proposed in the form of two-dimensional fermionic lattices. In 2020, realization of reservoir computing on gate-based quantum computers was proposed and demonstrated on cloud-based IBM superconducting near-term quantum computers.
人工智能和量子信息理论的最新进展引出了量子神经网络的概念。这些技术在量子信息处理领域具有广阔的应用前景,而量子信息处理是对经典网络的挑战,同时在解决经典问题方面也具有广阔的应用前景。2018年,一个量子库计算架构的物理实现被证明是以分子固体中的核自旋的形式出现的。然而,年的核自旋实验并没有证明量子库计算本身,因为它们并不涉及序列数据的处理。相反,数据是矢量输入,这使得这更准确地演示了一个随机厨房水槽算法的量子实现(在一些社区中也使用极端学习机的名称)。2019年,另一种可能的量子库处理器的实现被提出为二维费米晶格的形式。2020年,在基于门的量子计算机上实现了蓄电池计算,并在基于云的 IBM 超导近期量子计算机上进行了演示。
Reservoir computers have been used for time-series analysis purposes. In particular, some of their usages involve chaotic time-series prediction,[9][10] separation of chaotic signals,[11] and link inference of networks from their dynamics.[12]
Reservoir computers have been used for time-series analysis purposes. In particular, some of their usages involve chaotic time-series prediction, separation of chaotic signals, and link inference of networks from their dynamics.
储层计算机已用于时间序列分析的目的。特别是混沌时间序列预测、混沌信号分离、网络动力学链路推理等方面的应用。
Classical reservoir computing
Classical reservoir computing
= = 经典的油藏计算 =
Reservoir
The 'reservoir' in reservoir computing is the internal structure of the computer, and must have two properties: it must be made up of individual, non-linear units, and it must be capable of storing information. The non-linearity describes the response of each unit to input, which is what allows reservoir computers to solve complex problems. Reservoirs are able to store information by connecting the units in recurrent loops, where the previous input affects the next response. The change in reaction due to the past allows the computers to be trained to complete specific tasks.[13]
The 'reservoir' in reservoir computing is the internal structure of the computer, and must have two properties: it must be made up of individual, non-linear units, and it must be capable of storing information. The non-linearity describes the response of each unit to input, which is what allows reservoir computers to solve complex problems. Reservoirs are able to store information by connecting the units in recurrent loops, where the previous input affects the next response. The change in reaction due to the past allows the computers to be trained to complete specific tasks.
油藏计算中的“油藏”是计算机的内部结构,必须具有两个特性: 它必须由单个的非线性单元组成,并且必须能够存储信息。非线性描述了每个单元对输入的响应,这使得油藏计算机能够解决复杂的问题。水库能够储存的信息,连接单位在循环回路,其中前一个输入影响下一个响应。由于过去的反应的变化允许计算机被训练来完成特定的任务。
Reservoirs can be virtual or physical.[13] Virtual reservoirs are typically randomly generated and are designed like neural networks.[13][3] Virtual reservoirs can be designed to have non-linearity and recurrent loops, but, unlike neural networks, the connections between units are randomized and remain unchanged throughout computation.[13] Physical reservoirs are possible because of the inherent non-linearity of certain natural systems. The interaction between ripples on the surface of water contains the nonlinear dynamics required in reservoir creation, and a pattern recognition RC was developed by first inputting ripples with electric motors then recording and analyzing the ripples in the readout.[1]
Reservoirs can be virtual or physical. Virtual reservoirs are typically randomly generated and are designed like neural networks. Virtual reservoirs can be designed to have non-linearity and recurrent loops, but, unlike neural networks, the connections between units are randomized and remain unchanged throughout computation. Physical reservoirs are possible because of the inherent non-linearity of certain natural systems. The interaction between ripples on the surface of water contains the nonlinear dynamics required in reservoir creation, and a pattern recognition RC was developed by first inputting ripples with electric motors then recording and analyzing the ripples in the readout.
储油层可以是虚拟的,也可以是实体的。虚拟水库通常是随机产生的,设计类似于神经网络。虚拟水库可以设计成具有非线性和循环回路,但是,与神经网络不同,单元之间的连接是随机的,并且在整个计算过程中保持不变。由于某些自然系统固有的非线性,物理储层是可能存在的。水面波纹之间的相互作用包含了水库形成所需的非线性动力学,通过电动机输入波纹,然后对读出的波纹进行记录和分析,建立了模式识别 RC。
Readout
Readout
= = 读出 =
The readout is a neural network layer that performs a linear transformation on the output of the reservoir.[1] The weights of the readout layer are trained by analyzing the spatiotemporal patterns of the reservoir after excitation by known inputs, and by utilizing a training method such as a linear regression or a Ridge regression.[1] As its implementation depends on spatiotemporal reservoir patterns, the details of readout methods are tailored to each type of reservoir.[1] For example, the readout for a reservoir computer using a container of liquid as its reservoir might entail observing spatiotemporal patterns on the surface of the liquid.[1]
The readout is a neural network layer that performs a linear transformation on the output of the reservoir. The weights of the readout layer are trained by analyzing the spatiotemporal patterns of the reservoir after excitation by known inputs, and by utilizing a training method such as a linear regression or a Ridge regression. As its implementation depends on spatiotemporal reservoir patterns, the details of readout methods are tailored to each type of reservoir. For example, the readout for a reservoir computer using a container of liquid as its reservoir might entail observing spatiotemporal patterns on the surface of the liquid.
读数是一个神经网络层,对水库的输出执行一个线性映射。通过分析已知输入激发后的水库时空模式,以及利用线性回归或岭回归等训练方法,对读出层的权重进行训练。由于其实施取决于时空储存器模式,读出方法的细节是针对每种储存器类型量身定制的。例如,储存器计算机使用液体容器作为储存器的读数可能需要观察液体表面的时空模式。
Types
Types
= = 类型 =
Context reverberation network
An early example of reservoir computing was the context reverberation network.[14] In this architecture, an input layer feeds into a high dimensional dynamical system which is read out by a trainable single-layer perceptron. Two kinds of dynamical system were described: a recurrent neural network with fixed random weights, and a continuous reaction–diffusion system inspired by Alan Turing’s model of morphogenesis. At the trainable layer, the perceptron associates current inputs with the signals that reverberate in the dynamical system; the latter were said to provide a dynamic "context" for the inputs. In the language of later work, the reaction–diffusion system served as the reservoir.
An early example of reservoir computing was the context reverberation network. Kirby, Kevin. "Context dynamics in neural sequential learning." Proceedings of the Florida Artificial Intelligence Research Symposium FLAIRS (1991), 66–70.
In this architecture, an input layer feeds into a high dimensional dynamical system which is read out by a trainable single-layer perceptron. Two kinds of dynamical system were described: a recurrent neural network with fixed random weights, and a continuous reaction–diffusion system inspired by Alan Turing’s model of morphogenesis. At the trainable layer, the perceptron associates current inputs with the signals that reverberate in the dynamical system; the latter were said to provide a dynamic "context" for the inputs. In the language of later work, the reaction–diffusion system served as the reservoir.
上下文混响网络上下文混响网络是油藏计算的一个早期实例。凯文 · 柯比。“神经连续学习中的环境动力学。”佛罗里达人工智能研究会论文集,FLAIRS (1991) ,66-70。在这种架构中,一个输入层输入到一个高维动力系统中,这个高维信息由一个可训练的单层感知器读出。描述了两种动力系统: 一种是固定随机重量的递归神经网络,另一种是受 Alan Turing 的形态发生模型启发的连续反应扩散系统。在可训练层,感知器将当前输入与在动力系统中回响的信号联系起来; 后者被认为为输入提供了一个动态的“上下文”。用后来的工作语言来说,反应扩散系统就是水库。
Echo state network
The Tree Echo State Network (TreeESN) model represents a generalization of the reservoir computing framework to tree structured data.[15]
The Tree Echo State Network (TreeESN) model represents a generalization of the reservoir computing framework to tree structured data.
= = = = 回声状态网络 = = = = = 树状回声状态网络(TreeESN)模型代表了油藏计算框架向树状结构数据的推广。
Liquid-state machine
Chaotic Liquid State Machine
Chaotic Liquid State Machine
= = = = 液态机 = = = 混沌液态机
The liquid (i.e. reservoir) of a Chaotic Liquid State Machine (CLSM),[16][17] or chaotic reservoir, is made from chaotic spiking neurons but which stabilize their activity by settling to a single hypothesis that describes the trained inputs of the machine. This is in contrast to general types of reservoirs that don’t stabilize. The liquid stabilization occurs via synaptic plasticity and chaos control that govern neural connections inside the liquid. CLSM showed promising results in learning sensitive time series data.[16][17]
The liquid (i.e. reservoir) of a Chaotic Liquid State Machine (CLSM), or chaotic reservoir, is made from chaotic spiking neurons but which stabilize their activity by settling to a single hypothesis that describes the trained inputs of the machine. This is in contrast to general types of reservoirs that don’t stabilize. The liquid stabilization occurs via synaptic plasticity and chaos control that govern neural connections inside the liquid. CLSM showed promising results in learning sensitive time series data.
该液体(即。混沌液态机(CLSM)或混沌液态机(CLSM)中的混沌液态机(reservoir)是由混沌脉冲神经元构成,但它们通过建立一个描述机器训练输入的单一假设来稳定其活动。这与一般不稳定的储层形成了鲜明的对比。液体的稳定是通过突触可塑性和混沌控制来实现的,混沌控制控制着液体内部的神经连接。CLSM 在学习敏感时间序列数据方面取得了良好的效果。
Nonlinear transient computation
This type of information processing is most relevant when time-dependent input signals depart from the mechanism’s internal dynamics.[18] These departures cause transients or temporary altercations which are represented in the device’s output.[18]
This type of information processing is most relevant when time-dependent input signals depart from the mechanism’s internal dynamics. These departures cause transients or temporary altercations which are represented in the device’s output.
= = = = = 非线性瞬态计算 = = = = = 这种类型的信息处理是最相关的时间依赖的输入信号离开机制的内部动态。这些偏离引起瞬态或暂时的变化,这些变化在设备的输出中得到了体现。
Deep reservoir computing
The extension of the reservoir computing framework towards Deep Learning, with the introduction of Deep Reservoir Computing and of the Deep Echo State Network (DeepESN) model[19][20][21][22] allows to develop efficiently trained models for hierarchical processing of temporal data, at the same time enabling the investigation on the inherent role of layered composition in recurrent neural networks.
The extension of the reservoir computing framework towards Deep Learning, with the introduction of Deep Reservoir Computing and of the Deep Echo State Network (DeepESN) model allows to develop efficiently trained models for hierarchical processing of temporal data, at the same time enabling the investigation on the inherent role of layered composition in recurrent neural networks.
- = = = = = = = = = = = = 深度油藏计算扩展到深度学习的油藏计算框架,引入了深度油藏计算和深度回波状态网络(DeepESN)模型,允许开发有效训练的时间数据层次化处理模型,同时允许研究层状组合在回归神经网络中的固有作用。
Quantum reservoir computing
Quantum reservoir computing may use the nonlinear nature of quantum mechanical interactions or processes to form the characteristic nonlinear reservoirs[5][6][23][8] but may also be done with linear reservoirs when the injection of the input to the reservoir creates the nonlinearity.[24] The marriage of machine learning and quantum devices is leading to the emergence of quantum neuromorphic computing as a new research area.[25]
Quantum reservoir computing may use the nonlinear nature of quantum mechanical interactions or processes to form the characteristic nonlinear reservoirs but may also be done with linear reservoirs when the injection of the input to the reservoir creates the nonlinearity. The marriage of machine learning and quantum devices is leading to the emergence of quantum neuromorphic computing as a new research area.
量子油藏计算可以利用量子力学相互作用或过程的非线性本质来形成特征非线性油藏,但也可以利用线性油藏,当向油藏注入的输入产生非线性时。量子神经形态计算是机器学习和量子装置的结合,是量子神经形态计算的一个新的研究领域。
Types
Types
= = 类型 =
Gaussian states of interacting quantum harmonic oscillators
Gaussian states of interacting quantum harmonic oscillators
= = 相互作用量子谐振子的高斯态 = =
Gaussian states are a paradigmatic class of states of continuous variable quantum systems.[26] Although they can nowadays be created and manipulated in, e.g, state-of-the-art optical platforms,[27] naturally robust to decoherence, it is well-known that they are not sufficient for, e.g., universal quantum computing because transformations that preserve the Gaussian nature of a state are linear.[28] Normally, linear dynamics would not be sufficient for nontrivial reservoir computing either. It is nevertheless possible to harness such dynamics for reservoir computing purposes by considering a network of interacting quantum harmonic oscillators and injecting the input by periodical state resets of a subset of the oscillators. With a suitable choice of how the states of this subset of oscillators depends on the input, the observables of the rest of the oscillators can become nonlinear functions of the input suitable for reservoir computing; indeed, thanks to the properties of these functions, even universal reservoir computing becomes possible by combining the observables with a polynomial readout function.[24] In principle, such reservoir computers could be implemented with controlled multimode optical parametric processes,[29] however efficient extraction of the output from the system is challenging especially in the quantum regime where measurement back-action must be taken into account.
Gaussian states are a paradigmatic class of states of continuous variable quantum systems. Although they can nowadays be created and manipulated in, e.g, state-of-the-art optical platforms, naturally robust to decoherence, it is well-known that they are not sufficient for, e.g., universal quantum computing because transformations that preserve the Gaussian nature of a state are linear. Normally, linear dynamics would not be sufficient for nontrivial reservoir computing either. It is nevertheless possible to harness such dynamics for reservoir computing purposes by considering a network of interacting quantum harmonic oscillators and injecting the input by periodical state resets of a subset of the oscillators. With a suitable choice of how the states of this subset of oscillators depends on the input, the observables of the rest of the oscillators can become nonlinear functions of the input suitable for reservoir computing; indeed, thanks to the properties of these functions, even universal reservoir computing becomes possible by combining the observables with a polynomial readout function. In principle, such reservoir computers could be implemented with controlled multimode optical parametric processes, however efficient extraction of the output from the system is challenging especially in the quantum regime where measurement back-action must be taken into account.
高斯态是连续变量量子系统的一类典型态。尽管它们现在可以在诸如最先进的光学平台上创建和操作,这些平台对去相干具有天然的鲁棒性,但众所周知,它们对于诸如通用量子计算来说是不够的,因为保持状态的高斯性质的变换是线性的。正常情况下,线性动力学也不足以进行非平凡的储层计算。然而,通过考虑一个由相互作用的量子谐振子组成的网络,并通过周期性的振子子集的状态重置注入输入,可以将这种动力学应用于水库计算目的。选择一个合适的振荡器子集的状态如何取决于输入,其余振荡器的观测量可以成为非线性函数的输入适合于水库计算; 事实上,由于这些函数的性质,甚至通用水库计算成为可能,通过结合观测量和一个多项式读出函数。原则上,这种蓄电池计算机可以通过受控的多模光学参量过程实现,但是从系统中有效地提取输出是一个挑战,特别是在必须考虑测量反作用的量子体制中。
2-D quantum dot lattices
In this architecture, randomized coupling between lattice sites grants the reservoir the “black box” property inherent to reservoir processors.[5] The reservoir is then excited, which acts as the input, by an incident optical field. Readout occurs in the form of occupational numbers of lattice sites, which are naturally nonlinear functions of the input.[5]
In this architecture, randomized coupling between lattice sites grants the reservoir the “black box” property inherent to reservoir processors. The reservoir is then excited, which acts as the input, by an incident optical field. Readout occurs in the form of occupational numbers of lattice sites, which are naturally nonlinear functions of the input.
= = = = = = 2-D 量子点格子 = = = = = = 在这种结构中,格点之间的随机耦合赋予了蓄电池处理器固有的“黑盒”属性。然后通过一个入射光场激发储存器,作为输入。读出以格点的职业数的形式出现,这是输入的自然非线性函数。
Nuclear spins in a molecular solid
In this architecture, quantum mechanical coupling between spins of neighboring atoms within the molecular solid provides the non-linearity required to create the higher-dimensional computational space.[6] The reservoir is then excited by radiofrequency electromagnetic radiation tuned to the resonance frequencies of relevant nuclear spins.[6] Readout occurs by measuring the nuclear spin states.[6]
In this architecture, quantum mechanical coupling between spins of neighboring atoms within the molecular solid provides the non-linearity required to create the higher-dimensional computational space. The reservoir is then excited by radiofrequency electromagnetic radiation tuned to the resonance frequencies of relevant nuclear spins. Readout occurs by measuring the nuclear spin states.
分子固体中的核自旋在这种结构中,分子固体中相邻原子自旋之间的量子力学耦合提供了创造高维计算空间所需的非线性。然后,该蓄电池被调谐到相关核自旋共振频率的射频电磁辐射所激发。通过测量原子核的自旋态就可以读出数据。
Reservoir computing on gate-based near-term superconducting quantum computers
The most prevalent model of quantum computing is the gate-based model where quantum computation is performed by sequential applications of unitary quantum gates on qubits of a quantum computer.[30] A theory for the implementation of reservoir computing on a gate-based quantum computer with proof-of-principle demonstrations on a number of IBM superconducting noisy intermediate-scale quantum (NISQ) computers[31] has been reported in.[8]
The most prevalent model of quantum computing is the gate-based model where quantum computation is performed by sequential applications of unitary quantum gates on qubits of a quantum computer. A theory for the implementation of reservoir computing on a gate-based quantum computer with proof-of-principle demonstrations on a number of IBM superconducting noisy intermediate-scale quantum (NISQ) computersJohn Preskill. "Quantum Computing in the NISQ era and beyond." Quantum 2,79 (2018) has been reported in.
= = = = 基于门的近期超导量子计算机上的蓄水池计算 = = = = = 量子计算最流行的模型是基于门的模型,量子计算是通过量子计算机量子比特上的幺正量子门顺序应用来执行的。在基于栅极的量子计算机上实现蓄电池计算的理论,并在 IBM 超导带噪中级量子计算机(NISQ)上进行了原理论证。“量子计算在 NISQ 时代和以后。”量子2,79(2018)已于。
See also
- Deep learning
- Extreme learning machines
= = = = 深度学习 = = 极限学习机器
References
- ↑ 1.0 1.1 1.2 1.3 1.4 1.5 1.6 1.7 Tanaka, Gouhei; Yamane, Toshiyuki; Héroux, Jean Benoit; Nakane, Ryosho; Kanazawa, Naoki; Takeda, Seiji; Numata, Hidetoshi; Nakano, Daiju; Hirose, Akira (2019). "Recent advances in physical reservoir computing: A review". Neural Networks. 115: 100–123. doi:10.1016/j.neunet.2019.03.005. ISSN 0893-6080. PMID 30981085.
- ↑ Röhm, André; Lüdge, Kathy (2018-08-03). "Multiplexed networks: reservoir computing with virtual and real nodes". Journal of Physics Communications. 2 (8): 085007. Bibcode:2018JPhCo...2h5007R. doi:10.1088/2399-6528/aad56d. ISSN 2399-6528.
- ↑ 3.0 3.1 3.2 3.3 3.4 Schrauwen, Benjamin, David Verstraeten, and Jan Van Campenhout. "An overview of reservoir computing: theory, applications, and implementations." Proceedings of the European Symposium on Artificial Neural Networks ESANN 2007, pp. 471–482.
- ↑ Fernando, C.; Sojakka, Sampsa (2003). "Pattern Recognition in a Bucket". Advances in Artificial Life. Lecture Notes in Computer Science. 2801. pp. 588–597. doi:10.1007/978-3-540-39432-7_63. ISBN 978-3-540-20057-4. https://www.semanticscholar.org/paper/Pattern-Recognition-in-a-Bucket-Fernando-Sojakka/af342af4d0e674aef3bced5fd90875c6f2e04abc.
- ↑ 5.0 5.1 5.2 5.3 5.4 Ghosh, Sanjib; Opala, Andrzej; Matuszewski, Michał; Paterek, Tomasz; Liew, Timothy C. H. (December 2019). "Quantum reservoir processing". NPJ Quantum Information. 5 (1): 35. arXiv:1811.10335. Bibcode:2019npjQI...5...35G. doi:10.1038/s41534-019-0149-8. ISSN 2056-6387. S2CID 119197635.
- ↑ 6.0 6.1 6.2 6.3 6.4 6.5 6.6 6.7 Negoro, Makoto; Mitarai, Kosuke; Fujii, Keisuke; Nakajima, Kohei; Kitagawa, Masahiro (2018-06-28). "Machine learning with controllable quantum dynamics of a nuclear spin ensemble in a solid". arXiv:1806.10910 [quant-ph].
- ↑ Rahimi, Ali; Recht, Benjamin (December 2008). "Weighted Sums of Random Kitchen Sinks: Replacing minimization with randomization in Learning" (PDF). NIPS'08: Proceedings of the 21st International Conference on Neural Information Processing Systems: 1313–1320.
- ↑ 8.0 8.1 8.2 Chen, Jiayin; Nurdin, Hendra; Yamamoto, Naoki (2020-08-24). "Temporal Information Processing on Noisy Quantum Computers". Physical Review Applied. 14 (2): 024065. arXiv:2001.09498. Bibcode:2020PhRvP..14b4065C. doi:10.1103/PhysRevApplied.14.024065. S2CID 210920543.
- ↑ Pathak, Jaideep; Hunt, Brian; Girvan, Michelle; Lu, Zhixin; Ott, Edward (2018-01-12). "Model-Free Prediction of Large Spatiotemporally Chaotic Systems from Data: A Reservoir Computing Approach". Physical Review Letters. 120 (2): 024102. Bibcode:2018PhRvL.120b4102P. doi:10.1103/PhysRevLett.120.024102. PMID 29376715.
- ↑ Vlachas, P.R.; Pathak, J.; Hunt, B.R.; Sapsis, T.P.; Girvan, M.; Ott, E.; Koumoutsakos, P. (2020-03-21). "Backpropagation algorithms and Reservoir Computing in Recurrent Neural Networks for the forecasting of complex spatiotemporal dynamics". Neural Networks. 126: 191–217. arXiv:1910.05266. doi:10.1016/j.neunet.2020.02.016. ISSN 0893-6080. PMID 32248008. S2CID 211146609.
- ↑ Krishnagopal, Sanjukta; Girvan, Michelle; Ott, Edward; Hunt, Brian R. (2020-02-01). "Separation of chaotic signals by reservoir computing". Chaos: An Interdisciplinary Journal of Nonlinear Science. 30 (2): 023123. arXiv:1910.10080. Bibcode:2020Chaos..30b3123K. doi:10.1063/1.5132766. ISSN 1054-1500. PMID 32113243. S2CID 204823815.
- ↑ Banerjee, Amitava; Hart, Joseph D.; Roy, Rajarshi; Ott, Edward (2021-07-20). "Machine Learning Link Inference of Noisy Delay-Coupled Networks with Optoelectronic Experimental Tests". Physical Review X. 11 (3): 031014. arXiv:2010.15289. Bibcode:2021PhRvX..11c1014B. doi:10.1103/PhysRevX.11.031014.
- ↑ 13.0 13.1 13.2 13.3 Soriano, Miguel C. (2017-02-06). "Viewpoint: Reservoir Computing Speeds Up". Physics (in English). 10. doi:10.1103/Physics.10.12.
- ↑ Kirby, Kevin. "Context dynamics in neural sequential learning." Proceedings of the Florida Artificial Intelligence Research Symposium FLAIRS (1991), 66–70.
- ↑ Gallicchio, Claudio; Micheli, Alessio (2013). "Tree Echo State Networks". Neurocomputing. 101: 319–337. doi:10.1016/j.neucom.2012.08.017. hdl:11568/158480.
- ↑ 16.0 16.1 Aoun, Mario Antoine; Boukadoum, Mounir (2014). "Learning algorithm and neurocomputing architecture for NDS Neurons". 2014 IEEE 13th International Conference on Cognitive Informatics and Cognitive Computing. IEEE: 126–132. doi:10.1109/icci-cc.2014.6921451. ISBN 978-1-4799-6081-1. S2CID 16026952.
- ↑ 17.0 17.1 Aoun, Mario Antoine; Boukadoum, Mounir (2015). "Chaotic Liquid State Machine". International Journal of Cognitive Informatics and Natural Intelligence. 9 (4): 1–20. doi:10.4018/ijcini.2015100101. ISSN 1557-3958.
- ↑ 18.0 18.1 Crook, Nigel (2007). "Nonlinear Transient Computation". Neurocomputing. 70 (7–9): 1167–1176. doi:10.1016/j.neucom.2006.10.148.
- ↑ Pedrelli, Luca (2019). Deep Reservoir Computing: A Novel Class of Deep Recurrent Neural Networks (PhD thesis). Università di Pisa.
- ↑ Gallicchio, Claudio; Micheli, Alessio; Pedrelli, Luca (2017-12-13). "Deep reservoir computing: A critical experimental analysis". Neurocomputing. 268: 87–99. doi:10.1016/j.neucom.2016.12.089. hdl:11568/851934.
- ↑ Gallicchio, Claudio; Micheli, Alessio (2017-05-05). "Echo State Property of Deep Reservoir Computing Networks". Cognitive Computation. 9 (3): 337–350. doi:10.1007/s12559-017-9461-9. hdl:11568/851932. ISSN 1866-9956. S2CID 1077549.
- ↑ Gallicchio, Claudio; Micheli, Alessio; Pedrelli, Luca (December 2018). "Design of deep echo state networks". Neural Networks. 108: 33–47. doi:10.1016/j.neunet.2018.08.002. hdl:11568/939082. ISSN 0893-6080. PMID 30138751. S2CID 52075702.
- ↑ Chen, Jiayin; Nurdin, Hendra (2019-05-15). "Learning nonlinear input–output maps with dissipative quantum systems". Quantum Information Processing. 18 (7): 198. arXiv:1901.01653. Bibcode:2019QuIP...18..198C. doi:10.1007/s11128-019-2311-9. S2CID 57573677.
- ↑ 24.0 24.1 Nokkala, Johannes; Martínez-Peña, Rodrigo; Giorgi, Gian Luca; Parigi, Valentina; Soriano, Miguel C.; Zambrini, Roberta (2021). "Gaussian states of continuous-variable quantum systems provide universal and versatile reservoir computing". Communications Physics. 4 (1): 53. arXiv:2006.04821. Bibcode:2021CmPhy...4...53N. doi:10.1038/s42005-021-00556-w. S2CID 234355683.
- ↑ Marković, Danijela; Grollier, Julie (2020-10-13). "Quantum Neuromorphic Computing". Applied Physics Letters. 117 (15): 150501. arXiv:2006.15111. Bibcode:2020ApPhL.117o0501M. doi:10.1063/5.0020014. S2CID 210920543.
- ↑ Ferraro, Alessandro; Olivares, Stefano; Paris, Matteo G. A. (2005-03-31). "Gaussian states in continuous variable quantum information". arXiv:quant-ph/0503237.
- ↑ Roslund, Jonathan; de Araújo, Renné Medeiros; Jiang, Shifeng; Fabre, Claude; Treps, Nicolas (2013-12-15). "Wavelength-multiplexed quantum networks with ultrafast frequency combs". Nature Photonics (in English). 8 (2): 109–112. arXiv:1307.1216. doi:10.1038/nphoton.2013.340. ISSN 1749-4893. S2CID 2328402.
- ↑ Bartlett, Stephen D.; Sanders, Barry C.; Braunstein, Samuel L.; Nemoto, Kae (2002-02-14). "Efficient Classical Simulation of Continuous Variable Quantum Information Processes". Physical Review Letters. 88 (9): 097904. arXiv:quant-ph/0109047. Bibcode:2002PhRvL..88i7904B. doi:10.1103/PhysRevLett.88.097904. PMID 11864057. S2CID 2161585.
- ↑ Nokkala, J.; Arzani, F.; Galve, F.; Zambrini, R.; Maniscalco, S.; Piilo, J.; Treps, N.; Parigi, V. (2018-05-09). "Reconfigurable optical implementation of quantum complex networks". New Journal of Physics (in English). 20 (5): 053024. arXiv:1708.08726. Bibcode:2018NJPh...20e3024N. doi:10.1088/1367-2630/aabc77. ISSN 1367-2630. S2CID 119091176.
- ↑ Nielsen, Michael; Chuang, Isaac (2010), Quantum Computation and Quantum Information (2 ed.), Cambridge University Press Cambridge
- ↑ John Preskill. "Quantum Computing in the NISQ era and beyond." Quantum 2,79 (2018)
Further reading
- Reservoir Computing using delay systems, Nature Communications 2011
- Optoelectronic Reservoir Computing, Scientific Reports February 2012
- Optoelectronic Reservoir Computing, Optics Express 2012
- All-optical Reservoir Computing, Nature Communications 2013
- Memristor Models for Machine learning, Neural Computation 2014 arxiv
- Reservoir Computing using delay systems, Nature Communications 2011
- Optoelectronic Reservoir Computing, Scientific Reports February 2012
- Optoelectronic Reservoir Computing, Optics Express 2012
- All-optical Reservoir Computing, Nature Communications 2013
- Memristor Models for Machine learning, Neural Computation 2014 arxiv
光电子水库计算,Optics Express 2012,All-optical Reservoir Computing,Nature Communications 2013,Memristor Models for Machine learning,Neural calculation 2014 arxiv
Category:Artificial neural networks
类别: 人工神经网络
This page was moved from wikipedia:en:Reservoir computing. Its edit history can be viewed at 储备池计算/edithistory