第153行: |
第153行: |
| :<math> \frac{d}{dt} \vec{W} = \alpha \vec{W}-\frac{1}{\beta} (I+\xi \Omega W)^{-1} \Omega \vec S </math> | | :<math> \frac{d}{dt} \vec{W} = \alpha \vec{W}-\frac{1}{\beta} (I+\xi \Omega W)^{-1} \Omega \vec S </math> |
| | | |
− | as a function of the properties of the physical memristive network and the external sources. In the equation above, <math>\alpha</math> is the "forgetting" time scale constant, <math> \xi=r-1</math> and <math>r=\frac{R_\text{off}}{R_\text{on}}</math> is the ratio of ''off'' and ''on'' values of the limit resistances of the memristors, <math> \vec S </math> is the vector of the sources of the circuit and <math>\Omega</math> is a projector on the fundamental loops of the circuit. The constant <math>\beta</math> has the dimension of a voltage and is associated to the properties of the [[memristor]]; its physical origin is the charge mobility in the conductor. The diagonal matrix and vector <math>W=\operatorname{diag}(\vec W)</math> and <math>\vec W</math> respectively, are instead the internal value of the memristors, with values between 0 and 1. This equation thus requires adding extra constraints on the memory values in order to be reliable. | + | as a function of the properties of the physical memristive network and the external sources. In the equation above, <math>\alpha</math> is the "forgetting" time scale constant, <math> \xi=r-1</math> and <math>r=\frac{R_\text{off}}{R_\text{on}}</math> is the ratio of ''off'' and ''on'' values of the limit resistances of the memristors, <math> \vec S </math> is the vector of the sources of the circuit and <math>\Omega</math> is a projector on the fundamental loops of the circuit. The constant <math>\beta</math> has the dimension of a voltage and is associated to the properties of the [[memristor]]; its physical origin is the charge mobility in the conductor. The diagonal matrix and vector <math>W=\operatorname{diag}(\vec W)</math> and <math>\vec W<\math> respectively, are instead the internal value of the memristors, with values between 0 and 1. This equation thus requires adding extra constraints on the memory values in order to be reliable. |
| | | |
− | Caravelli-Traversa-Di Ventra方程是描述物理记忆网络和外部源性质的函数。在上述方程中,<math>\alpha</math>是“遗忘”时间尺度常数,<math>\xi=r-1</math>,<math>r =\frac{R\text_{off}}{R_\text{on}}</math>是记忆电阻器off状态和on状态极限电阻值之比,<math>\vec S</math>是电路源的矢量,<math>\Omega</math>是电路基本环路的投影。常数<math>\beta</math>具有电压的量纲,与记忆电阻器的特性有关;其物理原型是导体中的电荷迁移率。对角矩阵和向量 <math>W=\operatorname{diag}(\vec W)</math>和<nowiki><math>\vec W</math></nowiki>'''<font color="#32CD32">是记忆电阻器的内阻</font>''',值在0到1之间。因此,这个等式需要在'''<font color="32CD32">内存值</font>'''上添加额外约束以保证可靠性。 | + | Caravelli-Traversa-Di Ventra方程是描述物理记忆网络和外部源性质的函数。在上述方程中,<math>\alpha</math>是“遗忘”时间尺度常数,<math>\xi=r-1</math>,<math>r =\frac{R\text_{off}}{R_\text{on}}</math>是记忆电阻器off状态和on状态极限电阻值之比,<math>\vec S</math>是电路源的矢量,<math>\Omega</math>是电路基本环路的投影。常数<math>\beta</math>具有电压的量纲,与记忆电阻器的特性有关;其物理原型是导体中的电荷迁移率。对角矩阵和向量 <math>W=\operatorname{diag}(\vec W)</math>和<math>\vec W</math>'''<font color="#32CD32">分别是记忆电阻器的内阻</font>''',值在0到1之间。因此,这个等式需要在'''<font color="32CD32">内存值</font>'''上添加额外约束以保证可靠性。 |
| | | |
| | | |
第182行: |
第182行: |
| {{Portal bar|Electronics}} | | {{Portal bar|Electronics}} |
| | | |
− | == References== | + | ==References== |
| {{Reflist|40em}} | | {{Reflist|40em}} |
| | | |
第209行: |
第209行: |
| | | |
| | | |
− | * Telluride Neuromorphic Engineering Workshop | + | *Telluride Neuromorphic Engineering Workshop |
− | *CapoCaccia Cognitive Neuromorphic Engineering Workshop | + | * CapoCaccia Cognitive Neuromorphic Engineering Workshop |
− | *Institute of Neuromorphic Engineering | + | * Institute of Neuromorphic Engineering |
| *INE news site. | | *INE news site. |
| *Frontiers in Neuromorphic Engineering Journal | | *Frontiers in Neuromorphic Engineering Journal |
| *Computation and Neural Systems department at the California Institute of Technology. | | *Computation and Neural Systems department at the California Institute of Technology. |
| *Human Brain Project official site | | *Human Brain Project official site |
− | * Building a Silicon Brain: Computer chips based on biological neurons may help simulate larger and more-complex brain models. May 1, 2019. SANDEEP RAVINDRAN | + | *Building a Silicon Brain: Computer chips based on biological neurons may help simulate larger and more-complex brain models. May 1, 2019. SANDEEP RAVINDRAN |
| | | |
| =<nowiki>外部链接</nowiki>= | | =<nowiki>外部链接</nowiki>= |