|
|
第4行: |
第4行: |
| {{Distinguish|Entropy and life#Negative entropy}}{{Redirect|Syntropy||Syntropy (software)}} | | {{Distinguish|Entropy and life#Negative entropy}}{{Redirect|Syntropy||Syntropy (software)}} |
| | | |
− | In [[information theory]] and [[statistics]], '''negentropy''' is used as a measure of distance to normality. The concept and phrase "'''negative entropy'''" was introduced by [[Erwin Schrödinger]] in his 1944 popular-science book ''[[What is Life? (Schrödinger)|What is Life?]]''<ref>Schrödinger, Erwin, ''What is Life – the Physical Aspect of the Living Cell'', Cambridge University Press, 1944</ref> Later, [[Léon Brillouin]] shortened the phrase to ''negentropy''.<ref>Brillouin, Leon: (1953) "Negentropy Principle of Information", ''J. of Applied Physics'', v. '''24(9)''', pp. 1152–1163</ref><ref>Léon Brillouin, ''La science et la théorie de l'information'', Masson, 1959</ref> In 1974, [[Albert Szent-Györgyi]] proposed replacing the term ''negentropy'' with ''syntropy''. That term may have originated in the 1940s with the Italian mathematician [[Luigi Fantappiè]], who tried to construct a unified theory of [[biology]] and [[physics]]. [[Buckminster Fuller]] tried to popularize this usage, but ''negentropy'' remains common.
| |
| | | |
− | In information theory and statistics, negentropy is used as a measure of distance to normality. The concept and phrase "negative entropy" was introduced by Erwin Schrödinger in his 1944 popular-science book What is Life? Later, Léon Brillouin shortened the phrase to negentropy. In 1974, Albert Szent-Györgyi proposed replacing the term negentropy with syntropy. That term may have originated in the 1940s with the Italian mathematician Luigi Fantappiè, who tried to construct a unified theory of biology and physics. Buckminster Fuller tried to popularize this usage, but negentropy remains common.
| + | 在<font color="#ff8000">信息论 information theory</font>和<font color="#ff8000">统计学 statistics</font>中,<font color="#ff8000">负熵 negentropy</font>被用于度量与正态分布之间的距离。“负熵”这一概念是埃尔温·薛定谔Erwin Schrödinger在他1944年的科普著作<font color="#ff8000">《生命是什么?》 What is Life?</font>中首次提出,<ref>Schrödinger, Erwin, ''What is Life – the Physical Aspect of the Living Cell'', Cambridge University Press, 1944</ref>后来莱昂·布里渊Léon Brillouin把这个短语缩写为“负熵”。<ref>Brillouin, Leon: (1953) "Negentropy Principle of Information", ''J. of Applied Physics'', v. '''24(9)''', pp. 1152–1163</ref><ref>Léon Brillouin, ''La science et la théorie de l'information'', Masson, 1959</ref>1974年, 阿尔伯特·圣捷尔吉Albert Szent-Györgyi提出用“同向”这个可能源于20世纪40年代意大利数学家 Luigi fantappi的术语来代替“负熵”以建立一个生物学和物理学的统一理论。虽然巴克敏斯特·福乐 Buckminster Fuller试图推广这一思想,但是迄今为止负熵仍更为人们广泛使用。 |
| | | |
− | 在<font color="#ff8000">信息论 information theory</font>和<font color="#ff8000">统计学 statistics</font>中,<font color="#ff8000">负熵 negentropy</font>被用于度量与正态分布之间的距离。“负熵”这一概念是埃尔温·薛定谔在他1944年的科普著作<font color="#ff8000">《生命是什么?》 What is Life?</font>中首次提出,后来莱昂·布里渊把这个短语缩写为“负熵”。1974年, 阿尔伯特·圣捷尔吉提出用“同向”这个可能源于20世纪40年代意大利数学家 Luigi fantappi的术语来代替“负熵”以建立一个生物学和物理学的统一理论。虽然巴克敏斯特·福乐试图推广这一思想,但是迄今为止负熵仍更为人们广泛使用。
| |
− |
| |
− |
| |
− |
| |
− | In a note to ''[[What is Life?]]'' Schrödinger explained his use of this phrase.
| |
− |
| |
− | In a note to What is Life? Schrödinger explained his use of this phrase.
| |
| | | |
| 在《生命是什么?》的一项注释中,薛定谔解释了他使用这个短语的原因。 | | 在《生命是什么?》的一项注释中,薛定谔解释了他使用这个短语的原因。 |
| | | |
− | {{cquote|... if I had been catering for them [physicists] alone I should have let the discussion turn on ''[[Thermodynamic free energy|free energy]]'' instead. It is the more familiar notion in this context. But this highly technical term seemed linguistically too near to ''[[energy]]'' for making the average reader alive to the contrast between the two things.}}
| |
| | | |
| 如果我只为了迎合物理学家们,那么我就会让讨论转向<font color="#ff8000">“[[热力学自由能|自由能 free energy]]”</font>。在这个语境中,自由能是物理学更熟悉的概念。但是,这个高度专业的术语在语言学上似乎太接近于<font color="#ff8000">能量 energy</font>,以至于普通读者无法生动地看到两者之间的区别。 | | 如果我只为了迎合物理学家们,那么我就会让讨论转向<font color="#ff8000">“[[热力学自由能|自由能 free energy]]”</font>。在这个语境中,自由能是物理学更熟悉的概念。但是,这个高度专业的术语在语言学上似乎太接近于<font color="#ff8000">能量 energy</font>,以至于普通读者无法生动地看到两者之间的区别。 |
| | | |
− | In 2009, Mahulikar & Herwig redefined negentropy of a dynamically ordered sub-system as the specific entropy deficit of the ordered sub-system relative to its surrounding chaos.<ref>Mahulikar, S.P. & Herwig, H.: (2009) "Exact thermodynamic principles for dynamic order existence and evolution in chaos", ''Chaos, Solitons & Fractals'', v. '''41(4)''', pp. 1939–1948</ref> Thus, negentropy has SI units of (J kg<sup>−1</sup> K<sup>−1</sup>) when defined based on specific entropy per unit mass, and (K<sup>−1</sup>) when defined based on specific entropy per unit energy. This definition enabled: ''i'') scale-invariant thermodynamic representation of dynamic order existence, ''ii'') formulation of physical principles exclusively for dynamic order existence and evolution, and ''iii'') mathematical interpretation of Schrödinger's negentropy debt.
| |
− |
| |
− | In 2009, Mahulikar & Herwig redefined negentropy of a dynamically ordered sub-system as the specific entropy deficit of the ordered sub-system relative to its surrounding chaos. Thus, negentropy has SI units of (J kg<sup>−1</sup> K<sup>−1</sup>) when defined based on specific entropy per unit mass, and (K<sup>−1</sup>) when defined based on specific entropy per unit energy. This definition enabled: i) scale-invariant thermodynamic representation of dynamic order existence, ii) formulation of physical principles exclusively for dynamic order existence and evolution, and iii) mathematical interpretation of Schrödinger's negentropy debt.
| |
− |
| |
− | 2009年,Mahulikar 和 Herwig 将动态有序子系统的负熵重新定义为有序子系统相对于周围混沌的特定熵赤字。因此,根据单位质量的熵定义负熵的 SI 单位为(J kg<sup>−1</sup> K<sup>−1</sup>) ,其中(K<sup>−1</sup>)的定义基于单位能量的熵。这个定义表明: i)动态有序存在的尺度不变的热力学表示,ii)动态有序存在和演化的专门物理原理,iii)薛定谔关于负熵的数学解释。
| |
| | | |
| + | 2009年,Mahulikar 和 Herwig 将动态有序子系统的负熵重新定义为有序子系统相对于周围混沌的特定熵赤字。<ref>Mahulikar, S.P. & Herwig, H.: (2009) "Exact thermodynamic principles for dynamic order existence and evolution in chaos", ''Chaos, Solitons & Fractals'', v. '''41(4)''', pp. 1939–1948</ref>因此,根据单位质量的熵定义负熵的 SI 单位为(J kg<sup>−1</sup> K<sup>−1</sup>) ,其中(K<sup>−1</sup>)的定义基于单位能量的熵。这个定义表明: i)动态有序存在的尺度不变的热力学表示,ii)动态有序存在和演化的专门物理原理,iii)薛定谔关于负熵的数学解释。 |
| | | |
| | | |
| ==Information theory 信息论== | | ==Information theory 信息论== |
− |
| |
− | In [[information theory]] and [[statistics]], negentropy is used as a measure of distance to normality.<ref>Aapo Hyvärinen, [http://www.cis.hut.fi/aapo/papers/NCS99web/node32.html Survey on Independent Component Analysis, node32: Negentropy], Helsinki University of Technology Laboratory of Computer and Information Science</ref><ref>Aapo Hyvärinen and Erkki Oja, [http://www.cis.hut.fi/aapo/papers/IJCNN99_tutorialweb/node14.html Independent Component Analysis: A Tutorial, node14: Negentropy], Helsinki University of Technology Laboratory of Computer and Information Science</ref><ref>Ruye Wang, [http://fourier.eng.hmc.edu/e161/lectures/ica/node4.html Independent Component Analysis, node4: Measures of Non-Gaussianity]</ref> Out of all [[Distribution (mathematics)|distributions]] with a given mean and variance, the normal or [[Gaussian distribution]] is the one with the highest entropy. Negentropy measures the difference in entropy between a given distribution and the Gaussian distribution with the same mean and variance. Thus, negentropy is always nonnegative, is invariant by any linear invertible change of coordinates, and vanishes [[if and only if]] the signal is Gaussian.
| |
− |
| |
− | In information theory and statistics, negentropy is used as a measure of distance to normality. Out of all distributions with a given mean and variance, the normal or Gaussian distribution is the one with the highest entropy. Negentropy measures the difference in entropy between a given distribution and the Gaussian distribution with the same mean and variance. Thus, negentropy is always nonnegative, is invariant by any linear invertible change of coordinates, and vanishes if and only if the signal is Gaussian.
| |
− |
| |
− | 在信息论和统计学中,负熵可用来度量某个分布到正态分布的距离。在所有具有给定均值和方差的分布中,正态分布或高斯分布的熵最大。负熵也用来度量具有与它相同均值和方差的给定分布和正态分布之间熵的差距。因此,负熵总是非负的,在任何线性可逆的坐标变换下都是不变的,当且仅当分布表现为<font color="#ff8000">高斯分布 Gaussian distribution</font>时才变为零。
| |
− |
| |
| | | |
| | | |
− | Negentropy is defined as | + | 在信息论和统计学中,负熵可用来度量某个分布到正态分布的距离。<ref>Aapo Hyvärinen, [http://www.cis.hut.fi/aapo/papers/NCS99web/node32.html Survey on Independent Component Analysis, node32: Negentropy], Helsinki University of Technology Laboratory of Computer and Information Science</ref><ref>Aapo Hyvärinen and Erkki Oja, [http://www.cis.hut.fi/aapo/papers/IJCNN99_tutorialweb/node14.html Independent Component Analysis: A Tutorial, node14: Negentropy], Helsinki University of Technology Laboratory of Computer and Information Science</ref><ref>Ruye Wang, [http://fourier.eng.hmc.edu/e161/lectures/ica/node4.html Independent Component Analysis, node4: Measures of Non-Gaussianity]</ref> 在所有具有给定均值和方差的分布中,正态分布或高斯分布的熵最大。负熵也用来度量具有与它相同均值和方差的给定分布和正态分布之间熵的差距。因此,负熵总是非负的,在任何线性可逆的坐标变换下都是不变的,当且仅当分布表现为<font color="#ff8000">高斯分布 Gaussian distribution</font>时才变为零。 |
| | | |
− | Negentropy is defined as
| |
| | | |
| 负熵定义为 | | 负熵定义为 |
− |
| |
− |
| |
| | | |
| :<math>J(p_x) = S(\varphi_x) - S(p_x)\,</math> | | :<math>J(p_x) = S(\varphi_x) - S(p_x)\,</math> |
| | | |
− | <math>J(p_x) = S(\varphi_x) - S(p_x)\,</math>
| |
− |
| |
− |
| |
− |
| |
− | where <math>S(\varphi_x)</math> is the [[differential entropy]] of the Gaussian density with the same [[mean]] and [[variance]] as <math>p_x</math> and <math>S(p_x)</math> is the differential entropy of <math>p_x</math>:
| |
− |
| |
− | where <math>S(\varphi_x)</math> is the differential entropy of the Gaussian density with the same mean and variance as <math>p_x</math> and <math>S(p_x)</math> is the differential entropy of <math>p_x</math>:
| |
| | | |
| 其中<math>S(\varphi_x)</math>表示与<math>p_x</math>具有相同均值和方差的高斯密度的微分熵,<math>S(p_x)</math>表示<math>p_x</math>的微分熵: | | 其中<math>S(\varphi_x)</math>表示与<math>p_x</math>具有相同均值和方差的高斯密度的微分熵,<math>S(p_x)</math>表示<math>p_x</math>的微分熵: |
− |
| |
− |
| |
| | | |
| :<math>S(p_x) = - \int p_x(u) \log p_x(u) \, du</math> | | :<math>S(p_x) = - \int p_x(u) \log p_x(u) \, du</math> |
| | | |
− | <math>S(p_x) = - \int p_x(u) \log p_x(u) \, du</math>
| |
− |
| |
− | 数学 s (p x)- int p x (u) log p x (u) ,du / math
| |
− |
| |
− |
| |
− |
| |
− | Negentropy is used in [[statistics]] and [[signal processing]]. It is related to network [[Information entropy|entropy]], which is used in [[independent component analysis]].<ref>P. Comon, Independent Component Analysis – a new concept?, ''Signal Processing'', '''36''' 287–314, 1994.</ref><ref>Didier G. Leibovici and Christian Beckmann, [http://www.fmrib.ox.ac.uk/analysis/techrep/tr01dl1/tr01dl1/tr01dl1.html An introduction to Multiway Methods for Multi-Subject fMRI experiment], FMRIB Technical Report 2001, Oxford Centre for Functional Magnetic Resonance Imaging of the Brain (FMRIB), Department of Clinical Neurology, University of Oxford, John Radcliffe Hospital, Headley Way, Headington, Oxford, UK.</ref>
| |
− |
| |
− | Negentropy is used in statistics and signal processing. It is related to network entropy, which is used in independent component analysis.
| |
− |
| |
− | 负熵通常用于统计和信号处理。它与网络熵有关,网络熵常用于<font color="#ff8000">进行独立成分分析 independent component analysis</font>。
| |
| | | |
| + | 负熵通常用于统计和信号处理。它与网络熵有关,网络熵常用于<font color="#ff8000">进行独立成分分析 independent component analysis</font>。<ref>P. Comon, Independent Component Analysis – a new concept?, ''Signal Processing'', '''36''' 287–314, 1994.</ref><ref>Didier G. Leibovici and Christian Beckmann, [http://www.fmrib.ox.ac.uk/analysis/techrep/tr01dl1/tr01dl1/tr01dl1.html An introduction to Multiway Methods for Multi-Subject fMRI experiment], FMRIB Technical Report 2001, Oxford Centre for Functional Magnetic Resonance Imaging of the Brain (FMRIB), Department of Clinical Neurology, University of Oxford, John Radcliffe Hospital, Headley Way, Headington, Oxford, UK.</ref> |
| | | |
− |
| |
− | The negentropy of a distribution is equal to the [[Kullback–Leibler divergence]] between <math>p_x</math> and a Gaussian distribution with the same mean and variance as <math>p_x</math> (see [[Differential entropy#Maximization in the normal distribution]] for a proof). In particular, it is always nonnegative.
| |
− |
| |
− | The negentropy of a distribution is equal to the Kullback–Leibler divergence between <math>p_x</math> and a Gaussian distribution with the same mean and variance as <math>p_x</math> (see Differential entropy#Maximization in the normal distribution for a proof). In particular, it is always nonnegative.
| |
| | | |
| 一个分布的负熵等于 <math>p_x</math> 和具有与 <math>p_x</math> 相同均值和方差的正态分布的 Kullback-Leibler 散度(参见正态分布的<font color="#ff8000">微分熵 Differential entropy</font>和最大化)。特别地,负熵总是非负的。 | | 一个分布的负熵等于 <math>p_x</math> 和具有与 <math>p_x</math> 相同均值和方差的正态分布的 Kullback-Leibler 散度(参见正态分布的<font color="#ff8000">微分熵 Differential entropy</font>和最大化)。特别地,负熵总是非负的。 |
| | | |
| | | |
− |
| |
− | <br />
| |
− |
| |
− | <br />
| |
− |
| |
− | Br /
| |
| | | |
| ==Correlation between statistical negentropy and Gibbs' free energy 统计学负熵与吉布斯自由能的关联== | | ==Correlation between statistical negentropy and Gibbs' free energy 统计学负熵与吉布斯自由能的关联== |
| | | |
− | [[File:Wykres Gibbsa.svg|275px|thumb|right|[[Willard Gibbs]]’ 1873 '''available energy''' ([[Thermodynamic free energy|free energy]]) graph, which shows a plane perpendicular to the axis of ''v'' ([[volume]]) and passing through point A, which represents the initial state of the body. MN is the section of the surface of [[dissipated energy]]. Qε and Qη are sections of the planes ''η'' = 0 and ''ε'' = 0, and therefore parallel to the axes of ε ([[internal energy]]) and η ([[entropy]]) respectively. AD and AE are the energy and entropy of the body in its initial state, AB and AC its ''available energy'' ([[Gibbs energy]]) and its ''capacity for entropy'' (the amount by which the entropy of the body can be increased without changing the energy of the body or increasing its volume) respectively.]]
| |
− |
| |
− | [[Willard Gibbs’ 1873 available energy (free energy) graph, which shows a plane perpendicular to the axis of v (volume) and passing through point A, which represents the initial state of the body. MN is the section of the surface of dissipated energy. Qε and Qη are sections of the planes η = 0 and ε = 0, and therefore parallel to the axes of ε (internal energy) and η (entropy) respectively. AD and AE are the energy and entropy of the body in its initial state, AB and AC its available energy (Gibbs energy) and its capacity for entropy (the amount by which the entropy of the body can be increased without changing the energy of the body or increasing its volume) respectively.]]
| |
| | | |
| [ Willard Gibbs 在1873年展示的可用能量(自由能)图以一个垂直于 v (体积)轴和通过点A的平面为例,A点表示物体的初始状态。MN 是耗散能曲面的交线。Qε 和 Qη 分别是平面''η'' = 0 和 ''ε'' = 0的交线,因此分别与 ε (内能)和 η (熵)轴平行。AD 和 AE 分别代表物体初始状态的能量和熵,AB 和 AC 分别代表物体的有效能(吉布斯能)和熵的容量(在不改变物体能量或增加物体体积的情况下物体可以增加的熵的量)。 | | [ Willard Gibbs 在1873年展示的可用能量(自由能)图以一个垂直于 v (体积)轴和通过点A的平面为例,A点表示物体的初始状态。MN 是耗散能曲面的交线。Qε 和 Qη 分别是平面''η'' = 0 和 ''ε'' = 0的交线,因此分别与 ε (内能)和 η (熵)轴平行。AD 和 AE 分别代表物体初始状态的能量和熵,AB 和 AC 分别代表物体的有效能(吉布斯能)和熵的容量(在不改变物体能量或增加物体体积的情况下物体可以增加的熵的量)。 |
| | | |
− | There is a physical quantity closely linked to [[Thermodynamic free energy|free energy]] ([[free enthalpy]]), with a unit of entropy and isomorphic to negentropy known in statistics and information theory. In 1873, [[Josiah Willard Gibbs|Willard Gibbs]] created a diagram illustrating the concept of free energy corresponding to [[free enthalpy]]. On the diagram one can see the quantity called [[capacity for entropy]]. This quantity is the amount of entropy that may be increased without changing an internal energy or increasing its volume.<ref>Willard Gibbs, [http://www.ufn.ru/ufn39/ufn39_4/Russian/r394b.pdf A Method of Geometrical Representation of the Thermodynamic Properties of Substances by Means of Surfaces], ''Transactions of the Connecticut Academy'', 382–404 (1873)</ref> In other words, it is a difference between maximum possible, under assumed conditions, entropy and its actual entropy. It corresponds exactly to the definition of negentropy adopted in statistics and information theory. A similar physical quantity was introduced in 1869 by [[François Jacques Dominique Massieu|Massieu]] for the [[isothermal process]]<ref>Massieu, M. F. (1869a). Sur les fonctions caractéristiques des divers fluides. ''C. R. Acad. Sci.'' LXIX:858–862.</ref><ref>Massieu, M. F. (1869b). Addition au precedent memoire sur les fonctions caractéristiques. ''C. R. Acad. Sci.'' LXIX:1057–1061.</ref><ref>Massieu, M. F. (1869), ''Compt. Rend.'' '''69''' (858): 1057.</ref> (both quantities differs just with a figure sign) and then [[Max Planck|Planck]] for the [[Isothermal process|isothermal]]-[[Isobaric process|isobaric]] process.<ref>Planck, M. (1945). ''Treatise on Thermodynamics''. Dover, New York.</ref> More recently, the Massieu–Planck [[thermodynamic potential]], known also as ''[[free entropy]]'', has been shown to play a great role in the so-called entropic formulation of [[statistical mechanics]],<ref>Antoni Planes, Eduard Vives, [http://www.ecm.ub.es/condensed/eduard/papers/massieu/node2.html Entropic Formulation of Statistical Mechanics], Entropic variables and Massieu–Planck functions 2000-10-24 Universitat de Barcelona</ref> applied among the others in molecular biology<ref>John A. Scheilman, [http://www.biophysj.org/cgi/reprint/73/6/2960.pdf Temperature, Stability, and the Hydrophobic Interaction], ''Biophysical Journal'' '''73''' (December 1997), 2960–2964, Institute of Molecular Biology, University of Oregon, Eugene, Oregon 97403 USA</ref> and thermodynamic non-equilibrium processes.<ref>Z. Hens and X. de Hemptinne, [https://arxiv.org/pdf/chao-dyn/9604008 Non-equilibrium Thermodynamics approach to Transport Processes in Gas Mixtures], Department of Chemistry, Catholic University of Leuven, Celestijnenlaan 200 F, B-3001 Heverlee, Belgium</ref>
| |
− |
| |
− | There is a physical quantity closely linked to free energy (free enthalpy), with a unit of entropy and isomorphic to negentropy known in statistics and information theory. In 1873, Willard Gibbs created a diagram illustrating the concept of free energy corresponding to free enthalpy. On the diagram one can see the quantity called capacity for entropy. This quantity is the amount of entropy that may be increased without changing an internal energy or increasing its volume. In other words, it is a difference between maximum possible, under assumed conditions, entropy and its actual entropy. It corresponds exactly to the definition of negentropy adopted in statistics and information theory. A similar physical quantity was introduced in 1869 by Massieu for the isothermal process (both quantities differs just with a figure sign) and then Planck for the isothermal-isobaric process. More recently, the Massieu–Planck thermodynamic potential, known also as free entropy, has been shown to play a great role in the so-called entropic formulation of statistical mechanics, applied among the others in molecular biology and thermodynamic non-equilibrium processes.
| |
− |
| |
− | 存在一个与自由能(<font color="#ff8000">自由焓 free enthalpy</font>)密切相关的物理量,它具有熵的单位并且与我们所知的统计学和信息论中的负熵同构。1873年,威拉德·吉布斯创建了一个图表,说明了自由能对应于自由焓的概念。在图表上,我们可以看到称为<font color="#ff8000">熵的容量 capacity for entropy</font>的物理量。这个量表示在不改变<font color="#ff8000">内能 internal energy</font>体积的情况下可增加的熵值。换句话说,它是在假定条件下可能的最大熵与实际熵之间的差异。它恰好符合统计学和信息论中负熵的定义。1869年,Massieu 在<font color="#ff8000">等温过程 isothermal process</font>(两个量只有一个图形符号不同)中引入了一个类似的物理量,后来 Planck 把这个概念引入到<font color="#ff8000">等温-等压 isothermal-isobaric process</font>过程中。近期,Massieu–Planck 提出的热力学势,也被称为自由熵,已被证明在所谓的统计力学熵表述中发挥了重要作用,应用于分子生物学和热力学非平衡过程。
| |
| | | |
| + | 存在一个与自由能(<font color="#ff8000">自由焓 free enthalpy</font>)密切相关的物理量,它具有熵的单位并且与我们所知的统计学和信息论中的负熵同构。1873年,威拉德·吉布斯创建了一个图表,说明了自由能对应于自由焓的概念。在图表上,我们可以看到称为<font color="#ff8000">熵的容量 capacity for entropy</font>的物理量。这个量表示在不改变<font color="#ff8000">内能 internal energy</font>体积的情况下可增加的熵值。<ref>Willard Gibbs, [http://www.ufn.ru/ufn39/ufn39_4/Russian/r394b.pdf A Method of Geometrical Representation of the Thermodynamic Properties of Substances by Means of Surfaces], ''Transactions of the Connecticut Academy'', 382–404 (1873)</ref> 换句话说,它是在假定条件下可能的最大熵与实际熵之间的差异。它恰好符合统计学和信息论中负熵的定义。1869年,Massieu 在<font color="#ff8000">等温过程 isothermal process</font><ref>Massieu, M. F. (1869a). Sur les fonctions caractéristiques des divers fluides. ''C. R. Acad. Sci.'' LXIX:858–862.</ref><ref>Massieu, M. F. (1869b). Addition au precedent memoire sur les fonctions caractéristiques. ''C. R. Acad. Sci.'' LXIX:1057–1061.</ref><ref>Massieu, M. F. (1869), ''Compt. Rend.'' '''69''' (858): 1057.</ref>(两个量只有一个图形符号不同)中引入了一个类似的物理量,<ref>Planck, M. (1945). ''Treatise on Thermodynamics''. Dover, New York.</ref>后来 Planck 把这个概念引入到<font color="#ff8000">等温-等压 isothermal-isobaric process</font>过程中。近期,Massieu–Planck 提出的热力学势,也被称为自由熵,<ref>Antoni Planes, Eduard Vives, [http://www.ecm.ub.es/condensed/eduard/papers/massieu/node2.html Entropic Formulation of Statistical Mechanics], Entropic variables and Massieu–Planck functions 2000-10-24 Universitat de Barcelona</ref>已被证明在所谓的统计力学熵表述中发挥了重要作用,应用于分子生物学<ref>John A. Scheilman, [http://www.biophysj.org/cgi/reprint/73/6/2960.pdf Temperature, Stability, and the Hydrophobic Interaction], ''Biophysical Journal'' '''73''' (December 1997), 2960–2964, Institute of Molecular Biology, University of Oregon, Eugene, Oregon 97403 USA</ref> 和热力学非平衡过程。<ref>Z. Hens and X. de Hemptinne, [https://arxiv.org/pdf/chao-dyn/9604008 Non-equilibrium Thermodynamics approach to Transport Processes in Gas Mixtures], Department of Chemistry, Catholic University of Leuven, Celestijnenlaan 200 F, B-3001 Heverlee, Belgium</ref> |
| | | |
| | | |
| :: <math>J = S_\max - S = -\Phi = -k \ln Z\,</math> | | :: <math>J = S_\max - S = -\Phi = -k \ln Z\,</math> |
| | | |
− | <math>J = S_\max - S = -\Phi = -k \ln Z\,</math>
| |
− |
| |
− | 数学 j s max-s- Phi-k ln z ,/ math
| |
− |
| |
− |
| |
− |
| |
− | ::where:
| |
− |
| |
− | where:
| |
| | | |
| 其中: | | 其中: |
− |
| |
− | ::<math>S</math> is [[entropy]]
| |
− |
| |
− | <math>S</math> is entropy
| |
| | | |
| <math>S</math> 是熵 | | <math>S</math> 是熵 |
− |
| |
− | ::<math>J</math> is negentropy (Gibbs "capacity for entropy")
| |
− |
| |
− | <math>J</math> is negentropy (Gibbs "capacity for entropy")
| |
− |
| |
| <math>J</math> 是负熵(吉布斯“熵的容量”) | | <math>J</math> 是负熵(吉布斯“熵的容量”) |
− |
| |
− | ::<math>\Phi</math> is the [[Free entropy|Massieu potential]]
| |
− |
| |
− | <math>\Phi</math> is the Massieu potential
| |
− |
| |
| <math>\Phi</math> 是 Massieu 势 | | <math>\Phi</math> 是 Massieu 势 |
− |
| |
− | ::<math>Z</math> is the [[Partition function (statistical mechanics)|partition function]]
| |
− |
| |
− | <math>Z</math> is the partition function
| |
− |
| |
| <math>Z</math> 是配分函数 | | <math>Z</math> 是配分函数 |
− |
| |
− | ::<math>k</math> the [[Boltzmann constant]]
| |
− |
| |
− | <math>k</math> the Boltzmann constant
| |
− |
| |
| <math>k</math> 是波兹曼常数 | | <math>k</math> 是波兹曼常数 |
| | | |
− |
| |
− |
| |
− | In particular, mathematically the negentropy (the negative entropy function, in physics interpreted as free entropy) is the [[convex conjugate]] of [[LogSumExp]] (in physics interpreted as the free energy).
| |
− |
| |
− | In particular, mathematically the negentropy (the negative entropy function, in physics interpreted as free entropy) is the convex conjugate of LogSumExp (in physics interpreted as the free energy).
| |
| | | |
| 特别地,数学上的负熵(负熵函数,在物理学中解释为自由熵)是 LogSumExp 的凸共轭(在物理学中解释为自由能)。 | | 特别地,数学上的负熵(负熵函数,在物理学中解释为自由熵)是 LogSumExp 的凸共轭(在物理学中解释为自由能)。 |
第162行: |
第65行: |
| ==Brillouin's negentropy principle of information 布里渊信息负熵原理== | | ==Brillouin's negentropy principle of information 布里渊信息负熵原理== |
| | | |
− | In 1953, [[Léon Brillouin]] derived a general equation<ref>Leon Brillouin, The negentropy principle of information, ''J. Applied Physics'' '''24''', 1152–1163 1953</ref> stating that the changing of an information bit value requires at least kT ln(2) energy. This is the same energy as the work [[Leó Szilárd]]'s engine produces in the idealistic case. In his book,<ref>Leon Brillouin, ''Science and Information theory'', Dover, 1956</ref> he further explored this problem concluding that any cause of this bit value change (measurement, decision about a yes/no question, erasure, display, etc.) will require the same amount of energy.
| |
− |
| |
− | In 1953, Léon Brillouin derived a general equation stating that the changing of an information bit value requires at least kT ln(2) energy. This is the same energy as the work Leó Szilárd's engine produces in the idealistic case. In his book, he further explored this problem concluding that any cause of this bit value change (measurement, decision about a yes/no question, erasure, display, etc.) will require the same amount of energy.
| |
| | | |
− | 1953年,布里渊推导出一个一般性方程,证明了改变一个信息的比特值至少需要 kT ln (2)的能量。这与理想情况下 Leó Szilárd 的引擎所产生的能量相等。在他的书中,他进一步探讨了这个问题并得出结论: 任何条件下一个比特的改变(测量、关于是/否问题的决定、擦除、显示等等)都需要消耗同等的能量。 | + | 1953年,布里渊推导出一个一般性方程,证明了改变一个信息的比特值至少需要 kT ln (2)的能量。这与理想情况下 Leó Szilárd 的引擎所产生的能量相等。在他的书中,<ref>Leon Brillouin, ''Science and Information theory'', Dover, 1956</ref>他进一步探讨了这个问题并得出结论: 任何条件下一个比特的改变(测量、关于是/否问题的决定、擦除、显示等等)都需要消耗同等的能量。 |
| | | |
| ==See also 参见== | | ==See also 参见== |
此词条由Jxzhou(讨论)翻译
由CecileLi初步审校
模板:Distinguish
在信息论 information theory和统计学 statistics中,负熵 negentropy被用于度量与正态分布之间的距离。“负熵”这一概念是埃尔温·薛定谔Erwin Schrödinger在他1944年的科普著作《生命是什么?》 What is Life?中首次提出,[1]后来莱昂·布里渊Léon Brillouin把这个短语缩写为“负熵”。[2][3]1974年, 阿尔伯特·圣捷尔吉Albert Szent-Györgyi提出用“同向”这个可能源于20世纪40年代意大利数学家 Luigi fantappi的术语来代替“负熵”以建立一个生物学和物理学的统一理论。虽然巴克敏斯特·福乐 Buckminster Fuller试图推广这一思想,但是迄今为止负熵仍更为人们广泛使用。
在《生命是什么?》的一项注释中,薛定谔解释了他使用这个短语的原因。
如果我只为了迎合物理学家们,那么我就会让讨论转向“自由能 free energy”。在这个语境中,自由能是物理学更熟悉的概念。但是,这个高度专业的术语在语言学上似乎太接近于能量 energy,以至于普通读者无法生动地看到两者之间的区别。
2009年,Mahulikar 和 Herwig 将动态有序子系统的负熵重新定义为有序子系统相对于周围混沌的特定熵赤字。[4]因此,根据单位质量的熵定义负熵的 SI 单位为(J kg−1 K−1) ,其中(K−1)的定义基于单位能量的熵。这个定义表明: i)动态有序存在的尺度不变的热力学表示,ii)动态有序存在和演化的专门物理原理,iii)薛定谔关于负熵的数学解释。
Information theory 信息论
在信息论和统计学中,负熵可用来度量某个分布到正态分布的距离。[5][6][7] 在所有具有给定均值和方差的分布中,正态分布或高斯分布的熵最大。负熵也用来度量具有与它相同均值和方差的给定分布和正态分布之间熵的差距。因此,负熵总是非负的,在任何线性可逆的坐标变换下都是不变的,当且仅当分布表现为高斯分布 Gaussian distribution时才变为零。
负熵定义为
- [math]\displaystyle{ J(p_x) = S(\varphi_x) - S(p_x)\, }[/math]
其中[math]\displaystyle{ S(\varphi_x) }[/math]表示与[math]\displaystyle{ p_x }[/math]具有相同均值和方差的高斯密度的微分熵,[math]\displaystyle{ S(p_x) }[/math]表示[math]\displaystyle{ p_x }[/math]的微分熵:
- [math]\displaystyle{ S(p_x) = - \int p_x(u) \log p_x(u) \, du }[/math]
负熵通常用于统计和信号处理。它与网络熵有关,网络熵常用于进行独立成分分析 independent component analysis。[8][9]
一个分布的负熵等于 [math]\displaystyle{ p_x }[/math] 和具有与 [math]\displaystyle{ p_x }[/math] 相同均值和方差的正态分布的 Kullback-Leibler 散度(参见正态分布的微分熵 Differential entropy和最大化)。特别地,负熵总是非负的。
Correlation between statistical negentropy and Gibbs' free energy 统计学负熵与吉布斯自由能的关联
[ Willard Gibbs 在1873年展示的可用能量(自由能)图以一个垂直于 v (体积)轴和通过点A的平面为例,A点表示物体的初始状态。MN 是耗散能曲面的交线。Qε 和 Qη 分别是平面η = 0 和 ε = 0的交线,因此分别与 ε (内能)和 η (熵)轴平行。AD 和 AE 分别代表物体初始状态的能量和熵,AB 和 AC 分别代表物体的有效能(吉布斯能)和熵的容量(在不改变物体能量或增加物体体积的情况下物体可以增加的熵的量)。
存在一个与自由能(自由焓 free enthalpy)密切相关的物理量,它具有熵的单位并且与我们所知的统计学和信息论中的负熵同构。1873年,威拉德·吉布斯创建了一个图表,说明了自由能对应于自由焓的概念。在图表上,我们可以看到称为熵的容量 capacity for entropy的物理量。这个量表示在不改变内能 internal energy体积的情况下可增加的熵值。[10] 换句话说,它是在假定条件下可能的最大熵与实际熵之间的差异。它恰好符合统计学和信息论中负熵的定义。1869年,Massieu 在等温过程 isothermal process[11][12][13](两个量只有一个图形符号不同)中引入了一个类似的物理量,[14]后来 Planck 把这个概念引入到等温-等压 isothermal-isobaric process过程中。近期,Massieu–Planck 提出的热力学势,也被称为自由熵,[15]已被证明在所谓的统计力学熵表述中发挥了重要作用,应用于分子生物学[16] 和热力学非平衡过程。[17]
- [math]\displaystyle{ J = S_\max - S = -\Phi = -k \ln Z\, }[/math]
其中:
[math]\displaystyle{ S }[/math] 是熵
[math]\displaystyle{ J }[/math] 是负熵(吉布斯“熵的容量”)
[math]\displaystyle{ \Phi }[/math] 是 Massieu 势
[math]\displaystyle{ Z }[/math] 是配分函数
[math]\displaystyle{ k }[/math] 是波兹曼常数
特别地,数学上的负熵(负熵函数,在物理学中解释为自由熵)是 LogSumExp 的凸共轭(在物理学中解释为自由能)。
Brillouin's negentropy principle of information 布里渊信息负熵原理
1953年,布里渊推导出一个一般性方程,证明了改变一个信息的比特值至少需要 kT ln (2)的能量。这与理想情况下 Leó Szilárd 的引擎所产生的能量相等。在他的书中,[18]他进一步探讨了这个问题并得出结论: 任何条件下一个比特的改变(测量、关于是/否问题的决定、擦除、显示等等)都需要消耗同等的能量。
See also 参见
--CecileLi(讨论) 【审校】如果手误与上下文衔接应为 entropy,熵;如果是Extropy,未查询到专门中文翻译,只有英译英:the prediction that human intelligence and technology will enable life to expand in an orderly way throughout the entire universe
Notes 注释
- ↑ Schrödinger, Erwin, What is Life – the Physical Aspect of the Living Cell, Cambridge University Press, 1944
- ↑ Brillouin, Leon: (1953) "Negentropy Principle of Information", J. of Applied Physics, v. 24(9), pp. 1152–1163
- ↑ Léon Brillouin, La science et la théorie de l'information, Masson, 1959
- ↑ Mahulikar, S.P. & Herwig, H.: (2009) "Exact thermodynamic principles for dynamic order existence and evolution in chaos", Chaos, Solitons & Fractals, v. 41(4), pp. 1939–1948
- ↑ Aapo Hyvärinen, Survey on Independent Component Analysis, node32: Negentropy, Helsinki University of Technology Laboratory of Computer and Information Science
- ↑ Aapo Hyvärinen and Erkki Oja, Independent Component Analysis: A Tutorial, node14: Negentropy, Helsinki University of Technology Laboratory of Computer and Information Science
- ↑ Ruye Wang, Independent Component Analysis, node4: Measures of Non-Gaussianity
- ↑ P. Comon, Independent Component Analysis – a new concept?, Signal Processing, 36 287–314, 1994.
- ↑ Didier G. Leibovici and Christian Beckmann, An introduction to Multiway Methods for Multi-Subject fMRI experiment, FMRIB Technical Report 2001, Oxford Centre for Functional Magnetic Resonance Imaging of the Brain (FMRIB), Department of Clinical Neurology, University of Oxford, John Radcliffe Hospital, Headley Way, Headington, Oxford, UK.
- ↑ Willard Gibbs, A Method of Geometrical Representation of the Thermodynamic Properties of Substances by Means of Surfaces, Transactions of the Connecticut Academy, 382–404 (1873)
- ↑ Massieu, M. F. (1869a). Sur les fonctions caractéristiques des divers fluides. C. R. Acad. Sci. LXIX:858–862.
- ↑ Massieu, M. F. (1869b). Addition au precedent memoire sur les fonctions caractéristiques. C. R. Acad. Sci. LXIX:1057–1061.
- ↑ Massieu, M. F. (1869), Compt. Rend. 69 (858): 1057.
- ↑ Planck, M. (1945). Treatise on Thermodynamics. Dover, New York.
- ↑ Antoni Planes, Eduard Vives, Entropic Formulation of Statistical Mechanics, Entropic variables and Massieu–Planck functions 2000-10-24 Universitat de Barcelona
- ↑ John A. Scheilman, Temperature, Stability, and the Hydrophobic Interaction, Biophysical Journal 73 (December 1997), 2960–2964, Institute of Molecular Biology, University of Oregon, Eugene, Oregon 97403 USA
- ↑ Z. Hens and X. de Hemptinne, Non-equilibrium Thermodynamics approach to Transport Processes in Gas Mixtures, Department of Chemistry, Catholic University of Leuven, Celestijnenlaan 200 F, B-3001 Heverlee, Belgium
- ↑ Leon Brillouin, Science and Information theory, Dover, 1956
40x40px |
Look up 负熵 in Wiktionary, the free dictionary. |
Category:Entropy and information
类别: 熵和信息
Category:Negative concepts
类别: 否定概念
Category:Statistical deviation and dispersion
类别: 统计偏差和离散度
Category:Thermodynamic entropy
类别: 熵
This page was moved from wikipedia:en:Negentropy. Its edit history can be viewed at 负熵/edithistory