# 信息论发展史

此词条暂由彩云小译翻译，翻译字数共1434，未经人工整理和审校，带来阅读不便，请见谅。

The decisive event which established the discipline of **information theory**, and brought it to immediate worldwide attention, was the publication of Claude E. Shannon's classic paper "A Mathematical Theory of Communication" in the *Bell System Technical Journal* in July and October 1948.

The decisive event which established the discipline of information theory, and brought it to immediate worldwide attention, was the publication of Claude E. Shannon's classic paper "A Mathematical Theory of Communication" in the Bell System Technical Journal in July and October 1948.

1948年7月和10月，在《贝尔系统技术杂志》上发表了克劳德·香农的经典论文《交流的数学理论》 ，这是确立了信息理论学科并使其立即引起全世界关注的决定性事件。

In this revolutionary and groundbreaking paper, the work for which Shannon had substantially completed at Bell Labs by the end of 1944, Shannon for the first time introduced the qualitative and quantitative model of communication as a statistical process underlying information theory, opening with the assertion that

- "The fundamental problem of communication is that of reproducing at one point, either exactly or approximately, a message selected at another point."

In this revolutionary and groundbreaking paper, the work for which Shannon had substantially completed at Bell Labs by the end of 1944, Shannon for the first time introduced the qualitative and quantitative model of communication as a statistical process underlying information theory, opening with the assertion that

- "The fundamental problem of communication is that of reproducing at one point, either exactly or approximately, a message selected at another point."

在这篇具有革命性和开创性的论文中，香农于1944年底在贝尔实验室基本完成了他的工作，香农第一次将通信的定性和定量模型作为信息理论基础的统计过程，并以这样的论断开始:”通信的根本问题是在某一点上精确地或近似地再现在另一点上选择的信息

With it came the ideas of

- the information entropy and redundancy of a source, and its relevance through the source coding theorem;
- the mutual information, and the channel capacity of a noisy channel, including the promise of perfect loss-free communication given by the noisy-channel coding theorem;
- the practical result of the Shannon–Hartley law for the channel capacity of a Gaussian channel; and of course
- the bit - a new way of seeing the most fundamental unit of information.

With it came the ideas of

- the information entropy and redundancy of a source, and its relevance through the source coding theorem;
- the mutual information, and the channel capacity of a noisy channel, including the promise of perfect loss-free communication given by the noisy-channel coding theorem;
- the practical result of the Shannon–Hartley law for the channel capacity of a Gaussian channel; and of course
- the bit - a new way of seeing the most fundamental unit of information.

随之而来的是信源的熵和冗余，以及信源编码定理的相关性; 信道的互信息，以及噪声信道的信道容量，包括有噪信道编码定理提供的完美无损通信的承诺; 香农-哈特利定律关于高斯信道信道容量的实际结果; 当然还有比特-一种观察最基本信息单位的新方法。

## Before 1948

## Before 1948

# = 1948年前 =

### Early telecommunications

### Early telecommunications

# = = 早期电讯 =

Some of the oldest methods of telecommunications implicitly use many of the ideas that would later be quantified in information theory. Modern telegraphy, starting in the 1830s, used Morse code, in which more common letters (like "E", which is expressed as one "dot") are transmitted more quickly than less common letters (like "J", which is expressed by one "dot" followed by three "dashes"). The idea of encoding information in this manner is the cornerstone of lossless data compression. A hundred years later, frequency modulation illustrated that bandwidth can be considered merely another degree of freedom. The vocoder, now largely looked at as an audio engineering curiosity, was originally designed in 1939 to use less bandwidth than that of an original message, in much the same way that mobile phones now trade off voice quality with bandwidth.

Some of the oldest methods of telecommunications implicitly use many of the ideas that would later be quantified in information theory. Modern telegraphy, starting in the 1830s, used Morse code, in which more common letters (like "E", which is expressed as one "dot") are transmitted more quickly than less common letters (like "J", which is expressed by one "dot" followed by three "dashes"). The idea of encoding information in this manner is the cornerstone of lossless data compression. A hundred years later, frequency modulation illustrated that bandwidth can be considered merely another degree of freedom. The vocoder, now largely looked at as an audio engineering curiosity, was originally designed in 1939 to use less bandwidth than that of an original message, in much the same way that mobile phones now trade off voice quality with bandwidth.

一些最古老的电信方法含蓄地使用了许多后来在信息理论中被量化的观点。现代电报始于19世纪30年代，使用摩尔斯电码，其中更多的普通字母(如“ e”，表示为一个“点”)比不常见的字母(如“ j”，表示为一个“点”后面跟着三个“破折号”)传输得更快。以这种方式编码信息的想法是无损数据压缩的基石。100年后，频率调制证明了带宽可以被认为仅仅是另一种自由度。声码器，现在主要看作是一个音频工程的好奇心，最初设计在1939年，使用较少的带宽比原来的信息，大致相同的方式，现在的移动电话取代语音质量的带宽。

### Quantitative ideas of information

The most direct antecedents of Shannon's work were two papers published in the 1920s by Harry Nyquist and Ralph Hartley, who were both still research leaders at Bell Labs when Shannon arrived in the early 1940s.

The most direct antecedents of Shannon's work were two papers published in the 1920s by Harry Nyquist and Ralph Hartley, who were both still research leaders at Bell Labs when Shannon arrived in the early 1940s.

20世纪20年代，哈里 · 奈奎斯特和拉尔夫 · 哈特利发表了两篇论文，当香农在40年代初到达贝尔实验室时，他们两人都还是研究领导者。

Nyquist's 1924 paper, "Certain Factors Affecting Telegraph Speed", is mostly concerned with some detailed engineering aspects of telegraph signals. But a more theoretical section discusses quantifying "intelligence" and the "line speed" at which it can be transmitted by a communication system, giving the relation

Nyquist's 1924 paper, "Certain Factors Affecting Telegraph Speed", is mostly concerned with some detailed engineering aspects of telegraph signals. But a more theoretical section discusses quantifying "intelligence" and the "line speed" at which it can be transmitted by a communication system, giving the relation

奈奎斯特1924年的论文《影响电报速度的某些因素》主要涉及电报信号的一些详细的工程方面。但更多的理论部分讨论了量化的“智能”和“线路速度”，它可以传输的通信系统，给出的关系

- [math]\displaystyle{ W = K \log m \, }[/math]

- W = K \log m \,

- w = k log m,

where *W* is the speed of transmission of intelligence, *m* is the number of different voltage levels to choose from at each time step, and *K* is a constant.^{[1]}

where W is the speed of transmission of intelligence, m is the number of different voltage levels to choose from at each time step, and K is a constant.

其中 w 是智能传输的速度，m 是每个时间步骤可以选择的不同电压水平的数量，k 是一个常数。

Hartley's 1928 paper, called simply "Transmission of Information", went further by using the word *information* (in a technical sense), and making explicitly clear that information in this context was a measurable quantity, reflecting only the receiver's ability to distinguish that one sequence of symbols had been intended by the sender rather than any other—quite regardless of any associated meaning or other psychological or semantic aspect the symbols might represent. This amount of information he quantified as

Hartley's 1928 paper, called simply "Transmission of Information", went further by using the word information (in a technical sense), and making explicitly clear that information in this context was a measurable quantity, reflecting only the receiver's ability to distinguish that one sequence of symbols had been intended by the sender rather than any other—quite regardless of any associated meaning or other psychological or semantic aspect the symbols might represent. This amount of information he quantified as

哈特利在1928年发表的论文《信息的传递》(Transmission of Information)更进一步，使用了“信息”这个词(在技术意义上) ，并明确表明，在这种情况下，信息是一个可测量的量，只反映了接收者辨别一系列符号是发送者而不是其他人的能力——完全不考虑这些符号可能代表的任何相关意义或其他心理或语义方面。他将这些信息量化为

- [math]\displaystyle{ H = \log S^n \, }[/math]

- H = \log S^n \,

- h = log s ^ n,

where *S* was the number of possible symbols, and *n* the number of symbols in a transmission. The natural unit of information was therefore the decimal digit, much later renamed the hartley in his honour as a unit or scale or measure of information. The Hartley information, *H*_{0}, is still used as a quantity for the logarithm of the total number of possibilities.^{[2]}

where S was the number of possible symbols, and n the number of symbols in a transmission. The natural unit of information was therefore the decimal digit, much later renamed the hartley in his honour as a unit or scale or measure of information. The Hartley information, H0, is still used as a quantity for the logarithm of the total number of possibilities.

其中 s 表示可能符号的数量，n 表示传输中符号的数量。因此，信息的自然单位是十进制数字，后来为了纪念他，更名为哈特利，作为信息的单位、尺度或度量单位。哈特利信息 h0仍然被用作可能性总数的对数。

A similar unit of log_{10} probability, the *ban*, and its derived unit the deciban (one tenth of a ban), were introduced by Alan Turing in 1940 as part of the statistical analysis of the breaking of the German second world war Enigma cyphers. The *decibannage* represented the reduction in (the logarithm of) the total number of possibilities (similar to the change in the Hartley information); and also the log-likelihood ratio (or change in the weight of evidence) that could be inferred for one hypothesis over another from a set of observations. The expected change in the weight of evidence is equivalent to what was later called the Kullback discrimination information.

A similar unit of log10 probability, the ban, and its derived unit the deciban (one tenth of a ban), were introduced by Alan Turing in 1940 as part of the statistical analysis of the breaking of the German second world war Enigma cyphers. The decibannage represented the reduction in (the logarithm of) the total number of possibilities (similar to the change in the Hartley information); and also the log-likelihood ratio (or change in the weight of evidence) that could be inferred for one hypothesis over another from a set of observations. The expected change in the weight of evidence is equivalent to what was later called the Kullback discrimination information.

1940年，阿兰 · 图灵(Alan Turing)引入了一个类似的 log10概率单位——“禁令”(the ban)及其派生单位 deciban (十分之一的禁令) ，作为对德国二战中破解谜密码的统计分析的一部分。地籍图代表可能性总数(对数)的减少(类似于哈特利信息的变化) ; 以及对数似然比(或证据权重的变化) ，可以从一组观察结果推断出一个假设超过另一个假设。证据权重的预期变化相当于后来所谓的 Kullback 歧视信息。

But underlying this notion was still the idea of equal a-priori probabilities, rather than the information content of events of unequal probability; nor yet any underlying picture of questions regarding the communication of such varied outcomes.

But underlying this notion was still the idea of equal a-priori probabilities, rather than the information content of events of unequal probability; nor yet any underlying picture of questions regarding the communication of such varied outcomes.

但是，这一概念的基础仍然是等先验概率的概念，而不是不等概率事件的信息内容; 也不是关于传达这种不同结果的问题的基本情况。

### Entropy in statistical mechanics

One area where unequal probabilities were indeed well known was statistical mechanics, where Ludwig Boltzmann had, in the context of his H-theorem of 1872, first introduced the quantity

One area where unequal probabilities were indeed well known was statistical mechanics, where Ludwig Boltzmann had, in the context of his H-theorem of 1872, first introduced the quantity

= = = 统计力学的熵 = = = 一个领域的不等概率确实是众所周知的是统计力学，其中路德维希·玻尔兹曼，在他的 h 定理的背景下，1872年，首先介绍了量

- [math]\displaystyle{ H = - \sum f_i \log f_i }[/math]

- H = - \sum f_i \log f_i

- h =-sum f _ i log f _ i

as a measure of the breadth of the spread of states available to a single particle in a gas of like particles, where *f* represented the relative frequency distribution of each possible state. Boltzmann argued mathematically that the effect of collisions between the particles would cause the *H*-function to inevitably increase from any initial configuration until equilibrium was reached; and further identified it as an underlying microscopic rationale for the macroscopic thermodynamic entropy of Clausius.

as a measure of the breadth of the spread of states available to a single particle in a gas of like particles, where f represented the relative frequency distribution of each possible state. Boltzmann argued mathematically that the effect of collisions between the particles would cause the H-function to inevitably increase from any initial configuration until equilibrium was reached; and further identified it as an underlying microscopic rationale for the macroscopic thermodynamic entropy of Clausius.

作为一个测量的广度，状态扩散的一个单一的粒子在一个气体相似的粒子，其中 f 代表每个可能的状态的相对频率分布。从数学上论证了粒子间碰撞的影响将导致 h 函数从任何初始构型不可避免地增加，直到达到平衡; 并进一步确定它是宏观的克劳修斯熵的潜在微观理论基础。

Boltzmann's definition was soon reworked by the American mathematical physicist J. Willard Gibbs into a general formula for statistical-mechanical entropy, no longer requiring identical and non-interacting particles, but instead based on the probability distribution *p _{i}* for the complete microstate

*i*of the total system:

Boltzmann's definition was soon reworked by the American mathematical physicist J. Willard Gibbs into a general formula for statistical-mechanical entropy, no longer requiring identical and non-interacting particles, but instead based on the probability distribution pi for the complete microstate i of the total system:

美国数学物理学家 j. Willard Gibbs 很快将 Boltzmann 的定义重新修改为统计力学熵的一般公式，不再要求相同和不相互作用的粒子，而是基于整个系统微观状态 i 的概率分布 π:

- [math]\displaystyle{ S = -k_\text{B} \sum p_i \ln p_i \, }[/math]

- S = -k_\text{B} \sum p_i \ln p_i \,

- s =-k text { b } sum p _ i ln p _ i,

This (Gibbs) entropy, from statistical mechanics, can be found to directly correspond to the Clausius's classical thermodynamic definition.

This (Gibbs) entropy, from statistical mechanics, can be found to directly correspond to the Clausius's classical thermodynamic definition.

这个来自统计力学的吉布斯熵，可以直接对应克劳修斯的经典热力学定义。

Shannon himself was apparently not particularly aware of the close similarity between his new measure and earlier work in thermodynamics, but John von Neumann was. It is said that, when Shannon was deciding what to call his new measure and fearing the term 'information' was already over-used, von Neumann told him firmly: "You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage."

Shannon himself was apparently not particularly aware of the close similarity between his new measure and earlier work in thermodynamics, but John von Neumann was. It is said that, when Shannon was deciding what to call his new measure and fearing the term 'information' was already over-used, von Neumann told him firmly: "You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage."

香农自己显然没有特别意识到他的新测量方法和早期热力学工作之间的密切相似性，但是约翰·冯·诺伊曼却意识到了。据说，当香农在决定如何称呼他的新测量方法时，担心“信息”这个术语已经被过度使用，冯 · 诺依曼坚定地告诉他: “你应该称之为熵，原因有二。首先，你的不确定性函数已经在统计力学中用到了这个名字，所以它已经有了一个名字。其次，更重要的是，没有人真正知道熵到底是什么，因此在辩论中，你总是占有优势。”

(Connections between information-theoretic entropy and thermodynamic entropy, including the important contributions by Rolf Landauer in the 1960s, are explored further in the article *Entropy in thermodynamics and information theory*).

(Connections between information-theoretic entropy and thermodynamic entropy, including the important contributions by Rolf Landauer in the 1960s, are explored further in the article Entropy in thermodynamics and information theory).

信息论熵和熵之间的联系，包括 Rolf Landauer 在20世纪60年代的重要贡献，将在《西拉德发动机文中进一步探讨。

## Development since 1948

The publication of Shannon's 1948 paper, "A Mathematical Theory of Communication", in the *Bell System Technical Journal* was the founding of information theory as we know it today. Many developments and applications of the theory have taken place since then, which have made many modern devices for data communication and storage such as CD-ROMs and mobile phones possible.

The publication of Shannon's 1948 paper, "A Mathematical Theory of Communication", in the Bell System Technical Journal was the founding of information theory as we know it today. Many developments and applications of the theory have taken place since then, which have made many modern devices for data communication and storage such as CD-ROMs and mobile phones possible.

= = 1948年以来的发展 = = 香农1948年在《贝尔系统技术杂志》上发表的论文《通信的数学理论》 ，是我们今天所知道的信息理论的创立。自那时以来，这一理论的许多发展和应用已经出现，使许多现代数据通信和存储设备，如光盘和移动电话成为可能。

Notable later developments are listed in a timeline of information theory, including:

Notable later developments are listed in a timeline of information theory, including:

值得注意的后来的发展被列入信息理论的时间轴，包括:

- The 1951, invention of Huffman encoding, a method of finding optimal prefix codes for lossless data compression.
- Irving S. Reed and David E. Muller proposing Reed–Muller codes in 1954.
- The 1960 proposal of Reed–Solomon codes.
- In 1966, Fumitada Itakura (Nagoya University) and Shuzo Saito (Nippon Telegraph and Telephone) develop linear predictive coding (LPC), a form of speech coding.
^{[3]} - In 1968, Elwyn Berlekamp invents the Berlekamp–Massey algorithm; its application to decoding BCH and Reed–Solomon codes is pointed out by James L. Massey the following year.
- In 1972, Nasir Ahmed proposes the discrete cosine transform (DCT).
^{[4]}It later becomes the most widely used lossy compression algorithm, and the basis for digital media compression standards from 1988 onwards, including H.26x (since H.261) and MPEG video coding standards,^{[5]}JPEG image compression,^{[6]}MP3 audio compression,^{[7]}and Advanced Audio Coding (AAC).^{[8]} - In 1976, Gottfried Ungerboeck gives the first paper on trellis modulation; a more detailed exposition in 1982 leads to a raising of analogue modem POTS speeds from 9.6 kbit/s to 33.6 kbit/s
- In 1977, Abraham Lempel and Jacob Ziv develop Lempel–Ziv compression (LZ77)
- In the early 1980s, Renuka P. Jindal at Bell Labs improves the noise performance of metal-oxide-semiconductor (MOS) devices, resolving issues that limited their receiver sensitivity and data rates. This leads to the wide adoption of MOS technology in laser lightwave systems and wireless terminal applications, enabling Edholm's law.
^{[9]} - In 1989, Phil Katz publishes the
`.zip`

format including DEFLATE (LZ77 + Huffman coding); later to become the most widely used archive container. - In 1995, Benjamin Schumacher coins the term qubit and proves the quantum noiseless coding theorem.

- The 1951, invention of Huffman encoding, a method of finding optimal prefix codes for lossless data compression.
- Irving S. Reed and David E. Muller proposing Reed–Muller codes in 1954.
- The 1960 proposal of Reed–Solomon codes.
- In 1966, Fumitada Itakura (Nagoya University) and Shuzo Saito (Nippon Telegraph and Telephone) develop linear predictive coding (LPC), a form of speech coding.
- In 1968, Elwyn Berlekamp invents the Berlekamp–Massey algorithm; its application to decoding BCH and Reed–Solomon codes is pointed out by James L. Massey the following year.
- In 1972, Nasir Ahmed proposes the discrete cosine transform (DCT). It later becomes the most widely used lossy compression algorithm, and the basis for digital media compression standards from 1988 onwards, including H.26x (since H.261) and MPEG video coding standards, JPEG image compression, MP3 audio compression, and Advanced Audio Coding (AAC).
- In 1976, Gottfried Ungerboeck gives the first paper on trellis modulation; a more detailed exposition in 1982 leads to a raising of analogue modem POTS speeds from 9.6 kbit/s to 33.6 kbit/s
- In 1977, Abraham Lempel and Jacob Ziv develop Lempel–Ziv compression (LZ77)
- In the early 1980s, Renuka P. Jindal at Bell Labs improves the noise performance of metal-oxide-semiconductor (MOS) devices, resolving issues that limited their receiver sensitivity and data rates. This leads to the wide adoption of MOS technology in laser lightwave systems and wireless terminal applications, enabling Edholm's law.
- In 1989, Phil Katz publishes the
`.zip`

format including DEFLATE (LZ77 + Huffman coding); later to become the most widely used archive container. - In 1995, Benjamin Schumacher coins the term qubit and proves the quantum noiseless coding theorem.

- 1951年，发明了 Huffman 编码，一种为无损数据压缩寻找最优前缀码的方法。
- 欧文 · s · 里德和大卫 · e · 穆勒于1954年提出里德-穆勒码。
- 1960年里德-所罗门法典提案。
- 在1966年，Fumitada Itakura (名古屋大学)和 Shuzo Saito (日本电信电话)开发了线性预测编码编码，一种语音编码形式。
- 1968年，Elwyn Berlekamp 发明了 Berlekamp-Massey 算法，第二年 James l. Massey 指出了它在 BCH 码和 Reed-Solomon 码解码中的应用。
- 1972年，Nasir Ahmed 提出了离散余弦变换。它后来成为最广泛使用的有损数据压缩算法，并从1988年开始成为数字媒体压缩标准的基础，包括 h. 26x (自 h. 261)和 MPEG 视频编码标准、 JPEG 图像压缩、 mp3音讯压缩和高级音频编码(AAC)。
- 1976年，Gottfried Ungerboeck 发表了关于 Trellis 调制的第一篇论文; 1982年更详细的阐述导致模拟调制解调器 POTS 速度从9.6 kbit/s 提高到33.6 kbit/s
- 1977年，Abraham Lempel 和 Jacob Ziv 开发了 Lempel-Ziv 压缩(LZ77)
- 在20世纪80年代早期，贝尔实验室的 Renuka p. Jindal 改善了金属氧化物半导体(MOS)设备的噪声性能，解决了限制其接收器灵敏度和数据传输速率的问题。这导致 MOS 技术在激光光波系统和无线终端应用中的广泛采用，使得埃德霍尔姆定律成为可能。
- 1989年，菲尔 · 卡茨出版了《代码》。Zip 格式包括 DEFLATE (LZ77 + Huffman 编码) ; 后来成为使用最广泛的存档容器。
- 1995年，本杰明 · 舒马赫创造了量子比特这一术语，并证明了量子无噪声编码定理。

## See also

- Timeline of information theory
- Claude Shannon
- Ralph Hartley
- H-theorem

# = = =

- 信息论时间线
- Claude Shannon
- Ralph Hartley
- h 定理

## References

- ↑ "BSTJ 3: 2. April 1924: Certain Factors Affecting Telegraph Speed. (Nyquist, H.)". April 1924.
- ↑ "BSTJ 7: 3. July 1928: Transmission of Information. (Hartley, R.V.L.)". July 1928.
- ↑ Gray, Robert M. (2010). "A History of Realtime Digital Speech on Packet Networks: Part II of Linear Predictive Coding and the Internet Protocol" (PDF).
*Found. Trends Signal Process*.**3**(4): 203–303. doi:10.1561/2000000036. ISSN 1932-8346. - ↑ Ahmed, Nasir (January 1991). "How I Came Up With the Discrete Cosine Transform".
*Digital Signal Processing*.**1**(1): 4–5. doi:10.1016/1051-2004(91)90086-Z. - ↑ Ghanbari, Mohammed (2003).
*Standard Codecs: Image Compression to Advanced Video Coding*. Institution of Engineering and Technology. pp. 1–2. ISBN 9780852967102. https://books.google.com/books?id=7XuU8T3ooOAC&pg=PA1. - ↑ "T.81 – DIGITAL COMPRESSION AND CODING OF CONTINUOUS-TONE STILL IMAGES – REQUIREMENTS AND GUIDELINES" (PDF). CCITT. September 1992. Retrieved 12 July 2019.
- ↑ Guckert, John (Spring 2012). "The Use of FFT and MDCT in MP3 Audio Compression" (PDF).
*University of Utah*. Retrieved 14 July 2019. - ↑ Brandenburg, Karlheinz (1999). "MP3 and AAC Explained" (PDF). Archived (PDF) from the original on 2017-02-13.
- ↑ Jindal, Renuka P. (2009). "From millibits to terabits per second and beyond - Over 60 years of innovation".
*2009 2nd International Workshop on Electron Devices and Semiconductor Technology*: 1–6. doi:10.1109/EDST.2009.5166093. ISBN 978-1-4244-3831-0. S2CID 25112828.

Category:Information theory
Information Theory

范畴: 信息论

This page was moved from wikipedia:en:History of information theory. Its edit history can be viewed at 信息论发展史/edithistory