# 信息论发展史

The decisive event which established the discipline of information theory, and brought it to immediate worldwide attention, was the publication of Claude E. Shannon's classic paper "A Mathematical Theory of Communication" in the Bell System Technical Journal in July and October 1948.

The decisive event which established the discipline of information theory, and brought it to immediate worldwide attention, was the publication of Claude E. Shannon's classic paper "A Mathematical Theory of Communication" in the Bell System Technical Journal in July and October 1948.

1948年7月和10月，在《贝尔系统技术杂志》上发表了克劳德·香农的经典论文《交流的数学理论》 ，这是确立了信息理论学科并使其立即引起全世界关注的决定性事件。

In this revolutionary and groundbreaking paper, the work for which Shannon had substantially completed at Bell Labs by the end of 1944, Shannon for the first time introduced the qualitative and quantitative model of communication as a statistical process underlying information theory, opening with the assertion that

"The fundamental problem of communication is that of reproducing at one point, either exactly or approximately, a message selected at another point."

In this revolutionary and groundbreaking paper, the work for which Shannon had substantially completed at Bell Labs by the end of 1944, Shannon for the first time introduced the qualitative and quantitative model of communication as a statistical process underlying information theory, opening with the assertion that

"The fundamental problem of communication is that of reproducing at one point, either exactly or approximately, a message selected at another point."

With it came the ideas of

With it came the ideas of

• the information entropy and redundancy of a source, and its relevance through the source coding theorem;
• the mutual information, and the channel capacity of a noisy channel, including the promise of perfect loss-free communication given by the noisy-channel coding theorem;
• the practical result of the Shannon–Hartley law for the channel capacity of a Gaussian channel; and of course
• the bit - a new way of seeing the most fundamental unit of information.

# = = 早期电讯 =

Some of the oldest methods of telecommunications implicitly use many of the ideas that would later be quantified in information theory. Modern telegraphy, starting in the 1830s, used Morse code, in which more common letters (like "E", which is expressed as one "dot") are transmitted more quickly than less common letters (like "J", which is expressed by one "dot" followed by three "dashes"). The idea of encoding information in this manner is the cornerstone of lossless data compression. A hundred years later, frequency modulation illustrated that bandwidth can be considered merely another degree of freedom. The vocoder, now largely looked at as an audio engineering curiosity, was originally designed in 1939 to use less bandwidth than that of an original message, in much the same way that mobile phones now trade off voice quality with bandwidth.

Some of the oldest methods of telecommunications implicitly use many of the ideas that would later be quantified in information theory. Modern telegraphy, starting in the 1830s, used Morse code, in which more common letters (like "E", which is expressed as one "dot") are transmitted more quickly than less common letters (like "J", which is expressed by one "dot" followed by three "dashes"). The idea of encoding information in this manner is the cornerstone of lossless data compression. A hundred years later, frequency modulation illustrated that bandwidth can be considered merely another degree of freedom. The vocoder, now largely looked at as an audio engineering curiosity, was originally designed in 1939 to use less bandwidth than that of an original message, in much the same way that mobile phones now trade off voice quality with bandwidth.

### Quantitative ideas of information

The most direct antecedents of Shannon's work were two papers published in the 1920s by Harry Nyquist and Ralph Hartley, who were both still research leaders at Bell Labs when Shannon arrived in the early 1940s.

The most direct antecedents of Shannon's work were two papers published in the 1920s by Harry Nyquist and Ralph Hartley, who were both still research leaders at Bell Labs when Shannon arrived in the early 1940s.

20世纪20年代，哈里 · 奈奎斯特和拉尔夫 · 哈特利发表了两篇论文，当香农在40年代初到达贝尔实验室时，他们两人都还是研究领导者。

Nyquist's 1924 paper, "Certain Factors Affecting Telegraph Speed", is mostly concerned with some detailed engineering aspects of telegraph signals. But a more theoretical section discusses quantifying "intelligence" and the "line speed" at which it can be transmitted by a communication system, giving the relation

Nyquist's 1924 paper, "Certain Factors Affecting Telegraph Speed", is mostly concerned with some detailed engineering aspects of telegraph signals. But a more theoretical section discusses quantifying "intelligence" and the "line speed" at which it can be transmitted by a communication system, giving the relation

$\displaystyle{ W = K \log m \, }$
W = K \log m \,
w = k log m,

where W is the speed of transmission of intelligence, m is the number of different voltage levels to choose from at each time step, and K is a constant.

where W is the speed of transmission of intelligence, m is the number of different voltage levels to choose from at each time step, and K is a constant.

Hartley's 1928 paper, called simply "Transmission of Information", went further by using the word information (in a technical sense), and making explicitly clear that information in this context was a measurable quantity, reflecting only the receiver's ability to distinguish that one sequence of symbols had been intended by the sender rather than any other—quite regardless of any associated meaning or other psychological or semantic aspect the symbols might represent. This amount of information he quantified as

Hartley's 1928 paper, called simply "Transmission of Information", went further by using the word information (in a technical sense), and making explicitly clear that information in this context was a measurable quantity, reflecting only the receiver's ability to distinguish that one sequence of symbols had been intended by the sender rather than any other—quite regardless of any associated meaning or other psychological or semantic aspect the symbols might represent. This amount of information he quantified as

$\displaystyle{ H = \log S^n \, }$
H = \log S^n \,
h = log s ^ n,

where S was the number of possible symbols, and n the number of symbols in a transmission. The natural unit of information was therefore the decimal digit, much later renamed the hartley in his honour as a unit or scale or measure of information. The Hartley information, H0, is still used as a quantity for the logarithm of the total number of possibilities.

where S was the number of possible symbols, and n the number of symbols in a transmission. The natural unit of information was therefore the decimal digit, much later renamed the hartley in his honour as a unit or scale or measure of information. The Hartley information, H0, is still used as a quantity for the logarithm of the total number of possibilities.

A similar unit of log10 probability, the ban, and its derived unit the deciban (one tenth of a ban), were introduced by Alan Turing in 1940 as part of the statistical analysis of the breaking of the German second world war Enigma cyphers. The decibannage represented the reduction in (the logarithm of) the total number of possibilities (similar to the change in the Hartley information); and also the log-likelihood ratio (or change in the weight of evidence) that could be inferred for one hypothesis over another from a set of observations. The expected change in the weight of evidence is equivalent to what was later called the Kullback discrimination information.

A similar unit of log10 probability, the ban, and its derived unit the deciban (one tenth of a ban), were introduced by Alan Turing in 1940 as part of the statistical analysis of the breaking of the German second world war Enigma cyphers. The decibannage represented the reduction in (the logarithm of) the total number of possibilities (similar to the change in the Hartley information); and also the log-likelihood ratio (or change in the weight of evidence) that could be inferred for one hypothesis over another from a set of observations. The expected change in the weight of evidence is equivalent to what was later called the Kullback discrimination information.

1940年，阿兰 · 图灵(Alan Turing)引入了一个类似的 log10概率单位——“禁令”(the ban)及其派生单位 deciban (十分之一的禁令) ，作为对德国二战中破解谜密码的统计分析的一部分。地籍图代表可能性总数(对数)的减少(类似于哈特利信息的变化) ; 以及对数似然比(或证据权重的变化) ，可以从一组观察结果推断出一个假设超过另一个假设。证据权重的预期变化相当于后来所谓的 Kullback 歧视信息。

But underlying this notion was still the idea of equal a-priori probabilities, rather than the information content of events of unequal probability; nor yet any underlying picture of questions regarding the communication of such varied outcomes.

But underlying this notion was still the idea of equal a-priori probabilities, rather than the information content of events of unequal probability; nor yet any underlying picture of questions regarding the communication of such varied outcomes.

### Entropy in statistical mechanics

One area where unequal probabilities were indeed well known was statistical mechanics, where Ludwig Boltzmann had, in the context of his H-theorem of 1872, first introduced the quantity

One area where unequal probabilities were indeed well known was statistical mechanics, where Ludwig Boltzmann had, in the context of his H-theorem of 1872, first introduced the quantity

= = = 统计力学的熵 = = = 一个领域的不等概率确实是众所周知的是统计力学，其中路德维希·玻尔兹曼，在他的 h 定理的背景下，1872年，首先介绍了量

$\displaystyle{ H = - \sum f_i \log f_i }$
H = - \sum f_i \log f_i
h =-sum f _ i log f _ i

as a measure of the breadth of the spread of states available to a single particle in a gas of like particles, where f represented the relative frequency distribution of each possible state. Boltzmann argued mathematically that the effect of collisions between the particles would cause the H-function to inevitably increase from any initial configuration until equilibrium was reached; and further identified it as an underlying microscopic rationale for the macroscopic thermodynamic entropy of Clausius.

as a measure of the breadth of the spread of states available to a single particle in a gas of like particles, where f represented the relative frequency distribution of each possible state. Boltzmann argued mathematically that the effect of collisions between the particles would cause the H-function to inevitably increase from any initial configuration until equilibrium was reached; and further identified it as an underlying microscopic rationale for the macroscopic thermodynamic entropy of Clausius.

Boltzmann's definition was soon reworked by the American mathematical physicist J. Willard Gibbs into a general formula for statistical-mechanical entropy, no longer requiring identical and non-interacting particles, but instead based on the probability distribution pi for the complete microstate i of the total system:

Boltzmann's definition was soon reworked by the American mathematical physicist J. Willard Gibbs into a general formula for statistical-mechanical entropy, no longer requiring identical and non-interacting particles, but instead based on the probability distribution pi for the complete microstate i of the total system:

$\displaystyle{ S = -k_\text{B} \sum p_i \ln p_i \, }$
S = -k_\text{B} \sum p_i \ln p_i \,
s =-k text { b } sum p _ i ln p _ i,

This (Gibbs) entropy, from statistical mechanics, can be found to directly correspond to the Clausius's classical thermodynamic definition.

This (Gibbs) entropy, from statistical mechanics, can be found to directly correspond to the Clausius's classical thermodynamic definition.

Shannon himself was apparently not particularly aware of the close similarity between his new measure and earlier work in thermodynamics, but John von Neumann was. It is said that, when Shannon was deciding what to call his new measure and fearing the term 'information' was already over-used, von Neumann told him firmly: "You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage."

Shannon himself was apparently not particularly aware of the close similarity between his new measure and earlier work in thermodynamics, but John von Neumann was. It is said that, when Shannon was deciding what to call his new measure and fearing the term 'information' was already over-used, von Neumann told him firmly: "You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage."

(Connections between information-theoretic entropy and thermodynamic entropy, including the important contributions by Rolf Landauer in the 1960s, are explored further in the article Entropy in thermodynamics and information theory).

(Connections between information-theoretic entropy and thermodynamic entropy, including the important contributions by Rolf Landauer in the 1960s, are explored further in the article Entropy in thermodynamics and information theory).

## Development since 1948

The publication of Shannon's 1948 paper, "A Mathematical Theory of Communication", in the Bell System Technical Journal was the founding of information theory as we know it today. Many developments and applications of the theory have taken place since then, which have made many modern devices for data communication and storage such as CD-ROMs and mobile phones possible.

The publication of Shannon's 1948 paper, "A Mathematical Theory of Communication", in the Bell System Technical Journal was the founding of information theory as we know it today. Many developments and applications of the theory have taken place since then, which have made many modern devices for data communication and storage such as CD-ROMs and mobile phones possible.

= = 1948年以来的发展 = = 香农1948年在《贝尔系统技术杂志》上发表的论文《通信的数学理论》 ，是我们今天所知道的信息理论的创立。自那时以来，这一理论的许多发展和应用已经出现，使许多现代数据通信和存储设备，如光盘和移动电话成为可能。

Notable later developments are listed in a timeline of information theory, including:

Notable later developments are listed in a timeline of information theory, including:

• The 1951, invention of Huffman encoding, a method of finding optimal prefix codes for lossless data compression.
• Irving S. Reed and David E. Muller proposing Reed–Muller codes in 1954.
• The 1960 proposal of Reed–Solomon codes.
• In 1966, Fumitada Itakura (Nagoya University) and Shuzo Saito (Nippon Telegraph and Telephone) develop linear predictive coding (LPC), a form of speech coding.
• In 1968, Elwyn Berlekamp invents the Berlekamp–Massey algorithm; its application to decoding BCH and Reed–Solomon codes is pointed out by James L. Massey the following year.
• In 1972, Nasir Ahmed proposes the discrete cosine transform (DCT). It later becomes the most widely used lossy compression algorithm, and the basis for digital media compression standards from 1988 onwards, including H.26x (since H.261) and MPEG video coding standards, JPEG image compression, MP3 audio compression, and Advanced Audio Coding (AAC).
• In 1976, Gottfried Ungerboeck gives the first paper on trellis modulation; a more detailed exposition in 1982 leads to a raising of analogue modem POTS speeds from 9.6 kbit/s to 33.6 kbit/s
• In 1977, Abraham Lempel and Jacob Ziv develop Lempel–Ziv compression (LZ77)
• In the early 1980s, Renuka P. Jindal at Bell Labs improves the noise performance of metal-oxide-semiconductor (MOS) devices, resolving issues that limited their receiver sensitivity and data rates. This leads to the wide adoption of MOS technology in laser lightwave systems and wireless terminal applications, enabling Edholm's law.
• In 1989, Phil Katz publishes the .zip format including DEFLATE (LZ77 + Huffman coding); later to become the most widely used archive container.
• In 1995, Benjamin Schumacher coins the term qubit and proves the quantum noiseless coding theorem.

• 1951年，发明了 Huffman 编码，一种为无损数据压缩寻找最优前缀码的方法。
• 欧文 · s · 里德和大卫 · e · 穆勒于1954年提出里德-穆勒码。
• 1960年里德-所罗门法典提案。
• 在1966年，Fumitada Itakura (名古屋大学)和 Shuzo Saito (日本电信电话)开发了线性预测编码编码，一种语音编码形式。
• 1968年，Elwyn Berlekamp 发明了 Berlekamp-Massey 算法，第二年 James l. Massey 指出了它在 BCH 码和 Reed-Solomon 码解码中的应用。
• 1972年，Nasir Ahmed 提出了离散余弦变换。它后来成为最广泛使用的有损数据压缩算法，并从1988年开始成为数字媒体压缩标准的基础，包括 h. 26x (自 h. 261)和 MPEG 视频编码标准、 JPEG 图像压缩、 mp3音讯压缩和高级音频编码(AAC)。
• 1976年，Gottfried Ungerboeck 发表了关于 Trellis 调制的第一篇论文; 1982年更详细的阐述导致模拟调制解调器 POTS 速度从9.6 kbit/s 提高到33.6 kbit/s
• 1977年，Abraham Lempel 和 Jacob Ziv 开发了 Lempel-Ziv 压缩(LZ77)
• 在20世纪80年代早期，贝尔实验室的 Renuka p. Jindal 改善了金属氧化物半导体(MOS)设备的噪声性能，解决了限制其接收器灵敏度和数据传输速率的问题。这导致 MOS 技术在激光光波系统和无线终端应用中的广泛采用，使得埃德霍尔姆定律成为可能。
• 1989年，菲尔 · 卡茨出版了《代码》。Zip 格式包括 DEFLATE (LZ77 + Huffman 编码) ; 后来成为使用最广泛的存档容器。
• 1995年，本杰明 · 舒马赫创造了量子比特这一术语，并证明了量子无噪声编码定理。