香农-哈特莱定理 Shannon–Hartley theorem
任务
生产人:ZQ 下一个任务:审校
In information theory, the Shannon–Hartley theorem tells the maximum rate at which information can be transmitted over a communications channel of a specified bandwidth in the presence of noise. It is an application of the noisy-channel coding theorem to the archetypal case of a continuous-time analog communications channel subject to Gaussian noise. The theorem establishes Shannon's channel capacity for such a communication link, a bound on the maximum amount of error-free information per time unit that can be transmitted with a specified bandwidth in the presence of the noise interference, assuming that the signal power is bounded, and that the Gaussian noise process is characterized by a known power or power spectral density. The law is named after Claude Shannon and Ralph Hartley.
在信息论中,香农-哈特利定理(Shannon–Hartley theorem)给出了在存在噪声的情况下信息通过给定带宽的信道中传输的最大速率。香农-哈特利定理是噪声信道编码定理中的一个应用,是一种受高斯噪声影响的连续时间模拟通信信道的典型案例。假设信号功率是有界的,且高斯噪声过程为已知功率或功率谱密度,那么在存在噪声干扰的情况下,香农-哈特利定理为这种通信链路建立了信道容量的界限,即每个时间单元能够以指定带宽传输最多无误差信息的范围。定理以克劳德 · 香农和拉尔夫 · 哈特利命名。
定理内容
The Shannon–Hartley theorem states the channel capacity {\displaystyle C}C,meaning the theoretical tightest upper bound on the information rate of data that can be communicated at an arbitrarily low error rate using an average received signal power S through an analog communication channel subject to additive white Gaussian noise (AWGN) of power N.
香农-哈特利定理给出了信道容量[math]\displaystyle{ C }[/math]的计算方法,表示理论上的信道传输速率的上限可以用信号的平均接受功率S以任意错误率通过模拟通信信道传输,并且会受到加性高斯白噪声(AWGN)的影响:
[math]\displaystyle{ C = B\log _{2}{1+\frac{S}{N}} }[/math]
where
{\displaystyle C}C is the channel capacity in bits per second, a theoretical upper bound on the net bit rate (information rate, sometimes denoted {\displaystyle I}I) excluding error-correction codes; {\displaystyle B}B is the bandwidth of the channel in hertz (passband bandwidth in case of a bandpass signal); {\displaystyle S}S is the average received signal power over the bandwidth (in case of a carrier-modulated passband transmission, often denoted C), measured in watts (or volts squared); {\displaystyle N}N is the average power of the noise and interference over the bandwidth, measured in watts (or volts squared); and {\displaystyle S/N}S/N is the signal-to-noise ratio (SNR) or the carrier-to-noise ratio (CNR) of the communication signal to the noise and interference at the receiver (expressed as a linear power ratio, not as logarithmic decibels).
其中:
- [math]\displaystyle{ C }[/math]为信道容量,单位为比特每秒或者奈特每秒等。为理论上的最大比特率(即信息速率,也可用[math]\displaystyle{ I }[/math]表示),不包括纠错码。
- [math]\displaystyle{ B }[/math]是信道的带宽,单位为赫兹(在带通信号的情况下为通带带宽);
- [math]\displaystyle{ S }[/math]是带宽上的平均接收信号功率(在载波调制通带传输的情况下,通常表示为[math]\displaystyle{ C }[/math]),以瓦特(或伏特的平方)为单位;
- [math]\displaystyle{ N }[/math]是在带宽上的噪声干扰的平均功率,以瓦特(或伏特的平方)为单位;
- [math]\displaystyle{ \frac{S}{N} }[/math]为信噪比(SNR)。
历史发展
During the late 1920s, Harry Nyquist and Ralph Hartley developed a handful of fundamental ideas related to the transmission of information, particularly in the context of the telegraph as a communications system. At the time, these concepts were powerful breakthroughs individually, but they were not part of a comprehensive theory. In the 1940s, Claude Shannon developed the concept of channel capacity, based in part on the ideas of Nyquist and Hartley, and then formulated a complete theory of information and its transmission.
在1920年代后期,哈里·奈奎斯特(Harry Nyquist)和拉尔夫·哈特利(Ralph Hartley)提出了信息传输的基本思想,特别是在电报作为通信系统的背景下。当时,这些概念的提出都是非常大的突破,但这些都不是没有综合统一起来的理论。1940年代,克劳德·香农(Claude Shannon)基于奈奎斯特(Nyquist)和哈特利(Hartley)的思想提出了信道容量的概念,然后制定了完整的信息及其传播的理论。
奈奎斯特速率 Nyquist rate
In 1927, Nyquist determined that the number of independent pulses that could be put through a telegraph channel per unit time is limited to twice the bandwidth of the channel. In symbolic notation,
1927年,奈奎斯特发现单位时间可通过电报信道发送的独立脉冲数最大只能为该信道带宽的两倍。公式表示如下:
[math]\displaystyle{ f_{p} \lt 2B }[/math]
where {\displaystyle f_{p}} f_{p} is the pulse frequency (in pulses per second) and {\displaystyle B} B is the bandwidth (in hertz). The quantity {\displaystyle 2B} 2B later came to be called the Nyquist rate, and transmitting at the limiting pulse rate of {\displaystyle 2B} 2B pulses per second as signalling at the Nyquist rate. Nyquist published his results in 1928 as part of his paper "Certain topics in Telegraph Transmission Theory".
其中[math]\displaystyle{ f_{p} }[/math]为脉冲频率(单位为脉冲/s),[math]\displaystyle{ B }[/math]为带宽(单位为赫兹)。公式中,2B后来被称为奈奎斯特速率,表示传输的极限速率为2B脉冲每秒。奈奎斯特在1928年发表该研究成果在论文《Certain topics in Telegraph Transmission Theory》中。
哈特利定律 Hartley's law
During 1928, Hartley formulated a way to quantify information and its line rate (also known as data signalling rate R bits per second).[1] This method, later known as Hartley's law, became an important precursor for Shannon's more sophisticated notion of channel capacity.
1928年,哈特利提出了一种量化信息及其线速(也被称之为数据信令速率,R比特每秒)的方法。这种方法后来被称之为哈特利定律,为香农提出更加复杂的信道容量的概念奠定了基础。
Hartley argued that the maximum number of distinguishable pulse levels that can be transmitted and received reliably over a communications channel is limited by the dynamic range of the signal amplitude and the precision with which the receiver can distinguish amplitude levels. Specifically, if the amplitude of the transmitted signal is restricted to the range of [−A ... +A] volts, and the precision of the receiver is ±ΔV volts, then the maximum number of distinct pulses M is given by
哈特利认为,在保证可靠性的条件下,信道中能够传输和接收的可分辨的最大脉冲会受到两个因素的影响和限制,一个为信号振幅的动态范围,另一个为接收机能够分辨的振幅电平的精度。具体来说,如果发射信号的振幅大小范围为[-a,+a]伏,接收机的精度为±ΔV伏,那么不同的脉冲的最大数m满足以下公式:
[math]\displaystyle{ M = 1 + \frac{A}{\bigtriangleup V} }[/math]
By taking information per pulse in bit/pulse to be the base-2-logarithm of the number of distinct messages M that could be sent, Hartley[2] constructed a measure of the line rate R as:
然后可以根据一下公式计算出先速率R的值:
[math]\displaystyle{ R = f_{p}\log _{2}{M} }[/math]
where {\displaystyle f_{p}} f_{p} is the pulse rate, also known as the symbol rate, in symbols/second or baud.
公式中[math]\displaystyle{ f_{p} }[/math]为脉冲速率,也称之为符号速率,单位为符号/秒或波特。
Hartley then combined the above quantification with Nyquist's observation that the number of independent pulses that could be put through a channel of bandwidth {\displaystyle B} B hertz was {\displaystyle 2B} 2B pulses per second, to arrive at his quantitative measure for achievable line rate.
哈特利随后将上述结论与奈奎斯特的观察结合起来,即B赫兹的带宽信道中可以传输的独立脉冲为2B脉冲每秒,由此可以计算出线速率。
Hartley's law is sometimes quoted as just a proportionality between the analog bandwidth, {\displaystyle B} B, in Hertz and what today is called the digital bandwidth, {\displaystyle R} R, in bit/s.[3] Other times it is quoted in this more quantitative form, as an achievable line rate of {\displaystyle R} R bits per second:
哈特利定律有时候被用来描述两种比例关系,一是以赫兹为单位的模拟带宽B,二是以比特/s为单位的数字带宽。哈特利定律还被用来计算线速率R的取值范围:
[math]\displaystyle{ R \leq 2B\log _{2}{M} }[/math]
Hartley did not work out exactly how the number M should depend on the noise statistics of the channel, or how the communication could be made reliable even when individual symbol pulses could not be reliably distinguished to M levels; with Gaussian noise statistics, system designers had to choose a very conservative value of {\displaystyle M} M to achieve a low error rate.
哈特利并没有准确的给出M应该如何依赖信道的噪声统计,以及在无法将单个符号脉冲可靠的区分为M电平的情况下如何使通信可靠。所以在高斯噪声存在时,系统设计者需要选择非常保守的M值,从而降低错误率。
The concept of an error-free capacity awaited Claude Shannon, who built on Hartley's observations about a logarithmic measure of information and Nyquist's observations about the effect of bandwidth limitations.
在哈特利关于信息对数测量的观察和奈奎斯特的带宽限制的基础上,香农提出了无差错容量的概念。
Hartley's rate result can be viewed as the capacity of an errorless M-ary channel of {\displaystyle 2B} 2B symbols per second. Some authors refer to it as a capacity. But such an errorless channel is an idealization, and if M is chosen small enough to make the noisy channel nearly errorless, the result is necessarily less than the Shannon capacity of the noisy channel of bandwidth {\displaystyle B} B, which is the Hartley–Shannon result that followed later.
哈特利定律可以看作是无误差M信道的容量为2B符号每秒。有些研究者将其称之为容量。但是这种无误差的信道是理想条件下的,如果M足够小以至于使有噪声的信道几乎没有误差,那么计算结果必然会小于带宽的有噪信道带宽B,即为之后哈特利-香农定律的结论。
噪声的信道编码定理和容量
克劳德香农(Claude Shannon)在