香农-哈特莱定理 Shannon–Hartley theorem

ZQ讨论 | 贡献2020年4月22日 (三) 17:10的版本

任务

生产人:ZQ 下一个任务:审校


In information theory, the Shannon–Hartley theorem tells the maximum rate at which information can be transmitted over a communications channel of a specified bandwidth in the presence of noise. It is an application of the noisy-channel coding theorem to the archetypal case of a continuous-time analog communications channel subject to Gaussian noise. The theorem establishes Shannon's channel capacity for such a communication link, a bound on the maximum amount of error-free information per time unit that can be transmitted with a specified bandwidth in the presence of the noise interference, assuming that the signal power is bounded, and that the Gaussian noise process is characterized by a known power or power spectral density. The law is named after Claude Shannon and Ralph Hartley.

信息论中,香农-哈特利定理(Shannon–Hartley theorem)给出了在存在噪声的情况下信息通过给定带宽的信道中传输的最大速率。香农-哈特利定理是噪声信道编码定理中的一个应用,是一种受高斯噪声影响的连续时间模拟通信信道的典型案例。假设信号功率是有界的,且高斯噪声过程为已知功率或功率谱密度,那么在存在噪声干扰的情况下,香农-哈特利定理为这种通信链路建立了信道容量的界限,即每个时间单元能够以指定带宽传输最多无误差信息的范围。定理以克劳德 · 香农和拉尔夫 · 哈特利命名。


[math]\displaystyle{ C = B\log _{2}{1+\frac{S}{N}} }[/math]