更改

跳到导航 跳到搜索
无编辑摘要
第109行: 第109行:  
===噪声的信道编码定理和容量===
 
===噪声的信道编码定理和容量===
   −
'''[[克劳德香农]]'''(Claude Shannon)在
+
Claude Shannon's development of information theory during World War II provided the next big step in understanding how much information could be reliably communicated through noisy channels. Building on Hartley's foundation, Shannon's noisy channel coding theorem (1948) describes the maximum possible efficiency of error-correcting methods versus levels of noise interference and data corruption.[5][6] The proof of the theorem shows that a randomly constructed error-correcting code is essentially as good as the best possible code; the theorem is proved through the statistics of such random codes.
 +
 
 +
'''[[克劳德香农]]'''(Claude Shannon)在第二次世界大战中对信息论的研究为在有噪信道中进行可靠信息传输的突破提供了基础。在哈特利的基础上,对于噪声干扰和数据损坏水平,香农的噪声信道编码定理描述了纠错算法的最大效率。通过对随机编码的统计结果,定理表明随机构造的前向错误纠正本质上是最好的编码方式。
 +
 
 +
Shannon's theorem shows how to compute a channel capacity from a statistical description of a channel, and establishes that given a noisy channel with capacity C and information transmitted at a line rate {\displaystyle R} R, then if
 +
 
 +
香农定理给出了如何从信道的统计描述计算信道容量的方法,并且证明了在一个有容量 c 的噪声信道中,信息以一个线速率传输
150

个编辑

导航菜单