更改

跳到导航 跳到搜索
删除70字节 、 2021年1月15日 (五) 16:41
无编辑摘要
第1行: 第1行: −
此词条暂由彩云小译翻译,翻译字数共1670,未经人工整理和审校,带来阅读不便,请见谅。
+
此词条暂由Henry翻译。
    
{{short description|Limit on data transfer rate}}
 
{{short description|Limit on data transfer rate}}
第13行: 第13行:  
In information theory, the noisy-channel coding theorem (sometimes Shannon's theorem or Shannon's limit), establishes that for any given degree of noise contamination of a communication channel, it is possible to communicate discrete data (digital information) nearly error-free up to a computable maximum rate through the channel.  This result was presented by Claude Shannon in 1948 and was based in part on earlier work and ideas of Harry Nyquist and Ralph Hartley.
 
In information theory, the noisy-channel coding theorem (sometimes Shannon's theorem or Shannon's limit), establishes that for any given degree of noise contamination of a communication channel, it is possible to communicate discrete data (digital information) nearly error-free up to a computable maximum rate through the channel.  This result was presented by Claude Shannon in 1948 and was based in part on earlier work and ideas of Harry Nyquist and Ralph Hartley.
   −
在信息论中,有噪信道编码定理定理(有时是香农定理或香农极限)规定,对于通信信道的任何给定程度的噪声污染,通过信道传输离散数据(数字信息)几乎没有错误,可以达到可计算的最大速率。这个结果由克劳德 · 香农在1948年提出,部分基于哈里 · 奈奎斯特和拉尔夫 · 哈特利的早期工作和思想。
+
在信息论中,有噪声信道编码定理(有时是香农定理或香农极限)确定了对于通信信道的任何给定程度的噪声污染,都有可能通过信道传输几乎无差错的离散数据(数字信息),从而达到可计算的最大速率。这个结果是由克劳德·香农在1948年提出的,部分基于哈利·奈奎斯特和拉尔夫·哈特利早期的工作和思想。
      第21行: 第21行:  
The Shannon limit or Shannon capacity of a communication channel refers to the maximum rate of error-free data that can theoretically be transferred over the channel if the link is subject to random data transmission errors, for a particular noise level.  It was first described by Shannon (1948), and shortly after published in a book by Claude Elwood Shannon and Warren Weaver in 1949 entitled The Mathematical Theory of Communication. ().  This founded the modern discipline of information theory.  
 
The Shannon limit or Shannon capacity of a communication channel refers to the maximum rate of error-free data that can theoretically be transferred over the channel if the link is subject to random data transmission errors, for a particular noise level.  It was first described by Shannon (1948), and shortly after published in a book by Claude Elwood Shannon and Warren Weaver in 1949 entitled The Mathematical Theory of Communication. ().  This founded the modern discipline of information theory.  
   −
通信信道的香农限或香农容量指的是当链路受到某一噪声水平的随机数据传输误差时,理论上可以在信道上传输的无差错数据的最大速率。它首先由 Shannon (1948)描述,不久之后由克劳德·香农和 Warren Weaver 在1949年出版了一本名为《交流的数学理论》的书。().这创立了现代信息理论学科。
+
通信信道的香农极限或香农容量是指在特定噪声水平下,如果链路受到随机数据传输错误的影响,理论上可以通过信道传输的最大无错误数据速率。它最早由香农(1948)描述,不久后在1949年由克劳德·埃尔伍德·香农和沃伦·韦弗出版的一本书中发表,书名为《通信的数学理论》。这奠定了现代信息论学科的基础。
      −
 
+
== Overview 总览==
== Overview ==
        第33行: 第32行:  
Stated by Claude Shannon in 1948, the theorem describes the maximum possible efficiency of error-correcting methods versus levels of noise interference and data corruption.  Shannon's theorem has wide-ranging applications in both communications and data storage.  This theorem is of foundational importance to the modern field of information theory. Shannon only gave an outline of the proof. The first rigorous proof for the discrete case is due to Amiel Feinstein in 1954.
 
Stated by Claude Shannon in 1948, the theorem describes the maximum possible efficiency of error-correcting methods versus levels of noise interference and data corruption.  Shannon's theorem has wide-ranging applications in both communications and data storage.  This theorem is of foundational importance to the modern field of information theory. Shannon only gave an outline of the proof. The first rigorous proof for the discrete case is due to Amiel Feinstein in 1954.
   −
该定理由 Claude Shannon 在1948年提出,描述了相对于噪声干扰和数据损坏水平的纠错方法的最大可能效率。香农定理在通信和数据存储中有着广泛的应用。这个定理对现代信息论领域具有基础性的重要意义。香农只给出了证明的大纲。1954年,阿米尔 · 范斯坦为这个离散案例提供了第一个严格的证据。
+
香农在1948年提出的定理描述了纠错方法的最大可能效率与噪声干扰和数据损坏程度的关系。香农定理在通信和数据存储中都有广泛的应用。这个定理对现代信息论领域具有重要的基础性意义。香农只概述了证明。1954年,阿米尔·范斯坦提出了离散情况的第一个严格证明。
      第41行: 第40行:  
The Shannon theorem states that given a noisy channel with channel capacity C and information transmitted at a rate R, then if <math>R < C</math> there exist codes that allow the probability of error at the receiver to be made arbitrarily small. This means that, theoretically, it is possible to transmit information nearly without error at any rate below a limiting rate, C.
 
The Shannon theorem states that given a noisy channel with channel capacity C and information transmitted at a rate R, then if <math>R < C</math> there exist codes that allow the probability of error at the receiver to be made arbitrarily small. This means that, theoretically, it is possible to transmit information nearly without error at any rate below a limiting rate, C.
   −
香农定理指出,给定一个信道容量为 c,信息传输速率为 r 的噪声信道,如果存在允许接收端误码概率任意小的码。这意味着,从理论上讲,在低于限制速率 c 的情况下,几乎可以毫无差错地传输信息。
+
香农定理指出,给定一个信道容量为C的噪声信道和以R速率传输的信息,那么如果R<C,则存在允许接收机处的错误概率任意小的码。这意味着,从理论上讲,以低于极限速率C的任何速率几乎无误地传输信息是可能的。
 
        第49行: 第47行:  
The converse is also important. If <math>R > C</math>, an arbitrarily small probability of error is not achievable. All codes will have a probability of error greater than a certain positive minimal level, and this level increases as the rate increases. So, information cannot be guaranteed to be transmitted reliably across a channel at rates beyond the channel capacity.  The theorem does not address the rare situation in which rate and capacity are equal.
 
The converse is also important. If <math>R > C</math>, an arbitrarily small probability of error is not achievable. All codes will have a probability of error greater than a certain positive minimal level, and this level increases as the rate increases. So, information cannot be guaranteed to be transmitted reliably across a channel at rates beyond the channel capacity.  The theorem does not address the rare situation in which rate and capacity are equal.
   −
反过来也很重要。如果 < math > r > c </math > ,任意小的错误概率是不可能实现的。所有的代码将有一个错误的概率大于某一个积极的最低水平,这一水平随着速率的增加。因此,信息不能保证以超出信道容量的速率可靠地通过信道传输。这个定理并没有解决速率和容量相等的罕见情况。
+
定理反过来也很重要。如果R>C,任意小的错误概率都是不可能实现的。所有代码的错误概率都将大于某个正最小水平,并且该水平随着速率的增加而增加。因此,不能保证信息以超出信道容量的速率可靠地跨信道传输。这个定理并不适用于速率和容量相等的罕见情况
      第57行: 第55行:  
The channel capacity <math>C</math> can be calculated from the physical properties of a channel; for a band-limited channel with Gaussian noise, using the Shannon–Hartley theorem.
 
The channel capacity <math>C</math> can be calculated from the physical properties of a channel; for a band-limited channel with Gaussian noise, using the Shannon–Hartley theorem.
   −
信道容量 < math > c </math > 可以从信道的物理特性计算出来,对于带有高斯噪声的带限信道,使用 Shannon-Hartley 定理。
+
信道容量C可以从信道的物理特性计算出来,对于带有高斯噪声的带限信道,可以使用香农-哈特莱定理。
      第65行: 第63行:  
Simple schemes such as "send the message 3 times and use a best 2 out of 3 voting scheme if the copies differ" are inefficient error-correction methods, unable to asymptotically guarantee that a block of data can be communicated free of error.  Advanced techniques such as Reed–Solomon codes and, more recently,  low-density parity-check (LDPC) codes and turbo codes, come much closer to reaching the theoretical Shannon limit, but at a cost of high computational complexity. Using these highly efficient codes and with the computing power in today's digital signal processors, it is now possible to reach very close to the Shannon limit. In fact, it was shown that LDPC codes can reach within 0.0045&nbsp;dB of the Shannon limit (for binary Additive white Gaussian noise (AWGN) channels, with very long block lengths).
 
Simple schemes such as "send the message 3 times and use a best 2 out of 3 voting scheme if the copies differ" are inefficient error-correction methods, unable to asymptotically guarantee that a block of data can be communicated free of error.  Advanced techniques such as Reed–Solomon codes and, more recently,  low-density parity-check (LDPC) codes and turbo codes, come much closer to reaching the theoretical Shannon limit, but at a cost of high computational complexity. Using these highly efficient codes and with the computing power in today's digital signal processors, it is now possible to reach very close to the Shannon limit. In fact, it was shown that LDPC codes can reach within 0.0045&nbsp;dB of the Shannon limit (for binary Additive white Gaussian noise (AWGN) channels, with very long block lengths).
   −
简单的方案,例如”发送信息3次,如果副本不同,使用3个投票方案中最好的2个” ,都是低效的纠错方法,不能保证一组数据能够通信无误。诸如 Reed-Solomon 码以及最近的低密度奇偶校验(LDPC)码和 turbo 码等先进技术,更接近于达到理论上的 Shannon 极限,但代价是高计算复杂度。使用这些高效率的代码和当今数字信号处理器的计算能力,现在可以非常接近香农限制。事实上,已经证明 LDPC 码可以达到香农极限的0.0045 dB 以内(对于二进制加性高斯白噪声(AWGN)信道,具有很长的块长度)
+
诸如“发送消息3次,如果副本不同,则使用3选2最佳投票方案”之类的简单方案是低效的纠错方法,无法渐近地保证数据块可以无错误地通信。诸如Reed-Solomon码以及最近的低密度奇偶校验(LDPC)码和turbo码之类的先进技术更接近于达到理论上的香农极限,但代价是计算复杂度很高。使用这些高效的代码和当今数字信号处理器的计算能力,现在有可能达到非常接近香农极限。事实上,LDPC码可以达到香农极限的0.0045dB以内(对于二进制加性高斯白噪声(AWGN)信道,具有很长的块长度)。
         −
== Mathematical statement ==
+
== Mathematical statement数学表述 ==
     
153

个编辑

导航菜单