更改

跳到导航 跳到搜索
添加175字节 、 2020年7月20日 (一) 15:11
无编辑摘要
第37行: 第37行:  
Applications of fundamental topics of information theory include lossless data compression (e.g. ZIP files), lossy data compression (e.g. MP3s and JPEGs), and channel coding (e.g. for DSL). Information theory is used in information retrieval, intelligence gathering, gambling, and even in musical composition.
 
Applications of fundamental topics of information theory include lossless data compression (e.g. ZIP files), lossy data compression (e.g. MP3s and JPEGs), and channel coding (e.g. for DSL). Information theory is used in information retrieval, intelligence gathering, gambling, and even in musical composition.
   −
信息论基本主题的应用包括无损数据压缩(例如:ZIP压缩文件)、有损数据压缩(例如:Mp3和jpeg格式) ,以及频道编码(例如:用于DSL)。信息论应用于信息检索、情报收集、赌博,甚至在音乐创作中也有应用。
+
信息论基本主题的应用包括无损数据压缩(例如:ZIP压缩文件)、有损数据压缩(例如:Mp3和jpeg格式) ,以及频道编码(例如:DSL)。信息论应用于信息检索、情报收集、赌博,甚至在音乐创作中也有应用。
      第47行: 第47行:  
A key measure in information theory is entropy. Entropy quantifies the amount of uncertainty involved in the value of a random variable or the outcome of a random process. For example, identifying the outcome of a fair coin flip (with two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a  (with six equally likely outcomes). Some other important measures in information theory are mutual information, channel capacity, error exponents, and relative entropy.
 
A key measure in information theory is entropy. Entropy quantifies the amount of uncertainty involved in the value of a random variable or the outcome of a random process. For example, identifying the outcome of a fair coin flip (with two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a  (with six equally likely outcomes). Some other important measures in information theory are mutual information, channel capacity, error exponents, and relative entropy.
   −
信息论中的一个关键度量是熵。熵量化了一个随机变量的值或者一个随机过程的结果所包含的不确定性。例如,识别一次公平抛硬币的结果(有两个同样可能的结果)所提供的信息(较低的熵)少于指定一卷 a 的结果(有六个同样可能的结果)。信息论中的其他一些重要指标是互信息、信道容量、误差指数和相对熵。
+
信息论中的一个关键度量是熵。熵量化了一个随机变量的值或者一个随机过程的结果所包含的不确定性。例如,识别一次公平抛硬币的结果(有两个同样可能的结果)所提供的信息(较低的熵)少于指定一卷 a 的结果(有六个同样可能的结果)。信息论中的其他一些重要指标有:互信息、信道容量、误差指数和相对熵。
      第97行: 第97行:     
第三类信息论代码是密码算法(包括代码和密码)。编码理论和信息论的概念、方法和结果在密码学和密码分析中得到了广泛的应用。有关历史应用,请参阅文章禁令(单位)。
 
第三类信息论代码是密码算法(包括代码和密码)。编码理论和信息论的概念、方法和结果在密码学和密码分析中得到了广泛的应用。有关历史应用,请参阅文章禁令(单位)。
        第120行: 第119行:  
The landmark event that established the discipline of information theory and brought it to immediate worldwide attention was the publication of Claude E. Shannon's classic paper "A Mathematical Theory of Communication" in the Bell System Technical Journal in July and October 1948.
 
The landmark event that established the discipline of information theory and brought it to immediate worldwide attention was the publication of Claude E. Shannon's classic paper "A Mathematical Theory of Communication" in the Bell System Technical Journal in July and October 1948.
   −
1948年7月和10月,在《贝尔系统技术杂志》上发表了克劳德·香农的经典论文《交流的数学理论》 ,这是建立信息理论学科并立即引起全世界关注的里程碑事件。
+
1948年7月和10月,Claude Shannon在''[[Bell System Technical Journal]]''上发表了经典论文:"A Mathematical Theory of Communication",这是建立信息论学科并立即引起全世界关注的里程碑事件。
      第130行: 第129行:  
Prior to this paper, limited information-theoretic ideas had been developed at Bell Labs, all implicitly assuming events of equal probability.  Harry Nyquist's 1924 paper, Certain Factors Affecting Telegraph Speed, contains a theoretical section quantifying "intelligence" and the "line speed" at which it can be transmitted by a communication system, giving the relation  (recalling Boltzmann's constant), where W is the speed of transmission of intelligence, m is the number of different voltage levels to choose from at each time step, and K is a constant.  Ralph Hartley's 1928 paper, Transmission of Information, uses the word information as a measurable quantity, reflecting the receiver's ability to distinguish one sequence of symbols from any other, thus quantifying information as , where S was the number of possible symbols, and n the number of symbols in a transmission. The unit of information was therefore the decimal digit, which has since sometimes been called the hartley in his honor as a unit or scale or measure of information. Alan Turing in 1940 used similar ideas as part of the statistical analysis of the breaking of the German second world war Enigma ciphers.
 
Prior to this paper, limited information-theoretic ideas had been developed at Bell Labs, all implicitly assuming events of equal probability.  Harry Nyquist's 1924 paper, Certain Factors Affecting Telegraph Speed, contains a theoretical section quantifying "intelligence" and the "line speed" at which it can be transmitted by a communication system, giving the relation  (recalling Boltzmann's constant), where W is the speed of transmission of intelligence, m is the number of different voltage levels to choose from at each time step, and K is a constant.  Ralph Hartley's 1928 paper, Transmission of Information, uses the word information as a measurable quantity, reflecting the receiver's ability to distinguish one sequence of symbols from any other, thus quantifying information as , where S was the number of possible symbols, and n the number of symbols in a transmission. The unit of information was therefore the decimal digit, which has since sometimes been called the hartley in his honor as a unit or scale or measure of information. Alan Turing in 1940 used similar ideas as part of the statistical analysis of the breaking of the German second world war Enigma ciphers.
   −
在这篇文章之前,有限的信息理论思想已经在贝尔实验室开发,所有隐含假设事件的等概率。Harry Nyquist 在1924年的论文《影响电报速度的某些因素》中有一个理论部分,量化了“智能”和通信系统可以传输的“线路速度” ,给出了关系(回顾 Boltzmann 常数) ,其中 w 是智能传输的速度,m 是每个时间步骤可以选择的不同电压级数,k 是常数。拉尔夫 · 哈特利在1928年发表的论文《信息的传递》中,将单词信息作为一个可测量的量,反映了接收者区分一系列符号的能力,从而将信息量化为,其中 s 是可能符号的数量,n 是传输中符号的数量。因此,信息的单位就是十进制数字,为了表示对他的尊敬,这个单位有时被称为哈特莱,作为信息的单位、尺度或度量。1940年,阿兰 · 图灵在对德国二战时期破解英格玛密码的统计分析中使用了类似的想法。
+
在此之前,贝尔实验室已经提出了有限的信息论思想,所有这些理论都隐含地假设了概率均等的事件。Harry Nyquist 在1924年发表的论文”Certain Factors Affecting Telegraph Speed”中包含一个理论部分,量化了“智能”和通信系统可以传输的“线路速度”,并给出了关系式(检索Boltzmann常数) ,其中 w 是智能传输的速度,m 是每个时间步长可以选择的不同电压电平的数,k 是常数。Ralph Hartley在1928年发表的论文” Transmission of Information”中,将单词信息作为一个可测量的量,以此反映接收器区分一系列符号的能力,从而将信息量化,其中 s 是可能符号的数量,n 是传输中符号的数量。因此信息的单位就是十进制数字,为了表示对他的尊敬,这个单位有时被称为Hartley,作为信息的单位、尺度或度量。1940年,Alan Turing在对德国二战时期破解迷密码(Enigma ciphers)的统计分析中使用了类似的思想。
 
  −
 
  −
 
  −
 
      
Much of the mathematics behind information theory with events of different probabilities were developed for the field of [[thermodynamics]] by [[Ludwig Boltzmann]] and [[J. Willard Gibbs]].  Connections between information-theoretic entropy and thermodynamic entropy, including the important contributions by [[Rolf Landauer]] in the 1960s, are explored in ''[[Entropy in thermodynamics and information theory]]''.
 
Much of the mathematics behind information theory with events of different probabilities were developed for the field of [[thermodynamics]] by [[Ludwig Boltzmann]] and [[J. Willard Gibbs]].  Connections between information-theoretic entropy and thermodynamic entropy, including the important contributions by [[Rolf Landauer]] in the 1960s, are explored in ''[[Entropy in thermodynamics and information theory]]''.
第140行: 第135行:  
Much of the mathematics behind information theory with events of different probabilities were developed for the field of thermodynamics by Ludwig Boltzmann and J. Willard Gibbs.  Connections between information-theoretic entropy and thermodynamic entropy, including the important contributions by Rolf Landauer in the 1960s, are explored in Entropy in thermodynamics and information theory.
 
Much of the mathematics behind information theory with events of different probabilities were developed for the field of thermodynamics by Ludwig Boltzmann and J. Willard Gibbs.  Connections between information-theoretic entropy and thermodynamic entropy, including the important contributions by Rolf Landauer in the 1960s, are explored in Entropy in thermodynamics and information theory.
   −
信息论背后的许多数学理论,包括不同概率的事件,都是由路德维希·玻尔兹曼和 j. Willard Gibbs 为热力学领域而发展起来的。信息论熵和熵之间的联系,包括 Rolf Landauer 在20世纪60年代的重要贡献,在《西拉德发动机志中进行了探讨。
+
信息论背后的许多数学理论(包括不同概率的事件)都是由Ludwig Boltzmann和 j. Willard Gibbs 为热力学领域而发展起来的。信息论中的熵和热力学中的熵之间的联系,包括 Rolf Landauer 在20世纪60年代的重要贡献,在热力学和信息论的熵中进行了探讨。
 
        第150行: 第144行:  
In Shannon's revolutionary and groundbreaking paper, the work for which had been substantially completed at Bell Labs by the end of 1944, Shannon for the first time introduced the qualitative and quantitative model of communication as a statistical process underlying information theory, opening with the assertion that
 
In Shannon's revolutionary and groundbreaking paper, the work for which had been substantially completed at Bell Labs by the end of 1944, Shannon for the first time introduced the qualitative and quantitative model of communication as a statistical process underlying information theory, opening with the assertion that
   −
1944年底,贝尔实验室基本完成了香农具有革命性和开创性的论文
+
1944年底之前,Shannon的工作在贝尔实验室已基本完成。在Shannon的开创性的论文中首次引入了定性和定量的通信模型,将其作为信息理论基础的统计过程。
 
   
:"The fundamental problem of communication is that of reproducing at one point, either exactly or approximately, a message selected at another point."
 
:"The fundamental problem of communication is that of reproducing at one point, either exactly or approximately, a message selected at another point."
   第166行: 第159行:  
With it came the ideas of
 
With it came the ideas of
   −
随之而来的是
+
相关观点
    
* the information entropy and [[redundancy (information theory)|redundancy]] of a source, and its relevance through the [[source coding theorem]];
 
* the information entropy and [[redundancy (information theory)|redundancy]] of a source, and its relevance through the [[source coding theorem]];
         
* the mutual information, and the channel capacity of a noisy channel, including the promise of perfect loss-free communication given by the noisy-channel coding theorem;
 
* the mutual information, and the channel capacity of a noisy channel, including the promise of perfect loss-free communication given by the noisy-channel coding theorem;
        第181行: 第172行:     
* the [[bit]]—a new way of seeing the most fundamental unit of information.
 
* the [[bit]]—a new way of seeing the most fundamental unit of information.
 +
 +
 +
 +
     
16

个编辑

导航菜单