更改

跳到导航 跳到搜索
添加76字节 、 2020年11月8日 (日) 12:12
无编辑摘要
第7行: 第7行:  
Information theory studies the quantification, storage, and communication of information.  It was originally proposed by Claude Shannon in 1948 to find fundamental limits on signal processing and communication operations such as data compression, in a landmark paper titled "A Mathematical Theory of Communication". Its impact has been crucial to the success of the Voyager missions to deep space, the invention of the compact disc, the feasibility of mobile phones, the development of the Internet, the study of linguistics and of human perception, the understanding of black holes, and numerous other fields.
 
Information theory studies the quantification, storage, and communication of information.  It was originally proposed by Claude Shannon in 1948 to find fundamental limits on signal processing and communication operations such as data compression, in a landmark paper titled "A Mathematical Theory of Communication". Its impact has been crucial to the success of the Voyager missions to deep space, the invention of the compact disc, the feasibility of mobile phones, the development of the Internet, the study of linguistics and of human perception, the understanding of black holes, and numerous other fields.
   −
'''信息论'''研究的是信息的量化、存储与传播。信息论最初是由'''克劳德·香农 Claude Shannon'''在1948年的一篇题为"《一种通信的数学理论(A Mathematical Theory of Communication)》"的里程碑式论文中提出的,其目的是找到信号处理和通信操作(如数据压缩)的基本限制。信息论对于旅行者号深空探测任务的成功、光盘的发明、移动电话的可行性、互联网的发展、语言学和人类感知的研究、对黑洞的理解以及许多其他领域的研究都是至关重要的。
+
'''<font color="#ff8000">信息论 Information Theory</font>'''研究的是信息的量化、存储与传播。信息论最初是由'''<font color="#ff8000">克劳德·香农 Claude Shannon</font>'''在1948年的一篇题为'''<font color="#ff8000">《一种通信的数学理论(A Mathematical Theory of Communication)》</font>'''的里程碑式论文中提出的,其目的是找到信号处理和通信操作(如数据压缩)的基本限制。信息论对于旅行者号深空探测任务的成功、光盘的发明、移动电话的可行性、互联网的发展、语言学和人类感知的研究、对黑洞的理解以及许多其他领域的研究都是至关重要的。
      第116行: 第116行:  
* the information entropy and [[redundancy (information theory)|redundancy]] of a source, and its relevance through the [[source coding theorem]];
 
* the information entropy and [[redundancy (information theory)|redundancy]] of a source, and its relevance through the [[source coding theorem]];
   −
* 信息熵和信源冗余,以及信源编码定理;
+
* 信息熵和信源冗余,以及'''<font color="#ff8000">信源编码定理</font>''';
    
* the mutual information, and the channel capacity of a noisy channel, including the promise of perfect loss-free communication given by the noisy-channel coding theorem;
 
* the mutual information, and the channel capacity of a noisy channel, including the promise of perfect loss-free communication given by the noisy-channel coding theorem;
   −
* 互信息,有噪信道的信道容量,包括无损通信的证明,和有噪信道编码定理;
+
* '''<font color="#ff8000">互信息,有噪信道的信道容量</font>''',包括无损通信的证明,和'''<font color="#ff8000">有噪信道编码定理</font>''';
   −
* the practical result of the [[Shannon–Hartley law]] for the channel capacity of a [[Gaussian channel]]; as well as
+
* the practical result of the [[]] for the channel capacity of a [[Gaussian channel]]; as well as
   −
* 香农-哈特利定律应用于高斯信道的信道容量的结果,以及
+
* '''<font color="#ff8000">香农-哈特利定律 Shannon–Hartley law</font>'''应用于高斯信道的信道容量的结果,以及
    
* the [[bit]]—a new way of seeing the most fundamental unit of information.
 
* the [[bit]]—a new way of seeing the most fundamental unit of information.
   −
* 比特——一种新的度量信息的最基本单位
+
* '''<font color="#ff8000">比特 bit</font>'''——一种新的度量信息的最基本单位
      第165行: 第165行:  
<math>H = - \sum_{i} p_i \log_2 (p_i)</math>
 
<math>H = - \sum_{i} p_i \log_2 (p_i)</math>
   −
基于每个用于通信的源符号的概率质量函数,香农熵(以比特为单位)由下式给出:
+
基于每个用于通信的源符号的概率质量函数,'''<font color="#ff8000">香农熵 Shannon Entropy</font>'''(以比特为单位)由下式给出:
 
<math>H = - \sum_{i} p_i \log_2 (p_i)</math>
 
<math>H = - \sum_{i} p_i \log_2 (p_i)</math>
   第203行: 第203行:  
If one transmits 1000 bits (0s and 1s), and the value of each of these bits is known to the receiver (has a specific value with certainty) ahead of transmission, it is clear that no information is transmitted.  If, however, each bit is independently equally likely to be 0 or 1, 1000 shannons of information (more often called bits) have been transmitted.  Between these two extremes, information can be quantified as follows. If 𝕏 is the set of all messages }} that  could be, and  is the probability of some <math>x \in \mathbb X</math>, then the entropy, , of  is defined:  
 
If one transmits 1000 bits (0s and 1s), and the value of each of these bits is known to the receiver (has a specific value with certainty) ahead of transmission, it is clear that no information is transmitted.  If, however, each bit is independently equally likely to be 0 or 1, 1000 shannons of information (more often called bits) have been transmitted.  Between these two extremes, information can be quantified as follows. If 𝕏 is the set of all messages }} that  could be, and  is the probability of some <math>x \in \mathbb X</math>, then the entropy, , of  is defined:  
   −
如果一个人发送了1000比特(0s和1s),然而接收者在发送之前就已知这串比特序列中的每一个位的值,显然这个通信过程并没有任何信息(译注:如果你要告诉我一个我已已经直到的消息,那么本次通信没有传递任何信息)。但是,如果消息未知,且每个比特独立且等可能的为0或1时,则本次通信传输了1000香农的信息(通常称为“比特”)。在这两个极端之间,信息可以按以下方式进行量化。如果𝕏是{{math|''X''}}可能在的所有消息的集合{{math|{''x''<sub>1</sub>, ..., ''x''<sub>''n''</sub>}}},且{{math|''p''(''x'')}}是<math>x \in \mathbb X</math>的概率,那么熵、{{math|''H''}}和{{math|''H''}}的定义如下: <ref name = Reza>{{cite book | title = An Introduction to Information Theory | author = Fazlollah M. Reza | publisher = Dover Publications, Inc., New York | origyear = 1961| year = 1994 | isbn = 0-486-68210-2 | url = https://books.google.com/books?id=RtzpRAiX6OgC&pg=PA8&dq=intitle:%22An+Introduction+to+Information+Theory%22++%22entropy+of+a+simple+source%22}}</ref>
+
如果一个人发送了1000比特(0s和1s),然而接收者在发送之前就已知这串比特序列中的每一个位的值,显然这个通信过程并没有任何信息(译注:如果你要告诉我一个我已经知到的消息,那么本次通信没有传递任何信息)。但是,如果消息未知,且每个比特独立且等可能的为0或1时,则本次通信传输了1000香农的信息(通常称为“比特”)。在这两个极端之间,信息可以按以下方式进行量化。如果𝕏是{{math|''X''}}可能在的所有消息的集合{{math|{''x''<sub>1</sub>, ..., ''x''<sub>''n''</sub>}}},且{{math|''p''(''x'')}}是<math>x \in \mathbb X</math>的概率,那么熵、{{math|''H''}}和{{math|''H''}}的定义如下: <ref name = Reza>{{cite book | title = An Introduction to Information Theory | author = Fazlollah M. Reza | publisher = Dover Publications, Inc., New York | origyear = 1961| year = 1994 | isbn = 0-486-68210-2 | url = https://books.google.com/books?id=RtzpRAiX6OgC&pg=PA8&dq=intitle:%22An+Introduction+to+Information+Theory%22++%22entropy+of+a+simple+source%22}}</ref>
      第231行: 第231行:  
The  of two discrete random variables  and  is merely the entropy of their pairing: .  This implies that if  and  are independent, then their joint entropy is the sum of their individual entropies.
 
The  of two discrete random variables  and  is merely the entropy of their pairing: .  This implies that if  and  are independent, then their joint entropy is the sum of their individual entropies.
   −
两个离散的随机变量{{math|''X''}}和{{math|''Y''}}的联合熵大致是它们的联合熵: {{math|(''X'', ''Y'')}}。若{{math|''X''}}和{{math|''Y''}}是独立的,那么它们的联合熵就是其各自熵的总和。
+
两个离散的随机变量{{math|''X''}}和{{math|''Y''}}的'''<font color="#ff8000">联合熵 Joint Entropy</font>'''大致是它们的配对: {{math|(''X'', ''Y'')}}。若{{math|''X''}}和{{math|''Y''}}是独立的,那么它们的联合熵就是其各自熵的总和。
      第256行: 第256行:  
The  or conditional uncertainty of  given random variable  (also called the equivocation of  about ) is the average conditional entropy over :
 
The  or conditional uncertainty of  given random variable  (also called the equivocation of  about ) is the average conditional entropy over :
   −
在给定随机变量{{math|''Y''}}下{{math|''X''}}的条件熵(或条件不确定性,也可称为{{math|''X''}}关于{{math|''Y''}}的含糊度))是{{math|''Y''}}上的平均条件熵: <ref name=Ash>{{cite book | title = Information Theory | author = Robert B. Ash | publisher = Dover Publications, Inc. | origyear = 1965| year = 1990 | isbn = 0-486-66521-6 | url = https://books.google.com/books?id=ngZhvUfF0UIC&pg=PA16&dq=intitle:information+intitle:theory+inauthor:ash+conditional+uncertainty}}</ref>
+
在给定随机变量{{math|''Y''}}下{{math|''X''}}的'''<font color="#ff8000">条件熵 Conditional Entropy</font>'''(或条件不确定性,也可称为{{math|''X''}}关于{{math|''Y''}}的含糊度))是{{math|''Y''}}上的平均条件熵: <ref name=Ash>{{cite book | title = Information Theory | author = Robert B. Ash | publisher = Dover Publications, Inc. | origyear = 1965| year = 1990 | isbn = 0-486-66521-6 | url = https://books.google.com/books?id=ngZhvUfF0UIC&pg=PA16&dq=intitle:information+intitle:theory+inauthor:ash+conditional+uncertainty}}</ref>
    
:<math> H(X|Y) = \mathbb E_Y [H(X|y)] = -\sum_{y \in Y} p(y) \sum_{x \in X} p(x|y) \log p(x|y) = -\sum_{x,y} p(x,y) \log p(x|y).</math>
 
:<math> H(X|Y) = \mathbb E_Y [H(X|y)] = -\sum_{y \in Y} p(y) \sum_{x \in X} p(x|y) \log p(x|y) = -\sum_{x,y} p(x,y) \log p(x|y).</math>
第275行: 第275行:  
Mutual information measures the amount of information that can be obtained about one random variable by observing another.  It is important in communication where it can be used to maximize the amount of information shared between sent and received signals.  The mutual information of  relative to  is given by:
 
Mutual information measures the amount of information that can be obtained about one random variable by observing another.  It is important in communication where it can be used to maximize the amount of information shared between sent and received signals.  The mutual information of  relative to  is given by:
   −
互信息度量的是某个随机变量在通过观察另一个随机变量时可以获得的信息量。在通信中可以用它来最大化发送和接收信号之间共享的信息量,这一点至关重要。{{math|''X''}}相对于{{math|''Y''}}的互信息由以下公式给出:
+
'''<font color="#ff8000">互信息 Mutual Information</font>'''度量的是某个随机变量在通过观察另一个随机变量时可以获得的信息量。在通信中可以用它来最大化发送和接收信号之间共享的信息量,这一点至关重要。{{math|''X''}}相对于{{math|''Y''}}的互信息由以下公式给出:
      第343行: 第343行:  
The Kullback–Leibler divergence (or information divergence, information gain, or relative entropy) is a way of comparing two distributions: a "true" probability distribution p(X), and an arbitrary probability distribution q(X). If we compress data in a manner that assumes q(X) is the distribution underlying some data, when, in reality, p(X) is the correct distribution, the Kullback–Leibler divergence is the number of average additional bits per datum necessary for compression.  It is thus defined
 
The Kullback–Leibler divergence (or information divergence, information gain, or relative entropy) is a way of comparing two distributions: a "true" probability distribution p(X), and an arbitrary probability distribution q(X). If we compress data in a manner that assumes q(X) is the distribution underlying some data, when, in reality, p(X) is the correct distribution, the Kullback–Leibler divergence is the number of average additional bits per datum necessary for compression.  It is thus defined
   −
Kullback-Leibler 散度(或信息散度、相对熵、信息增益)是比较两种分布的方法: “真实的”概率分布''p(X)''和任意概率分布''q(X)''。若假设''q(X)''是基于某种方式压缩的数据的分布,而实际上''p(X)''才是真正分布,那么 Kullback-Leibler 散度是每个数据压缩所需的平均额外比特数。因此定义:
+
'''<font color="#ff8000">Kullback-Leibler 散度</font>'''(或信息散度、相对熵、信息增益)是比较两种分布的方法: “真实的”概率分布''p(X)''和任意概率分布''q(X)''。若假设''q(X)''是基于某种方式压缩的数据的分布,而实际上''p(X)''才是真正分布,那么 Kullback-Leibler 散度是每个数据压缩所需的平均额外比特数。因此定义:
    
:<math>D_{\mathrm{KL}}(p(X) \| q(X)) = \sum_{x \in X} -p(x) \log {q(x)} \, - \, \sum_{x \in X} -p(x) \log {p(x)} = \sum_{x \in X} p(x) \log \frac{p(x)}{q(x)}.</math>
 
:<math>D_{\mathrm{KL}}(p(X) \| q(X)) = \sum_{x \in X} -p(x) \log {q(x)} \, - \, \sum_{x \in X} -p(x) \log {p(x)} = \sum_{x \in X} p(x) \log \frac{p(x)}{q(x)}.</math>
第369行: 第369行:  
Other important information theoretic quantities include Rényi entropy (a generalization of entropy), differential entropy (a generalization of quantities of information to continuous distributions), and the conditional mutual information.
 
Other important information theoretic quantities include Rényi entropy (a generalization of entropy), differential entropy (a generalization of quantities of information to continuous distributions), and the conditional mutual information.
   −
信息论中其他重要的量包括Rényi熵(一种熵的推广),微分熵(信息量推广到连续分布),以及条件互信息。
+
信息论中其他重要的量包括'''<font color="#ff8000">瑞丽熵 Rényi Entropy</font>'''(一种熵的推广),微分熵(信息量推广到连续分布),以及条件互信息。
 +
 
    
==编码理论==
 
==编码理论==
      
{{Main|Coding theory}}
 
{{Main|Coding theory}}
第390行: 第390行:  
Coding theory is one of the most important and direct applications of information theory. It can be subdivided into source coding theory and channel coding theory. Using a statistical description for data, information theory quantifies the number of bits needed to describe the data, which is the information entropy of the source.
 
Coding theory is one of the most important and direct applications of information theory. It can be subdivided into source coding theory and channel coding theory. Using a statistical description for data, information theory quantifies the number of bits needed to describe the data, which is the information entropy of the source.
   −
编码理论是信息论最重要、最直接的应用之一,可以细分为信源编码理论和信道编码理论。信息论使用统计学来量化描述数据所需的比特数,也就是源的信息熵。
+
'''<font color="#ff8000">编码理论 Coding Theory</font>'''是信息论最重要、最直接的应用之一,可以细分为'''<font color="#ff8000">信源编码理论 Source Coding Theory</font>'''和'''<font color="#ff8000">信道编码理论 Channel Coding Theory</font>'''。信息论使用统计学来量化描述数据所需的比特数,也就是源的信息熵。
      第414行: 第414行:  
This division of coding theory into compression and transmission is justified by the information transmission theorems, or source–channel separation theorems that justify the use of bits as the universal currency for information in many contexts. However, these theorems only hold in the situation where one transmitting user wishes to communicate to one receiving user. In scenarios with more than one transmitter (the multiple-access channel), more than one receiver (the broadcast channel) or intermediary "helpers" (the relay channel), or more general networks, compression followed by transmission may no longer be optimal. Network information theory refers to these multi-agent communication models.
 
This division of coding theory into compression and transmission is justified by the information transmission theorems, or source–channel separation theorems that justify the use of bits as the universal currency for information in many contexts. However, these theorems only hold in the situation where one transmitting user wishes to communicate to one receiving user. In scenarios with more than one transmitter (the multiple-access channel), more than one receiver (the broadcast channel) or intermediary "helpers" (the relay channel), or more general networks, compression followed by transmission may no longer be optimal. Network information theory refers to these multi-agent communication models.
   −
信息传输定理,或着说“信源-信道分离定理”证明编码理论划分为压缩和传输是正确的,这些定理证明了在许多情况下使用比特作为信息的''通用货币''是合理的,但这只在发送用户与特定接收用户建立通信的情况下才成立。在具有多个发送器(多路访问信道),多个接收器(广播信道)或中转器(中继信道)或多个计算机网络的情况下,压缩后再进行传输可能就不再是最佳选择。[[网络信息论]]指的就是这些多主体通信模型。
+
信息传输定理,或着说“信源-信道分离定理”证明,编码理论应当划分为压缩和传输两部分。定理证明了在许多情况下使用比特作为信息的''通用货币''是合理的,但这只在发送用户与特定接收用户建立通信的情况下才成立。在具有多个发送器(多路访问信道),多个接收器(广播信道)或中转器(中继信道)或多个计算机网络的情况下,压缩后再进行传输可能就不再是最佳选择。[[网络信息论]]指的就是这些多主体通信模型。
      第433行: 第433行:  
Information rate is the average entropy per symbol.  For memoryless sources, this is merely the entropy of each symbol, while, in the case of a stationary stochastic process, it is
 
Information rate is the average entropy per symbol.  For memoryless sources, this is merely the entropy of each symbol, while, in the case of a stationary stochastic process, it is
   −
信息速率(熵率)是每个符号的平均熵。对于无记忆信源,信息速率仅表示每个符号的熵,而在平稳随机过程中,它是:
+
'''<font color="#ff8000">信息速率 Information Rate</font>'''(熵率)是每个符号的平均熵。对于无记忆信源,信息速率仅表示每个符号的熵,而在平稳随机过程中,它是:
    
:<math>r = \lim_{n \to \infty} H(X_n|X_{n-1},X_{n-2},X_{n-3}, \ldots);</math>
 
:<math>r = \lim_{n \to \infty} H(X_n|X_{n-1},X_{n-2},X_{n-3}, \ldots);</math>
第471行: 第471行:  
Communications over a channel—such as an ethernet cable—is the primary motivation of information theory.  However, such channels often fail to produce exact reconstruction of a signal; noise, periods of silence, and other forms of signal corruption often degrade quality.
 
Communications over a channel—such as an ethernet cable—is the primary motivation of information theory.  However, such channels often fail to produce exact reconstruction of a signal; noise, periods of silence, and other forms of signal corruption often degrade quality.
   −
通过信道(例如:以太网电缆)进行通信是信息论的主要动机。然而,这样的信道往往不能产生信号的精确重建; 静默时段内、噪声、其他形式的信号损坏往往会使得信息质量的降低。
+
通过信道(例如,以太网电缆)进行通信是信息论的主要动机。然而,这样的信道往往不能产生信号的精确重建; 静默时段内、噪声、其他形式的信号损坏往往会使得信息质量的降低。
      第548行: 第548行:     
::[[File:Binary erasure channel.svg]]
 
::[[File:Binary erasure channel.svg]]
  −
==Applications to other fields==
  −
  −
==Applications to other fields==
  −
  −
其他领域的应用
  −
  −
  −
        −
===Intelligence uses and secrecy applications===
     −
===Intelligence uses and secrecy applications===
+
==在其他领域的应用==
   −
情报使用和安保中的应用
+
===情报使用和安全应用===
    
Information theoretic concepts apply to cryptography and cryptanalysis. Turing's information unit, the [[Ban (unit)|ban]], was used in the [[Ultra]] project, breaking the German [[Enigma machine]] code and hastening the [[Victory in Europe Day|end of World War II in Europe]].  Shannon himself defined an important concept now called the [[unicity distance]]. Based on the redundancy of the [[plaintext]], it attempts to give a minimum amount of [[ciphertext]] necessary to ensure unique decipherability.
 
Information theoretic concepts apply to cryptography and cryptanalysis. Turing's information unit, the [[Ban (unit)|ban]], was used in the [[Ultra]] project, breaking the German [[Enigma machine]] code and hastening the [[Victory in Europe Day|end of World War II in Europe]].  Shannon himself defined an important concept now called the [[unicity distance]]. Based on the redundancy of the [[plaintext]], it attempts to give a minimum amount of [[ciphertext]] necessary to ensure unique decipherability.
第569行: 第559行:  
Information theoretic concepts apply to cryptography and cryptanalysis. Turing's information unit, the ban, was used in the Ultra project, breaking the German Enigma machine code and hastening the end of World War II in Europe.  Shannon himself defined an important concept now called the unicity distance. Based on the redundancy of the plaintext, it attempts to give a minimum amount of ciphertext necessary to ensure unique decipherability.
 
Information theoretic concepts apply to cryptography and cryptanalysis. Turing's information unit, the ban, was used in the Ultra project, breaking the German Enigma machine code and hastening the end of World War II in Europe.  Shannon himself defined an important concept now called the unicity distance. Based on the redundancy of the plaintext, it attempts to give a minimum amount of ciphertext necessary to ensure unique decipherability.
   −
信息论概念应用于密码学和密码分析。在[[Ultra]]的项目中使用了Turing的信息单元[[Ban(unit)| ban]],破解了德国的恩尼格玛密码,加速了二战在欧洲的结束。香农定义了一个重要的概念,现在称为单一性距离([[unicity distance]]),基于明文的冗余性尝试给出具有唯一可解密性所需的最少量的密文。
+
信息论的概念可以应用于密码学和密码分析。在[[Ultra]]的项目中就使用了图灵的信息单位[[Ban(unit)| ban]],破解了德国的恩尼格玛密码,加速了二战在欧洲的结束。香农定义了一个重要的概念,现在称为单一性距离([[unicity distance]]),基于明文的冗余性尝试给出具有唯一可解密性所需的最少量的密文。
      第578行: 第568行:  
Information theory leads us to believe it is much more difficult to keep secrets than it might first appear.  A brute force attack can break systems based on asymmetric key algorithms or on most commonly used methods of symmetric key algorithms (sometimes called secret key algorithms), such as block ciphers.  The security of all such methods currently comes from the assumption that no known attack can break them in a practical amount of time.
 
Information theory leads us to believe it is much more difficult to keep secrets than it might first appear.  A brute force attack can break systems based on asymmetric key algorithms or on most commonly used methods of symmetric key algorithms (sometimes called secret key algorithms), such as block ciphers.  The security of all such methods currently comes from the assumption that no known attack can break them in a practical amount of time.
   −
信息论使我们觉得保守秘密比最初看起来要困难得多。穷举法可以基于非对称密钥算法或最常用的对称密钥算法(也称为秘密密钥算法),如分组密码破坏系统。当前,所有这些方法的安全性都来自一下假设:在已知的时间内没有已知的攻击可以破坏它们。
+
信息论使我们觉得保密比最初看起来要困难得多。穷举法也可以破解基于非对称密钥算法或最常用的对称密钥算法(也称为密钥算法),如分块加密。所有这些方法的安全性都来自以下假设:在一定的的时间内没有已知的攻击方法可以破解它们。
 
        第587行: 第576行:  
Information theoretic security refers to methods such as the one-time pad that are not vulnerable to such brute force attacks.  In such cases, the positive conditional mutual information between the plaintext and ciphertext (conditioned on the key) can ensure proper transmission, while the unconditional mutual information between the plaintext and ciphertext remains zero, resulting in absolutely secure communications.  In other words, an eavesdropper would not be able to improve his or her guess of the plaintext by gaining knowledge of the ciphertext but not of the key. However, as in any other cryptographic system, care must be used to correctly apply even information-theoretically secure methods; the Venona project was able to crack the one-time pads of the Soviet Union due to their improper reuse of key material.
 
Information theoretic security refers to methods such as the one-time pad that are not vulnerable to such brute force attacks.  In such cases, the positive conditional mutual information between the plaintext and ciphertext (conditioned on the key) can ensure proper transmission, while the unconditional mutual information between the plaintext and ciphertext remains zero, resulting in absolutely secure communications.  In other words, an eavesdropper would not be able to improve his or her guess of the plaintext by gaining knowledge of the ciphertext but not of the key. However, as in any other cryptographic system, care must be used to correctly apply even information-theoretically secure methods; the Venona project was able to crack the one-time pads of the Soviet Union due to their improper reuse of key material.
   −
信息理论安全性指的是诸如一次性密钥之类的不易受到这种暴力攻击的方法。在这种情况下,可以确保明文和密文(以密钥为条件)之间的正条件互信息正确的传输,而明文和密文之间的无条件互信息仍为零,从而保证绝对安全的通信。换句话说,窃听者将无法通过获取密文而不是密钥的知识来改善其对纯文本的猜测。但是,就像在其他任何密码系统中一样,必须小心正确的使用信息论中安全的方法; 之所以Venona 项目能够破解苏联的一次性密钥,是因为苏联不当地重复使用关键材料。
+
信息理论安全性指的是诸如一次性密钥之类的不易受到这种暴力攻击的方法。在这种情况下,可以确保明文和密文(以密钥为条件)之间的正条件互信息正确的传输,而明文和密文之间的无条件互信息仍为零,从而保证绝对安全的通信。换句话说,窃听者将无法通过获取密文而不是密钥的知识来改善其对原文本的猜测。但是,就像在其他任何密码系统中一样,即便时信息论中安全的方法必须小心正确的使用; 之所以Venona 项目能够破解苏联的一次性密钥,就是因为苏联不当地重复使用关键材料。
 
        −
 
+
===伪随机数的生成===
===Pseudorandom number generation===
  −
 
  −
===Pseudorandom number generation===
  −
 
  −
伪随机数的生成
      
[[Pseudorandom number generator]]s are widely available in computer language libraries and application programs. They are, almost universally, unsuited to cryptographic use as they do not evade the deterministic nature of modern computer equipment and software. A class of improved random number generators is termed [[cryptographically secure pseudorandom number generator]]s, but even they require [[random seed]]s external to the software to work as intended. These can be obtained via [[Extractor (mathematics)|extractors]], if done carefully. The measure of  sufficient randomness in extractors is [[min-entropy]], a value related to Shannon entropy through [[Rényi entropy]]; Rényi entropy is also used in evaluating randomness in cryptographic systems.  Although related, the distinctions among these measures mean that a random variable with high Shannon entropy is not necessarily satisfactory for use in an extractor and so for cryptography uses.
 
[[Pseudorandom number generator]]s are widely available in computer language libraries and application programs. They are, almost universally, unsuited to cryptographic use as they do not evade the deterministic nature of modern computer equipment and software. A class of improved random number generators is termed [[cryptographically secure pseudorandom number generator]]s, but even they require [[random seed]]s external to the software to work as intended. These can be obtained via [[Extractor (mathematics)|extractors]], if done carefully. The measure of  sufficient randomness in extractors is [[min-entropy]], a value related to Shannon entropy through [[Rényi entropy]]; Rényi entropy is also used in evaluating randomness in cryptographic systems.  Although related, the distinctions among these measures mean that a random variable with high Shannon entropy is not necessarily satisfactory for use in an extractor and so for cryptography uses.
第602行: 第585行:  
Pseudorandom number generators are widely available in computer language libraries and application programs. They are, almost universally, unsuited to cryptographic use as they do not evade the deterministic nature of modern computer equipment and software. A class of improved random number generators is termed cryptographically secure pseudorandom number generators, but even they require random seeds external to the software to work as intended. These can be obtained via extractors, if done carefully. The measure of  sufficient randomness in extractors is min-entropy, a value related to Shannon entropy through Rényi entropy; Rényi entropy is also used in evaluating randomness in cryptographic systems.  Although related, the distinctions among these measures mean that a random variable with high Shannon entropy is not necessarily satisfactory for use in an extractor and so for cryptography uses.
 
Pseudorandom number generators are widely available in computer language libraries and application programs. They are, almost universally, unsuited to cryptographic use as they do not evade the deterministic nature of modern computer equipment and software. A class of improved random number generators is termed cryptographically secure pseudorandom number generators, but even they require random seeds external to the software to work as intended. These can be obtained via extractors, if done carefully. The measure of  sufficient randomness in extractors is min-entropy, a value related to Shannon entropy through Rényi entropy; Rényi entropy is also used in evaluating randomness in cryptographic systems.  Although related, the distinctions among these measures mean that a random variable with high Shannon entropy is not necessarily satisfactory for use in an extractor and so for cryptography uses.
   −
伪随机数生成器在计算机语言库和应用程序中广泛应用。由于它们没有规避现代计算机设备和软件的确定性,因此普遍不适合密码使用。一类改进的随机数生成器称为加密安全的伪随机数生成器,但也需要软件外部的随机种子才能正常工作。如果更进一步开发可以通过提取器来获得。提取器中充分随机性的度量是最小熵,该值与通过[[Rényi熵]]的Shannon熵有关;Rényi熵还用于评估密码系统中的随机性。虽然相关,具有较高Shannon熵的随机变量不一定适合在提取器中使用,因此在密码学用途中并不令人满意。
+
伪随机数生成器在计算机语言库和应用程序中广泛应用。由于它们没有规避现代计算机设备和软件的确定性,因此普遍不适合用在密码学中。一类改进的随机数生成器称为加密安全的伪随机数生成器,但也需要软件外部的随机种子才能正常工作,这通过提取器来获得。用来度量提取器中充分随机性的概念是最小熵,该值通过[[瑞丽熵]]与香农熵关联;瑞丽熵还用于评估密码系统中的随机性。虽然相关,但具有较高香农熵的随机变量不一定适合在提取器中使用,因此也不能用在密码学中。
 
  −
 
     −
===Seismic exploration===
     −
===Seismic exploration===
     −
地震勘探
+
===地震勘探===
    
One early commercial application of information theory was in the field of seismic oil exploration. Work in this field made it possible to strip off and separate the unwanted noise from the desired seismic signal. Information theory and [[digital signal processing]] offer a major improvement of resolution and image clarity over previous analog methods.<ref>{{cite journal|doi=10.1002/smj.4250020202 | volume=2 | issue=2 | title=The corporation and innovation | year=1981 | journal=Strategic Management Journal | pages=97–118 | last1 = Haggerty | first1 = Patrick E.}}</ref>
 
One early commercial application of information theory was in the field of seismic oil exploration. Work in this field made it possible to strip off and separate the unwanted noise from the desired seismic signal. Information theory and [[digital signal processing]] offer a major improvement of resolution and image clarity over previous analog methods.<ref>{{cite journal|doi=10.1002/smj.4250020202 | volume=2 | issue=2 | title=The corporation and innovation | year=1981 | journal=Strategic Management Journal | pages=97–118 | last1 = Haggerty | first1 = Patrick E.}}</ref>
第621行: 第600行:       −
===Semiotics===
+
===符号学===
 
  −
===Semiotics===
  −
 
  −
符号学
      
[[Semiotics|Semioticians]] [[:nl:Doede Nauta|Doede Nauta]] and [[Winfried Nöth]] both considered [[Charles Sanders Peirce]] as having created a theory of information in his works on semiotics.<ref name="Nauta 1972">{{cite book |ref=harv |last1=Nauta |first1=Doede |title=The Meaning of Information |date=1972 |publisher=Mouton |location=The Hague |isbn=9789027919960}}</ref>{{rp|171}}<ref name="Nöth 2012">{{cite journal |ref=harv |last1=Nöth |first1=Winfried |title=Charles S. Peirce's theory of information: a theory of the growth of symbols and of knowledge |journal=Cybernetics and Human Knowing |date=January 2012 |volume=19 |issue=1–2 |pages=137–161 |url=https://edisciplinas.usp.br/mod/resource/view.php?id=2311849}}</ref>{{rp|137}} Nauta defined semiotic information theory as the study of "the internal processes of coding, filtering, and information processing."<ref name="Nauta 1972"/>{{rp|91}}
 
[[Semiotics|Semioticians]] [[:nl:Doede Nauta|Doede Nauta]] and [[Winfried Nöth]] both considered [[Charles Sanders Peirce]] as having created a theory of information in his works on semiotics.<ref name="Nauta 1972">{{cite book |ref=harv |last1=Nauta |first1=Doede |title=The Meaning of Information |date=1972 |publisher=Mouton |location=The Hague |isbn=9789027919960}}</ref>{{rp|171}}<ref name="Nöth 2012">{{cite journal |ref=harv |last1=Nöth |first1=Winfried |title=Charles S. Peirce's theory of information: a theory of the growth of symbols and of knowledge |journal=Cybernetics and Human Knowing |date=January 2012 |volume=19 |issue=1–2 |pages=137–161 |url=https://edisciplinas.usp.br/mod/resource/view.php?id=2311849}}</ref>{{rp|137}} Nauta defined semiotic information theory as the study of "the internal processes of coding, filtering, and information processing."<ref name="Nauta 1972"/>{{rp|91}}
第632行: 第607行:     
符号学家[[:nl:Doede Nauta|Doede Nauta]]和[[Winfried Nöth]]都认为[[Charles Sanders Peirce]]在他的符号学著作中创造了信息论。<ref name="Nauta 1972">{{cite book |ref=harv |last1=Nauta |first1=Doede |title=The Meaning of Information |date=1972 |publisher=Mouton |location=The Hague |isbn=9789027919960}}</ref>{{rp|171}}<ref name="Nöth 2012">{{cite journal |ref=harv |last1=Nöth |first1=Winfried |title=Charles S. Peirce's theory of information: a theory of the growth of symbols and of knowledge |journal=Cybernetics and Human Knowing |date=January 2012 |volume=19 |issue=1–2 |pages=137–161 |url=https://edisciplinas.usp.br/mod/resource/view.php?id=2311849}}</ref>{{rp|137}} Nauta将符号信息论定义为研究编码、过滤和信息处理的内部过程。<ref name="Nauta 1972"/>{{rp|91}}
 
符号学家[[:nl:Doede Nauta|Doede Nauta]]和[[Winfried Nöth]]都认为[[Charles Sanders Peirce]]在他的符号学著作中创造了信息论。<ref name="Nauta 1972">{{cite book |ref=harv |last1=Nauta |first1=Doede |title=The Meaning of Information |date=1972 |publisher=Mouton |location=The Hague |isbn=9789027919960}}</ref>{{rp|171}}<ref name="Nöth 2012">{{cite journal |ref=harv |last1=Nöth |first1=Winfried |title=Charles S. Peirce's theory of information: a theory of the growth of symbols and of knowledge |journal=Cybernetics and Human Knowing |date=January 2012 |volume=19 |issue=1–2 |pages=137–161 |url=https://edisciplinas.usp.br/mod/resource/view.php?id=2311849}}</ref>{{rp|137}} Nauta将符号信息论定义为研究编码、过滤和信息处理的内部过程。<ref name="Nauta 1972"/>{{rp|91}}
  −
  −
  −
        第642行: 第613行:  
Concepts from information theory such as redundancy and code control have been used by semioticians such as Umberto Eco and Ferruccio Rossi-Landi to explain ideology as a form of message transmission whereby a dominant social class emits its message by using signs that exhibit a high degree of redundancy such that only one message is decoded among a selection of competing ones.
 
Concepts from information theory such as redundancy and code control have been used by semioticians such as Umberto Eco and Ferruccio Rossi-Landi to explain ideology as a form of message transmission whereby a dominant social class emits its message by using signs that exhibit a high degree of redundancy such that only one message is decoded among a selection of competing ones.
   −
来自信息论的概念,如冗余和代码控制,已经被符号学家如 Umberto Eco 和 Ferruccio Rossi-Landi 用来解释意识形态作为一种信息传递的形式,主导社会阶层通过使用高度冗余的标志来发出信息,这样在一系列相互竞争的标志中只有一个信息被解码。
+
信息论的概念(例如冗余和代码控制)已被符号学家如Umberto Eco和Ferruccio Rossi-Landi用来解释意识形态,将其作为消息传输的一种形式,占统治地位的社会阶层通过使用具有高度冗余性的标志来发出其信息,使得从符号中解码出来的消息只有一种,而不会时其他可能的消息。
 
  −
信息论的概念(例如冗余和代码控制)已被符号学家如Umberto Eco和Ferruccio Rossi-Landi用来解释意识形态,将其作为消息传输的一种形式,占统治地位的社会阶层通过使用具有高度冗余性的标志来发出其信息,从而在一系列相互竞争的标志中只有一个信息被解码。
  −
 
        −
===Miscellaneous applications===
     −
===Miscellaneous applications===
+
===其他应用===
   −
其他杂项中的应用
   
Information theory also has applications in [[Gambling and information theory]], [[black hole information paradox|black holes]], and [[bioinformatics]].
 
Information theory also has applications in [[Gambling and information theory]], [[black hole information paradox|black holes]], and [[bioinformatics]].
    
Information theory also has applications in Gambling and information theory, black holes, and bioinformatics.
 
Information theory also has applications in Gambling and information theory, black holes, and bioinformatics.
   −
信息论在赌博和信息论、黑洞信息悖论和生物信息学中也有应用。
+
信息论在赌博、黑洞信息悖论和生物信息学中也有应用。
 
  −
 
        第702行: 第666行:  
* [[Intelligence (information gathering)]]
 
* [[Intelligence (information gathering)]]
 
* [[reflection seismology|Seismic exploration]]
 
* [[reflection seismology|Seismic exploration]]
 +
    
* [[主动网络]]
 
* [[主动网络]]
第747行: 第712行:  
* [[Quantum information science]]
 
* [[Quantum information science]]
 
* [[Source coding]]
 
* [[Source coding]]
 +
    
* [[编码理论]]
 
* [[编码理论]]
第794行: 第760行:  
* [[Variety (cybernetics)|Variety]]
 
* [[Variety (cybernetics)|Variety]]
 
* [[Hamming distance]]
 
* [[Hamming distance]]
 +
    
* [[Ban (单位)]] —— 以10为底的对数信息量单位
 
* [[Ban (单位)]] —— 以10为底的对数信息量单位
第815行: 第782行:  
* [[瑞丽熵]]
 
* [[瑞丽熵]]
 
* [[子信息]]
 
* [[子信息]]
* [[唯一解距离]]
+
* [[单一性距离]]
 
* [[种类 (控制论)|种类]]
 
* [[种类 (控制论)|种类]]
 
* [[汉明距离]]
 
* [[汉明距离]]
370

个编辑

导航菜单