更改

跳到导航 跳到搜索
删除5,413字节 、 2020年11月15日 (日) 17:05
无编辑摘要
第9行: 第9行:       −
The field is at the intersection of mathematics, [[statistics]], computer science, physics, [[Neuroscience|neurobiology]], [[information engineering (field)|information engineering]], and electrical engineering. The theory has also found applications in other areas, including [[statistical inference]], [[natural language processing]], [[cryptography]], [[neurobiology]],<ref name="Spikes">{{cite book|title=Spikes: Exploring the Neural Code|author1=F. Rieke|author2=D. Warland|author3=R Ruyter van Steveninck|author4=W Bialek|publisher=The MIT press|year=1997|isbn=978-0262681087}}</ref> [[human vision]],<ref>{{Cite journal|last1=Delgado-Bonal|first1=Alfonso|last2=Martín-Torres|first2=Javier|date=2016-11-03|title=Human vision is determined based on information theory|journal=Scientific Reports|language=En|volume=6|issue=1|pages=36038|bibcode=2016NatSR...636038D|doi=10.1038/srep36038|issn=2045-2322|pmc=5093619|pmid=27808236}}</ref> the evolution<ref>{{cite journal|last1=cf|last2=Huelsenbeck|first2=J. P.|last3=Ronquist|first3=F.|last4=Nielsen|first4=R.|last5=Bollback|first5=J. P.|year=2001|title=Bayesian inference of phylogeny and its impact on evolutionary biology|url=|journal=Science|volume=294|issue=5550|pages=2310–2314|bibcode=2001Sci...294.2310H|doi=10.1126/science.1065889|pmid=11743192|s2cid=2138288}}</ref> and function<ref>{{cite journal|last1=Allikmets|first1=Rando|last2=Wasserman|first2=Wyeth W.|last3=Hutchinson|first3=Amy|last4=Smallwood|first4=Philip|last5=Nathans|first5=Jeremy|last6=Rogan|first6=Peter K.|year=1998|title=Thomas D. Schneider], Michael Dean (1998) Organization of the ABCR gene: analysis of promoter and splice junction sequences|url=http://alum.mit.edu/www/toms/|journal=Gene|volume=215|issue=1|pages=111–122|doi=10.1016/s0378-1119(98)00269-8|pmid=9666097}}</ref> of molecular codes ([[bioinformatics]]), [[model selection]] in statistics,<ref>Burnham, K. P. and Anderson D. R. (2002) ''Model Selection and Multimodel Inference: A Practical Information-Theoretic Approach, Second Edition'' (Springer Science, New York) {{ISBN|978-0-387-95364-9}}.</ref> [[thermal physics]],<ref>{{cite journal|last1=Jaynes|first1=E. T.|year=1957|title=Information Theory and Statistical Mechanics|url=http://bayes.wustl.edu/|journal=Phys. Rev.|volume=106|issue=4|page=620|bibcode=1957PhRv..106..620J|doi=10.1103/physrev.106.620}}</ref> [[quantum computing]], linguistics, [[plagiarism detection]],<ref>{{cite journal|last1=Bennett|first1=Charles H.|last2=Li|first2=Ming|last3=Ma|first3=Bin|year=2003|title=Chain Letters and Evolutionary Histories|url=http://sciamdigital.com/index.cfm?fa=Products.ViewIssuePreview&ARTICLEID_CHAR=08B64096-0772-4904-9D48227D5C9FAC75|journal=Scientific American|volume=288|issue=6|pages=76–81|bibcode=2003SciAm.288f..76B|doi=10.1038/scientificamerican0603-76|pmid=12764940|access-date=2008-03-11|archive-url=https://web.archive.org/web/20071007041539/http://www.sciamdigital.com/index.cfm?fa=Products.ViewIssuePreview&ARTICLEID_CHAR=08B64096-0772-4904-9D48227D5C9FAC75|archive-date=2007-10-07|url-status=dead}}</ref> [[pattern recognition]], and [[anomaly detection]].<ref>{{Cite web|url=http://aicanderson2.home.comcast.net/~aicanderson2/home.pdf|title=Some background on why people in the empirical sciences may want to better understand the information-theoretic methods|author=David R. Anderson|date=November 1, 2003|archiveurl=https://web.archive.org/web/20110723045720/http://aicanderson2.home.comcast.net/~aicanderson2/home.pdf|archivedate=July 23, 2011|url-status=dead|accessdate=2010-06-23}}
+
The field is at the intersection of mathematics, [[statistics]], computer science, physics, [[Neuroscience|neurobiology]], [[information engineering (field)|information engineering]], and electrical engineering. The theory has also found applications in other areas, including [[statistical inference]], [[natural language processing]], [[cryptography]], [[neurobiology]], [[human vision]], the evolution and function of molecular codes ([[bioinformatics]]), [[model selection]] in statistics, [[thermal physics]],[[quantum computing]], linguistics, [[plagiarism detection]], [[pattern recognition]], and [[anomaly detection]]. Important sub-fields of information theory include [[source coding]], [[algorithmic complexity theory]], [[algorithmic information theory]], and [[information-theoretic security]].
</ref> Important sub-fields of information theory include [[source coding]], [[algorithmic complexity theory]], [[algorithmic information theory]], and [[information-theoretic security]].
      
The field is at the intersection of mathematics, statistics, computer science, physics, neurobiology, information engineering, and electrical engineering. The theory has also found applications in other areas, including statistical inference, natural language processing, cryptography, neurobiology, human vision, the evolution and function of molecular codes (bioinformatics), model selection in statistics, thermal physics, quantum computing, linguistics, plagiarism detection, pattern recognition, and anomaly detection.
 
The field is at the intersection of mathematics, statistics, computer science, physics, neurobiology, information engineering, and electrical engineering. The theory has also found applications in other areas, including statistical inference, natural language processing, cryptography, neurobiology, human vision, the evolution and function of molecular codes (bioinformatics), model selection in statistics, thermal physics, quantum computing, linguistics, plagiarism detection, pattern recognition, and anomaly detection.
   −
该领域是数学、统计学、计算机科学、物理学、神经生物学、信息工程和电气工程的交叉学科。这一理论也在其他领域得到了应用,比如推论统计学、自然语言处理、密码学、神经生物学、人类视觉、分子编码的进化和功能(生物信息学)、统计学中的模型选择、热物理学、量子计算、语言学、剽窃检测、模式识别和异常检测。
+
该领域是数学、统计学、计算机科学、物理学、神经生物学、信息工程和电气工程的交叉学科。这一理论也在其他领域得到了应用,比如推论统计学、自然语言处理、密码学、神经生物学<ref name="Spikes">{{cite book|title=Spikes: Exploring the Neural Code|author1=F. Rieke|author2=D. Warland|author3=R Ruyter van Steveninck|author4=W Bialek|publisher=The MIT press|year=1997|isbn=978-0262681087}}</ref>、人类视觉<ref>{{Cite journal|last1=Delgado-Bonal|first1=Alfonso|last2=Martín-Torres|first2=Javier|date=2016-11-03|title=Human vision is determined based on information theory|journal=Scientific Reports|language=En|volume=6|issue=1|pages=36038|bibcode=2016NatSR...636038D|doi=10.1038/srep36038|issn=2045-2322|pmc=5093619|pmid=27808236}}</ref>、分子编码的进化<ref>{{cite journal|last1=cf|last2=Huelsenbeck|first2=J. P.|last3=Ronquist|first3=F.|last4=Nielsen|first4=R.|last5=Bollback|first5=J. P.|year=2001|title=Bayesian inference of phylogeny and its impact on evolutionary biology|url=|journal=Science|volume=294|issue=5550|pages=2310–2314|bibcode=2001Sci...294.2310H|doi=10.1126/science.1065889|pmid=11743192|s2cid=2138288}}</ref>和功能(生物信息学)、统计学中的模型选择<ref>Burnham, K. P. and Anderson D. R. (2002) ''Model Selection and Multimodel Inference: A Practical Information-Theoretic Approach, Second Edition'' (Springer Science, New York) {{ISBN|978-0-387-95364-9}}.</ref>、热物理学<ref>{{cite journal|last1=Jaynes|first1=E. T.|year=1957|title=Information Theory and Statistical Mechanics|url=http://bayes.wustl.edu/|journal=Phys. Rev.|volume=106|issue=4|page=620|bibcode=1957PhRv..106..620J|doi=10.1103/physrev.106.620}}</ref> 、量子计算、语言学、剽窃检测<ref>{{cite journal|last1=Bennett|first1=Charles H.|last2=Li|first2=Ming|last3=Ma|first3=Bin|year=2003|title=Chain Letters and Evolutionary Histories|url=http://sciamdigital.com/index.cfm?fa=Products.ViewIssuePreview&ARTICLEID_CHAR=08B64096-0772-4904-9D48227D5C9FAC75|journal=Scientific American|volume=288|issue=6|pages=76–81|bibcode=2003SciAm.288f..76B|doi=10.1038/scientificamerican0603-76|pmid=12764940|access-date=2008-03-11|archive-url=https://web.archive.org/web/20071007041539/http://www.sciamdigital.com/index.cfm?fa=Products.ViewIssuePreview&ARTICLEID_CHAR=08B64096-0772-4904-9D48227D5C9FAC75|archive-date=2007-10-07|url-status=dead}}</ref>、模式识别和异常检测<ref>{{Cite web|url=http://aicanderson2.home.comcast.net/~aicanderson2/home.pdf|title=Some background on why people in the empirical sciences may want to better understand the information-theoretic methods|author=David R. Anderson|date=November 1, 2003|archiveurl=https://web.archive.org/web/20110723045720/http://aicanderson2.home.comcast.net/~aicanderson2/home.pdf|archivedate=July 23, 2011|url-status=dead|accessdate=2010-06-23}}
 +
</ref>。
    
Important sub-fields of information theory include [[source coding]], [[algorithmic complexity theory]], [[algorithmic information theory]], [[information-theoretic security]], [[Grey system theory]] and measures of information.
 
Important sub-fields of information theory include [[source coding]], [[algorithmic complexity theory]], [[algorithmic information theory]], [[information-theoretic security]], [[Grey system theory]] and measures of information.
第198行: 第198行:       −
If one transmits 1000 bits (0s and 1s), and the value of each of these bits is known to the receiver (has a specific value with certainty) ahead of transmission, it is clear that no information is transmitted.  If, however, each bit is independently equally likely to be 0 or 1, 1000 shannons of information (more often called bits) have been transmitted.  Between these two extremes, information can be quantified as follows. If 𝕏 is the set of all messages {{math|{{mset|''x''<sub>1</sub>, ..., ''x''<sub>''n''</sub>}}}} that {{math|''X''}} could be, and {{math|''p''(''x'')}} is the probability of some <math>x \in \mathbb X</math>, then the entropy, {{math|''H''}}, of is defined:<ref name = Reza>{{cite book | title = An Introduction to Information Theory | author = Fazlollah M. Reza | publisher = Dover Publications, Inc., New York | origyear = 1961| year = 1994 | isbn = 0-486-68210-2 | url = https://books.google.com/books?id=RtzpRAiX6OgC&pg=PA8&dq=intitle:%22An+Introduction+to+Information+Theory%22++%22entropy+of+a+simple+source%22}}</ref>
+
If one transmits 1000 bits (0s and 1s), and the value of each of these bits is known to the receiver (has a specific value with certainty) ahead of transmission, it is clear that no information is transmitted.  If, however, each bit is independently equally likely to be 0 or 1, 1000 shannons of information (more often called bits) have been transmitted.  Between these two extremes, information can be quantified as follows. If 𝕏 is the set of all messages {{math|{{mset|''x''<sub>1</sub>, ..., ''x''<sub>''n''</sub>}}}} that {{math|''X''}} could be, and {{math|''p''(''x'')}} is the probability of some <math>x \in \mathbb X</math>, then the entropy, {{math|''H''}}, of is defined:
    
If one transmits 1000 bits (0s and 1s), and the value of each of these bits is known to the receiver (has a specific value with certainty) ahead of transmission, it is clear that no information is transmitted.  If, however, each bit is independently equally likely to be 0 or 1, 1000 shannons of information (more often called bits) have been transmitted.  Between these two extremes, information can be quantified as follows. If 𝕏 is the set of all messages }} that  could be, and  is the probability of some <math>x \in \mathbb X</math>, then the entropy, , of  is defined:  
 
If one transmits 1000 bits (0s and 1s), and the value of each of these bits is known to the receiver (has a specific value with certainty) ahead of transmission, it is clear that no information is transmitted.  If, however, each bit is independently equally likely to be 0 or 1, 1000 shannons of information (more often called bits) have been transmitted.  Between these two extremes, information can be quantified as follows. If 𝕏 is the set of all messages }} that  could be, and  is the probability of some <math>x \in \mathbb X</math>, then the entropy, , of  is defined:  
第251行: 第251行:  
===条件熵(含糊度)===
 
===条件熵(含糊度)===
   −
The {{em|[[conditional entropy]]}} or ''conditional uncertainty'' of {{math|''X''}} given random variable {{math|''Y''}} (also called the ''equivocation'' of {{math|''X''}} about {{math|''Y''}}) is the average conditional entropy over {{math|''Y''}}:<ref name=Ash>{{cite book | title = Information Theory | author = Robert B. Ash | publisher = Dover Publications, Inc. | origyear = 1965| year = 1990 | isbn = 0-486-66521-6 | url = https://books.google.com/books?id=ngZhvUfF0UIC&pg=PA16&dq=intitle:information+intitle:theory+inauthor:ash+conditional+uncertainty}}</ref>
+
The {{em|[[conditional entropy]]}} or ''conditional uncertainty'' of {{math|''X''}} given random variable {{math|''Y''}} (also called the ''equivocation'' of {{math|''X''}} about {{math|''Y''}}) is the average conditional entropy over {{math|''Y''}}:
    
The  or conditional uncertainty of  given random variable  (also called the equivocation of  about ) is the average conditional entropy over :
 
The  or conditional uncertainty of  given random variable  (also called the equivocation of  about ) is the average conditional entropy over :
第392行: 第392行:       −
* Data compression (source coding): There are two formulations for the compression problem:
+
 
 
*数据压缩(源编码):压缩问题有两个相关公式;
 
*数据压缩(源编码):压缩问题有两个相关公式;
      −
*[[lossless data compression]]: the data must be reconstructed exactly;
   
* [[[无损数据压缩]]:数据必须准确重构;
 
* [[[无损数据压缩]]:数据必须准确重构;
      −
*[[lossy data compression]]: allocates bits needed to reconstruct the data, within a specified fidelity level measured by a distortion function. This subset of information theory is called ''[[rate–distortion theory]]''.
+
 
 
* [[有损数据压缩]]:由失真函数测得的在指定保真度级别内分配重构数据所需的比特数。信息论中的这个部分称为率失真理论。
 
* [[有损数据压缩]]:由失真函数测得的在指定保真度级别内分配重构数据所需的比特数。信息论中的这个部分称为率失真理论。
      −
* Error-correcting codes (channel coding): While data compression removes as much redundancy as possible, an error correcting code adds just the right kind of redundancy (i.e., error correction) needed to transmit the data efficiently and faithfully across a noisy channel.
+
 
 
*纠错码(信道编码):数据压缩会尽可能多的消除冗余,而纠错码会添加所需的冗余(即纠错),以便在嘈杂的信道上有效且保真地传输数据。
 
*纠错码(信道编码):数据压缩会尽可能多的消除冗余,而纠错码会添加所需的冗余(即纠错),以便在嘈杂的信道上有效且保真地传输数据。
   第446行: 第445行:       −
that is, the limit of the joint entropy per symbol.  For stationary sources, these two expressions give the same result.<ref>{{cite book | title = Digital Compression for Multimedia: Principles and Standards | author = Jerry D. Gibson | publisher = Morgan Kaufmann | year = 1998 | url = https://books.google.com/books?id=aqQ2Ry6spu0C&pg=PA56&dq=entropy-rate+conditional#PPA57,M1 | isbn = 1-55860-369-7 }}</ref>
+
that is, the limit of the joint entropy per symbol.  For stationary sources, these two expressions give the same result.
 
   
that is, the limit of the joint entropy per symbol.  For stationary sources, these two expressions give the same result.
 
that is, the limit of the joint entropy per symbol.  For stationary sources, these two expressions give the same result.
   第512行: 第510行:  
====特定信道容量模型====
 
====特定信道容量模型====
   −
* A continuous-time analog communications channel subject to [[Gaussian noise]] — see [[Shannon–Hartley theorem]].
+
 
    
*连续时间内受高斯噪声(Gaussian noise)限制的模拟通信信道(详细内容请参见[[Shannon–Hartley定理]])。
 
*连续时间内受高斯噪声(Gaussian noise)限制的模拟通信信道(详细内容请参见[[Shannon–Hartley定理]])。
   −
* A [[binary symmetric channel]] (BSC) with crossover probability ''p'' is a binary input, binary output channel that flips the input bit with probability ''p''. The BSC has a capacity of {{math|1 &minus; ''H''<sub>b</sub>(''p'')}} bits per channel use, where {{math|''H''<sub>b</sub>}} is the binary entropy function to the base-2 logarithm:
      
*二进制对称通道([[binary symmetric channel]],BSC)是交叉概率为''p''的二进制输入、二进制输出(以概率''p''翻转输入位)通道。每个通道使用的BSC容量为{{math|1 &minus; ''H''<sub>b</sub>(''p'')}}比特,其中{{math|''H''<sub>b</sub>}}是以2为底的对数的二进制熵函数:
 
*二进制对称通道([[binary symmetric channel]],BSC)是交叉概率为''p''的二进制输入、二进制输出(以概率''p''翻转输入位)通道。每个通道使用的BSC容量为{{math|1 &minus; ''H''<sub>b</sub>(''p'')}}比特,其中{{math|''H''<sub>b</sub>}}是以2为底的对数的二进制熵函数:
第534行: 第531行:       −
* A [[binary erasure channel]] (BEC) with erasure probability ''p'' is a binary input, ternary output channel. The possible channel outputs are 0, 1, and a third symbol 'e' called an erasure. The erasure represents complete loss of information about an input bit. The capacity of the BEC is {{nowrap|1 &minus; ''p''}} bits per channel use.
   
*二进制擦除通道([[binary erasure channel]],BEC)是擦除概率为“ p”的二进制输入、三进制输出通道。可能的通道输出为0、1和擦除符号'e'。擦除表示信息输入位的完全丢失。每个通道使用的BEC容量为{{nowrap|1 &minus; ''p''}}比特。
 
*二进制擦除通道([[binary erasure channel]],BEC)是擦除概率为“ p”的二进制输入、三进制输出通道。可能的通道输出为0、1和擦除符号'e'。擦除表示信息输入位的完全丢失。每个通道使用的BEC容量为{{nowrap|1 &minus; ''p''}}比特。
   第601行: 第597行:  
===符号学===
 
===符号学===
   −
[[Semiotics|Semioticians]] [[:nl:Doede Nauta|Doede Nauta]] and [[Winfried Nöth]] both considered [[Charles Sanders Peirce]] as having created a theory of information in his works on semiotics.<ref name="Nauta 1972">{{cite book |ref=harv |last1=Nauta |first1=Doede |title=The Meaning of Information |date=1972 |publisher=Mouton |location=The Hague |isbn=9789027919960}}</ref>{{rp|171}}<ref name="Nöth 2012">{{cite journal |ref=harv |last1=Nöth |first1=Winfried |title=Charles S. Peirce's theory of information: a theory of the growth of symbols and of knowledge |journal=Cybernetics and Human Knowing |date=January 2012 |volume=19 |issue=1–2 |pages=137–161 |url=https://edisciplinas.usp.br/mod/resource/view.php?id=2311849}}</ref>{{rp|137}} Nauta defined semiotic information theory as the study of "the internal processes of coding, filtering, and information processing."<ref name="Nauta 1972"/>{{rp|91}}
+
[[Semiotics|Semioticians]] [[:nl:Doede Nauta|Doede Nauta]] and [[Winfried Nöth]] both considered [[Charles Sanders Peirce]] as having created a theory of information in his works on semiotics.{{rp|137}} Nauta defined semiotic information theory as the study of "the internal processes of coding, filtering, and information processing."<ref name="Nauta 1972"/>{{rp|91}}
    
Semioticians Doede Nauta and Winfried Nöth both considered Charles Sanders Peirce as having created a theory of information in his works on semiotics. Nauta defined semiotic information theory as the study of "the internal processes of coding, filtering, and information processing."
 
Semioticians Doede Nauta and Winfried Nöth both considered Charles Sanders Peirce as having created a theory of information in his works on semiotics. Nauta defined semiotic information theory as the study of "the internal processes of coding, filtering, and information processing."
第608行: 第604行:       −
Concepts from information theory such as redundancy and code control have been used by semioticians such as [[Umberto Eco]] and [[:it:Ferruccio Rossi-Landi|Ferruccio Rossi-Landi]] to explain ideology as a form of message transmission whereby a dominant social class emits its message by using signs that exhibit a high degree of redundancy such that only one message is decoded among a selection of competing ones.<ref>Nöth, Winfried (1981). "[https://kobra.uni-kassel.de/bitstream/handle/123456789/2014122246977/semi_2004_002.pdf?sequence=1&isAllowed=y Semiotics of ideology]". ''Semiotica'', Issue 148.</ref>
+
Concepts from information theory such as redundancy and code control have been used by semioticians such as [[Umberto Eco]] and [[:it:Ferruccio Rossi-Landi|Ferruccio Rossi-Landi]] to explain ideology as a form of message transmission whereby a dominant social class emits its message by using signs that exhibit a high degree of redundancy such that only one message is decoded among a selection of competing ones.
    
Concepts from information theory such as redundancy and code control have been used by semioticians such as Umberto Eco and Ferruccio Rossi-Landi to explain ideology as a form of message transmission whereby a dominant social class emits its message by using signs that exhibit a high degree of redundancy such that only one message is decoded among a selection of competing ones.
 
Concepts from information theory such as redundancy and code control have been used by semioticians such as Umberto Eco and Ferruccio Rossi-Landi to explain ideology as a form of message transmission whereby a dominant social class emits its message by using signs that exhibit a high degree of redundancy such that only one message is decoded among a selection of competing ones.
   −
信息论的概念(例如冗余和代码控制)已被符号学家如Umberto Eco和Ferruccio Rossi-Landi用来解释意识形态,将其作为消息传输的一种形式,占统治地位的社会阶层通过使用具有高度冗余性的标志来发出其信息,使得从符号中解码出来的消息只有一种,而不会时其他可能的消息。
+
信息论的概念(例如冗余和代码控制)已被符号学家如Umberto Eco和Ferruccio Rossi-Landi用来解释意识形态,将其作为消息传输的一种形式,占统治地位的社会阶层通过使用具有高度冗余性的标志来发出其信息,使得从符号中解码出来的消息只有一种,而不会时其他可能的消息。<ref>Nöth, Winfried (1981). "[https://kobra.uni-kassel.de/bitstream/handle/123456789/2014122246977/semi_2004_002.pdf?sequence=1&isAllowed=y Semiotics of ideology]". ''Semiotica'', Issue 148.</ref>
      第629行: 第625行:     
{{Portal|Mathematics}}
 
{{Portal|Mathematics}}
  −
* [[Algorithmic probability]]
  −
* [[Bayesian inference]]
  −
* [[Communication theory]]
  −
* [[Constructor theory]] - a generalization of information theory that includes quantum information
  −
* [[Inductive probability]]
  −
* [[Info-metrics]]
  −
* [[Minimum message length]]
  −
* [[Minimum description length]]
  −
* [[List of important publications in theoretical computer science#Information theory|List of important publications]]
  −
* [[Philosophy of information]]
      
* [[算法概率]]
 
* [[算法概率]]
第656行: 第641行:     
{{div col|colwidth=20em}}
 
{{div col|colwidth=20em}}
  −
* [[Active networking]]
  −
* [[Cryptanalysis]]
  −
* [[Cryptography]]
  −
* [[Cybernetics]]
  −
* [[Entropy in thermodynamics and information theory]]
  −
* [[Gambling]]
  −
* [[Intelligence (information gathering)]]
  −
* [[reflection seismology|Seismic exploration]]
        第683行: 第659行:       −
* [[Ralph Hartley|Hartley, R.V.L.]]
  −
* [[History of information theory]]
  −
* [[Claude Elwood Shannon|Shannon, C.E.]]
  −
* [[Timeline of information theory]]
  −
* [[Hubert Yockey|Yockey, H.P.]]
      
{{div col end}}
 
{{div col end}}
第694行: 第665行:     
{{div col|colwidth=20em}}
 
{{div col|colwidth=20em}}
  −
* [[Coding theory]]
  −
* [[Detection theory]]
  −
* [[Estimation theory]]
  −
* [[Fisher information]]
  −
* [[Information algebra]]
  −
* [[Information asymmetry]]
  −
* [[Information field theory]]
  −
* [[Information geometry]]
  −
* [[Information theory and measure theory]]
  −
* [[Kolmogorov complexity]]
  −
* [[List of unsolved problems in information theory]]
  −
* [[Logic of information]]
  −
* [[Network coding]]
  −
* [[Philosophy of information]]
  −
* [[Quantum information science]]
  −
* [[Source coding]]
        第736行: 第690行:  
{{div col|colwidth=20em}}
 
{{div col|colwidth=20em}}
   −
* [[Ban (unit)]]
  −
* [[Channel capacity]]
  −
* [[Communication channel]]
  −
* [[Communication source]]
  −
* [[Conditional entropy]]
  −
* [[Covert channel]]
  −
* [[Data compression]]
  −
* Decoder
  −
* [[Differential entropy]]
  −
* [[Fungible information]]
  −
* [[Information fluctuation complexity]]
  −
* [[Information entropy]]
  −
* [[Joint entropy]]
  −
* [[Kullback–Leibler divergence]]
  −
* [[Mutual information]]
  −
* [[Pointwise mutual information]] (PMI)
  −
* [[Receiver (information theory)]]
  −
* [[Redundancy (information theory)|Redundancy]]
  −
* [[Rényi entropy]]
  −
* [[Self-information]]
  −
* [[Unicity distance]]
  −
* [[Variety (cybernetics)|Variety]]
  −
* [[Hamming distance]]
       
1,068

个编辑

导航菜单