更改

添加1,632字节 、 2021年2月12日 (五) 23:30
第37行: 第37行:  
As with its discrete analog, the units of differential entropy depend on the base of the [[logarithm]], which is usually 2 (i.e., the units are [[bit]]s). See [[logarithmic units]] for logarithms taken in different bases. Related concepts such as [[joint entropy|joint]], [[conditional entropy|conditional]] differential entropy, and [[Kullback–Leibler divergence|relative entropy]] are defined in a similar fashion. Unlike the discrete analog, the differential entropy has an offset that depends on the units used to measure <math>X</math>.<ref name="gibbs">{{cite book |last=Gibbs |first=Josiah Willard |authorlink=Josiah Willard Gibbs |title=[[Elementary Principles in Statistical Mechanics|Elementary Principles in Statistical Mechanics, developed with especial reference to the rational foundation of thermodynamics]] |year=1902 |publisher=Charles Scribner's Sons |location=New York}}</ref>{{rp|183–184}} For example, the differential entropy of a quantity measured in millimeters will be {{not a typo|log(1000)}} more than the same quantity measured in meters; a dimensionless quantity will have differential entropy of {{not a typo|log(1000)}} more than the same quantity divided by 1000.
 
As with its discrete analog, the units of differential entropy depend on the base of the [[logarithm]], which is usually 2 (i.e., the units are [[bit]]s). See [[logarithmic units]] for logarithms taken in different bases. Related concepts such as [[joint entropy|joint]], [[conditional entropy|conditional]] differential entropy, and [[Kullback–Leibler divergence|relative entropy]] are defined in a similar fashion. Unlike the discrete analog, the differential entropy has an offset that depends on the units used to measure <math>X</math>.<ref name="gibbs">{{cite book |last=Gibbs |first=Josiah Willard |authorlink=Josiah Willard Gibbs |title=[[Elementary Principles in Statistical Mechanics|Elementary Principles in Statistical Mechanics, developed with especial reference to the rational foundation of thermodynamics]] |year=1902 |publisher=Charles Scribner's Sons |location=New York}}</ref>{{rp|183–184}} For example, the differential entropy of a quantity measured in millimeters will be {{not a typo|log(1000)}} more than the same quantity measured in meters; a dimensionless quantity will have differential entropy of {{not a typo|log(1000)}} more than the same quantity divided by 1000.
    +
--[[用户:CecileLi|CecileLi]]([[用户讨论:CecileLi|讨论]])  【审校】补充翻译:与离散模型一样,微分熵的单位取决于对数的底数,通常是2(单位:比特;请参阅对数单位,了解不同基数的对数。)相对熵的定义与联合熵、条件差分熵等概念相对熵的概念存在类似之处。与离散模型不同,差分熵的偏移量取决于测量单位。例如,以毫米为单位测量的量的差分熵将大于以米为单位测量的相同量;无量纲量的差分熵将大于相同量除以1000。
    
One must take care in trying to apply properties of discrete entropy to differential entropy, since probability density functions can be greater than 1. For example, the [[Uniform distribution (continuous)|uniform distribution]] <math>\mathcal{U}(0,1/2)</math> has ''negative'' differential entropy
 
One must take care in trying to apply properties of discrete entropy to differential entropy, since probability density functions can be greater than 1. For example, the [[Uniform distribution (continuous)|uniform distribution]] <math>\mathcal{U}(0,1/2)</math> has ''negative'' differential entropy
    +
--[[用户:CecileLi|CecileLi]]([[用户讨论:CecileLi|讨论]])  【审校】补充翻译:在尝试将离散熵的性质应用于微分熵时必须小心,因为概率密度函数可以大于1。例如,均匀分布具有“负”微分熵
    
:<math>\int_0^\frac{1}{2} -2\log(2)\,dx=-\log(2)\,</math>.
 
:<math>\int_0^\frac{1}{2} -2\log(2)\,dx=-\log(2)\,</math>.
第45行: 第47行:     
Thus, differential entropy does not share all properties of discrete entropy.
 
Thus, differential entropy does not share all properties of discrete entropy.
 
+
--[[用户:CecileLi|CecileLi]]([[用户讨论:CecileLi|讨论]])  【审校】补充翻译:因此,微分熵并不具有离散熵的所有性质。
    
Note that the continuous [[mutual information]] <math>I(X;Y)</math> has the distinction of retaining its fundamental significance as a measure of discrete information since it is actually the limit of the discrete mutual information of ''partitions'' of <math>X</math> and <math>Y</math> as these partitions become finer and finer.  Thus it is invariant under non-linear [[homeomorphisms]] (continuous and uniquely invertible maps), <ref>{{cite journal | first = Alexander | last = Kraskov |author2=Stögbauer, Grassberger | year = 2004 | title = Estimating mutual information | journal = [[Physical Review E]] | volume = 60 | pages = 066138 | doi =10.1103/PhysRevE.69.066138|arxiv = cond-mat/0305641 |bibcode = 2004PhRvE..69f6138K }}</ref> including linear <ref name = Reza>{{ cite book | title = An Introduction to Information Theory | author = Fazlollah M. Reza | publisher = Dover Publications, Inc., New York | origyear = 1961| year = 1994 | isbn = 0-486-68210-2 | url = https://books.google.com/books?id=RtzpRAiX6OgC&pg=PA8&dq=intitle:%22An+Introduction+to+Information+Theory%22++%22entropy+of+a+simple+source%22&as_brr=0&ei=zP79Ro7UBovqoQK4g_nCCw&sig=j3lPgyYrC3-bvn1Td42TZgTzj0Q }}</ref> transformations of <math>X</math> and <math>Y</math>, and still represents the amount of discrete information that can be transmitted over a channel that admits a continuous space of values.
 
Note that the continuous [[mutual information]] <math>I(X;Y)</math> has the distinction of retaining its fundamental significance as a measure of discrete information since it is actually the limit of the discrete mutual information of ''partitions'' of <math>X</math> and <math>Y</math> as these partitions become finer and finer.  Thus it is invariant under non-linear [[homeomorphisms]] (continuous and uniquely invertible maps), <ref>{{cite journal | first = Alexander | last = Kraskov |author2=Stögbauer, Grassberger | year = 2004 | title = Estimating mutual information | journal = [[Physical Review E]] | volume = 60 | pages = 066138 | doi =10.1103/PhysRevE.69.066138|arxiv = cond-mat/0305641 |bibcode = 2004PhRvE..69f6138K }}</ref> including linear <ref name = Reza>{{ cite book | title = An Introduction to Information Theory | author = Fazlollah M. Reza | publisher = Dover Publications, Inc., New York | origyear = 1961| year = 1994 | isbn = 0-486-68210-2 | url = https://books.google.com/books?id=RtzpRAiX6OgC&pg=PA8&dq=intitle:%22An+Introduction+to+Information+Theory%22++%22entropy+of+a+simple+source%22&as_brr=0&ei=zP79Ro7UBovqoQK4g_nCCw&sig=j3lPgyYrC3-bvn1Td42TZgTzj0Q }}</ref> transformations of <math>X</math> and <math>Y</math>, and still represents the amount of discrete information that can be transmitted over a channel that admits a continuous space of values.
    +
--[[用户:CecileLi|CecileLi]]([[用户讨论:CecileLi|讨论]])  【审校】补充翻译:注意,连续相互变量I(X;Y)具有保留其作为离散信息度量的基本意义的区别,因为它实际上是X和Y的“分区”的离散互信息的极限,因为这些分区变得越来越细。因此,它在非线性[[同胚]](连续且唯一可逆的映射)下是不变的,并且仍然表示可在允许连续值空间的信道上传输的离散信息量。
    
For the direct analogue of discrete entropy extended to the continuous space, see  [[limiting density of discrete points]].
 
For the direct analogue of discrete entropy extended to the continuous space, see  [[limiting density of discrete points]].
 +
 +
--[[用户:CecileLi|CecileLi]]([[用户讨论:CecileLi|讨论]])  【审校】补充翻译:对于扩展到连续空间的离散熵的直接模拟,参见[[离散点的极限密度]]。
    
==Properties of differential entropy==
 
==Properties of differential entropy==
526

个编辑