更改

添加55字节 、 2020年12月24日 (四) 12:53
无编辑摘要
第1行: 第1行:  
此词条暂由Henry翻译。
 
此词条暂由Henry翻译。
 +
由CecileLi初步审校。
    
{{Short description|Concept in information theory}}
 
{{Short description|Concept in information theory}}
第11行: 第12行:  
Differential entropy (also referred to as continuous entropy) is a concept in information theory that began as an attempt by Shannon to extend the idea of (Shannon) entropy, a measure of average surprisal of a random variable, to continuous probability distributions. Unfortunately, Shannon did not derive this formula, and rather just assumed it was the correct continuous analogue of discrete entropy, but it is not.
 
Differential entropy (also referred to as continuous entropy) is a concept in information theory that began as an attempt by Shannon to extend the idea of (Shannon) entropy, a measure of average surprisal of a random variable, to continuous probability distributions. Unfortunately, Shannon did not derive this formula, and rather just assumed it was the correct continuous analogue of discrete entropy, but it is not.
   −
<font color="#ff8000"> 微分熵Differential entropy</font>(也称为连续熵)是信息论中的一个概念,最初由香农尝试将(香农)熵的概念扩展到连续的概率分布,香农熵是衡量一个随机变量的平均惊人程度的指标。不幸的是,香农没有推导出这个公式,而只是假设它是离散熵的正确连续模拟,但事实上它不是。
+
<font color="#ff8000"> 微分熵Differential entropy</font>(也被称为连续熵)是信息论中的一个概念,来源于香农尝试将他的熵的概念扩展到连续的概率分布。香农熵是衡量一个随机变量的平均惊异程度的指标。可惜的是,香农只是假设它是离散熵的正确连续模拟而并没有推导出公式,但事实上它并不是离散熵的正确连续模拟。
     
526

个编辑