更改

跳到导航 跳到搜索
添加86字节 、 2020年11月3日 (二) 17:35
无编辑摘要
第9行: 第9行:     
In [[information theory]], '''joint [[entropy (information theory)|entropy]]''' is a measure of the uncertainty associated with a set of [[random variables|variables]].<ref name=korn>{{cite book |author1=Theresa M. Korn |author2=Korn, Granino Arthur |title=Mathematical Handbook for Scientists and Engineers: Definitions, Theorems, and Formulas for Reference and Review |publisher=Dover Publications |location=New York |year= |isbn=0-486-41147-8 |oclc= |doi=}}</ref>
 
In [[information theory]], '''joint [[entropy (information theory)|entropy]]''' is a measure of the uncertainty associated with a set of [[random variables|variables]].<ref name=korn>{{cite book |author1=Theresa M. Korn |author2=Korn, Granino Arthur |title=Mathematical Handbook for Scientists and Engineers: Definitions, Theorems, and Formulas for Reference and Review |publisher=Dover Publications |location=New York |year= |isbn=0-486-41147-8 |oclc= |doi=}}</ref>
 +
 +
在信息论中,联合熵是对与一组变量相关的不确定性进行度量。
     
961

个编辑

导航菜单