更改

跳到导航 跳到搜索
添加103字节 、 2020年11月3日 (二) 17:36
无编辑摘要
第10行: 第10行:  
In [[information theory]], '''joint [[entropy (information theory)|entropy]]''' is a measure of the uncertainty associated with a set of [[random variables|variables]].<ref name=korn>{{cite book |author1=Theresa M. Korn |author2=Korn, Granino Arthur |title=Mathematical Handbook for Scientists and Engineers: Definitions, Theorems, and Formulas for Reference and Review |publisher=Dover Publications |location=New York |year= |isbn=0-486-41147-8 |oclc= |doi=}}</ref>
 
In [[information theory]], '''joint [[entropy (information theory)|entropy]]''' is a measure of the uncertainty associated with a set of [[random variables|variables]].<ref name=korn>{{cite book |author1=Theresa M. Korn |author2=Korn, Granino Arthur |title=Mathematical Handbook for Scientists and Engineers: Definitions, Theorems, and Formulas for Reference and Review |publisher=Dover Publications |location=New York |year= |isbn=0-486-41147-8 |oclc= |doi=}}</ref>
   −
在信息论中,联合熵是对与一组变量相关的不确定性进行度量。
+
在'''<font color="#ff8000"> 信息论Information theory</font>'''中,'''<font color="#ff8000"> 联合熵Joint entropy</font>'''是对与一组变量相关的不确定性进行度量。
     
961

个编辑

导航菜单