第681行: |
第681行: |
| [[Directed information]], <math>\operatorname{I}\left(X^n \to Y^n\right)</math>, measures the amount of information that flows from the process <math>X^n</math> to <math>Y^n</math>, where <math>X^n</math> denotes the vector <math>X_1, X_2, ..., X_n</math> and <math>Y^n</math> denotes <math>Y_1, Y_2, ..., Y_n</math>. The term ''directed information'' was coined by [[James Massey]] and is defined as | | [[Directed information]], <math>\operatorname{I}\left(X^n \to Y^n\right)</math>, measures the amount of information that flows from the process <math>X^n</math> to <math>Y^n</math>, where <math>X^n</math> denotes the vector <math>X_1, X_2, ..., X_n</math> and <math>Y^n</math> denotes <math>Y_1, Y_2, ..., Y_n</math>. The term ''directed information'' was coined by [[James Massey]] and is defined as |
| | | |
− | Directed information, <math>\operatorname{I}\left(X^n \to Y^n\right)</math>, measures the amount of information that flows from the process <math>X^n</math> to <math>Y^n</math>, where <math>X^n</math> denotes the vector <math>X_1, X_2, ..., X_n</math> and <math>Y^n</math> denotes <math>Y_1, Y_2, ..., Y_n</math>. The term directed information was coined by James Massey and is defined as | + | Directed information, I(𝑋𝑛→𝑌𝑛), measures the amount of information that flows from the process 𝑋𝑛 to 𝑌𝑛, where 𝑋𝑛 denotes the vector 𝑋1,𝑋2,...,𝑋𝑛 and 𝑌𝑛 denotes 𝑌1,𝑌2,...,𝑌𝑛. The term directed information was coined by James Massey and is defined as: |
| + | |
| + | 定向信息I(𝑋𝑛→𝑌𝑛)测量从过程𝑋𝑛流向𝑋𝑛的信息量,其中𝑋𝑛表示矢量𝑋1,𝑋2,…,𝑌𝑛表示𝑌1,𝑌𝑛。定向信息这个术语是由 James Massey 创造的,它被定义为: |
| | | |
− | 有向信息,数学操作者名称左(x ^ n 到 y ^ n 右) / math,测量从过程数学 x ^ n / math 到数学 y ^ n / math 的信息量,其中数学 x ^ n / math 表示向量数学 x 1,x 2,... ,x n / math 和数学 y ^ n / math 表示数学 y 1,y 2,... ,y / math。定向信息这个术语是由 James Massey 创造的,被定义为
| |
| | | |
| :<math>\operatorname{I}\left(X^n \to Y^n\right) | | :<math>\operatorname{I}\left(X^n \to Y^n\right) |
第693行: |
第694行: |
| Note that if <math>n=1</math>, the directed information becomes the mutual information. Directed information has many applications in problems where [[causality]] plays an important role, such as [[Channel capacity|capacity of channel]] with feedback.<ref>{{cite conference|last1=Massey|first1=James|title=Causality, Feedback And Directed Informatio|date=1990|book-title=Proc. 1990 Intl. Symp. on Info. Th. and its Applications, Waikiki, Hawaii, Nov. 27-30, 1990|citeseerx=10.1.1.36.5688}}</ref><ref>{{cite journal|last1=Permuter|first1=Haim Henry|last2=Weissman|first2=Tsachy|last3=Goldsmith|first3=Andrea J.|title=Finite State Channels With Time-Invariant Deterministic Feedback|journal=IEEE Transactions on Information Theory|date=February 2009|volume=55|issue=2|pages=644–662|doi=10.1109/TIT.2008.2009849|arxiv=cs/0608070}}</ref> | | Note that if <math>n=1</math>, the directed information becomes the mutual information. Directed information has many applications in problems where [[causality]] plays an important role, such as [[Channel capacity|capacity of channel]] with feedback.<ref>{{cite conference|last1=Massey|first1=James|title=Causality, Feedback And Directed Informatio|date=1990|book-title=Proc. 1990 Intl. Symp. on Info. Th. and its Applications, Waikiki, Hawaii, Nov. 27-30, 1990|citeseerx=10.1.1.36.5688}}</ref><ref>{{cite journal|last1=Permuter|first1=Haim Henry|last2=Weissman|first2=Tsachy|last3=Goldsmith|first3=Andrea J.|title=Finite State Channels With Time-Invariant Deterministic Feedback|journal=IEEE Transactions on Information Theory|date=February 2009|volume=55|issue=2|pages=644–662|doi=10.1109/TIT.2008.2009849|arxiv=cs/0608070}}</ref> |
| | | |
− | Note that if <math>n=1</math>, the directed information becomes the mutual information. Directed information has many applications in problems where causality plays an important role, such as capacity of channel with feedback. | + | Note that if 𝑛=1, the directed information becomes the mutual information. Directed information has many applications in problems where causality plays an important role, such as capacity of channel with feedback. |
| | | |
− | 注意,如果 math n 1 / math,有向信息成为互信息。有向信息在因果关系问题中有着广泛的应用,如反馈信道容量问题。
| + | 注意,如果𝑛=1,则定向信息成为互信息。有向信息在因果关系问题中有着广泛的应用,如反馈信道的容量问题。 |
| | | |
| === 标准化变形 Normalized variants === | | === 标准化变形 Normalized variants === |