更改

跳到导航 跳到搜索
添加79字节 、 2020年10月10日 (六) 16:52
第879行: 第879行:  
另一个问题是,连续时间过程的泛函依赖于指数集中无法计算的点数,因此某些事件的概率可能无法很好地定义。可分性保证了无穷维分布决定样本函数的性质,它要求样本函数本质上是由指数集中的稠密可数点集上的值决定的。此外,如果随机过程是可分的,那么指数集上不可数个点的泛函是可测的,并且可以研究它们的概率。对于任意度量空间作为状态空间的连续时间随机过程。为了构造这样一个随机过程,我们假设随机过程的样本函数属于某个适当的函数空间,这个空间通常是由所有右连续函数和左极限组成的 Skorokhod 空间。这种方法现在比可分离性假设更常用,但是基于这种方法的随机过程可自动分离。
 
另一个问题是,连续时间过程的泛函依赖于指数集中无法计算的点数,因此某些事件的概率可能无法很好地定义。可分性保证了无穷维分布决定样本函数的性质,它要求样本函数本质上是由指数集中的稠密可数点集上的值决定的。此外,如果随机过程是可分的,那么指数集上不可数个点的泛函是可测的,并且可以研究它们的概率。对于任意度量空间作为状态空间的连续时间随机过程。为了构造这样一个随机过程,我们假设随机过程的样本函数属于某个适当的函数空间,这个空间通常是由所有右连续函数和左极限组成的 Skorokhod 空间。这种方法现在比可分离性假设更常用,但是基于这种方法的随机过程可自动分离。
   −
===Markov processes and chains马尔可夫过程与链===
+
==='''<font color="#ff8000"> 马尔可夫过程与链Markov processes and chains</font>'''===
    
{{Main|Markov process}}
 
{{Main|Markov process}}
第885行: 第885行:  
Although less used, the separability assumption is considered more general because every stochastic process has a separable version. For example, separability is assumed when constructing and studying random fields, where the collection of random variables is now indexed by sets other than the real line such as <math>n</math>-dimensional Euclidean space.
 
Although less used, the separability assumption is considered more general because every stochastic process has a separable version. For example, separability is assumed when constructing and studying random fields, where the collection of random variables is now indexed by sets other than the real line such as <math>n</math>-dimensional Euclidean space.
   −
尽管很少使用,但是可分性假设被认为是更一般的,因为每个随机过程都有一个可分离的版本。例如,在构造和研究随机场时假设可分性,其中随机变量的集合现在由实线以外的集合索引,如 < math > n </math > 维欧氏空间。
+
尽管很少使用,但是可分性假设被认为是更一般的,因为每个随机过程都有一个可分离的版本。例如,在构造和研究随机场时假设可分性,其中随机变量的集合现在由实线以外的集合索引,如 <math>n</math> 维欧氏空间。
    
Markov processes are stochastic processes, traditionally in [[Discrete time and continuous time|discrete or continuous time]], that have the Markov property, which means the next value of the Markov process depends on the current value, but it is conditionally independent of the previous values of the stochastic process. In other words, the behavior of the process in the future is stochastically independent of its behavior in the past, given the current state of the process.<ref name="Serfozo2009page2">{{cite book|author=Richard Serfozo|title=Basics of Applied Stochastic Processes|url=https://books.google.com/books?id=JBBRiuxTN0QC|year=2009|publisher=Springer Science & Business Media|isbn=978-3-540-89332-5|page=2}}</ref><ref name="Rozanov2012page58">{{cite book|author=Y.A. Rozanov|title=Markov Random Fields|url=https://books.google.com/books?id=wGUECAAAQBAJ|year=2012|publisher=Springer Science & Business Media|isbn=978-1-4613-8190-7|page=58}}</ref>
 
Markov processes are stochastic processes, traditionally in [[Discrete time and continuous time|discrete or continuous time]], that have the Markov property, which means the next value of the Markov process depends on the current value, but it is conditionally independent of the previous values of the stochastic process. In other words, the behavior of the process in the future is stochastically independent of its behavior in the past, given the current state of the process.<ref name="Serfozo2009page2">{{cite book|author=Richard Serfozo|title=Basics of Applied Stochastic Processes|url=https://books.google.com/books?id=JBBRiuxTN0QC|year=2009|publisher=Springer Science & Business Media|isbn=978-3-540-89332-5|page=2}}</ref><ref name="Rozanov2012page58">{{cite book|author=Y.A. Rozanov|title=Markov Random Fields|url=https://books.google.com/books?id=wGUECAAAQBAJ|year=2012|publisher=Springer Science & Business Media|isbn=978-1-4613-8190-7|page=58}}</ref>
   −
马尔可夫过程是一种随机过程,传统上在[[离散时间和连续时间|离散或连续时间]]中,具有马尔可夫特性,即马尔可夫过程的下一个值取决于当前值,但它与随机过程的先前值有条件无关。换句话说,给定进程的当前状态,进程在未来的行为与它过去的行为是随机独立的。<ref name=“Serfozo2009page2”>{cite book | author=Richard Serfozo | title=Basics of Applied randocial Processes | url=https://books.google.com/books?id=JBBRiuxTN0QC | year=2009 | publisher=Springer Science&Business Media | isbn=978-3-540-89332-5 | page=2}</ref><ref name=“Rozanov2012page58”>{cite book |作者=Y.A.Rozanov | title=Markov Random Fields| url=https://books.google.com/books?id=wguecaaqbaj | year=2012 | publisher=Springer Science&Business Media | isbn=978-1-4613-8190-7 | page=58}</ref>
+
'''<font color="#ff8000"> 马尔可夫过程Markov processes </font>'''是一种随机过程,传统上在[[离散时间和连续时间|离散或连续时间]]中,具有马尔可夫特性,即马尔可夫过程的下一个值取决于当前值,但它与随机过程的先前值条件无关。换句话说,给定进程的当前状态,进程在未来的行为与它过去的行为是随机独立的。<ref name=“Serfozo2009page2”>{cite book | author=Richard Serfozo | title=Basics of Applied randocial Processes | url=https://books.google.com/books?id=JBBRiuxTN0QC | year=2009 | publisher=Springer Science&Business Media | isbn=978-3-540-89332-5 | page=2}</ref><ref name=“Rozanov2012page58”>{cite book |作者=Y.A.Rozanov | title=Markov Random Fields| url=https://books.google.com/books?id=wguecaaqbaj | year=2012 | publisher=Springer Science&Business Media | isbn=978-1-4613-8190-7 | page=58}</ref>
    
The Brownian motion process and the Poisson process (in one dimension) are both examples of Markov processes<ref name="Ross1996page235and358">{{cite book|author=Sheldon M. Ross|title=Stochastic processes|url=https://books.google.com/books?id=ImUPAQAAMAAJ|year=1996|publisher=Wiley|isbn=978-0-471-12062-9|pages=235, 358}}</ref> in continuous time, while [[random walk]]s on the integers and the [[gambler's ruin]] problem are examples of Markov processes in discrete time.<ref name="Florescu2014page373">{{cite book|author=Ionut Florescu|title=Probability and Stochastic Processes|url=https://books.google.com/books?id=Z5xEBQAAQBAJ&pg=PR22|year=2014|publisher=John Wiley & Sons|isbn=978-1-118-59320-2|pages=373, 374}}</ref><ref name="KarlinTaylor2012page49">{{cite book|author1=Samuel Karlin|author2=Howard E. Taylor|title=A First Course in Stochastic Processes|url=https://books.google.com/books?id=dSDxjX9nmmMC|year=2012|publisher=Academic Press|isbn=978-0-08-057041-9|page=49}}</ref>
 
The Brownian motion process and the Poisson process (in one dimension) are both examples of Markov processes<ref name="Ross1996page235and358">{{cite book|author=Sheldon M. Ross|title=Stochastic processes|url=https://books.google.com/books?id=ImUPAQAAMAAJ|year=1996|publisher=Wiley|isbn=978-0-471-12062-9|pages=235, 358}}</ref> in continuous time, while [[random walk]]s on the integers and the [[gambler's ruin]] problem are examples of Markov processes in discrete time.<ref name="Florescu2014page373">{{cite book|author=Ionut Florescu|title=Probability and Stochastic Processes|url=https://books.google.com/books?id=Z5xEBQAAQBAJ&pg=PR22|year=2014|publisher=John Wiley & Sons|isbn=978-1-118-59320-2|pages=373, 374}}</ref><ref name="KarlinTaylor2012page49">{{cite book|author1=Samuel Karlin|author2=Howard E. Taylor|title=A First Course in Stochastic Processes|url=https://books.google.com/books?id=dSDxjX9nmmMC|year=2012|publisher=Academic Press|isbn=978-0-08-057041-9|page=49}}</ref>
第897行: 第897行:  
{{columns-list|colwidth=30em|
 
{{columns-list|colwidth=30em|
   −
{ columns-list | colwidth = 30em |
+
A Markov chain is a type of Markov process that has either discrete [[state space]] or discrete index set (often representing time), but the precise definition of a Markov chain varies.<ref name="Asmussen2003page7">{{cite book|url=https://books.google.com/books?id=BeYaTxesKy0C|title=Applied Probability and Queues|year=2003|publisher=Springer Science & Business Media|isbn=978-0-387-00211-8|page=7|author=Søren Asmussen}}</ref> For example, it is common to define a Markov chain as a Markov process in either [[Continuous and discrete variables|discrete or continuous time]] with a countable state space (thus regardless of the nature of time),<ref name="Parzen1999page188">{{cite book|url=https://books.google.com/books?id=0mB2CQAAQBAJ|title=Stochastic Processes|year=2015|publisher=Courier Dover Publications|isbn=978-0-486-79688-8|page=188|author=Emanuel Parzen}}</ref><ref name="KarlinTaylor2012page29">{{cite book|url=https://books.google.com/books?id=dSDxjX9nmmMC|title=A First Course in Stochastic Processes|year=2012|publisher=Academic Press|isbn=978-0-08-057041-9|pages=29, 30|author1=Samuel Karlin|author2=Howard E. Taylor}}</ref><ref name="Lamperti1977chap6">{{cite book|url=https://books.google.com/books?id=Pd4cvgAACAAJ|title=Stochastic processes: a survey of the mathematical theory|publisher=Springer-Verlag|year=1977|isbn=978-3-540-90275-1|pages=106–121|author=John Lamperti}}</ref><ref name="Ross1996page174and231">{{cite book|url=https://books.google.com/books?id=ImUPAQAAMAAJ|title=Stochastic processes|publisher=Wiley|year=1996|isbn=978-0-471-12062-9|pages=174, 231|author=Sheldon M. Ross}}</ref> but it has been also common to define a Markov chain as having discrete time in either countable or continuous state space (thus regardless of the state space).<ref name="Asmussen2003page7" /> It has been argued that the first definition of a Markov chain, where it has discrete time, now tends to be used, despite the second definition having been used by researchers like [[Joseph Doob]] and [[Kai Lai Chung]].<ref name="MeynTweedie2009">{{cite book|author1=Sean Meyn|author2=Richard L. Tweedie|title=Markov Chains and Stochastic Stability|url=https://books.google.com/books?id=Md7RnYEPkJwC|year=2009|publisher=Cambridge University Press|isbn=978-0-521-73182-9|page=19}}</ref>
 
  −
 
     −
A Markov chain is a type of Markov process that has either discrete [[state space]] or discrete index set (often representing time), but the precise definition of a Markov chain varies.<ref name="Asmussen2003page7">{{cite book|url=https://books.google.com/books?id=BeYaTxesKy0C|title=Applied Probability and Queues|year=2003|publisher=Springer Science & Business Media|isbn=978-0-387-00211-8|page=7|author=Søren Asmussen}}</ref> For example, it is common to define a Markov chain as a Markov process in either [[Continuous and discrete variables|discrete or continuous time]] with a countable state space (thus regardless of the nature of time),<ref name="Parzen1999page188">{{cite book|url=https://books.google.com/books?id=0mB2CQAAQBAJ|title=Stochastic Processes|year=2015|publisher=Courier Dover Publications|isbn=978-0-486-79688-8|page=188|author=Emanuel Parzen}}</ref><ref name="KarlinTaylor2012page29">{{cite book|url=https://books.google.com/books?id=dSDxjX9nmmMC|title=A First Course in Stochastic Processes|year=2012|publisher=Academic Press|isbn=978-0-08-057041-9|pages=29, 30|author1=Samuel Karlin|author2=Howard E. Taylor}}</ref><ref name="Lamperti1977chap6">{{cite book|url=https://books.google.com/books?id=Pd4cvgAACAAJ|title=Stochastic processes: a survey of the mathematical theory|publisher=Springer-Verlag|year=1977|isbn=978-3-540-90275-1|pages=106–121|author=John Lamperti}}</ref><ref name="Ross1996page174and231">{{cite book|url=https://books.google.com/books?id=ImUPAQAAMAAJ|title=Stochastic processes|publisher=Wiley|year=1996|isbn=978-0-471-12062-9|pages=174, 231|author=Sheldon M. Ross}}</ref> but it has been also common to define a Markov chain as having discrete time in either countable or continuous state space (thus regardless of the state space).<ref name="Asmussen2003page7" /> It has been argued that the first definition of a Markov chain, where it has discrete time, now tends to be used, despite the second definition having been used by researchers like [[Joseph Doob]] and [[Kai Lai Chung]].<ref name="MeynTweedie2009">{{cite book|author1=Sean Meyn|author2=Richard L. Tweedie|title=Markov Chains and Stochastic Stability|url=https://books.google.com/books?id=Md7RnYEPkJwC|year=2009|publisher=Cambridge University Press|isbn=978-0-521-73182-9|page=19}}</ref>
+
马尔可夫链是一种具有离散[[状态空间]]或离散索引集(通常表示时间)的马尔可夫过程,但马尔可夫链的精确定义各不相同<ref name="Asmussen2003page7">{{cite book|url=https://books.google.com/books?id=BeYaTxesKy0C|title=Applied Probability and Queues|year=2003|publisher=Springer Science & Business Media|isbn=978-0-387-00211-8|page=7|author=Søren Asmussen}}</ref>例如,通常将马尔可夫链定义为具有可数状态空间的[[连续变量|离散或连续时间]]中的马尔可夫过程(因此不管时间的性质),<ref name=“Parzen1999page188”>{cite book |网址=https://books.google.com/books?id=0mB2CQAAQBAJ | title=随机过程|年份=2015 | publisher=Courier Dover Publications | isbn=978-0-486-79688-8 | page=188 | author=Emanuel Parzen}</ref><ref name=“KarlinTaylor2012page29”>{cite book|网址=https://books.google.com/books?id=dSDxjX9nmmMC | title=A First Course in randocial Processes | year=2012 | publisher=学术出版社| isbn=978-0-08-057041-9 | pages=29,30 | author1=Samuel Karlin | author2=Howard E.Taylor}</ref><ref name=“Lamperti1977chap6”>{cite book|网址=https://books.google.com/books?id=pd4cvgaacaj | title=随机过程:数学理论综述| publisher=Springer Verlag | year=1977 | isbn=978-3-540-90275-1 | pages=106–121 | author=John Lamperti}</ref><ref name=“Ross1996page174and231”>{cite book|网址=https://books.google.com/books?id=ImUPAQAAMAAJ | title=随机过程| publisher=Wiley | year=1996 | isbn=978-0-471-12062-9 | pages=174,231 | author=Sheldon M.Ross}</ref>但将马尔可夫链定义为在可数状态空间或连续状态空间(因此与状态空间无关)中具有离散时间也是常见的马尔可夫链的第一个定义,它有离散时间,现在倾向于使用,尽管第二个定义已经被[[Joseph Doob]]和[[Kai Lai Chung]]等研究人员所使用。<ref name=“MeynTweedie2009”>{cite book | author1=Sean Meyn | author2=Richard L.Tweedie | title=Markov链和随机稳定性| url=https://books.google.com/books?id=Md7RnYEPkJwC | year=2009 | publisher=Cambridge University Press | isbn=978-0-521-73182-9 | page=19}</ref>
马尔可夫链是一种具有离散[[状态空间]]或离散索引集(通常表示时间)的马尔可夫过程,但马尔可夫链的精确定义各不相同=https://books.google.com/books?id=BeYaTxesKy0C | title=Applied Probability and Queues | year=2003 | publisher=Springer Science&Business Media | isbn=978-0-387-00211-8 | page=7 | author=Søen Asmussen}</ref>例如,通常将马尔可夫链定义为具有可数状态空间的[[连续变量|离散或连续时间]]中的马尔可夫过程(因此不管时间的性质),<ref name=“Parzen1999page188”>{cite book |网址=https://books.google.com/books?id=0mB2CQAAQBAJ | title=随机过程|年份=2015 | publisher=Courier Dover Publications | isbn=978-0-486-79688-8 | page=188 | author=Emanuel Parzen}</ref><ref name=“KarlinTaylor2012page29”>{cite book|网址=https://books.google.com/books?id=dSDxjX9nmmMC | title=A First Course in randocial Processes | year=2012 | publisher=学术出版社| isbn=978-0-08-057041-9 | pages=29,30 | author1=Samuel Karlin | author2=Howard E.Taylor}</ref><ref name=“Lamperti1977chap6”>{cite book|网址=https://books.google.com/books?id=pd4cvgaacaj | title=随机过程:数学理论综述| publisher=Springer Verlag | year=1977 | isbn=978-3-540-90275-1 | pages=106–121 | author=John Lamperti}</ref><ref name=“Ross1996page174and231”>{cite book|网址=https://books.google.com/books?id=ImUPAQAAMAAJ | title=随机过程| publisher=Wiley | year=1996 | isbn=978-0-471-12062-9 | pages=174,231 | author=Sheldon M.Ross}</ref>但将马尔可夫链定义为在可数状态空间或连续状态空间(因此与状态空间无关)中具有离散时间也是常见的马尔可夫链的第一个定义,它有离散时间,现在倾向于使用,尽管第二个定义已经被[[Joseph Doob]]和[[Kai Lai Chung]]等研究人员所使用。<ref name=“MeynTweedie2009”>{cite book | author1=Sean Meyn | author2=Richard L.Tweedie | title=Markov链和随机稳定性| url=https://books.google.com/books?id=Md7RnYEPkJwC | year=2009 | publisher=Cambridge University Press | isbn=978-0-521-73182-9 | page=19}</ref>
       
561

个编辑

导航菜单