更改

跳到导航 跳到搜索
添加183,196字节 、 2020年8月11日 (二) 15:03
此词条暂由彩云小译翻译,未经人工整理和审校,带来阅读不便,请见谅。

{{Probability distribution

{{Probability distribution

{概率分布

| name = Poisson Distribution

| name = Poisson Distribution

泊松分佈

| type = mass

| type = mass

类型 = 质量

| pdf_image = [[File:poisson pmf.svg|325px]]

| pdf_image = 325px

325px

| pdf_caption = The horizontal axis is the index ''k'', the number of occurrences. ''λ'' is the expected rate of occurrences. The vertical axis is the probability of ''k'' occurrences given ''λ''. The function is defined only at integer values of ''k''; the connecting lines are only guides for the eye.

| pdf_caption = The horizontal axis is the index k, the number of occurrences. λ is the expected rate of occurrences. The vertical axis is the probability of k occurrences given λ. The function is defined only at integer values of k; the connecting lines are only guides for the eye.

| pdf _ caption = 横轴是索引 k,表示出现的次数。是预期发生率。垂直轴是给定的 k 发生概率。函数只定义在 k 的整数值上,连接线只是眼睛的向导。

| cdf_image = [[File:poisson cdf.svg|325px]]

| cdf_image = 325px

325px

| cdf_caption = The horizontal axis is the index ''k'', the number of occurrences. The CDF is discontinuous at the integers of ''k'' and flat everywhere else because a variable that is Poisson distributed takes on only integer values.

| cdf_caption = The horizontal axis is the index k, the number of occurrences. The CDF is discontinuous at the integers of k and flat everywhere else because a variable that is Poisson distributed takes on only integer values.

| cdf _ caption = 水平轴是索引 k,表示出现的次数。因为一个泊松分布的变量只取整数值,所以 CDF 在 k 的整数和平坦的所有其他地方都是不连续的。

| notation = <math>\operatorname{Pois}(\lambda)</math>

| notation = <math>\operatorname{Pois}(\lambda)</math>

| 表示法 = < math > operatorname { Pois }(lambda) </math >

| parameters = <math>\lambda\in (0, \infty) </math> (rate)

| parameters = <math>\lambda\in (0, \infty) </math> (rate)

| parameters = < math > lambda in (0,infty) </math > (rate)

| support = <math>k \in \mathbb{N}_0</math> ([[Natural numbers]] starting from 0)

| support = <math>k \in \mathbb{N}_0</math> (Natural numbers starting from 0)

| support = < math > k in mathbb { n } _ 0 </math > (自然数从0开始)

| pdf = <math>\frac{\lambda^k e^{-\lambda}}{k!}</math>

| pdf = <math>\frac{\lambda^k e^{-\lambda}}{k!}</math>

| pdf = < math > frac { lambda ^ k e ^ {-lambda }{ k!{/math >

| cdf = <math>\frac{\Gamma(\lfloor k+1\rfloor, \lambda)}{\lfloor k\rfloor !}</math>, or <math>e^{-\lambda} \sum_{i=0}^{\lfloor k\rfloor} \frac{\lambda^i}{i!}\ </math>, or <math>Q(\lfloor k+1\rfloor,\lambda)</math>

| cdf = <math>\frac{\Gamma(\lfloor k+1\rfloor, \lambda)}{\lfloor k\rfloor !}</math>, or <math>e^{-\lambda} \sum_{i=0}^{\lfloor k\rfloor} \frac{\lambda^i}{i!}\ </math>, or <math>Q(\lfloor k+1\rfloor,\lambda)</math>

| cdf = < math > frac { Gamma (lfloor k + 1 rfloor,lambda)}{ lfloor k rfloor!{ i = 0}{ lfloor k rfloor } frac { lambda ^ i }{ i!} </math > ,或者 < math > q (lfloor k + 1 rfloor,lambda) </math >

(for <math>k\ge 0</math>, where <math>\Gamma(x, y)</math> is the [[upper incomplete gamma function]], <math>\lfloor k\rfloor</math> is the [[floor function]], and Q is the [[regularized gamma function]])

(for <math>k\ge 0</math>, where <math>\Gamma(x, y)</math> is the upper incomplete gamma function, <math>\lfloor k\rfloor</math> is the floor function, and Q is the regularized gamma function)

(对于 < math > k ge 0 </math > ,其中 < math > Gamma (x,y) </math > 是上面的不完全Γ函数,< math > lfloor k | | | | | | | | | | | | | | | | | | </math > 是下面的函数,q 是正则化的 Gamma 函数)

| mean = <math>\lambda</math>

| mean = <math>\lambda</math>

| mean = < math > > lambda </math >

| median = <math>\approx\lfloor\lambda+1/3-0.02/\lambda\rfloor</math>

| median = <math>\approx\lfloor\lambda+1/3-0.02/\lambda\rfloor</math>

| 中位数 = < math > > 大约1floor lambda + 1/3-0.02/lambda rfloor </math >

| mode = <math>\lceil\lambda\rceil - 1, \lfloor\lambda\rfloor</math>

| mode = <math>\lceil\lambda\rceil - 1, \lfloor\lambda\rfloor</math>

| mode = < math > lceil lambda rceil-1,lfloor lambda rfloor

| variance = <math>\lambda</math>

| variance = <math>\lambda</math>

| variance = < math > lambda </math >

| skewness = <math>\lambda^{-1/2}</math>

| skewness = <math>\lambda^{-1/2}</math>

| skewness = < math > lambda ^ {-1/2} </math >

| kurtosis = <math>\lambda^{-1}</math>

| kurtosis = <math>\lambda^{-1}</math>

| 峭度 = < math > lambda ^ {-1} </math >

| entropy = <math>\lambda[1 - \log(\lambda)] + e^{-\lambda}\sum_{k=0}^\infty \frac{\lambda^k\log(k!)}{k!}</math>

| entropy = <math>\lambda[1 - \log(\lambda)] + e^{-\lambda}\sum_{k=0}^\infty \frac{\lambda^k\log(k!)}{k!}</math>

| 熵 = < math > lambda [1-log (lambda)] + e ^ {-lambda } sum _ { k = 0} ^ infty frac { lambda ^ k log (k!)}{ k!{/math >

(for large <math>\lambda</math>)

(for large <math>\lambda</math>)

(对于大的 < math > > > lambda </math >)

<math>\frac{1}{2}\log(2 \pi e \lambda) - \frac{1}{12 \lambda} - \frac{1}{24 \lambda^2} -{}</math><br><math>\qquad \frac{19}{360 \lambda^3} + O\left(\frac{1}{\lambda^4}\right)</math><!--formula split with \qquad indent-->

<math>\frac{1}{2}\log(2 \pi e \lambda) - \frac{1}{12 \lambda} - \frac{1}{24 \lambda^2} -{}</math><br><math>\qquad \frac{19}{360 \lambda^3} + O\left(\frac{1}{\lambda^4}\right)</math><!--formula split with \qquad indent-->

12 lambda }-frac {1}{24 lambda ^ 2}-{{{} </math > < br > < math > qfrac {19}{360 lambda ^ 3} + o left (frac {1}{1}{ lambda ^ 4}右) </math > <

| pgf = <math>\exp[\lambda(z - 1)]</math>

| pgf = <math>\exp[\lambda(z - 1)]</math>

| pgf = < math > exp [ lambda (z-1)] </math >

| mgf = <math>\exp[\lambda (e^{t} - 1)]</math>

| mgf = <math>\exp[\lambda (e^{t} - 1)]</math>

| mgf = < math > exp [ lambda (e ^ { t }-1)] </math >

| char = <math>\exp[\lambda (e^{it} - 1)]</math>

| char = <math>\exp[\lambda (e^{it} - 1)]</math>

| char = < math > exp [ lambda (e ^ { it }-1)] </math >

| fisher = <math>\frac{1}{\lambda}</math>

| fisher = <math>\frac{1}{\lambda}</math>

| fisher = < math > frac {1}{ lambda } </math >

}}

}}

}}



In [[probability theory]] and [[statistics]], the '''Poisson distribution''' ({{IPAc-en|'|p|w|ɑː|s|ɒ|n}}; {{IPA-fr|pwasɔ̃}}), named after [[France|French]] mathematician [[Siméon Denis Poisson]], is a [[discrete probability distribution]] that expresses the probability of a given number of events occurring in a fixed interval of time or space if these events occur with a known constant mean rate and [[Statistical independence|independently]] of the time since the last event.{{r|Haight1967}} The Poisson distribution can also be used for the number of events in other specified intervals such as distance, area or volume.

In probability theory and statistics, the Poisson distribution (; ), named after French mathematician Siméon Denis Poisson, is a discrete probability distribution that expresses the probability of a given number of events occurring in a fixed interval of time or space if these events occur with a known constant mean rate and independently of the time since the last event. The Poisson distribution can also be used for the number of events in other specified intervals such as distance, area or volume.

在概率论和统计学中,泊松分佈是以法国数学家西莫恩·德尼·泊松命名的,是一个离散的概率分布,它表达了在一个固定的时间间隔或空间中发生的一定数量的事件的概率,如果这些事件以一个已知的常数平均速率发生,并且独立于自上一个事件以来的时间。泊松分佈还可以用来表示其他特定间隔的事件数量,如距离、面积或体积。



For instance, an individual keeping track of the amount of mail they receive each day may notice that they receive an average number of 4 letters per day. If receiving any particular piece of mail does not affect the arrival times of future pieces of mail, i.e., if pieces of mail from a wide range of sources arrive independently of one another, then a reasonable assumption is that the number of pieces of mail received in a day obeys a Poisson distribution.{{r|Brooks2007}} Other examples that may follow a Poisson distribution include the number of phone calls received by a call center per hour and the number of decay events per second from a radioactive source.

For instance, an individual keeping track of the amount of mail they receive each day may notice that they receive an average number of 4 letters per day. If receiving any particular piece of mail does not affect the arrival times of future pieces of mail, i.e., if pieces of mail from a wide range of sources arrive independently of one another, then a reasonable assumption is that the number of pieces of mail received in a day obeys a Poisson distribution. Other examples that may follow a Poisson distribution include the number of phone calls received by a call center per hour and the number of decay events per second from a radioactive source.

例如,记录每天收到邮件数量的个人可能会注意到,他们平均每天收到4封信。如果收到任何特定的邮件并不影响未来邮件的到达时间,也就是说,如果来自不同来源的邮件彼此独立地到达,那么一个合理的假设是,每天收到的邮件数量服从一个泊松分佈。其他例子可能遵循一个泊松分佈包括呼叫中心每小时接到的电话数量和每秒从放射源衰变事件的数量。



== Definitions ==



===Probability mass function===



The Poisson distribution is popular for modeling the ''number of times an event occurs in an interval of time or space''.

The Poisson distribution is popular for modeling the number of times an event occurs in an interval of time or space.

泊松分佈模型是用来模拟一个事件在一段时间或空间内发生的次数的。



A discrete [[random variable]] ''X'' is said to have a Poisson distribution with parameter ''λ''&nbsp;>&nbsp;0, if, for ''k''&nbsp;=&nbsp;0,&nbsp;1,&nbsp;2,&nbsp;..., the [[probability mass function]] of ''X'' is given by:{{r|Yates2014|p=60}}

A discrete random variable X is said to have a Poisson distribution with parameter λ&nbsp;>&nbsp;0, if, for k&nbsp;=&nbsp;0,&nbsp;1,&nbsp;2,&nbsp;..., the probability mass function of X is given by:

一个离散的随机变量 x 被称为具有参数 > 0的泊松分佈,如果,对于 k = 0,1,2,... ,x 的概率质量函数是:

:<math>\!f(k; \lambda)= \Pr(X = k)= \frac{\lambda^k e^{-\lambda}}{k!},</math>

<math>\!f(k; \lambda)= \Pr(X = k)= \frac{\lambda^k e^{-\lambda}}{k!},</math>

< math > ! f (k; lambda) = Pr (x = k) = frac { lambda ^ k e ^ {-lambda }}{ k!} ,</math >

where

where

在哪里

* ''e'' is [[e (mathematical constant)|Euler's number]] (''e'' = 2.71828...)

* ''k''! is the [[factorial]] of ''k''.



The positive [[real number]] ''λ'' is equal to the [[expected value]] of ''X'' and also to its [[variance]]<ref>For the proof, see :

The positive real number λ is equal to the expected value of X and also to its variance<ref>For the proof, see :

正实数等于 x 的期望值和方差 < ref > 关于证明,请参阅:

[http://www.proofwiki.org/wiki/Expectation_of_Poisson_Distribution Proof wiki: expectation] and [http://www.proofwiki.org/wiki/Variance_of_Poisson_Distribution Proof wiki: variance]</ref>

[http://www.proofwiki.org/wiki/Expectation_of_Poisson_Distribution Proof wiki: expectation] and [http://www.proofwiki.org/wiki/Variance_of_Poisson_Distribution Proof wiki: variance]</ref>

[ http://www.proofwiki.org/wiki/expectation_of_poisson_distribution 证明 wiki: 期望]和[ http://www.proofwiki.org/wiki/variance_of_poisson_distribution 证明 wiki: variance ] </ref >

:<math>\lambda=\operatorname{E}(X)=\operatorname{Var}(X).</math>

<math>\lambda=\operatorname{E}(X)=\operatorname{Var}(X).</math>

< math > lambda = operatorname { e }(x) = operatorname { Var }(x) . </math >



The Poisson distribution can be applied to systems with a [[large number of rare events|large number of possible events, each of which is rare]]. The number of such events that occur during a fixed time interval is, under the right circumstances, a random number with a Poisson distribution.

The Poisson distribution can be applied to systems with a large number of possible events, each of which is rare. The number of such events that occur during a fixed time interval is, under the right circumstances, a random number with a Poisson distribution.

泊松分佈可以应用于具有大量可能事件的系统,每个可能事件都是罕见的。在正确的情况下,在一个固定的时间间隔内发生的这类事件的数量是一个带有泊松分佈的随机数。



===Example===

The Poisson distribution may be useful to model events such as

The Poisson distribution may be useful to model events such as

泊松分佈模型可以用来模拟事件,比如

* The number of meteorites greater than 1 meter diameter that strike Earth in a year

* The number of patients arriving in an emergency room between 10 and 11 pm

* The number of laser photons hitting a detector in a particular time interval



===Assumptions and validity===

The Poisson distribution is an appropriate model if the following assumptions are true:{{r|Koehrsen2019}}

The Poisson distribution is an appropriate model if the following assumptions are true:

如果下面的假设是正确的,泊松分佈是一个合适的模型:

* {{mvar|k}} is the number of times an event occurs in an interval and {{mvar|k}} can take values 0, 1, 2, ....

* The occurrence of one event does not affect the probability that a second event will occur. That is, events occur independently.

* The average rate at which events occur is independent of any occurrences. For simplicity, this is usually assumed to be constant, but may in practice vary with time.

* Two events cannot occur at exactly the same instant; instead, at each very small sub-interval exactly one event either occurs or does not occur.



If these conditions are true, then {{mvar|k}} is a Poisson random variable, and the distribution of {{mvar|k}} is a Poisson distribution.

If these conditions are true, then is a Poisson random variable, and the distribution of is a Poisson distribution.

如果这些条件成立,那么泊松是一个随机变量,其分布是一个泊松分佈。



The Poisson distribution is also the [[Limit (mathematics)|limit]] of a [[binomial distribution]], for which the probability of success for each trial equals {{mvar|λ}} divided by the number of trials, as the number of trials approaches infinity (see [[Poisson distribution#Related distributions|Related distributions]]).

The Poisson distribution is also the limit of a binomial distribution, for which the probability of success for each trial equals divided by the number of trials, as the number of trials approaches infinity (see Related distributions).

随着试验的数量趋于无穷大,每次试验的成功概率除以试验的数量,泊松分佈也是二项分布的极限。



===Probability of events for a Poisson distribution===

An event can occur 0, 1, 2, ... times in an interval. The average number of events in an interval is designated <math> \lambda </math> (lambda). <math> \lambda </math> is the event rate, also called the rate parameter. The probability of observing {{mvar|k}} events in an interval is given by the equation

An event can occur 0, 1, 2, ... times in an interval. The average number of events in an interval is designated <math> \lambda </math> (lambda). <math> \lambda </math> is the event rate, also called the rate parameter. The probability of observing events in an interval is given by the equation

一个事件可以在一个间隔内发生0,1,2,... 次。区间内的平均事件数被指定为 < math > lambda </math > (lambda)。Lambda </math > 是事件速率,也称为速率参数。该方程给出了在一个区间内观测事件的概率



:<math>P(k \text{ events in interval}) = \frac{\lambda^k e^{-\lambda}}{k!}</math>

<math>P(k \text{ events in interval}) = \frac{\lambda^k e^{-\lambda}}{k!}</math>

< math > p (k text { events in interval }) = frac { lambda ^ k e ^ {-lambda }{ k!{/math >



where

where

在哪里

* <math> \lambda </math> is the average number of events per interval

* ''e'' is the number 2.71828... ([[e (mathematical constant)|Euler's number]]) the base of the natural logarithms

* {{mvar|k}} takes values 0, 1, 2, ...

* {{mvar|k}}! = {{mvar|k}} × ({{mvar|k}} − 1) × ({{mvar|k}} − 2) × ... × 2 × 1 is the [[factorial]] of {{mvar|k}}.

This equation is the [[probability mass function]] (PMF) for a Poisson distribution.

This equation is the probability mass function (PMF) for a Poisson distribution.

这个方程就是概率质量函数的泊松分佈。



This equation can be adapted if, instead of the average number of events <math> \lambda </math>, we are given a time rate <math> r </math> for the events to happen. Then <math> \lambda = r t </math> (with <math> r</math> in units of 1/time), and

This equation can be adapted if, instead of the average number of events <math> \lambda </math>, we are given a time rate <math> r </math> for the events to happen. Then <math> \lambda = r t </math> (with <math> r</math> in units of 1/time), and

如果不用事件的平均数字 < math > lambda </math > ,而是给出事件发生的时间率 < math > r </math > ,那么这个方程就可以适用。然后是 < math > > lambda = r t </math > (以1/time 为单位的 < math > r </math >) ,以及



: <math>P(k \text{ events in interval } t) = \frac{(r t)^k e^{-r t}}{k!}</math>

<math>P(k \text{ events in interval } t) = \frac{(r t)^k e^{-r t}}{k!}</math>

< math > p (k text { events in interval } t) = frac {(r t) ^ k e ^ {-r t }{ k!{/math >



==== Examples of probability for Poisson distributions ====



{{col-begin|gap=5em}}

{{col-break}}

On a particular river, overflow floods occur once every 100 years on average. Calculate the probability of {{mvar|k}} = 0, 1, 2, 3, 4, 5, or 6 overflow floods in a 100-year interval, assuming the Poisson model is appropriate.

On a particular river, overflow floods occur once every 100 years on average. Calculate the probability of = 0, 1, 2, 3, 4, 5, or 6 overflow floods in a 100-year interval, assuming the Poisson model is appropriate.

在一条特定的河流上,泛滥的洪水平均每100年发生一次。计算概率 = 0,1,2,3,4,5,或6溢出洪水在100年的间隔,假设泊松模型是适当的。



Because the average event rate is one overflow flood per 100 years, ''λ'' = 1

Because the average event rate is one overflow flood per 100 years, λ = 1

因为平均事件率是每100年溢出一次洪水,= 1



: <math> P(k \text{ overflow floods in 100 years}) = \frac{\lambda^k e^{-\lambda}}{k!} = \frac{1^k e^{-1}}{k!}</math>

<math> P(k \text{ overflow floods in 100 years}) = \frac{\lambda^k e^{-\lambda}}{k!} = \frac{1^k e^{-1}}{k!}</math>

< math > p (k text { overflow in 100 years }) = frac { lambda ^ k e ^ {-lambda }}{ k! }= frac {1 ^ k e ^ {-1}{ k!{/math >



: <math> P(k = 0 \text{ overflow floods in 100 years}) = \frac{1^0 e^{-1}}{0!} = \frac{e^{-1}}{1} \approx 0.368 </math>

<math> P(k = 0 \text{ overflow floods in 100 years}) = \frac{1^0 e^{-1}}{0!} = \frac{e^{-1}}{1} \approx 0.368 </math>

< math > p (k = 0 text { overflow flood in 100 years }) = frac {1 ^ 0 e ^ {-1}{0! }= frac { e ^ {-1}{1}约0.368 </math >



: <math> P(k = 1 \text{ overflow flood in 100 years}) = \frac{1^1 e^{-1}}{1!} = \frac{e^{-1}}{1} \approx 0.368 </math>

<math> P(k = 1 \text{ overflow flood in 100 years}) = \frac{1^1 e^{-1}}{1!} = \frac{e^{-1}}{1} \approx 0.368 </math>

< math > p (k = 1 text { overflow flood in 100 years }) = frac {1 ^ 1 e ^ {-1}{1! }= frac { e ^ {-1}{1}约0.368 </math >



: <math> P(k = 2 \text{ overflow floods in 100 years}) = \frac{1^2 e^{-1}}{2!} = \frac{e^{-1}}{2} \approx 0.184 </math>

<math> P(k = 2 \text{ overflow floods in 100 years}) = \frac{1^2 e^{-1}}{2!} = \frac{e^{-1}}{2} \approx 0.184 </math>

< math > p (k = 2 text { overflow flood in 100 years }) = frac {1 ^ 2 e ^ {-1}{2! }= frac { e ^ {-1}{2} approx 0.184 </math >



{{col-break}}

The table below gives the probability for 0 to 6 overflow floods in a 100-year period.

The table below gives the probability for 0 to 6 overflow floods in a 100-year period.

下表给出了100年内0到6次洪水泛滥的概率。



{| class="wikitable"

{| class="wikitable"

{ | class = “ wikitable”

|-

|-

|-

! {{mvar|k}} !! ''P''({{mvar|k}} overflow floods in 100 years)

! !! P( overflow floods in 100 years)

!!!100年内泛滥成灾

|-

|-

|-

| 0|| 0.368

| 0|| 0.368

| 0|| 0.368

|-

|-

|-

| 1|| 0.368

| 1|| 0.368

| 1|| 0.368

|-

|-

|-

| 2|| 0.184

| 2|| 0.184

| 2|| 0.184

|-

|-

|-

| 3|| 0.061

| 3|| 0.061

| 3|| 0.061

|-

|-

|-

| 4|| 0.015

| 4|| 0.015

| 4|| 0.015

|-

|-

|-

| 5|| 0.003

| 5|| 0.003

| 5|| 0.003

|-

|-

|-

| 6|| 0.0005

| 6|| 0.0005

| 6|| 0.0005

|}

|}

|}

{{col-end}}



{{col-begin|gap=5em}}

{{col-break}}

Ugarte and colleagues report that the average number of goals in a World Cup soccer match is approximately 2.5 and the Poisson model is appropriate.{{r|Ugarte2016}}

Ugarte and colleagues report that the average number of goals in a World Cup soccer match is approximately 2.5 and the Poisson model is appropriate.

乌加特和他的同事们报告说,世界杯足球赛的平均进球数约为2.5个,泊松模型是适当的。

Because the average event rate is 2.5 goals per match, ''λ'' = 2.5.

Because the average event rate is 2.5 goals per match, λ = 2.5.

因为平均每场比赛有2.5个进球,= 2.5个。



: <math> P(k \text{ goals in a match}) = \frac{2.5^k e^{-2.5}}{k!}</math>

<math> P(k \text{ goals in a match}) = \frac{2.5^k e^{-2.5}}{k!}</math>

< math > p (k text { goals in a match }) = frac {2.5 ^ k e ^ {-2.5}{ k!{/math >



: <math> P(k = 0 \text{ goals in a match}) = \frac{2.5^0 e^{-2.5}}{0!} = \frac{e^{-2.5}}{1} \approx 0.082 </math>

<math> P(k = 0 \text{ goals in a match}) = \frac{2.5^0 e^{-2.5}}{0!} = \frac{e^{-2.5}}{1} \approx 0.082 </math>

= frac {2.5 ^ 0 e ^ {-2.5}{0! }= frac { e ^ {-2.5}{1}大约0.082



: <math> P(k = 1 \text{ goal in a match}) = \frac{2.5^1 e^{-2.5}}{1!} = \frac{2.5 e^{-2.5}}{1} \approx 0.205 </math>

<math> P(k = 1 \text{ goal in a match}) = \frac{2.5^1 e^{-2.5}}{1!} = \frac{2.5 e^{-2.5}}{1} \approx 0.205 </math>

= frac {2.5 ^ 1 e ^ {-2.5}{1! }= frac {2.5 e ^ {-2.5}{1}约0.205



: <math> P(k = 2 \text{ goals in a match}) = \frac{2.5^2 e^{-2.5}}{2!} = \frac{6.25 e^{-2.5}}{2} \approx 0.257 </math>

<math> P(k = 2 \text{ goals in a match}) = \frac{2.5^2 e^{-2.5}}{2!} = \frac{6.25 e^{-2.5}}{2} \approx 0.257 </math>

< math > p (k = 2 text { goals in a match }) = frac {2.5 ^ 2 e ^ {-2.5}{2! }= frac {6.25 e ^ {-2.5}{2}{2}{约0.257 </math >



{{col-break}}

The table below gives the probability for 0 to 7 goals in a match.

The table below gives the probability for 0 to 7 goals in a match.

下表给出了一场比赛中0到7个进球的概率。



{| class="wikitable"

{| class="wikitable"

{ | class = “ wikitable”

|-

|-

|-

! {{mvar|k}} !! ''P''({{mvar|k}} goals in a World Cup soccer match)

! !! P( goals in a World Cup soccer match)

!!!P (世界杯足球赛进球)

|-

|-

|-

| 0|| 0.082

| 0|| 0.082

| 0|| 0.082

|-

|-

|-

| 1|| 0.205

| 1|| 0.205

| 1|| 0.205

|-

|-

|-

| 2|| 0.257

| 2|| 0.257

| 2|| 0.257

|-

|-

|-

| 3|| 0.213

| 3|| 0.213

| 3|| 0.213

|-

|-

|-

| 4|| 0.133

| 4|| 0.133

| 4|| 0.133

|-

|-

|-

| 5|| 0.067

| 5|| 0.067

| 5|| 0.067

|-

|-

|-

| 6|| 0.028

| 6|| 0.028

| 6|| 0.028

|-

|-

|-

| 7|| 0.010

| 7|| 0.010

| 7|| 0.010

|}

|}

|}

{{col-end}}



====Once in an interval events: The special case of ''λ'' = 1 and ''k'' = 0 ====



Suppose that astronomers estimate that large meteorites (above a certain size) hit the earth on average once every 100 years (''λ'' = 1 event per 100 years), and that the number of meteorite hits follows a Poisson distribution. What is the probability of {{mvar|k}} = 0 meteorite hits in the next 100 years?

Suppose that astronomers estimate that large meteorites (above a certain size) hit the earth on average once every 100 years (λ = 1 event per 100 years), and that the number of meteorite hits follows a Poisson distribution. What is the probability of = 0 meteorite hits in the next 100 years?

假设天文学家估计,大型陨石(超过一定大小)平均每100年撞击地球一次(= 每100年撞击一次) ,而且陨石撞击的次数紧随泊松分佈之后。在接下来的100年里,陨石击中0的概率是多少?



: <math> P(k = \text{0 meteorites hit in next 100 years}) = \frac{1^0 e^{-1}}{0!} = \frac{1}{e} \approx 0.37 </math>

<math> P(k = \text{0 meteorites hit in next 100 years}) = \frac{1^0 e^{-1}}{0!} = \frac{1}{e} \approx 0.37 </math>

< math > p (k = text {0 meteorites hit in next 100 years }) = frac {1 ^ 0 e ^ {-1}{0! }= frac {1}{ e }大约0.37 </math >



Under these assumptions, the probability that no large meteorites hit the earth in the next 100 years is roughly 0.37. The remaining 1&nbsp;−&nbsp;0.37&nbsp;= 0.63 is the probability of 1, 2, 3, or more large meteorite hits in the next 100 years.

Under these assumptions, the probability that no large meteorites hit the earth in the next 100 years is roughly 0.37. The remaining 1&nbsp;−&nbsp;0.37&nbsp;= 0.63 is the probability of 1, 2, 3, or more large meteorite hits in the next 100 years.

根据这些假设,未来100年内没有大陨石撞击地球的概率大约为0.37。剩下的1-0.37 = 0.63是未来100年内1,2,3或更多大型陨石撞击的概率。

In an example above, an overflow flood occurred once every 100 years (''λ''&nbsp;=&nbsp;1). The probability of no overflow floods in 100 years was roughly 0.37, by the same calculation.

In an example above, an overflow flood occurred once every 100 years (λ&nbsp;=&nbsp;1). The probability of no overflow floods in 100 years was roughly 0.37, by the same calculation.

在上面的一个例子中,溢流洪水每100年发生一次(= 1)。根据同样的计算,100年内不会有洪水泛滥的概率大约是0.37。



In general, if an event occurs on average once per interval (''λ''&nbsp;=&nbsp;1), and the events follow a Poisson distribution, then {{nowrap|1=''P''(0 events in next interval)&nbsp;= 0.37}}. In addition, ''P''(exactly one event in next interval) = 0.37, as shown in the table for overflow floods.

In general, if an event occurs on average once per interval (λ&nbsp;=&nbsp;1), and the events follow a Poisson distribution, then . In addition, P(exactly one event in next interval) = 0.37, as shown in the table for overflow floods.

一般来说,如果一个事件平均每个间隔发生一次(= 1) ,并且事件遵循一个泊松分佈,那么。此外,p (下一个间隔中正好有一个事件) = 0.37,如溢出洪水的表所示。



=== Examples that violate the Poisson assumptions ===



The number of students who arrive at the [[Student center|student union]] per minute will likely not follow a Poisson distribution, because the rate is not constant (low rate during class time, high rate between class times) and the arrivals of individual students are not independent (students tend to come in groups).

The number of students who arrive at the student union per minute will likely not follow a Poisson distribution, because the rate is not constant (low rate during class time, high rate between class times) and the arrivals of individual students are not independent (students tend to come in groups).

每分钟抵达学生会的学生人数可能不会遵循一个泊松分佈,因为这个比率不是恒定的(上课时间的低比率,上课时间之间的高比率) ,而且个别学生的到达也不是独立的(学生往往是成群结队来的)。



The number of magnitude 5 earthquakes per year in a country may not follow a Poisson distribution if one large earthquake increases the probability of aftershocks of similar magnitude.

The number of magnitude 5 earthquakes per year in a country may not follow a Poisson distribution if one large earthquake increases the probability of aftershocks of similar magnitude.

如果一个国家每年发生5级地震的次数增加了发生类似震级余震的可能性,那么这个国家每年发生5级地震的次数可能不会超过一个泊松分佈。



Examples in which at least one event is guaranteed are not Poission distributed; but may be modeled using a [[Zero-truncated Poisson distribution]].

Examples in which at least one event is guaranteed are not Poission distributed; but may be modeled using a Zero-truncated Poisson distribution.

至少有一个事件得到保证的例子不是 Poission 分布式的,而是可以使用零截断泊松分佈进行建模。



Count distributions in which the number of intervals with zero events is higher than predicted by a Poisson model may be modeled using a [[Zero-inflated model]].

Count distributions in which the number of intervals with zero events is higher than predicted by a Poisson model may be modeled using a Zero-inflated model.

如果零事件的区间数高于泊松模型预测的区间数分布,则可以使用零膨胀模型来建模。



== Properties ==



=== Descriptive statistics ===

* The [[expected value]] and [[variance]] of a Poisson-distributed random variable are both equal to λ.

* The [[coefficient of variation]] is <math>\textstyle \lambda^{-1/2}</math>, while the [[index of dispersion]] is 1.{{r|Johnson2005|p=163}}

* The [[mean absolute deviation]] about the mean is{{r|Johnson2005|p=163}}

::<math>\operatorname{E}[|X-\lambda|]= \frac{2 \lambda^{\lfloor\lambda\rfloor + 1} e^{-\lambda}}{\lfloor\lambda\rfloor!}.</math>

<math>\operatorname{E}[|X-\lambda|]= \frac{2 \lambda^{\lfloor\lambda\rfloor + 1} e^{-\lambda}}{\lfloor\lambda\rfloor!}.</math>

[ | x-lambda | ] = frac {2 lambda ^ { lfloor lambda rfloor + 1} e ^ {-lambda }{ lfloor lambda rfloor!} . </math >

* The [[mode (statistics)|mode]] of a Poisson-distributed random variable with non-integer λ is equal to <math>\scriptstyle\lfloor \lambda \rfloor</math>, which is the largest integer less than or equal to&nbsp;''λ''. This is also written as [[floor function|floor]](λ). When λ is a positive integer, the modes are ''λ'' and ''λ''&nbsp;−&nbsp;1.

* All of the [[cumulant]]s of the Poisson distribution are equal to the expected value&nbsp;''λ''. The ''n''th [[factorial moment]] of the Poisson distribution is ''λ''<sup>''n''</sup>.

* The [[expected value]] of a [[Poisson process]] is sometimes decomposed into the product of ''intensity'' and ''exposure'' (or more generally expressed as the integral of an "intensity function" over time or space, sometimes described as “exposure”).{{r|Helske2017}}



=== Median ===



Bounds for the median (<math>\nu</math>) of the distribution are known and are [[Mathematical jargon#sharp|sharp]]:{{r|Choi1994}}

Bounds for the median (<math>\nu</math>) of the distribution are known and are sharp:

分布的中位数(< math > nu </math >)的界限是已知的,并且是锐利的:



: <math> \lambda - \ln 2 \le \nu < \lambda + \frac{1}{3}. </math>

<math> \lambda - \ln 2 \le \nu < \lambda + \frac{1}{3}. </math>

2 le nu < lambda + frac {1}{3}.数学



=== Higher moments ===

* The higher [[moment (mathematics)|moments]] ''m''<sub>''k''</sub> of the Poisson distribution about the origin are [[Touchard polynomials]] in λ:

:: <math> m_k = \sum_{i=0}^k \lambda^i \left\{\begin{matrix} k \\ i \end{matrix}\right\},</math>

<math> m_k = \sum_{i=0}^k \lambda^i \left\{\begin{matrix} k \\ i \end{matrix}\right\},</math>

< math > m _ k = sum { i = 0} ^ k lambda ^ i left { begin { matrix } k i end { matrix } right } ,</math >



: where the {braces} denote [[Stirling numbers of the second kind]].{{r|Riordan1937}}{{r|Haight1967|p=6}} The coefficients of the polynomials have a [[combinatorics|combinatorial]] meaning. In fact, when the expected value of the Poisson distribution is 1, then [[Dobinski's formula]] says that the ''n''th moment equals the number of [[partition of a set|partitions of a set]] of size ''n''.

where the {braces} denote Stirling numbers of the second kind. The coefficients of the polynomials have a combinatorial meaning. In fact, when the expected value of the Poisson distribution is 1, then Dobinski's formula says that the nth moment equals the number of partitions of a set of size n.

其中{括号}表示第二类 Stirling 数。多项式的系数具有组合意义。事实上,当泊松分佈的期望值是1时,那么 Dobinski 的公式说第 n 个时刻等于一组大小为 n 的分区的数目。



For the non-centered moments we define <math>B=k/\lambda</math>, then{{r|Jagadeesan2017}}

For the non-centered moments we define <math>B=k/\lambda</math>, then

对于非中心时刻,我们定义了 < math > b = k/lambda </math >

::<math>

<math>

《数学》

E[X^k]^{1/k} \le C\cdot

E[X^k]^{1/k} \le C\cdot

E [ x ^ k ] ^ {1/k } le c dot

\begin{cases}

\begin{cases}

开始{ cases }

k/B & \text{if}\quad B < e \\

k/B & \text{if}\quad B < e \\

k/B & text { if }方 b < e

k/\log B & \text{if}\quad B\ge e

k/\log B & \text{if}\quad B\ge e

如果你想要一个更好的工具,你需要一个更好的工具

\end{cases}

\end{cases}

结束{ cases }

</math>

</math>

数学

where <math>C</math> is some absolute constant greater than 0.

where <math>C</math> is some absolute constant greater than 0.

其中,c </math > 是某个大于0的绝对常数。



=== Sums of Poisson-distributed random variables ===

: If <math>X_i \sim \operatorname{Pois}(\lambda_i)</math> for <math>i=1,\dotsc,n</math> are [[statistical independence|independent]], then <math>\sum_{i=1}^n X_i \sim \operatorname{Pois}\left(\sum_{i=1}^n \lambda_i\right)</math>.{{r|Lehmann1986|p=65}} A converse is [[Raikov's theorem]], which says that if the sum of two independent random variables is Poisson-distributed, then so are each of those two independent random variables.{{r|Raikov1937}}{{r|vonMises1964|p=}}

If <math>X_i \sim \operatorname{Pois}(\lambda_i)</math> for <math>i=1,\dotsc,n</math> are independent, then <math>\sum_{i=1}^n X_i \sim \operatorname{Pois}\left(\sum_{i=1}^n \lambda_i\right)</math>. A converse is Raikov's theorem, which says that if the sum of two independent random variables is Poisson-distributed, then so are each of those two independent random variables.

如果对于 < math > i = 1,dotsc,n </math > 是独立的,那么 < math > sum { i = 1} ^ n xi sim 操作者名{ Pois }左(sum { i = 1} ^ n lambda _ i 右) </math > 。一个逆定理是雷科夫定理,它说如果两个独立的随机变量之和是泊松分布的,那么这两个独立的随机变量之和也是泊松分布的。



=== Other properties ===

* The Poisson distributions are [[Infinite divisibility (probability)|infinitely divisible]] probability distributions.{{r|Laha1979|p=233}}{{r|Johnson2005|p=164}}

* The directed [[Kullback–Leibler divergence]] of <math>\operatorname{Pois}(\lambda_0)</math> from <math>\operatorname{Pois}(\lambda)</math> is given by



:: <math>\operatorname{D}_{\text{KL}}(\lambda\mid\lambda_0) = \lambda_0 - \lambda + \lambda \log \frac{\lambda}{\lambda_0}.</math>

<math>\operatorname{D}_{\text{KL}}(\lambda\mid\lambda_0) = \lambda_0 - \lambda + \lambda \log \frac{\lambda}{\lambda_0}.</math>

< math > operatorname { d }{ text { KL }}(lambda mid lambda _ 0) = lambda _ 0-lambda + lambda log frac { lambda }{ lambda _ 0} . </math >

* Bounds for the tail probabilities of a Poisson random variable <math> X \sim \operatorname{Pois}(\lambda)</math> can be derived using a [[Chernoff bound]] argument.{{r|Mitzenmacher2005|p=97-98}}

:: <math> P(X \geq x) \leq \frac{(e \lambda)^x e^{-\lambda}}{x^x}, \text{ for } x > \lambda</math>,

<math> P(X \geq x) \leq \frac{(e \lambda)^x e^{-\lambda}}{x^x}, \text{ for } x > \lambda</math>,

(x x x) leq frac {(e lambda) ^ x e ^ {-lambda }{ x ^ x } ,text { for } x > lambda </math > ,



:: <math> P(X \leq x) \leq \frac{(e \lambda)^x e^{-\lambda} }{x^x}, \text{ for } x < \lambda.</math>

<math> P(X \leq x) \leq \frac{(e \lambda)^x e^{-\lambda} }{x^x}, \text{ for } x < \lambda.</math>

< math > p (x leq x) leq frac {(e lambda) ^ x e ^ {-lambda }{ x ^ x } ,text { for } x < lambda. </math >



* The upper tail probability can be tightened (by a factor of at least two) as follows:{{r|Short2013}}

:: <math> P(X \geq x) \leq \frac{e^{-\operatorname{D}_{\text{KL}}(x\mid\lambda)}}{\max{(2, \sqrt{4\pi\operatorname{D}_{\text{KL}}(x\mid\lambda)}})}, \text{ for } x > \lambda,</math>

<math> P(X \geq x) \leq \frac{e^{-\operatorname{D}_{\text{KL}}(x\mid\lambda)}}{\max{(2, \sqrt{4\pi\operatorname{D}_{\text{KL}}(x\mid\lambda)}})}, \text{ for } x > \lambda,</math>

< math > p (x geq x) leq frac { e ^ {-operatorname { d }{ text { KL }(x mid lambda)}}{ max {(2,sqrt {4 pi operatorname { d }{ text { KL }(x mid lambda)}}}}) ,text { for } x > lambda,</math >



: where <math>\operatorname{D}_{\text{KL}}(x\mid\lambda)</math> is the directed Kullback–Leibler divergence, as described above.

where <math>\operatorname{D}_{\text{KL}}(x\mid\lambda)</math> is the directed Kullback–Leibler divergence, as described above.

其中 < math > operatorname { d } _ { text { KL }}(x mid lambda) </math > 是指向的 Kullback-Leibler 分歧,如上所述。



* Inequalities that relate the distribution function of a Poisson random variable <math> X \sim \operatorname{Pois}(\lambda)</math> to the [[Standard normal distribution]] function <math> \Phi(x) </math> are as follows:{{r|Short2013}}



:: <math> \Phi\left(\operatorname{sign}(k-\lambda)\sqrt{2\operatorname{D}_{\text{KL}}(k\mid\lambda)}\right) < P(X \leq k) < \Phi\left(\operatorname{sign}(k-\lambda+1)\sqrt{2\operatorname{D}_{\text{KL}}(k+1\mid\lambda)}\right), \text{ for } k > 0,</math>

<math> \Phi\left(\operatorname{sign}(k-\lambda)\sqrt{2\operatorname{D}_{\text{KL}}(k\mid\lambda)}\right) < P(X \leq k) < \Phi\left(\operatorname{sign}(k-\lambda+1)\sqrt{2\operatorname{D}_{\text{KL}}(k+1\mid\lambda)}\right), \text{ for } k > 0,</math>

< math > Phi left (operatorname { sign }(k-lambda) sqrt {2 operatorname { d }{ text { KL }(k mid lambda)}右) < p (x leq k) < left (operatorname { sign }(k-lambda + 1) sqrt {2 operatorname { d }{ text { KL }}}(k + 1 mid lambda)}右) ,text { for } k > 0,</math >



: where <math>\operatorname{D}_{\text{KL}}(k\mid\lambda)</math> is again the directed Kullback–Leibler divergence.

where <math>\operatorname{D}_{\text{KL}}(k\mid\lambda)</math> is again the directed Kullback–Leibler divergence.

其中 < math > 操作者名{ d } _ { text { KL }(k mid lambda) </math > 仍然是有向的 Kullback-Leibler 分歧。



=== Poisson races ===



Let <math>X \sim \operatorname{Pois}(\lambda)</math> and <math>Y \sim \operatorname{Pois}(\mu)</math> be independent random variables, with <math> \lambda < \mu </math>, then we have that

Let <math>X \sim \operatorname{Pois}(\lambda)</math> and <math>Y \sim \operatorname{Pois}(\mu)</math> be independent random variables, with <math> \lambda < \mu </math>, then we have that

设 x sim 操作者名称{ Pois }(lambda) </math > 和 < math > y sim 操作者名称{ Pois }(mu) </math > 是独立的随机变量,并且带有 < math > lambda < mu </math > ,那么我们就有了



:<math>

<math>

《数学》

\frac{e^{-(\sqrt{\mu} -\sqrt{\lambda})^2 }}{(\lambda + \mu)^2} - \frac{e^{-(\lambda + \mu)}}{2\sqrt{\lambda \mu}} - \frac{e^{-(\lambda + \mu)}}{4\lambda \mu} \leq P(X - Y \geq 0) \leq e^{- (\sqrt{\mu} -\sqrt{\lambda})^2}

\frac{e^{-(\sqrt{\mu} -\sqrt{\lambda})^2 }}{(\lambda + \mu)^2} - \frac{e^{-(\lambda + \mu)}}{2\sqrt{\lambda \mu}} - \frac{e^{-(\lambda + \mu)}}{4\lambda \mu} \leq P(X - Y \geq 0) \leq e^{- (\sqrt{\mu} -\sqrt{\lambda})^2}

Frac { e ^ {-(sqrt { mu }-sqrt { lambda }) ^ 2}{(lambda + mu) ^ 2}-frac { e ^ {-(lambda + mu)}}{2 sqrt { lambda mu }-frac { e ^ { e ^ {-(lambda + mu)}-(lambda + mu)}}}{4 lambda } leq p (x-y geq 0) leq ^ e ^ {-(sqrt { mu }-sqrt { lambda }) ^ 2}

</math>

</math>

数学



The upper bound is proved using a standard Chernoff bound.

The upper bound is proved using a standard Chernoff bound.

利用标准的 Chernoff 界证明了上界的存在性。



The lower bound can be proved by noting that <math> P(X-Y\geq0\mid X+Y=i)</math> is the probability that <math>Z \geq \frac{i}{2}</math>, where <math>Z \sim \operatorname{Bin}\left(i, \frac{\lambda}{\lambda+\mu}\right)</math>, which is bounded below by <math> \frac{1}{(i+1)^2} e^{\left(-iD\left(0.5 \| \frac{\lambda}{\lambda+\mu}\right)\right)} </math>, where <math>D</math> is [[Kullback–Leibler divergence|relative entropy]] (See the entry on [[Binomial distribution#Tail Bounds|bounds on tails of binomial distributions]] for details). Further noting that <math> X+Y \sim \operatorname{Pois}(\lambda+\mu)</math>, and computing a lower bound on the unconditional probability gives the result. More details can be found in the appendix of Kamath ''et al.''.{{r|Kamath2015}}

The lower bound can be proved by noting that <math> P(X-Y\geq0\mid X+Y=i)</math> is the probability that <math>Z \geq \frac{i}{2}</math>, where <math>Z \sim \operatorname{Bin}\left(i, \frac{\lambda}{\lambda+\mu}\right)</math>, which is bounded below by <math> \frac{1}{(i+1)^2} e^{\left(-iD\left(0.5 \| \frac{\lambda}{\lambda+\mu}\right)\right)} </math>, where <math>D</math> is relative entropy (See the entry on bounds on tails of binomial distributions for details). Further noting that <math> X+Y \sim \operatorname{Pois}(\lambda+\mu)</math>, and computing a lower bound on the unconditional probability gives the result. More details can be found in the appendix of Kamath et al..

下限可以通过下面的例子来证明: < math > p (X-Y geq0 mid x + y = i) </math > 是 < math > z geq frac { i }{2} </math > ,其中 < math > z sim 操作员名{ Bin }左(i,frac { lambda } + mu }右) </math > ,下面由 < math > frac {1}{(i + 1) ^ 2} e ^ { left (- iD left (0.5 | frac { lambda }(- iD left (0.5 | frac { lambda + mu } right)}}} </math > 限定,其中 < math > d </math > 是相对熵。进一步注意到 < math > x + y sim 操作者名{ Pois }(lambda + mu) </math > ,并计算无条件概率的下限得到结果。更多的细节可以在卡马斯等人的附录中找到。



== Related distributions ==

===General===

* If <math>X_1 \sim \mathrm{Pois}(\lambda_1)\,</math> and <math>X_2 \sim \mathrm{Pois}(\lambda_2)\,</math> are independent, then the difference <math> Y = X_1 - X_2</math> follows a [[Skellam distribution]].

* If <math>X_1 \sim \mathrm{Pois}(\lambda_1)\,</math> and <math>X_2 \sim \mathrm{Pois}(\lambda_2)\,</math> are independent, then the distribution of <math>X_1</math> conditional on <math>X_1+X_2</math> is a [[binomial distribution]].

:Specifically, if <math>X_1+X_2=k</math>, then <math>\!X_1\sim \mathrm{Binom}(k, \lambda_1/(\lambda_1+\lambda_2))</math>.

Specifically, if <math>X_1+X_2=k</math>, then <math>\!X_1\sim \mathrm{Binom}(k, \lambda_1/(\lambda_1+\lambda_2))</math>.

具体来说,如果 < math > x _ 1 + x _ 2 = k </math > ,那么 < math > ! x _ 1 sim mathrm { Binom }(k,lambda _ 1/(lambda _ 1 + lambda _ 2)) </math > 。

:More generally, if ''X''<sub>1</sub>, ''X''<sub>2</sub>,..., ''X''<sub>''n''</sub> are independent Poisson random variables with parameters ''λ''<sub>1</sub>, ''λ''<sub>2</sub>,..., ''λ''<sub>''n''</sub> then

More generally, if X<sub>1</sub>, X<sub>2</sub>,..., X<sub>n</sub> are independent Poisson random variables with parameters λ<sub>1</sub>, λ<sub>2</sub>,..., λ<sub>n</sub> then

更一般地说,如果 x < sub > 1 </sub > ,x < sub > 2 </sub > ,... ,x < sub > n </sub > 是独立的随机变量,参数 < sub > 1 </sub > ,< sub > 2 </sub > ,... ,< sub > n </sub > 然后

:: given <math>\sum_{j=1}^n X_j=k,</math> <math>X_i \sim \mathrm{Binom}\left(k, \frac{\lambda_i}{\sum_{j=1}^n\lambda_j}\right)</math>. In fact, <math>\{X_i\} \sim \mathrm{Multinom}\left(k, \left\{\frac{\lambda_i}{\sum_{j=1}^n\lambda_j}\right\}\right)</math>.

given <math>\sum_{j=1}^n X_j=k,</math> <math>X_i \sim \mathrm{Binom}\left(k, \frac{\lambda_i}{\sum_{j=1}^n\lambda_j}\right)</math>. In fact, <math>\{X_i\} \sim \mathrm{Multinom}\left(k, \left\{\frac{\lambda_i}{\sum_{j=1}^n\lambda_j}\right\}\right)</math>.

给定的数学公式 sum { j = 1} ^ n x _ j = k,</math > x _ i sim mathrm { Binom }左(k,frac { lambda _ i }{ sum { j = 1} ^ n lambda _ j }右) </math > 。事实上,[ math ]{ xi } sim mathrm { mothom } left (k,left { frac { lambda _ i }{ sum { j = 1} ^ n lambda _ j } right }}) </math > 。

* If <math>X \sim \mathrm{Pois}(\lambda)\,</math> and the distribution of <math>Y</math>, conditional on ''X''&nbsp;=&nbsp;''k'', is a [[binomial distribution]], <math>Y \mid (X = k) \sim \mathrm{Binom}(k, p)</math>, then the distribution of Y follows a Poisson distribution <math>Y \sim \mathrm{Pois}(\lambda \cdot p)\,</math>. In fact, if <math>\{Y_i\} </math>, conditional on X = k, follows a multinomial distribution, <math>\{Y_i\} \mid (X = k) \sim \mathrm{Multinom}\left(k, p_i\right)</math>, then each <math>Y_i</math> follows an independent Poisson distribution <math>Y_i \sim \mathrm{Pois}(\lambda \cdot p_i), \rho(Y_i, Y_j) = 0</math>.

* The Poisson distribution can be derived as a limiting case to the binomial distribution as the number of trials goes to infinity and the [[expected value|expected]] number of successes remains fixed — see [[#law of rare events|law of rare events]] below. Therefore, it can be used as an approximation of the binomial distribution if ''n'' is sufficiently large and ''p'' is sufficiently small. There is a rule of thumb stating that the Poisson distribution is a good approximation of the binomial distribution if n is at least 20 and ''p'' is smaller than or equal to 0.05, and an excellent approximation if ''n''&nbsp;≥&nbsp;100 and ''np''&nbsp;≤&nbsp;10.{{r|NIST2006}}

:: <math>F_\mathrm{Binomial}(k;n, p) \approx F_\mathrm{Poisson}(k;\lambda=np)\,</math>

<math>F_\mathrm{Binomial}(k;n, p) \approx F_\mathrm{Poisson}(k;\lambda=np)\,</math>

(k; n,p)接近 f _ mathrm { Poisson }(k; lambda = np) ,</math >

* The Poisson distribution is a [[special case]] of the discrete compound Poisson distribution (or stuttering Poisson distribution) with only a parameter.{{r|Zhang2013|Zhang2016}} The discrete compound Poisson distribution can be deduced from the limiting distribution of univariate multinomial distribution. It is also a [[compound Poisson distribution#Special cases|special case]] of a [[compound Poisson distribution]].

* For sufficiently large values of λ, (say λ>1000), the [[normal distribution]] with mean λ and variance λ (standard deviation <math>\sqrt{\lambda}</math>) is an excellent approximation to the Poisson distribution. If λ is greater than about 10, then the normal distribution is a good approximation if an appropriate [[continuity correction]] is performed, i.e., if P(''X''&nbsp;≤&nbsp;''x''), where ''x'' is a non-negative integer, is replaced by P(''X''&nbsp;≤&nbsp;''x''&nbsp;+&nbsp;0.5).

:: <math>F_\mathrm{Poisson}(x;\lambda) \approx F_\mathrm{normal}(x;\mu=\lambda,\sigma^2=\lambda)\,</math>

<math>F_\mathrm{Poisson}(x;\lambda) \approx F_\mathrm{normal}(x;\mu=\lambda,\sigma^2=\lambda)\,</math>

(x; λ) approx f _ mathrm { normal }(x; mu = lambda,sigma ^ 2 = lambda) ,</math >

* [[Variance-stabilizing transformation]]: If <math>X \sim \mathrm{Pois}(\lambda)\,</math>, then

::<math>Y = 2 \sqrt{X} \approx \mathcal{N}(2\sqrt{\lambda};1)</math>,{{r|Johnson2005|p=168}}

<math>Y = 2 \sqrt{X} \approx \mathcal{N}(2\sqrt{\lambda};1)</math>,

(2 sqrt { lambda } ; 1) </math > ,

:and

and



::<math>Y = \sqrt{X} \approx \mathcal{N}(\sqrt{\lambda};1/4)</math>.{{r|McCullagh1989|p=196}}

<math>Y = \sqrt{X} \approx \mathcal{N}(\sqrt{\lambda};1/4)</math>.

(sqrt { lambda } ; 1/4) </math > .

:Under this transformation, the convergence to normality (as <math>\lambda</math> increases) is far faster than the untransformed variable.{{Citation needed|date=May 2012}} Other, slightly more complicated, variance stabilizing transformations are available,{{r|Johnson2005|p=168}} one of which is [[Anscombe transform]].{{r|Anscombe1948}} See [[Data transformation (statistics)]] for more general uses of transformations.

Under this transformation, the convergence to normality (as <math>\lambda</math> increases) is far faster than the untransformed variable. Other, slightly more complicated, variance stabilizing transformations are available, one of which is Anscombe transform. See Data transformation (statistics) for more general uses of transformations.

在这种转换下,收敛到正态的速度(如 < math > lambda </math > 增加)远远快于未转换的变量。还有一些稍微复杂一些的稳定方差的变换,其中之一就是安斯科姆变换。有关转换的更多一般用途,请参见数据转换(统计信息)。

* If for every ''t''&nbsp;>&nbsp;0 the number of arrivals in the time interval [0,&nbsp;''t''] follows the Poisson distribution with mean ''λt'', then the sequence of inter-arrival times are independent and identically distributed [[exponential distribution|exponential]] random variables having mean&nbsp;1/''λ''.{{r|Ross2010|p=317–319}}

* The [[cumulative distribution function]]s of the Poisson and [[chi-squared distribution]]s are related in the following ways:{{r|Johnson2005|p=167}}

::<math>F_\text{Poisson}(k;\lambda) = 1-F_{\chi^2}(2\lambda;2(k+1)) \quad\quad \text{ integer } k,</math>

<math>F_\text{Poisson}(k;\lambda) = 1-F_{\chi^2}(2\lambda;2(k+1)) \quad\quad \text{ integer } k,</math>

1-F _ { chi ^ 2}(2 lambda; 2(k + 1)) quad text { integer } k,</math >

:and{{r|Johnson2005|p=158}}

and



::<math>\Pr(X=k)=F_{\chi^2}(2\lambda;2(k+1)) -F_{\chi^2}(2\lambda;2k) .

<math>\Pr(X=k)=F_{\chi^2}(2\lambda;2(k+1)) -F_{\chi^2}(2\lambda;2k) .

< math > Pr (x = k) = f { chi ^ 2}(2 lambda; 2(k + 1))-f { chi ^ 2}(2 lambda; 2k).

</math>

</math>

数学



=== Poisson Approximation ===



Assume <math>X_1\sim\operatorname{Pois}(\lambda_1), X_2\sim\operatorname{Pois}(\lambda_2), \dots, X_n\sim\operatorname{Pois}(\lambda_n)</math> where <math>\lambda_1 + \lambda_2 + \dots + \lambda_n=1</math>, then<ref>{{Cite web | url=https://newonlinecourses.science.psu.edu/stat504/node/48/ | title=1.7.7 – Relationship between the Multinomial and Poisson &#124; STAT 504}}</ref> <math>(X_1, X_2, \dots, X_n)</math> is [[Multinomial distribution|multinomially distributed]]

Assume <math>X_1\sim\operatorname{Pois}(\lambda_1), X_2\sim\operatorname{Pois}(\lambda_2), \dots, X_n\sim\operatorname{Pois}(\lambda_n)</math> where <math>\lambda_1 + \lambda_2 + \dots + \lambda_n=1</math>, then <math>(X_1, X_2, \dots, X_n)</math> is multinomially distributed

假设 x _ 1 sim 操作者名为{ Pois }(lambda _ 1) ,x _ 2 sim 操作者名为{ Pois }(lambda _ 2) ,点,xn sim 操作者名为{ Pois }(lambda _ n) </math > 其中 < math > lambda _ 1 + lambda _ 2 + dots + lambda _ n = 1 </math 多项式 > ,那么 < math > (x _ 1,x _ 2,dots,x _ n) </math > 是统一分布的

<math>(X_1, X_2, \dots, X_n) \sim \operatorname{Mult}(N, \lambda_1, \lambda_2, \dots, \lambda_n)</math> conditioned on <math>N = X_1 + X_2 + \dots X_n</math>.

<math>(X_1, X_2, \dots, X_n) \sim \operatorname{Mult}(N, \lambda_1, \lambda_2, \dots, \lambda_n)</math> conditioned on <math>N = X_1 + X_2 + \dots X_n</math>.

(x _ 1,x _ 2,dots,x _ n) sim 操作员名称{ Mult }(n,λ _ 1,λ _ 2,dots,λ _ n) </math > 取决于 < math > n = x1 + x _ 2 + dots x _ n </math > 。



This means{{r|Mitzenmacher2005|p=101-102}}, among other things, that for any nonnegative function <math>f(x_1,x_2,\dots,x_n)</math>,

This means, among other things, that for any nonnegative function <math>f(x_1,x_2,\dots,x_n)</math>,

这意味着,对于任何非负函数 f (x _ 1,x _ 2,dots,x _ n) </math > ,

if <math>(Y_1, Y_2, \dots, Y_n)\sim\operatorname{Mult}(m, \mathbf{p})</math> is multinomially distributed, then

if <math>(Y_1, Y_2, \dots, Y_n)\sim\operatorname{Mult}(m, \mathbf{p})</math> is multinomially distributed, then

如果[ math ](y _ 1,y _ 2,dots,y _ n) sim 操作符名称{ Mult }(m,mathbf { p }) </math > 是多项式分布,则

:<math>

<math>

《数学》

\operatorname{E}[f(Y_1, Y_2, \dots, Y_n)] \le e\sqrt{m}\operatorname{E}[f(X_1, X_2, \dots, X_n)]

\operatorname{E}[f(Y_1, Y_2, \dots, Y_n)] \le e\sqrt{m}\operatorname{E}[f(X_1, X_2, \dots, X_n)]

操作员名称{ e }[ f (y _ 1,y _ 2,dots,y _ n)] le e sqrt { m }操作员名称{ e }[ f (x _ 1,x _ 2,dots,x _ n)]

</math>

</math>

数学

where <math>(X_1, X_2, \dots, X_n)\sim\operatorname{Pois}(\mathbf{p})</math>.

where <math>(X_1, X_2, \dots, X_n)\sim\operatorname{Pois}(\mathbf{p})</math>.

其中 < math > (x _ 1,x _ 2,dots,x _ n) sim 操作员名称{ Pois }(mathbf { p }) </math > 。



The factor of <math>e\sqrt{m}</math> can be removed if <math>f</math> is further assumed to be monotonically increasing or decreasing.

The factor of <math>e\sqrt{m}</math> can be removed if <math>f</math> is further assumed to be monotonically increasing or decreasing.

如果进一步假定 < math > f </math > 是单调递增或递减的,则可以去掉 < math > e sqrt { m } </math > 的因子。



=== Bivariate Poisson distribution ===

This distribution has been extended to the [[bivariate]] case.{{r|Loukas1986}} The [[generating function]] for this distribution is

This distribution has been extended to the bivariate case. The generating function for this distribution is

这种分布已经扩展到二元情况。这个分布的母函数是

: <math> g( u, v ) = \exp[ ( \theta_1 - \theta_{12} )( u - 1 ) + ( \theta_2 - \theta_{12} )(v - 1) + \theta_{12} ( uv - 1 ) ] </math>

<math> g( u, v ) = \exp[ ( \theta_1 - \theta_{12} )( u - 1 ) + ( \theta_2 - \theta_{12} )(v - 1) + \theta_{12} ( uv - 1 ) ] </math>

< math > g (u,v) = exp [(theta _ 1-theta _ {12})(u-1) + (theta _ 2-theta _ {12})(v-1) + theta _ {12}(uv-1)] </math >



with

with



: <math> \theta_1, \theta_2 > \theta_{ 12 } > 0 \, </math>

<math> \theta_1, \theta_2 > \theta_{ 12 } > 0 \, </math>

1,theta 2,theta {12} > 0,</math >



The marginal distributions are Poisson(''θ''<sub>1</sub>) and Poisson(''θ''<sub>2</sub>) and the correlation coefficient is limited to the range

The marginal distributions are Poisson(θ<sub>1</sub>) and Poisson(θ<sub>2</sub>) and the correlation coefficient is limited to the range

边缘分布为 Poisson (< sub > 1 </sub >)和 Poisson (< sub > 2 </sub >) ,相关系数仅限于一定范围

: <math> 0 \le \rho \le \min\left\{ \frac{ \theta_1 }{ \theta_2 }, \frac{ \theta_2 }{ \theta_1 } \right\}</math>

<math> 0 \le \rho \le \min\left\{ \frac{ \theta_1 }{ \theta_2 }, \frac{ \theta_2 }{ \theta_1 } \right\}</math>

[数学][数学][数学][数学]



A simple way to generate a bivariate Poisson distribution <math>X_1,X_2</math> is to take three independent Poisson distributions <math>Y_1,Y_2,Y_3</math> with means <math>\lambda_1,\lambda_2,\lambda_3</math> and then set <math>X_1 = Y_1 + Y_3,X_2 = Y_2 + Y_3</math>. The probability function of the bivariate Poisson distribution is

A simple way to generate a bivariate Poisson distribution <math>X_1,X_2</math> is to take three independent Poisson distributions <math>Y_1,Y_2,Y_3</math> with means <math>\lambda_1,\lambda_2,\lambda_3</math> and then set <math>X_1 = Y_1 + Y_3,X_2 = Y_2 + Y_3</math>. The probability function of the bivariate Poisson distribution is

一个简单的方法来产生一个二变量的泊松分佈分布: 取3个独立的 Poisson 分布 < math > y _ 1,y _ 2,y _ 3 </math > ,用 < math > lambda _ 1,lambda _ 2,lambda _ 3 </math > 然后设置 < math > x _ 1 = y _ 1 + y _ 3,x _ 2 = y _ 2 + y _ 3 </math > 。二元概率密度函数变量的泊松分佈变量是

: <math>

<math>

《数学》

\begin{align}

\begin{align}

开始{ align }

& \Pr(X_1=k_1,X_2=k_2) \\

& \Pr(X_1=k_1,X_2=k_2) \\

& Pr (x _ 1 = k _ 1,x _ 2 = k _ 2)

= {} & \exp\left(-\lambda_1-\lambda_2-\lambda_3\right) \frac{\lambda_1^{k_1}}{k_1!} \frac{\lambda_2^{k_2}}{k_2!} \sum_{k=0}^{\min(k_1,k_2)} \binom{k_1}{k} \binom{k_2}{k} k! \left( \frac{\lambda_3}{\lambda_1\lambda_2}\right)^k

= {} & \exp\left(-\lambda_1-\lambda_2-\lambda_3\right) \frac{\lambda_1^{k_1}}{k_1!} \frac{\lambda_2^{k_2}}{k_2!} \sum_{k=0}^{\min(k_1,k_2)} \binom{k_1}{k} \binom{k_2}{k} k! \left( \frac{\lambda_3}{\lambda_1\lambda_2}\right)^k

= {} & exp left (- lambda _ 1-lambda _ 2-lambda _ 3 right) frac { lambda _ 1 ^ { k _ 1}{ k _ 1! }2 ^ { k2}{ k2! }{ k = 0} ^ { min (k _ 1,k _ 2)} binom { k _ 1}{ k } binom { k _ 2}{ k } !左(frac { lambda _ 3}{ lambda _ 1 lambda _ 2}右) ^ k

\end{align}

\end{align}

结束{ align }

</math>

</math>

数学



===Free Poisson distribution===

The free Poisson distribution<ref>Free Random Variables by D. Voiculescu, K. Dykema, A. Nica, CRM Monograph Series, American Mathematical Society, Providence RI, 1992</ref> with jump size <math>\alpha</math> and rate <math>\lambda</math> arises in [[free probability]] theory as the limit of repeated [[free convolution]]

The free Poisson distribution with jump size <math>\alpha</math> and rate <math>\lambda</math> arises in free probability theory as the limit of repeated free convolution

带有跳跃大小 < math > alpha </math > 和速率 < math > lambda </math > 的自由泊松分佈作为重复自由卷积的极限在自由概率论中出现



: <math>

<math>

《数学》

\left( \left(1-\frac{\lambda}{N}\right)\delta_0 + \frac{\lambda}{N}\delta_\alpha\right)^{\boxplus N}</math>

\left( \left(1-\frac{\lambda}{N}\right)\delta_0 + \frac{\lambda}{N}\delta_\alpha\right)^{\boxplus N}</math>

左(左(1-frac { lambda }{ n }右) delta _ 0 + frac { lambda }{ n } delta _ alpha 右) ^ { boxplus n } </math >



as ''N''&nbsp;→&nbsp;∞.

as N&nbsp;→&nbsp;∞.

as N&nbsp;→&nbsp;∞.



In other words, let <math>X_N</math> be random variables so that <math>X_N</math> has value <math>\alpha</math> with probability <math>\frac{\lambda}{N}</math> and value 0 with the remaining probability. Assume also that the family <math>X_1,X_2,\ldots</math> are [[free independence|freely independent]]. Then the limit as <math>N\to\infty</math> of the law of <math>X_1+\cdots +X_N</math>

In other words, let <math>X_N</math> be random variables so that <math>X_N</math> has value <math>\alpha</math> with probability <math>\frac{\lambda}{N}</math> and value 0 with the remaining probability. Assume also that the family <math>X_1,X_2,\ldots</math> are freely independent. Then the limit as <math>N\to\infty</math> of the law of <math>X_1+\cdots +X_N</math>

换句话说,让 x _ n </math > 是随机变量,因此 x _ n </math > 具有值 < math > alpha </math > 具有概率 < math > frac { lambda }{ n } </math > ,值0具有剩余的概率。同时假设家庭成员 x1,x2,ldots </math > 是自由独立的。然后将极限值设为 < math > n,以确定 < math > x1 + cdots + xn </math > 的规律

is given by the Free Poisson law with parameters <math>\lambda,\alpha</math>.

is given by the Free Poisson law with parameters <math>\lambda,\alpha</math>.

是由带参数的自由泊松定律给出的。



This definition is analogous to one of the ways in which the classical Poisson distribution is obtained from a (classical) Poisson process.

This definition is analogous to one of the ways in which the classical Poisson distribution is obtained from a (classical) Poisson process.

这个定义类似于从(经典)泊松过程获得经典泊松分佈的一种方法。



The measure associated to the free Poisson law is given by<ref>James A. Mingo, Roland Speicher: Free Probability and Random Matrices. Fields Institute Monographs, Vol. 35, Springer, New York, 2017.</ref>

The measure associated to the free Poisson law is given by

给出了与自由泊松定律相关的测度



:<math>\mu=\begin{cases} (1-\lambda) \delta_0 + \lambda \nu,& \text{if } 0\leq \lambda \leq 1 \\

<math>\mu=\begin{cases} (1-\lambda) \delta_0 + \lambda \nu,& \text{if } 0\leq \lambda \leq 1 \\

< math > mu = begin { cases }(1-lambda) delta _ 0 + lambda nu,& text { if }0 leq lambda leq 1

\nu, & \text{if }\lambda >1,

\nu, & \text{if }\lambda >1,

1,& text { if } lambda > 1,

\end{cases}

\end{cases}

结束{ cases }

</math>

</math>

数学



where

where

在哪里



: <math>\nu = \frac{1}{2\pi\alpha t}\sqrt{4\lambda \alpha^2 - ( t - \alpha (1+\lambda))^2} \, dt</math>

<math>\nu = \frac{1}{2\pi\alpha t}\sqrt{4\lambda \alpha^2 - ( t - \alpha (1+\lambda))^2} \, dt</math>

4 lambda alpha ^ 2-(t-alpha (1 + lambda)) ^ 2} ,dt </math >



and has support <math>[\alpha (1-\sqrt{\lambda})^2,\alpha (1+\sqrt{\lambda})^2]</math>.

and has support <math>[\alpha (1-\sqrt{\lambda})^2,\alpha (1+\sqrt{\lambda})^2]</math>.

并且支持 < math > [ alpha (1-sqrt { lambda }) ^ 2,alpha (1 + sqrt { lambda }) ^ 2] </math > 。



This law also arises in [[random matrix]] theory as the [[Marchenko–Pastur law]]. Its [[free cumulants]] are equal to <math>\kappa_n=\lambda\alpha^n</math>.

This law also arises in random matrix theory as the Marchenko–Pastur law. Its free cumulants are equal to <math>\kappa_n=\lambda\alpha^n</math>.

这个定律也出现在随机矩阵理论中,称为马尔琴科-帕斯图定律。它的自由累积量等于 < math > kappa _ n = lambda alpha ^ n </math > 。



====Some transforms of this law====

We give values of some important transforms of the free Poisson law; the computation can be found in e.g. in the book ''Lectures on the Combinatorics of Free Probability'' by A. Nica and R. Speicher<ref>Lectures on the Combinatorics of Free Probability by A. Nica and R. Speicher, pp. 203–204, Cambridge Univ. Press 2006</ref>

We give values of some important transforms of the free Poisson law; the computation can be found in e.g. in the book Lectures on the Combinatorics of Free Probability by A. Nica and R. Speicher

我们给出了自由泊松定律的一些重要变换的值。在 a. Nica 和 r. Speicher 合著的《自由概率组合学讲座》一书中



The [[R-transform]] of the free Poisson law is given by

The R-transform of the free Poisson law is given by

给出了自由泊松定律的 r- 变换



: <math>R(z)=\frac{\lambda \alpha}{1-\alpha z}. </math>

<math>R(z)=\frac{\lambda \alpha}{1-\alpha z}. </math>

1-alpha z }.数学



The [[Cauchy transform]] (which is the negative of the [[Stieltjes transformation]]) is given by

The Cauchy transform (which is the negative of the Stieltjes transformation) is given by

柯西变换(即 Stieltjes 变换的负变换)是由



: <math>

<math>

《数学》

G(z) = \frac{ z + \alpha - \lambda \alpha - \sqrt{ (z-\alpha (1+\lambda))^2 - 4 \lambda \alpha^2}}{2\alpha z}

G(z) = \frac{ z + \alpha - \lambda \alpha - \sqrt{ (z-\alpha (1+\lambda))^2 - 4 \lambda \alpha^2}}{2\alpha z}

G (z) = frac { z + alpha-lambda alpha-sqrt {(z-alpha (1 + lambda)))) ^ 2-4 lambda alpha ^ 2}{2 alpha z }

</math>

</math>

数学



The [[S-transform]] is given by

The S-transform is given by

给出了 s 变换的一般形式,并给出了 s 变换的一般形式



: <math>

<math>

《数学》

S(z) = \frac{1}{z+\lambda}

S(z) = \frac{1}{z+\lambda}

(z) = frac {1}{ z + lambda }

</math>

</math>

数学



in the case that <math>\alpha=1</math>.

in the case that <math>\alpha=1</math>.

在这种情况下。



== Statistical Inference ==

{{See also|Poisson regression}}



=== Parameter estimation ===

Given a sample of ''n'' measured values <math> k_i \in \{0,1,...\}</math>, for ''i''&nbsp;=&nbsp;1,&nbsp;...,&nbsp;''n'', we wish to estimate the value of the parameter ''λ'' of the Poisson population from which the sample was drawn. The [[maximum likelihood]] estimate is <ref>{{cite web |last=Paszek|first=Ewa |title=Maximum Likelihood Estimation – Examples |url = http://cnx.org/content/m13500/latest/?collection=col10343/latest}}</ref>

Given a sample of n measured values <math> k_i \in \{0,1,...\}</math>, for i&nbsp;=&nbsp;1,&nbsp;...,&nbsp;n, we wish to estimate the value of the parameter λ of the Poisson population from which the sample was drawn. The maximum likelihood estimate is

给定一个 n 个测量值的样本{0,1,... } </math > ,对于 i = 1,... ,n,我们希望估计取样的泊松总体参数的值。最大似然估计是



: <math>\widehat{\lambda}_\mathrm{MLE}=\frac{1}{n}\sum_{i=1}^n k_i. \!</math>

<math>\widehat{\lambda}_\mathrm{MLE}=\frac{1}{n}\sum_{i=1}^n k_i. \!</math>

1}{ n } sum { i = 1} ^ n ki.! 数学



Since each observation has expectation λ so does the sample mean. Therefore, the maximum likelihood estimate is an [[unbiased estimator]] of λ. It is also an efficient estimator since its variance achieves the [[Cramér–Rao lower bound]] (CRLB).{{Citation needed|date=April 2012}} Hence it is [[Minimum-variance unbiased estimator|minimum-variance unbiased]]. Also it can be proven that the sum (and hence the sample mean as it is a one-to-one function of the sum) is a complete and sufficient statistic for λ.

Since each observation has expectation λ so does the sample mean. Therefore, the maximum likelihood estimate is an unbiased estimator of λ. It is also an efficient estimator since its variance achieves the Cramér–Rao lower bound (CRLB). Hence it is minimum-variance unbiased. Also it can be proven that the sum (and hence the sample mean as it is a one-to-one function of the sum) is a complete and sufficient statistic for λ.

因为每个观测值都有期望值,所以样本的意义也是如此。因此,最大似然估计是。由于其方差达到了 CRLB 下界,因此它也是一个有效的估计量。因此它是最小方差无偏的。也可以证明和(因此样本平均值是和的单射)是一个完整充分的统计量。



To prove sufficiency we may use the [[Sufficient statistic|factorization theorem]]. Consider partitioning the probability mass function of the joint Poisson distribution for the sample into two parts: one that depends solely on the sample <math>\mathbf{x}</math> (called <math>h(\mathbf{x})</math>) and one that depends on the parameter <math>\lambda</math> and the sample <math>\mathbf{x}</math> only through the function <math>T(\mathbf{x})</math>. Then <math>T(\mathbf{x})</math> is a sufficient statistic for <math>\lambda</math>.

To prove sufficiency we may use the factorization theorem. Consider partitioning the probability mass function of the joint Poisson distribution for the sample into two parts: one that depends solely on the sample <math>\mathbf{x}</math> (called <math>h(\mathbf{x})</math>) and one that depends on the parameter <math>\lambda</math> and the sample <math>\mathbf{x}</math> only through the function <math>T(\mathbf{x})</math>. Then <math>T(\mathbf{x})</math> is a sufficient statistic for <math>\lambda</math>.

为了证明充分性,我们可以用因子分解定理。考虑将样本的关节概率质量函数分成两部分: 一部分仅依赖于样本 < math > mathbf { x } </math > (称为 < math > h (mathbf { x }) </math >) ,另一部分依赖于参数 < math > lambda </math > 和样本 < math > mathbf { x } </math > 只通过函数 math < t (mathbf { x }) </math > 。那么 < math > t (mathbf { x }) </math > 就是 < math > lambda </math > 的一个充分的统计量。



: <math> P(\mathbf{x})=\prod_{i=1}^n\frac{\lambda^{x_i} e^{-\lambda}}{x_i!}=\frac{1}{\prod_{i=1}^n x_i!} \times \lambda^{\sum_{i=1}^n x_i}e^{-n\lambda} </math>

<math> P(\mathbf{x})=\prod_{i=1}^n\frac{\lambda^{x_i} e^{-\lambda}}{x_i!}=\frac{1}{\prod_{i=1}^n x_i!} \times \lambda^{\sum_{i=1}^n x_i}e^{-n\lambda} </math>

= prod _ { i = 1} ^ n frac { lambda ^ { x _ i } e ^ {-lambda }{ x _ i!1}{ prod { i = 1} ^ n x i! }乘以 lambda ^ { sum { i = 1} ^ n xi } e ^ {-n lambda } </math >



The first term, <math>h(\mathbf{x})</math>, depends only on <math>\mathbf{x}</math>. The second term, <math>g(T(\mathbf{x})|\lambda)</math>, depends on the sample only through <math>T(\mathbf{x})=\sum_{i=1}^nx_i</math>. Thus, <math>T(\mathbf{x})</math> is sufficient.

The first term, <math>h(\mathbf{x})</math>, depends only on <math>\mathbf{x}</math>. The second term, <math>g(T(\mathbf{x})|\lambda)</math>, depends on the sample only through <math>T(\mathbf{x})=\sum_{i=1}^nx_i</math>. Thus, <math>T(\mathbf{x})</math> is sufficient.

第一个学期,< math > h (mathbf { x }) </math > ,仅依赖于 < math > mathbf { x } </math > 。第二个术语,< math > g (t (mathbf { x }) | lambda) </math > ,仅通过 < math > t (mathbf { x }) = sum { i = 1} ^ nx _ i </math > 取决于样本。因此,< math > t (mathbf { x }) </math > 就足够了。



To find the parameter λ that maximizes the probability function for the Poisson population, we can use the logarithm of the likelihood function:

To find the parameter λ that maximizes the probability function for the Poisson population, we can use the logarithm of the likelihood function:

为了找到泊松种群概率密度函数最大的参数,我们可以使用似然函数的对数:



: <math> \begin{align} \ell(\lambda) & = \ln \prod_{i=1}^n f(k_i \mid \lambda) \\ & = \sum_{i=1}^n \ln\!\left(\frac{e^{-\lambda}\lambda^{k_i}}{k_i!}\right) \\ & = -n\lambda + \left(\sum_{i=1}^n k_i\right) \ln(\lambda) - \sum_{i=1}^n \ln(k_i!). \end{align} </math>

<math> \begin{align} \ell(\lambda) & = \ln \prod_{i=1}^n f(k_i \mid \lambda) \\ & = \sum_{i=1}^n \ln\!\left(\frac{e^{-\lambda}\lambda^{k_i}}{k_i!}\right) \\ & = -n\lambda + \left(\sum_{i=1}^n k_i\right) \ln(\lambda) - \sum_{i=1}^n \ln(k_i!). \end{align} </math>

{ align } ell (lambda) & = ln prod { i = 1} ^ n f (k _ i mid lambda) & = sum { i = 1} ^ n ln! left (frac { e ^ {-lambda } lambda ^ { k _ i }{ k _ i!} right) & =-n lambda + left (sum _ { i = 1} ^ n k _ i right) ln (lambda)-sum _ { i = 1} ^ n ln (k _ i!).结束{ align } </math >



We take the derivative of ''<math>\ell</math>'' with respect to ''λ'' and compare it to zero:

We take the derivative of <math>\ell</math> with respect to λ and compare it to zero:

我们对 < math > 求导,然后将其与零进行比较:



: <math>\frac{\mathrm{d}}{\mathrm{d}\lambda} \ell(\lambda) = 0 \iff -n + \left(\sum_{i=1}^n k_i\right) \frac{1}{\lambda} = 0. \!</math>

<math>\frac{\mathrm{d}}{\mathrm{d}\lambda} \ell(\lambda) = 0 \iff -n + \left(\sum_{i=1}^n k_i\right) \frac{1}{\lambda} = 0. \!</math>

1}{ lambda } = 0 iff-n + left (sum { i = 1} ^ n k i right) frac {1}{ lambda } = 0.! 数学



Solving for ''λ'' gives a stationary point.

Solving for λ gives a stationary point.

解决问题会给你带来驻点。



: <math> \lambda = \frac{\sum_{i=1}^n k_i}{n}</math>

<math> \lambda = \frac{\sum_{i=1}^n k_i}{n}</math>

{ math > lambda = frac { sum { i = 1} ^ n k _ i }{ n } </math >



So ''λ'' is the average of the ''k''<sub>''i''</sub> values. Obtaining the sign of the second derivative of ''L'' at the stationary point will determine what kind of extreme value ''λ'' is.

So λ is the average of the k<sub>i</sub> values. Obtaining the sign of the second derivative of L at the stationary point will determine what kind of extreme value λ is.

K < sub > i </sub > 值的平均值也是如此。在驻点得到 l 的二阶导数的符号将决定什么是极值。



: <math>\frac{\partial^2 \ell}{\partial \lambda^2} = -\lambda^{-2}\sum_{i=1}^n k_i </math>

<math>\frac{\partial^2 \ell}{\partial \lambda^2} = -\lambda^{-2}\sum_{i=1}^n k_i </math>

{ partial ^ 2 ell }{ partial lambda ^ 2} =-lambda ^ {-2} sum { i = 1} ^ n k i </math >



Evaluating the second derivative ''at the stationary point'' gives:

Evaluating the second derivative at the stationary point gives:

在驻点对二阶导数进行评估得出:



: <math>\frac{\partial^2 \ell}{\partial \lambda^2} = - \frac{n^2}{\sum_{i=1}^n k_i} </math>

<math>\frac{\partial^2 \ell}{\partial \lambda^2} = - \frac{n^2}{\sum_{i=1}^n k_i} </math>

{ partial ^ 2 ell }{ partial lambda ^ 2} =-frac { n ^ 2}{ sum { i = 1} ^ n ki } </math >



which is the negative of ''n'' times the reciprocal of the average of the k<sub>i</sub>. This expression is negative when the average is positive. If this is satisfied, then the stationary point maximizes the probability function.

which is the negative of n times the reciprocal of the average of the k<sub>i</sub>. This expression is negative when the average is positive. If this is satisfied, then the stationary point maximizes the probability function.

它是 n 乘以 k < sub > i </sub > 平均值的倒数。当平均数为正时,这个表达式是负的。如果这一点得到了满足,那么驻点最大化了概率密度函数。



For [[Completeness (statistics)|completeness]], a family of distributions is said to be complete if and only if <math> E(g(T)) = 0</math> implies that <math>P_\lambda(g(T) = 0) = 1</math> for all <math>\lambda</math>. If the individual <math>X_i</math> are iid <math>\mathrm{Po}(\lambda)</math>, then <math>T(\mathbf{x})=\sum_{i=1}^nX_i\sim \mathrm{Po}(n\lambda)</math>. Knowing the distribution we want to investigate, it is easy to see that the statistic is complete.

For completeness, a family of distributions is said to be complete if and only if <math> E(g(T)) = 0</math> implies that <math>P_\lambda(g(T) = 0) = 1</math> for all <math>\lambda</math>. If the individual <math>X_i</math> are iid <math>\mathrm{Po}(\lambda)</math>, then <math>T(\mathbf{x})=\sum_{i=1}^nX_i\sim \mathrm{Po}(n\lambda)</math>. Knowing the distribution we want to investigate, it is easy to see that the statistic is complete.

为了完整性起见,一个分布族被认为是完整的,当且仅当 < math > e (g (t)) = 0 </math > 意味着 < math > p lambda (g (t) = 0) = 1 </math > 对于所有 < math > λ </math > 。如果个体 < math > xi </math > 是 < math > mathrm { Po }(λ) </math > ,那么 < math > t (mathbf { x }) = sum { i = 1} ^ nX _ i sim mathrm { Po }(n lambda) </math > 。了解了我们要调查的分布情况后,很容易看出统计数据是完整的。



: <math>E(g(T))=\sum_{t=0}^\infty g(t)\frac{(n\lambda)^te^{-n\lambda}}{t!}=0</math>

<math>E(g(T))=\sum_{t=0}^\infty g(t)\frac{(n\lambda)^te^{-n\lambda}}{t!}=0</math>

< math > e (g (t)) = sum { t = 0} ^ infty g (t) frac {(n lambda) ^ te ^ {-n lambda }{ t!0 </math >



For this equality to hold, <math>g(t)</math> must be 0. This follows from the fact that none of the other terms will be 0 for all <math>t</math> in the sum and for all possible values of <math>\lambda</math>. Hence, <math> E(g(T)) = 0</math> for all <math>\lambda</math> implies that <math>P_\lambda(g(T) = 0) = 1</math>, and the statistic has been shown to be complete.

For this equality to hold, <math>g(t)</math> must be 0. This follows from the fact that none of the other terms will be 0 for all <math>t</math> in the sum and for all possible values of <math>\lambda</math>. Hence, <math> E(g(T)) = 0</math> for all <math>\lambda</math> implies that <math>P_\lambda(g(T) = 0) = 1</math>, and the statistic has been shown to be complete.

要保持这个等式,< math > g (t) </math > 必须为0。这源于这样一个事实: 对于所有 < math > t </math > 的和和以及 < math > > lambda </math > 的所有可能值,其他项都不会为0。因此,e (g (t)) = 0 </math > > lambda </math > 意味着 < math > p _ lambda (g (t) = 0 = 1 </math > ,统计已被证明是完整的。



=== Confidence interval ===

The [[confidence interval]] for the mean of a Poisson distribution can be expressed using the relationship between the cumulative distribution functions of the Poisson and [[chi-squared distribution]]s. The chi-squared distribution is itself closely related to the [[gamma distribution]], and this leads to an alternative expression. Given an observation ''k'' from a Poisson distribution with mean ''μ'', a confidence interval for ''μ'' with confidence level {{math|1 – α}} is

The confidence interval for the mean of a Poisson distribution can be expressed using the relationship between the cumulative distribution functions of the Poisson and chi-squared distributions. The chi-squared distribution is itself closely related to the gamma distribution, and this leads to an alternative expression. Given an observation k from a Poisson distribution with mean μ, a confidence interval for μ with confidence level is

置信区间的平均泊松分佈可以用泊松分布和卡方分布的累积分布函数之间的关系来表示。卡方分布本身与伽玛分布密切相关,这导致了另一种表达方式。给定一个来自平均泊松分佈的观测值 k,一个带有置信水平的置信区间是



:<math>\tfrac 12\chi^{2}(\alpha/2; 2k) \le \mu \le \tfrac 12 \chi^{2}(1-\alpha/2; 2k+2), </math>

<math>\tfrac 12\chi^{2}(\alpha/2; 2k) \le \mu \le \tfrac 12 \chi^{2}(1-\alpha/2; 2k+2), </math>

12 chi ^ {2}(alpha/2; 2k) le mu le tfrac 12 chi ^ {2}(1-alpha/2; 2k + 2) ,</math >



or equivalently,

or equivalently,

或者等价,



:<math>F^{-1}(\alpha/2; k,1) \le \mu \le F^{-1}(1-\alpha/2; k+1,1),</math>

<math>F^{-1}(\alpha/2; k,1) \le \mu \le F^{-1}(1-\alpha/2; k+1,1),</math>

(1-alpha/2; k,1) le mu le f ^ {-1}(1-alpha/2; k + 1,1) ,</math >



where <math>\chi^{2}(p;n)</math> is the [[quantile function]] (corresponding to a lower tail area ''p'') of the chi-squared distribution with ''n'' degrees of freedom and <math>F^{-1}(p;n,1)</math> is the quantile function of a [[gamma distribution]] with shape parameter n and scale parameter 1.{{r|Johnson2005|p=176-178|Garwood1936}} This interval is '[[exact statistics|exact]]' in the sense that its [[coverage probability]] is never less than the nominal {{math|1 – α}}.

where <math>\chi^{2}(p;n)</math> is the quantile function (corresponding to a lower tail area p) of the chi-squared distribution with n degrees of freedom and <math>F^{-1}(p;n,1)</math> is the quantile function of a gamma distribution with shape parameter n and scale parameter 1. This interval is 'exact' in the sense that its coverage probability is never less than the nominal .

其中 < math > chi ^ {2}(p; n) </math > 是 n 个自由度的分位函数,< math > f ^ {-1}(p; n,1) </math > 是形状参数 n 和尺度参数1的卡方分布的分位函数。这个时间间隔是“精确的” ,因为它的覆盖概率从来没有小于名义值。



When quantiles of the gamma distribution are not available, an accurate approximation to this exact interval has been proposed (based on the [[Chi-square distribution#Asymptotic properties|Wilson–Hilferty transformation]]):{{r|Breslow1987}}

When quantiles of the gamma distribution are not available, an accurate approximation to this exact interval has been proposed (based on the Wilson–Hilferty transformation):

当伽玛分布的分位数不可用时,对这个精确区间提出了精确的近似(基于 Wilson-Hilferty 变换) :

:<math>k \left( 1 - \frac{1}{9k} - \frac{z_{\alpha/2}}{3\sqrt{k}}\right)^3 \le \mu \le (k+1) \left( 1 - \frac{1}{9(k+1)} + \frac{z_{\alpha/2}}{3\sqrt{k+1}}\right)^3, </math>

<math>k \left( 1 - \frac{1}{9k} - \frac{z_{\alpha/2}}{3\sqrt{k}}\right)^3 \le \mu \le (k+1) \left( 1 - \frac{1}{9(k+1)} + \frac{z_{\alpha/2}}{3\sqrt{k+1}}\right)^3, </math>

< math > k left (1-frac {1}{9k }-frac { z _ { alpha/2}{3 sqrt { k }右) ^ 3 le mu le (k + 1) left (1-frac {1}{9(k + 1)}} + frac { z _ { alpha/2}{3 sqrt { k + 1}右) ^ 3,</math >

where <math>z_{\alpha/2}</math> denotes the [[standard normal deviate]] with upper tail area {{math|α / 2}}.

where <math>z_{\alpha/2}</math> denotes the standard normal deviate with upper tail area .

其中 < math > z _ { alpha/2} </math > 表示标准的正常偏差和上尾区。



For application of these formulae in the same context as above (given a sample of ''n'' measured values ''k''<sub>''i''</sub> each drawn from a Poisson distribution with mean ''λ''), one would set

For application of these formulae in the same context as above (given a sample of n measured values k<sub>i</sub> each drawn from a Poisson distribution with mean λ), one would set

为了在与上述相同的上下文中应用这些公式(给定一个 n 个测量值 k < sub > i </sub > 每个取自一个泊松分佈的平均值) ,我们将设置



:<math>k=\sum_{i=1}^n k_i ,\!</math>

<math>k=\sum_{i=1}^n k_i ,\!</math>

[ math > k = sum { i = 1} ^ n k _ i,

calculate an interval for ''μ''&nbsp;=&nbsp;''nλ'', and then derive the interval for ''λ''.

calculate an interval for μ&nbsp;=&nbsp;nλ, and then derive the interval for λ.

计算 = n 的区间,然后推导出。



=== Bayesian inference ===

In [[Bayesian inference]], the [[conjugate prior]] for the rate parameter ''λ'' of the Poisson distribution is the [[gamma distribution]].{{r|Fink1976}} Let

In Bayesian inference, the conjugate prior for the rate parameter λ of the Poisson distribution is the gamma distribution. Let

在贝叶斯推断中,泊松分佈的速率参数的共轭先验是伽玛分布。让



:<math>\lambda \sim \mathrm{Gamma}(\alpha, \beta) \!</math>

<math>\lambda \sim \mathrm{Gamma}(\alpha, \beta) \!</math>

{ Gamma }(alpha,beta)



denote that ''λ'' is distributed according to the gamma [[probability density function|density]] ''g'' parameterized in terms of a [[shape parameter]] ''α'' and an inverse [[scale parameter]] ''β'':

denote that λ is distributed according to the gamma density g parameterized in terms of a shape parameter α and an inverse scale parameter β:

表示根据以形状参数和反比例尺参数表示的伽马密度 g 分布:



:<math> g(\lambda \mid \alpha,\beta) = \frac{\beta^{\alpha}}{\Gamma(\alpha)} \; \lambda^{\alpha-1} \; e^{-\beta\,\lambda} \qquad \text{ for } \lambda>0 \,\!.</math>

<math> g(\lambda \mid \alpha,\beta) = \frac{\beta^{\alpha}}{\Gamma(\alpha)} \; \lambda^{\alpha-1} \; e^{-\beta\,\lambda} \qquad \text{ for } \lambda>0 \,\!.</math>

G (lambda mid alpha,beta) = frac { beta ^ { alpha }{ Gamma (alpha)} ; lambda ^ { alpha-1} ; e ^ {-beta,lambda } qquad text { for } lambda > 0,! </math >



Then, given the same sample of ''n'' measured values ''k''<sub>''i''</sub> [[Poisson distribution#Maximum likelihood|as before]], and a prior of Gamma(''α'', ''β''), the posterior distribution is

Then, given the same sample of n measured values k<sub>i</sub> as before, and a prior of Gamma(α, β), the posterior distribution is

然后,给定相同的样本 n 测量值 k < sub > i </sub > 和之前的 Gamma (,) ,后验概率是



:<math>\lambda \sim \mathrm{Gamma}\left(\alpha + \sum_{i=1}^n k_i, \beta + n\right). \!</math>

<math>\lambda \sim \mathrm{Gamma}\left(\alpha + \sum_{i=1}^n k_i, \beta + n\right). \!</math>

左(alpha + sum _ { i = 1} ^ n k _ i,beta + n right)。! 数学



The posterior mean E[''λ''] approaches the maximum likelihood estimate <math>\widehat{\lambda}_\mathrm{MLE}</math> in the limit as <math>\alpha\to 0,\ \beta\to 0</math>, which follows immediately from the general expression of the mean of the [[gamma distribution]].

The posterior mean E[λ] approaches the maximum likelihood estimate <math>\widehat{\lambda}_\mathrm{MLE}</math> in the limit as <math>\alpha\to 0,\ \beta\to 0</math>, which follows immediately from the general expression of the mean of the gamma distribution.

后验平均值 e []接近极限中的最大似然估计 < math > lambda } _ mathrm { MLE } </math > ,它紧跟在伽玛分布平均值的一般表达式之后。



The [[posterior predictive distribution]] for a single additional observation is a [[negative binomial distribution]],{{r|Gelman2003|p=53}} sometimes called a gamma–Poisson distribution.

The posterior predictive distribution for a single additional observation is a negative binomial distribution, sometimes called a gamma–Poisson distribution.

单一额外观察的后验预测分布是负二项分布,有时称为泊松分佈。



=== Simultaneous estimation of multiple Poisson means ===

Suppose <math>X_1, X_2, \dots, X_p</math> is a set of independent random variables from a set of <math>p</math> Poisson distributions, each with a parameter <math>\lambda_i</math>, <math>i=1,\dots,p</math>, and we would like to estimate these parameters. Then, Clevenson and Zidek show that under the normalized squared error loss <math>L(\lambda,{\hat \lambda})=\sum_{i=1}^p \lambda_i^{-1} ({\hat \lambda}_i-\lambda_i)^2</math>, when <math>p>1</math>, then, similar as in [[Stein's example]] for the Normal means, the MLE estimator <math>{\hat \lambda}_i = X_i</math> is [[Admissible decision rule|inadmissible]]. {{r|Clevenson1975}}

Suppose <math>X_1, X_2, \dots, X_p</math> is a set of independent random variables from a set of <math>p</math> Poisson distributions, each with a parameter <math>\lambda_i</math>, <math>i=1,\dots,p</math>, and we would like to estimate these parameters. Then, Clevenson and Zidek show that under the normalized squared error loss <math>L(\lambda,{\hat \lambda})=\sum_{i=1}^p \lambda_i^{-1} ({\hat \lambda}_i-\lambda_i)^2</math>, when <math>p>1</math>, then, similar as in Stein's example for the Normal means, the MLE estimator <math>{\hat \lambda}_i = X_i</math> is inadmissible.

假设 x _ 1,x _ 2,点,x _ p </math > 是一组来自一组 p </math > 泊松分布的独立随机变量,每个分布都有一个参数 λ < math > i </math > ,< math > i = 1,点,p </math > ,我们想估计这些参数。然后,clevensen 和 Zidek 证明,在归一化的平方误差损失 < math > l (lambda,{ hat lambda }) = sum { i = 1} ^ p lambda i ^ {-1}({ hat lambda } i-lambda _ i) ^ 2 </math > 下,当 < math > p > 1 </math > ,那么,类似于 Stein 的例子中的正态方法,MLE < math > lambda } i = xi </math > 是不允许的。



In this case, a family of [[minimax estimator]]s is given for any <math>0 < c \leq 2(p-1)</math> and <math>b \geq (p-2+p^{-1})</math> as{{r|Berger1985}}

In this case, a family of minimax estimators is given for any <math>0 < c \leq 2(p-1)</math> and <math>b \geq (p-2+p^{-1})</math> as

在这种情况下,对于任意 < math > 0 < c leq 2(p-1) </math > 和 < math > b geq (p-2 + p ^ {-1}) </math > ,给出了极大极小估计族

:<math>{\hat \lambda}_i = \left(1 - \frac{c}{b + \sum_{i=1}^p X_i}\right) X_i, \qquad i=1,\dots,p.</math>

<math>{\hat \lambda}_i = \left(1 - \frac{c}{b + \sum_{i=1}^p X_i}\right) X_i, \qquad i=1,\dots,p.</math>

{ hat lambda } _ i = left (1-frac { c }{ b + sum { i = 1} ^ p x _ i } right) xi,qquad i = 1,dots,p </math >



== Occurrence and applications ==

{{More citations needed|date=December 2019}}



Applications of the Poisson distribution can be found in many fields including:{{r|Rasch1963}}

Applications of the Poisson distribution can be found in many fields including:

泊松分佈的应用可以在很多领域找到,包括:

* [[Telecommunication]] example: telephone calls arriving in a system.

* [[Astronomy]] example: photons arriving at a telescope.

* [[Chemistry]] example: the [[molar mass distribution]] of a [[living polymerization]].{{r|Flory1940}}

* [[Biology]] example: the number of mutations on a strand of [[DNA]] per unit length.

* [[Management]] example: customers arriving at a counter or call centre.

* [[Finance and insurance]] example: number of losses or claims occurring in a given period of time.

* [[Earthquake seismology]] example: an asymptotic Poisson model of seismic risk for large earthquakes.{{r|Lomnitz1994|p=}}

* [[Radioactivity]] example: number of decays in a given time interval in a radioactive sample.

* [[Optics]] example: the number of photons emitted in a single laser pulse. This is a major vulnerability to most [[Quantum key distribution]] protocols known as Photon Number Splitting (PNS).



The Poisson distribution arises in connection with Poisson processes. It applies to various phenomena of discrete properties (that is, those that may happen 0, 1, 2, 3, ... times during a given period of time or in a given area) whenever the probability of the phenomenon happening is constant in time or [[space]]. Examples of events that may be modelled as a Poisson distribution include:

The Poisson distribution arises in connection with Poisson processes. It applies to various phenomena of discrete properties (that is, those that may happen 0, 1, 2, 3, ... times during a given period of time or in a given area) whenever the probability of the phenomenon happening is constant in time or space. Examples of events that may be modelled as a Poisson distribution include:

泊松分佈过程与泊松过程有关。它适用于各种离散性质的现象(也就是说,那些可能发生0,1,2,3,... 在给定时间内或在给定区域) ,只要现象发生的概率在时间或空间上是常数。可以被模仿为泊松分佈的活动包括:

<!--

<!--

<!--

Surely there are enough examples in this list now! However, not enough with citations

Surely there are enough examples in this list now! However, not enough with citations

现在在这个列表中已经有足够的例子了!然而,引用还不够吗

-->

-->

-->

* The number of soldiers killed by horse-kicks each year in each corps in the [[Prussia]]n cavalry. This example was used in a book by [[Ladislaus Bortkiewicz]] (1868–1931).{{r|vonBortkiewitsch1898|p=23-25}}

* The number of yeast cells used when brewing [[Guinness]] beer. This example was used by [[William Sealy Gosset]] (1876–1937).{{r|Student1907}}{{r|Boland1984}}

* The number of phone calls arriving at a [[call centre]] within a minute. This example was described by [[Agner Krarup Erlang|A.K. Erlang]] (1878–1929).{{r|Erlang1909}}

* Internet traffic.

* The number of goals in sports involving two competing teams.{{r|Hornby2014}}

* The number of deaths per year in a given age group.

* The number of jumps in a stock price in a given time interval.

* Under an assumption of [[Poisson process#Homogeneous|homogeneity]], the number of times a [[web server]] is accessed per minute.

* The number of [[mutation]]s in a given stretch of [[DNA]] after a certain amount of radiation.

* The proportion of [[cells (biology)|cells]] that will be infected at a given [[multiplicity of infection]].

* The number of bacteria in a certain amount of liquid.{{r|Koyama2016}}

* The arrival of [[photons]] on a pixel circuit at a given illumination and over a given time period.

* The targeting of [[V-1 flying bomb]]s on London during World War II investigated by R. D. Clarke in 1946.{{r|Clarke1946}}

[[Patrick X. Gallagher|Gallagher]] showed in 1976 that the counts of [[prime number]]s in short intervals obey a Poisson distribution{{r|Gallagher1976}} provided a certain version of the unproved [[Second Hardy–Littlewood conjecture|prime r-tuple conjecture of Hardy-Littlewood]]{{r|Hardy1923}} is true.

Gallagher showed in 1976 that the counts of prime numbers in short intervals obey a Poisson distribution provided a certain version of the unproved prime r-tuple conjecture of Hardy-Littlewood is true.

1976年,加拉格尔指出,只要未经证明的 Hardy-Littlewood 素数 r-tuple 猜想在某种程度上是正确的,那么短时间内的素数计数就服从一个泊松分佈。



{{Anchor|law of rare events}}



=== Law of rare events ===

{{main|Poisson limit theorem}}

[[File:Binomial versus poisson.svg|right|upright=1.5|thumb |Comparison of the Poisson distribution (black lines) and the [[binomial distribution]] with ''n''&nbsp;=&nbsp;10 (red circles), ''n''&nbsp;=&nbsp;20 (blue circles), ''n''&nbsp;=&nbsp;1000 (green circles). All distributions have a mean of&nbsp;5. The horizontal axis shows the number of events&nbsp;''k''. As ''n'' gets larger, the Poisson distribution becomes an increasingly better approximation for the binomial distribution with the same mean.]]

Comparison of the Poisson distribution (black lines) and the [[binomial distribution with n&nbsp;=&nbsp;10 (red circles), n&nbsp;=&nbsp;20 (blue circles), n&nbsp;=&nbsp;1000 (green circles). All distributions have a mean of&nbsp;5. The horizontal axis shows the number of events&nbsp;k. As n gets larger, the Poisson distribution becomes an increasingly better approximation for the binomial distribution with the same mean.]]

泊松分佈(黑线)与[二项分布(红圈) ,n = 20(蓝圈) ,n = 1000(绿圈)的比较。所有分布的平均值都是5。水平轴显示事件的数量 k。随着 n 变得越来越大,泊松分佈变成了一个越来越好的平均二项分布。]



The rate of an event is related to the probability of an event occurring in some small subinterval (of time, space or otherwise). In the case of the Poisson distribution, one assumes that there exists a small enough subinterval for which the probability of an event occurring twice is "negligible". With this assumption one can derive the Poisson distribution from the Binomial one, given only the information of expected number of total events in the whole interval. Let this total number be <math>\lambda</math>. Divide the whole interval into <math>n</math> subintervals <math>I_1,\dots,I_n</math> of equal size, such that <math>n</math> > <math>\lambda</math> (since we are interested in only very small portions of the interval this assumption is meaningful). This means that the expected number of events in an interval <math>I_i</math> for each <math>i</math> is equal to <math>\lambda/n</math>. Now we assume that the occurrence of an event in the whole interval can be seen as a [[Bernoulli trial]], where the <math>i^{th}</math> trial corresponds to looking whether an event happens at the subinterval <math>I_i</math> with probability <math>\lambda/n</math>. The expected number of total events in <math>n</math> such trials would be <math>\lambda</math>, the expected number of total events in the whole interval. Hence for each subdivision of the interval we have approximated the occurrence of the event as a Bernoulli process of the form <math>\textrm{B}(n,\lambda/n)</math>. As we have noted before we want to consider only very small subintervals. Therefore, we take the limit as <math>n</math> goes to infinity.

The rate of an event is related to the probability of an event occurring in some small subinterval (of time, space or otherwise). In the case of the Poisson distribution, one assumes that there exists a small enough subinterval for which the probability of an event occurring twice is "negligible". With this assumption one can derive the Poisson distribution from the Binomial one, given only the information of expected number of total events in the whole interval. Let this total number be <math>\lambda</math>. Divide the whole interval into <math>n</math> subintervals <math>I_1,\dots,I_n</math> of equal size, such that <math>n</math> > <math>\lambda</math> (since we are interested in only very small portions of the interval this assumption is meaningful). This means that the expected number of events in an interval <math>I_i</math> for each <math>i</math> is equal to <math>\lambda/n</math>. Now we assume that the occurrence of an event in the whole interval can be seen as a Bernoulli trial, where the <math>i^{th}</math> trial corresponds to looking whether an event happens at the subinterval <math>I_i</math> with probability <math>\lambda/n</math>. The expected number of total events in <math>n</math> such trials would be <math>\lambda</math>, the expected number of total events in the whole interval. Hence for each subdivision of the interval we have approximated the occurrence of the event as a Bernoulli process of the form <math>\textrm{B}(n,\lambda/n)</math>. As we have noted before we want to consider only very small subintervals. Therefore, we take the limit as <math>n</math> goes to infinity.

事件的发生率与事件发生在某个小的子间隔(时间、空间或其他)的概率有关。在泊松分佈的例子中,我们假设存在一个足够小的子区间,其中一个事件发生两次的概率是“可以忽略的”。有了这个假设,我们就可以从二项式中推导出泊松分佈,只需要给出整个时间间隔内预期的事件总数的信息。设这个总数是 < math > > lambda </math > 。将整个区间分为 < math > n </math > 子区间 < math > i _ 1,点,i _ n </math > 大小相等,这样 < math > n </math > < math > lambda </math > (因为我们只对区间的很小一部分感兴趣,所以这个假设是有意义的)。这意味着每个 < math > i </math > 中期望的事件数等于 < math > lambda/n </math > 。现在,我们假设一个事件在整个时间间隔内的发生可以被看作是伯努利试验,其中,“ math”试验对应于观察一个事件是否在子时间间隔内发生。在 < math > n </math > 这样的试验中预期的总事件数是 < math > lambda </math > ,这是整个间隔中预期的总事件数。因此,对于区间的每一个细分,我们都近似地将事件的发生作为形式 < math > textrm { b }(n,lambda/n) </math > 的伯努利过程。正如我们之前指出的,我们只想考虑非常小的子区间。因此,我们将极限取为 < math > n </math > 到无穷大。

In this case the binomial distribution converges to what is known as the Poisson distribution by the [[Poisson limit theorem]].

In this case the binomial distribution converges to what is known as the Poisson distribution by the Poisson limit theorem.

在这种情况下,二项分布收敛于泊松极限定理所称的泊松分佈。



In several of the above examples—such as, the number of mutations in a given sequence of DNA—the events being counted are actually the outcomes of discrete trials, and would more precisely be modelled using the [[binomial distribution]], that is

In several of the above examples—such as, the number of mutations in a given sequence of DNA—the events being counted are actually the outcomes of discrete trials, and would more precisely be modelled using the binomial distribution, that is

在上面的几个例子中---- 例如,一个给定序列的 dna 突变的数量---- 被计算的事件实际上是离散试验的结果,也就是说,更准确地说,是用二项分布模型来模拟的



:<math>X \sim \textrm{B}(n,p). \,</math>

<math>X \sim \textrm{B}(n,p). \,</math>

X sim textrm { b }(n,p).,math



In such cases ''n'' is very large and ''p'' is very small (and so the expectation ''np'' is of intermediate magnitude). Then the distribution may be approximated by the less cumbersome Poisson distribution{{Citation needed|date=April 2012}}

In such cases n is very large and p is very small (and so the expectation np is of intermediate magnitude). Then the distribution may be approximated by the less cumbersome Poisson distribution

在这种情况下,n 是非常大的,p 是非常小的(所以期望 np 是中等大小)。然后,分布可以近似于不那么麻烦的泊松分佈



:<math>X \sim \textrm{Pois}(np). \,</math>

<math>X \sim \textrm{Pois}(np). \,</math>

<math>X \sim \textrm{Pois}(np).,math



This approximation is sometimes known as the ''law of rare events'',{{r|Cameron1998|p=5}}since each of the ''n'' individual [[Bernoulli distribution|Bernoulli events]] rarely occurs. The name may be misleading because the total count of success events in a Poisson process need not be rare if the parameter ''np'' is not small. For example, the number of telephone calls to a busy switchboard in one hour follows a Poisson distribution with the events appearing frequent to the operator, but they are rare from the point of view of the average member of the population who is very unlikely to make a call to that switchboard in that hour.

This approximation is sometimes known as the law of rare events,since each of the n individual Bernoulli events rarely occurs. The name may be misleading because the total count of success events in a Poisson process need not be rare if the parameter np is not small. For example, the number of telephone calls to a busy switchboard in one hour follows a Poisson distribution with the events appearing frequent to the operator, but they are rare from the point of view of the average member of the population who is very unlikely to make a call to that switchboard in that hour.

这种近似有时被称为稀有事件定律,因为 n 个伯努利事件中的每一个很少发生。这个名称可能有误导性,因为如果参数 np 不小,那么 Poisson 过程中成功事件的总计数就不会很少。例如,一个小时内打给忙碌总机的电话数量跟随着一个泊松分佈,这些事件在接线员看来是频繁的,但是从普通人的角度来看,这些事件很少发生,因为他们不太可能在那个小时内打电话给总机。



The word ''law'' is sometimes used as a synonym of [[probability distribution]], and ''convergence in law'' means ''convergence in distribution''. Accordingly, the Poisson distribution is sometimes called the "law of small numbers" because it is the probability distribution of the number of occurrences of an event that happens rarely but has very many opportunities to happen. ''The Law of Small Numbers'' is a book by Ladislaus Bortkiewicz about the Poisson distribution, published in 1898.{{r|vonBortkiewitsch1898}}{{r|Edgeworth1913}}

The word law is sometimes used as a synonym of probability distribution, and convergence in law means convergence in distribution. Accordingly, the Poisson distribution is sometimes called the "law of small numbers" because it is the probability distribution of the number of occurrences of an event that happens rarely but has very many opportunities to happen. The Law of Small Numbers is a book by Ladislaus Bortkiewicz about the Poisson distribution, published in 1898.

法律一词有时被用作概率分布的同义词,法律的趋同意味着分配的趋同。因此,泊松分佈有时被称为“小数定律” ,因为它是一个事件发生次数的概率分布,这个事件很少发生,但却有很多机会发生。小数定律》是拉迪斯劳斯·博特基威茨的一本关于泊松分佈的书,出版于1898年。



=== Poisson point process ===



{{Main|Poisson point process}}



The Poisson distribution arises as the number of points of a [[Poisson point process]] located in some finite region. More specifically, if ''D'' is some region space, for example Euclidean space '''R'''<sup>''d''</sup>, for which |''D''|, the area, volume or, more generally, the Lebesgue measure of the region is finite, and if {{nowrap|''N''(''D'')}} denotes the number of points in ''D'', then

The Poisson distribution arises as the number of points of a Poisson point process located in some finite region. More specifically, if D is some region space, for example Euclidean space R<sup>d</sup>, for which |D|, the area, volume or, more generally, the Lebesgue measure of the region is finite, and if denotes the number of points in D, then

泊松分佈是位于某个有限区域的泊松过程的点数。更具体地说,如果 d 是某个区域空间,例如欧几里德空间 r < sup > d </sup > ,对于这个区域 | d | ,区域的面积、体积或者更一般地说,区域的勒贝格测度是有限的,如果表示 d 中的点数,那么



: <math> P(N(D)=k)=\frac{(\lambda|D|)^k e^{-\lambda|D|}}{k!} .</math>

<math> P(N(D)=k)=\frac{(\lambda|D|)^k e^{-\lambda|D|}}{k!} .</math>

< math > p (n (d) = k) = frac {(lambda | d |) ^ k e ^ {-lambda | d | }{ k! }. math



=== Poisson regression and negative binomial regression ===



[[Poisson regression]] and negative binomial regression are useful for analyses where the dependent (response) variable is the count (0,&nbsp;1,&nbsp;2,&nbsp;...) of the number of events or occurrences in an interval.

Poisson regression and negative binomial regression are useful for analyses where the dependent (response) variable is the count (0,&nbsp;1,&nbsp;2,&nbsp;...) of the number of events or occurrences in an interval.

泊松回归回归和负二项回归分析是有用的,其中依赖(响应)变量是计数(0,1,2,...)的事件或发生的数量在一个区间。



=== Other applications in science ===



In a Poisson process, the number of observed occurrences fluctuates about its mean ''λ'' with a [[standard deviation]] <math>\sigma_k =\sqrt{\lambda}</math>. These fluctuations are denoted as ''Poisson noise'' or (particularly in electronics) as ''[[shot noise]]''.

In a Poisson process, the number of observed occurrences fluctuates about its mean λ with a standard deviation <math>\sigma_k =\sqrt{\lambda}</math>. These fluctuations are denoted as Poisson noise or (particularly in electronics) as shot noise.

在泊松过程中,观察到的事件数目在其平均值上下波动,波动标准差为1/2。这些波动被称为泊松噪声或(特别是在电子学中)散粒噪声。



The correlation of the mean and standard deviation in counting independent discrete occurrences is useful scientifically. By monitoring how the fluctuations vary with the mean signal, one can estimate the contribution of a single occurrence, ''even if that contribution is too small to be detected directly''. For example, the charge ''e'' on an electron can be estimated by correlating the magnitude of an [[electric current]] with its [[shot noise]]. If ''N'' electrons pass a point in a given time ''t'' on the average, the [[mean]] [[Electric current|current]] is <math>I=eN/t</math>; since the current fluctuations should be of the order <math>\sigma_I=e\sqrt{N}/t</math> (i.e., the standard deviation of the [[Poisson process]]), the charge <math>e</math> can be estimated from the ratio <math>t\sigma_I^2/I</math>.{{Citation needed|date=April 2012}}

The correlation of the mean and standard deviation in counting independent discrete occurrences is useful scientifically. By monitoring how the fluctuations vary with the mean signal, one can estimate the contribution of a single occurrence, even if that contribution is too small to be detected directly. For example, the charge e on an electron can be estimated by correlating the magnitude of an electric current with its shot noise. If N electrons pass a point in a given time t on the average, the mean current is <math>I=eN/t</math>; since the current fluctuations should be of the order <math>\sigma_I=e\sqrt{N}/t</math> (i.e., the standard deviation of the Poisson process), the charge <math>e</math> can be estimated from the ratio <math>t\sigma_I^2/I</math>.

在计算独立的离散事件时,平均数和标准差的相关性是有科学价值的。通过监测波动是如何随着平均信号而变化的,我们可以估计单一事件的贡献,即使这个贡献太小而不能直接检测到。例如,电子的电荷 e 可以通过将电流的大小与散粒噪声相关联来估计。如果 n 个电子在给定时间 t 平均通过一个点,那么平均电流为 < math > i = eN/t </math > ; 因为当前的波动应该是 < math > sigma i = e sqrt { n }/t </math > (即 Poisson 过程的标准差) ,所以电荷 < math > e </math > 可以通过数学比率 < t sigma _ i ^ 2/I </math > 来估计。



An everyday example is the graininess that appears as photographs are enlarged; the graininess is due to Poisson fluctuations in the number of reduced [[silver]] grains, not to the individual grains themselves. By [[Correlation|correlating]] the graininess with the degree of enlargement, one can estimate the contribution of an individual grain (which is otherwise too small to be seen unaided).{{Citation needed|date=April 2012}} Many other molecular applications of Poisson noise have been developed, e.g., estimating the number density of [[receptor (biochemistry)|receptor]] molecules in a [[cell membrane]].

An everyday example is the graininess that appears as photographs are enlarged; the graininess is due to Poisson fluctuations in the number of reduced silver grains, not to the individual grains themselves. By correlating the graininess with the degree of enlargement, one can estimate the contribution of an individual grain (which is otherwise too small to be seen unaided). Many other molecular applications of Poisson noise have been developed, e.g., estimating the number density of receptor molecules in a cell membrane.

一个日常的例子是放大照片时出现的颗粒状; 颗粒状是由于减少的银粒数量的泊松波动,而不是单个颗粒本身。通过将颗粒度与放大程度相关联,我们可以估算出单个颗粒的贡献(否则颗粒太小,无法单独看到)。泊松噪声的许多其他分子应用已经发展起来,例如,估计细胞膜上受体分子的数量密度。

: <math>

<math>

《数学》

\Pr(N_t=k) = f(k;\lambda t) = \frac{(\lambda t)^k e^{-\lambda t}}{k!}.</math>

\Pr(N_t=k) = f(k;\lambda t) = \frac{(\lambda t)^k e^{-\lambda t}}{k!}.</math>

Pr (n _ t = k) = f (k; lambda t) = frac {((lambda t) ^ k e ^ {-lambda t }}{ k!} . </math >



In [[Causal Set]] theory the discrete elements of spacetime follow a Poisson distribution in the volume.

In Causal Set theory the discrete elements of spacetime follow a Poisson distribution in the volume.

在因果集合论中,时空的离散元素在卷中遵循一个泊松分佈。



==Computational methods==



The Poisson distribution poses two different tasks for dedicated software libraries: ''Evaluating'' the distribution <math>P(k;\lambda)</math>, and ''drawing random numbers'' according to that distribution.

The Poisson distribution poses two different tasks for dedicated software libraries: Evaluating the distribution <math>P(k;\lambda)</math>, and drawing random numbers according to that distribution.

泊松分佈为专用软件库提出了两个不同的任务: 评估分布 < math > p (k; lambda) </math > ,并根据分布绘制随机数。



=== Evaluating the Poisson distribution ===

Computing <math>P(k;\lambda)</math> for given <math>k</math> and <math>\lambda</math> is a trivial task that can be accomplished by using the standard definition of <math>P(k;\lambda)</math> in terms of exponential, power, and factorial functions. However, the conventional definition of the Poisson distribution contains two terms that can easily overflow on computers: λ<sup>''k''</sup> and ''k''!. The fraction of λ<sup>''k''</sup> to ''k''! can also produce a rounding error that is very large compared to ''e''<sup>−λ</sup>, and therefore give an erroneous result. For numerical stability the Poisson probability mass function should therefore be evaluated as

Computing <math>P(k;\lambda)</math> for given <math>k</math> and <math>\lambda</math> is a trivial task that can be accomplished by using the standard definition of <math>P(k;\lambda)</math> in terms of exponential, power, and factorial functions. However, the conventional definition of the Poisson distribution contains two terms that can easily overflow on computers: λ<sup>k</sup> and k!. The fraction of λ<sup>k</sup> to k! can also produce a rounding error that is very large compared to e<sup>−λ</sup>, and therefore give an erroneous result. For numerical stability the Poisson probability mass function should therefore be evaluated as

计算 < math > p (k; lambda) </math > 对于给定的 < math > k </math > 和 < math > lambda </math > 是一项琐碎的任务,可以通过使用 < math > p (k; lambda) </math > 的标准定义来完成,包括指数函数、幂函数和阶乘函数。然而,传统上对泊松分佈的定义包含了两个容易在计算机上溢出的术语: < sup > k </sup > 和 k。分数 < sup > k </sup > 到 k!也可能产生舍入误差,与 e < sup >-</sup > 相比,舍入误差非常大,因此给出错误的结果。因此,对于数值稳定性来说,泊松概率质量函数应该被评估为

:<math>\!f(k; \lambda)= \exp \left[ k\ln \lambda - \lambda - \ln \Gamma (k+1) \right],</math>

<math>\!f(k; \lambda)= \exp \left[ k\ln \lambda - \lambda - \ln \Gamma (k+1) \right],</math>

= exp left [ k ln lambda-lambda-ln Gamma (k + 1) right ] ,</math >

which is mathematically equivalent but numerically stable. The natural logarithm of the [[Gamma function]] can be obtained using the <code>lgamma</code> function in the [[C (programming language)|C]] standard library (C99 version) or [[R (programming language)|R]], the <code>gammaln</code> function in [[MATLAB]] or [[SciPy]], or the <code>log_gamma</code> function in [[Fortran]] 2008 and later.

which is mathematically equivalent but numerically stable. The natural logarithm of the Gamma function can be obtained using the <code>lgamma</code> function in the C standard library (C99 version) or R, the <code>gammaln</code> function in MATLAB or SciPy, or the <code>log_gamma</code> function in Fortran 2008 and later.

这在数学上是等价的,但在数值上是稳定的。函数的自然对数可以使用 c 标准库(C99版本)中的 < code > lgamma </code > 函数或 r,MATLAB 或 SciPy 中的 < code > gammaln </code > 函数,或 Fortran 2008及更高版本中的 < code > log _ Gamma </code > 函数来获得。



Some computing languages provide built-in functions to evaluate the Poisson distribution, namely

Some computing languages provide built-in functions to evaluate the Poisson distribution, namely

一些计算语言提供了内置函数来评估泊松分佈

* [[R (programming language)|R]]: function <code>dpois(x, lambda)</code>;

* [[Microsoft Excel|Excel]]: function <code>POISSON( x, mean, cumulative)</code>, with a flag to specify the cumulative distribution;

* [[Mathematica]]: univariate Poisson distribution as <code>PoissonDistribution[<math>\lambda</math>]</code>,<ref name="WLPoissonRefPage">{{cite web |url = http://reference.wolfram.com/language/ref/PoissonDistribution.html |title = Wolfram Language: PoissonDistribution reference page |website = wolfram.com |access-date = 2016-04-08 }}</ref> bivariate Poisson distribution as <code>MultivariatePoissonDistribution[<math>\theta_{12}</math>,{ <math>\theta_1 - \theta_{12}</math>, <math>\theta_2 - \theta_{12}</math>}]</code>,.<ref name="WLMvPoissonRefPage">{{cite web |url = http://reference.wolfram.com/language/ref/MultivariatePoissonDistribution.html |title = Wolfram Language: MultivariatePoissonDistribution reference page |website = wolfram.com |access-date = 2016-04-08 }}</ref>



=== Random drawing from the Poisson distribution ===

The less trivial task is to draw random integers from the Poisson distribution with given <math>\lambda</math>.

The less trivial task is to draw random integers from the Poisson distribution with given <math>\lambda</math>.

更简单的任务是用给定的 < math > lambda </math > 从泊松分佈中提取随机整数。



Solutions are provided by:

Solutions are provided by:

提供解决方案的有:

* [[R (programming language)|R]]: function <code>rpois(n, lambda)</code>;

* [[GNU Scientific Library]] (GSL): function [https://www.gnu.org/software/gsl/doc/html/randist.html#the-poisson-distribution gsl_ran_poisson]



=== Generating Poisson-distributed random variables ===



A simple algorithm to generate random Poisson-distributed numbers ([[pseudo-random number sampling]]) has been given by [[Donald Knuth|Knuth]]:{{r|Knuth1997|p=137-138}}

A simple algorithm to generate random Poisson-distributed numbers (pseudo-random number sampling) has been given by Knuth:

给出了一个产生随机泊松分布数(伪随机数抽样)的简单算法:



'''algorithm''' ''poisson random number (Knuth)'':

algorithm poisson random number (Knuth):

算法泊松随机数(Knuth) :

'''init''':

init:

初始化:

'''Let''' L ← ''e''<sup>−λ</sup>, k ← 0 and p ← 1.

Let L ← e<sup>−λ</sup>, k ← 0 and p ← 1.

Let L ← e<sup>−λ</sup>, k ← 0 and p ← 1.

'''do''':

do:

做:

k ← k + 1.

k ← k + 1.

k ← k + 1.

Generate uniform random number u in [0,1] and '''let''' p ← p × u.

Generate uniform random number u in [0,1] and let p ← p × u.

在[0,1]中生成均匀随机数 u 并且设 p ← p u。

'''while''' p > L.

while p > L.

而 p > l。

'''return''' k − 1.

return k − 1.

return k − 1.



The complexity is linear in the returned value ''k'', which is λ on average. There are many other algorithms to improve this. Some are given in Ahrens & Dieter, see {{slink||References}} below.

The complexity is linear in the returned value k, which is λ on average. There are many other algorithms to improve this. Some are given in Ahrens & Dieter, see below.

返回值 k 的复杂度是线性的,平均为。还有许多其他的算法可以改进这一点。一些是在 Ahrens & Dieter,见下面。



For large values of λ, the value of L = ''e''<sup>−λ</sup> may be so small that it is hard to represent. This can be solved by a change to the algorithm which uses an additional parameter STEP such that ''e''<sup>−STEP</sup> does not underflow: {{Citation needed|reason=Original source is missing|date=March 2019}}

For large values of λ, the value of L = e<sup>−λ</sup> may be so small that it is hard to represent. This can be solved by a change to the algorithm which uses an additional parameter STEP such that e<sup>−STEP</sup> does not underflow:

对于较大的值,l = e </sup >-</sup > 的值可能非常小,以至于很难表示。这可以通过改变算法来解决,该算法使用附加参数 STEP,使得 e < sup >-STEP </sup > 不会底流:



'''algorithm''' ''poisson random number (Junhao, based on Knuth)'':

algorithm poisson random number (Junhao, based on Knuth):

泊松随机数算法(Junhao,基于 Knuth) :

'''init''':

init:

初始化:

'''Let''' λLeft ← λ, k ← 0 and p ← 1.

Let λLeft ← λ, k ← 0 and p ← 1.

Let λLeft ← λ, k ← 0 and p ← 1.

'''do''':

do:

做:

k ← k + 1.

k ← k + 1.

k ← k + 1.

Generate uniform random number u in (0,1) and '''let''' p ← p × u.

Generate uniform random number u in (0,1) and let p ← p × u.

在(0,1)中生成均匀随机数 u 并且设 p ← p u。

'''while''' p < 1 and λLeft > 0:

while p < 1 and λLeft > 0:

while p < 1 and λLeft > 0:

'''if''' λLeft > STEP:

if λLeft > STEP:

if λLeft > STEP:

p ← p × ''e''<sup>STEP</sup>

p ← p × e<sup>STEP</sup>

p ← p × e<sup>STEP</sup>

λLeft ← λLeft − STEP

λLeft ← λLeft − STEP

λLeft ← λLeft − STEP

'''else''':

else:

其他:

p ← p × ''e''<sup>λLeft</sup>

p ← p × e<sup>λLeft</sup>

p ← p × e<sup>λLeft</sup>

λLeft ← 0

λLeft ← 0

λLeft ← 0

'''while''' p > 1.

while p > 1.

同时 p > 1。

'''return''' k − 1.

return k − 1.

return k − 1.



The choice of STEP depends on the threshold of overflow. For double precision floating point format, the threshold is near ''e''<sup>700</sup>, so 500 shall be a safe ''STEP''.

The choice of STEP depends on the threshold of overflow. For double precision floating point format, the threshold is near e<sup>700</sup>, so 500 shall be a safe STEP.

STEP 的选择取决于溢出阈值。对于双精度浮点格式,阈值接近 e < sup > 700 </sup > ,因此500应该是一个安全的 STEP。



Other solutions for large values of λ include [[rejection sampling]] and using Gaussian approximation.

Other solutions for large values of λ include rejection sampling and using Gaussian approximation.

其他大值的解包括抑制取样和使用高斯近似。



[[Inverse transform sampling]] is simple and efficient for small values of λ, and requires only one uniform random number ''u'' per sample. Cumulative probabilities are examined in turn until one exceeds ''u''.

Inverse transform sampling is simple and efficient for small values of λ, and requires only one uniform random number u per sample. Cumulative probabilities are examined in turn until one exceeds u.

逆变换采样对于小数值的样本是简单有效的,并且每个样本只需要一个均匀的随机数 u。累积概率被依次检查,直到一个超过 u。



'''algorithm''' ''Poisson generator based upon the inversion by sequential search'':{{r|Devroye1986|p=505}}

algorithm Poisson generator based upon the inversion by sequential search:

基于线性搜索反演的泊松生成算法:

'''init''':

init:

初始化:

'''Let''' x ← 0, p ← ''e''<sup>−λ</sup>, s ← p.

Let x ← 0, p ← e<sup>−λ</sup>, s ← p.

Let x ← 0, p ← e<sup>−λ</sup>, s ← p.

Generate uniform random number u in [0,1].

Generate uniform random number u in [0,1].

在[0,1]中生成均匀随机数 u。

'''while''' u > s '''do''':

while u > s do:

当你做的时候:

x ← x + 1.

x ← x + 1.

x ← x + 1.

p ← p × λ / x.

p ← p × λ / x.

p ← p × λ / x.

s ← s + p.

s ← s + p.

s ← s + p.

'''return''' x.

return x.

返回 x。



== History ==

The distribution was first introduced by [[Siméon Denis Poisson]] (1781–1840) and published together with his probability theory in his work ''Recherches sur la probabilité des jugements en matière criminelle et en matière civile''(1837).{{r|Poisson1837|p=205-207}} The work theorized about the number of wrongful convictions in a given country by focusing on certain [[random variable]]s ''N'' that count, among other things, the number of discrete occurrences (sometimes called "events" or "arrivals") that take place during a [[time]]-interval of given length. The result had already been given in 1711 by [[Abraham de Moivre]] in ''De Mensura Sortis seu; de Probabilitate Eventuum in Ludis a Casu Fortuito Pendentibus'' .{{r|deMoivre1711|p=219}}{{r|deMoivre1718|p=14-15}}{{r|deMoivre1721|p=193}}{{r|Johnson2005|p=157}} This makes it an example of [[Stigler's law]] and it has prompted some authors to argue that the Poisson distribution should bear the name of de Moivre.{{r|Stigler1982|Hald1984}}

The distribution was first introduced by Siméon Denis Poisson (1781–1840) and published together with his probability theory in his work Recherches sur la probabilité des jugements en matière criminelle et en matière civile(1837). The work theorized about the number of wrongful convictions in a given country by focusing on certain random variables N that count, among other things, the number of discrete occurrences (sometimes called "events" or "arrivals") that take place during a time-interval of given length. The result had already been given in 1711 by Abraham de Moivre in De Mensura Sortis seu; de Probabilitate Eventuum in Ludis a Casu Fortuito Pendentibus . This makes it an example of Stigler's law and it has prompted some authors to argue that the Poisson distribution should bear the name of de Moivre.

The distribution was first introduced by Siméon Denis Poisson (1781–1840) and published together with his probability theory in his work Recherches sur la probabilité des jugements en matière criminelle et en matière civile(1837).这项工作通过关注某些随机变量 n (其中包括在给定时间间隔内发生的离散事件(有时称为“事件”或“到达事件”)的数量)来推断某一国家的错误定罪数量。这个结果早在1711年就已经在《亚伯拉罕·棣莫弗给出了。在 Ludis 举行的猜测活动。这使它成为斯蒂格勒定律的一个例子,并促使一些作者提出,泊松分佈应该以德莫伊弗雷的名字命名。



In 1860, [[Simon Newcomb]] fitted the Poisson distribution to the number of stars found in a unit of space.{{r|Newcomb1860}}

In 1860, Simon Newcomb fitted the Poisson distribution to the number of stars found in a unit of space.

1860年,Simon Newcomb 将泊松分佈天文台与一个空间单位中发现的恒星数量进行了比较。

A further practical application of this distribution was made by [[Ladislaus Bortkiewicz]] in 1898 when he was given the task of investigating the number of soldiers in the Prussian army killed accidentally by horse kicks;{{r|vonBortkiewitsch1898|p=23-25}} this experiment introduced the Poisson distribution to the field of [[reliability engineering]].

A further practical application of this distribution was made by Ladislaus Bortkiewicz in 1898 when he was given the task of investigating the number of soldiers in the Prussian army killed accidentally by horse kicks; this experiment introduced the Poisson distribution to the field of reliability engineering.

这种分布的进一步实际应用是在1898年,当时拉迪斯劳斯·博特基威茨被赋予调查普鲁士军队中被马踢意外杀死的士兵人数的任务; 这个实验将泊松分佈引入可靠度领域。



== See also ==

{{div col |colwidth = 18em }}

* [[Compound Poisson distribution]]

* [[Conway–Maxwell–Poisson distribution]]

* [[Erlang distribution]]

* [[Hermite distribution]]

* [[Index of dispersion]]

* [[Negative binomial distribution]]

* [[Poisson clumping]]

* [[Poisson point process]]

* [[Poisson regression]]

* [[Poisson sampling]]

* [[Poisson wavelet]]

* [[Queueing theory]]

* [[Renewal theory]]

* [[Robbins lemma]]

* [[Skellam distribution]]

* [[Tweedie distribution]]

* [[Zero-inflated model]]

* [[Zero-truncated Poisson distribution]]

{{div col end}}



== References ==

=== Citations ===

{{Reflist

{{Reflist

{通货再膨胀

|refs =

|refs =

2012年10月15日

<ref name=Haight1967>

<ref name=Haight1967>

< ref name = haight1967 >

{{citation

{{citation

{ citation

|last1=Haight

|last1=Haight

1 = Haight

|first1=Frank A.

|first1=Frank A.

1 = Frank a.

|title=Handbook of the Poisson Distribution

|title=Handbook of the Poisson Distribution

| title = 美国泊松分佈手册

|publisher=John Wiley & Sons

|publisher=John Wiley & Sons

2012年3月24日 | publisher = 约翰威立

|location=New York, NY, USA

|location=New York, NY, USA

| 地点: 美国纽约州纽约

|year=1967

|year=1967

1967年

|isbn=978-0-471-33932-8

|isbn=978-0-471-33932-8

| isbn = 978-0-471-33932-8

}}</ref>

}}</ref>

} </ref >

<ref name=Brooks2007>

<ref name=Brooks2007>

< ref name = brooks2007 >

{{citation

{{citation

{ citation

|last=Brooks |first=E. Bruce

|last=Brooks |first=E. Bruce

| last = Brooks | first = e.布鲁斯

|title=Statistics &#124; The Poisson Distribution

|title=Statistics &#124; The Poisson Distribution

2012年10月14日 | 标题: 统计数据 & # 124; 泊松分佈

|series=Warring States Project |publisher=Umass.edu

|series=Warring States Project |publisher=Umass.edu

战国计划 | 出版社 = Umass.edu

|date=2007-08-24

|date=2007-08-24

| date = 2007-08-24

|accessdate=2014-04-18

|accessdate=2014-04-18

2014-04-18

|url=https://www.umass.edu/wsp/resources/reference/poisson/

|url=https://www.umass.edu/wsp/resources/reference/poisson/

Https://www.umass.edu/wsp/resources/reference/poisson/

}}</ref>

}}</ref>

} </ref >

<ref name=Poisson1837>

<ref name=Poisson1837>

< ref name = poisson1837 >

{{citation

{{citation

{ citation

|last1=Poisson

|last1=Poisson

1 = Poisson

|first1=Siméon D.

|first1=Siméon D.

1 = Siméon d.

|title=Probabilité des jugements en matière criminelle et en matière civile, précédées des règles générales du calcul des probabilitiés

|title=Probabilité des jugements en matière criminelle et en matière civile, précédées des règles générales du calcul des probabilitiés

|title=Probabilité des jugements en matière criminelle et en matière civile, précédées des règles générales du calcul des probabilitiés

|trans-title=Research on the Probability of Judgments in Criminal and Civil Matters

|trans-title=Research on the Probability of Judgments in Criminal and Civil Matters

| trans-title = 刑事和民事案件判决的可能性研究

|language=French

|language=French

语言 = 法语

|publisher=Bachelier

|publisher=Bachelier

| publisher = Bachelier

|location=Paris, France

|location=Paris, France

| 地点: 法国巴黎

|year=1837

|year=1837

1837年

|url=https://gallica.bnf.fr/ark:/12148/bpt6k110193z/f218.image

|url=https://gallica.bnf.fr/ark:/12148/bpt6k110193z/f218.image

Https://gallica.bnf.fr/ark:/12148/bpt6k110193z/f218.image

}}</ref>

}}</ref>

} </ref >

<ref name=Koehrsen2019>

<ref name=Koehrsen2019>

< ref name = koehrsen2019 >

{{citation

{{citation

{ citation

|last1=Koehrsen

|last1=Koehrsen

1 = Koehrsen

|first1=William

|first1=William

1 = William

|title=The Poisson Distribution and Poisson Process Explained

|title=The Poisson Distribution and Poisson Process Explained

泊松分佈和泊松过程的解释

|publisher=Towards Data Science

|publisher=Towards Data Science

走向数据科学

|date=2019-01-20

|date=2019-01-20

| date = 2019-01-20

|url=https://towardsdatascience.com/the-poisson-distribution-and-poisson-process-explained-4e2cb17d459

|url=https://towardsdatascience.com/the-poisson-distribution-and-poisson-process-explained-4e2cb17d459

Https://towardsdatascience.com/the-poisson-distribution-and-poisson-process-explained-4e2cb17d459

|accessdate=2019-09-19

|accessdate=2019-09-19

2019-09-19

}}</ref>

}}</ref>

} </ref >

<ref name="Ugarte2016">

<ref name="Ugarte2016">

< ref name ="ugarte2016">

{{Citation

{{Citation

{引文

|last1=Ugarte |first1=Maria Dolores

|last1=Ugarte |first1=Maria Dolores

1 = Ugarte | first1 = Maria Dolores

|last2=Militino |first2=Ana F.

|last2=Militino |first2=Ana F.

2 = Militino | first2 = Ana f.

|last3=Arnholt |first3=Alan T.

|last3=Arnholt |first3=Alan T.

3 = Arnholt | first3 = Alan t.

|title=Probability and Statistics with R

|title=Probability and Statistics with R

| title = 带有 r 的概率和统计

|edition=Second

|edition=Second

| edition = Second

|year=2016

|year=2016

2016年

|publisher=CRC Press

|publisher=CRC Press

| publisher = CRC Press

|location=Boca Raton, FL, USA

|location=Boca Raton, FL, USA

| location = Boca Raton,FL,USA

|isbn=978-1-4665-0439-4

|isbn=978-1-4665-0439-4

| isbn = 978-1-4665-0439-4

}}

}}

}}

</ref>

</ref>

</ref >

<ref name=deMoivre1711>

<ref name=deMoivre1711>

< ref name = demoivre1711 >

{{citation

{{citation

{ citation

|last1=de Moivre |first1=Abraham

|last1=de Moivre |first1=Abraham

1 = de Moivre | first1 = Abraham

|title=De mensura sortis, seu, de probabilitate eventuum in ludis a casu fortuito pendentibus

|title=De mensura sortis, seu, de probabilitate eventuum in ludis a casu fortuito pendentibus

|title=De mensura sortis, seu, de probabilitate eventuum in ludis a casu fortuito pendentibus

|trans-title=On the Measurement of Chance, or, on the Probability of Events in Games Depending Upon Fortuitous Chance

|trans-title=On the Measurement of Chance, or, on the Probability of Events in Games Depending Upon Fortuitous Chance

| trans-title = 关于运气的测量,或者说,关于依靠偶然机会的比赛事件的概率

|language=Latin

|language=Latin

语言 = 拉丁语

|journal=[[Philosophical Transactions of the Royal Society]]

|journal=Philosophical Transactions of the Royal Society

英国皇家学会哲学学报

|year=1711 |volume=27 |issue=329 |pages=213–264

|year=1711 |volume=27 |issue=329 |pages=213–264

1711 | volume = 27 | issue = 329 | pages = 213-264

|doi=10.1098/rstl.1710.0018

|doi=10.1098/rstl.1710.0018

| doi = 10.1098/rstl. 1710.0018

|doi-access=free

|doi-access=free

免费访问

}}</ref>

}}</ref>

} </ref >

<ref name=deMoivre1718>

<ref name=deMoivre1718>

< ref name = demoivre1718 >

{{citation

{{citation

{ citation

|last1=de Moivre |first1=Abraham

|last1=de Moivre |first1=Abraham

1 = de Moivre | first1 = Abraham

|year=1718

|year=1718

1718年

|title=The Doctrine of Chances: Or, A Method of Calculating the Probability of Events in Play

|title=The Doctrine of Chances: Or, A Method of Calculating the Probability of Events in Play

| title = The Doctrine of Chances: Or,a Method of Calculating The Probability of Events in Play

|publisher=W. Pearson

|publisher=W. Pearson

| publisher = w.皮尔森

|location=London, Great Britain

|location=London, Great Britain

| 地点: 英国伦敦

|url=https://books.google.com/books?id=3EPac6QpbuMC&pg=PA14

|url=https://books.google.com/books?id=3EPac6QpbuMC&pg=PA14

Https://books.google.com/books?id=3epac6qpbumc&pg=pa14

}}</ref>

}}</ref>

} </ref >

<ref name=deMoivre1721>

<ref name=deMoivre1721>

< ref name = demoivre1721 >

{{citation

{{citation

{ citation

|last1=de Moivre |first1=Abraham

|last1=de Moivre |first1=Abraham

1 = de Moivre | first1 = Abraham

|year=1721

|year=1721

1721年

|chapter=Of the Laws of Chance

|chapter=Of the Laws of Chance

第二章: 运气法则

|title=The Philosophical Transactions from the Year MDCC (where Mr. Lowthorp Ends) to the Year MDCCXX. Abridg'd, and Dispos'd Under General Heads

|title=The Philosophical Transactions from the Year MDCC (where Mr. Lowthorp Ends) to the Year MDCCXX. Abridg'd, and Dispos'd Under General Heads

| title = 从 MDCC (洛索普先生结束的地方)到 MDCCXX 年的哲学事务。在将军的领导下,删节和修改

|volume=Vol. I

|volume=Vol. I

| volume = Vol.一

|language=Latin

|language=Latin

语言 = 拉丁语

|editor-last1=Motte |editor-first1=Benjamin

|editor-last1=Motte |editor-first1=Benjamin

1 = Motte | editor-first1 = Benjamin

|publisher=R. Wilkin, R. Robinson, S. Ballard, W. and J. Innys, and J. Osborn

|publisher=R. Wilkin, R. Robinson, S. Ballard, W. and J. Innys, and J. Osborn

| publisher = r.威尔金,r. 罗宾逊,s. 巴拉德,w. 和 j. Innys,j. 奥斯本

|location=London, Great Britain

|location=London, Great Britain

| 地点: 英国伦敦

|pages=190–219

|pages=190–219

| 页数 = 190-219

}}</ref>

}}</ref>

} </ref >

<ref name="Johnson2005">

<ref name="Johnson2005">

< ref name ="johnson2005">

{{citation

{{citation

{ citation

|last1=Johnson |first1=Norman L.

|last1=Johnson |first1=Norman L.

1 = Johnson | first1 = Norman l.

|last2=Kemp |first2=Adrienne W.

|last2=Kemp |first2=Adrienne W.

2 = Kemp | first2 = Adrienne w.

|last3=Kotz |first3=Samuel

|last3=Kotz |first3=Samuel

3 = Kotz | first3 = Samuel

|chapter=Poisson Distribution

|chapter=Poisson Distribution

| chapter = 泊松分佈

|year=2005 |pages=156–207

|year=2005 |pages=156–207

| year = 2005 | pages = 156-207

|title=Univariate Discrete Distributions

|title=Univariate Discrete Distributions

| title = 单变量离散分布

|edition=3rd

|edition=3rd

3 rd

|publisher=John Wiley & Sons, Inc.

|publisher=John Wiley & Sons, Inc.

2012年10月21日 | 出版商 = 约翰威立。

|location=New York, NY, USA

|location=New York, NY, USA

| 地点: 美国纽约州纽约

|isbn=978-0-471-27246-5

|isbn=978-0-471-27246-5

978-0-471-27246-5

|doi=10.1002/0471715816

|doi=10.1002/0471715816

10.1002/0471715816

}}</ref>

}}</ref>

} </ref >

<ref name=Helske2017>

<ref name=Helske2017>

< ref name = helske2017 >

{{cite arxiv

{{cite arxiv

{{cite arxiv

|last=Helske |first=Jouni

|last=Helske |first=Jouni

最后 = Helske | first = Jouni

|date=2017

|date=2017

2017年

|title=KFAS: Exponential family state space models in R

|title=KFAS: Exponential family state space models in R

| title = KFAS: r 的指数族状态空间模型

|eprint=1612.01907

|eprint=1612.01907

1612.01907

|class=stat.CO

|class=stat.CO

| class = stat.CO

}}</ref>

}}</ref>

} </ref >

<ref name=Stigler1982>

<ref name=Stigler1982>

< ref name = stigler1982 >

{{citation

{{citation

{ citation

|last1=Stigler |first1=Stephen M.

|last1=Stigler |first1=Stephen M.

1 = Stigler | first1 = Stephen m.

|title=Poisson on the Poisson Distribution

|title=Poisson on the Poisson Distribution

泊松分佈上的泊松

|journal=Statistics & Probability Letters

|journal=Statistics & Probability Letters

| journal = Statistics & Probability Letters

|year=1982 |volume=1 |issue=1 |pages=33–35

|year=1982 |volume=1 |issue=1 |pages=33–35

1982 | volume = 1 | issue = 1 | pages = 33-35

|doi=10.1016/0167-7152(82)90010-4

|doi=10.1016/0167-7152(82)90010-4

| doi = 10.1016/0167-7152(82)90010-4

}}</ref>

}}</ref>

} </ref >

<ref name=Hald1984>

<ref name=Hald1984>

< ref name = hald1984 >

{{Citation

{{Citation

{引文

|last1=Hald |first1=Anders

|last1=Hald |first1=Anders

1 = Hald | first1 = Anders

|last2=de Moivre |first2=Abraham

|last2=de Moivre |first2=Abraham

2 = de Moivre | first2 = Abraham

|last3=McClintock |first3=Bruce

|last3=McClintock |first3=Bruce

3 = McClintock | first3 = Bruce

|title=A. de Moivre: 'De Mensura Sortis' or 'On the Measurement of Chance'

|title=A. de Moivre: 'De Mensura Sortis' or 'On the Measurement of Chance'

| title = a.‘ De Mensura Sortis’或‘ On the Measurement of Chance’

|journal=International Statistical Review / Revue Internationale de Statistique

|journal=International Statistical Review / Revue Internationale de Statistique

国际统计评论/国际统计杂志

|year=1984 |volume=52 |issue=3 |pages=229–262

|year=1984 |volume=52 |issue=3 |pages=229–262

1984 | volume = 52 | issue = 3 | pages = 229-262

|doi=10.2307/1403045

|doi=10.2307/1403045

10.2307/1403045

|jstor=1403045

|jstor=1403045

1403045

}}</ref>

}}</ref>

} </ref >

<ref name=Newcomb1860>

<ref name=Newcomb1860>

< ref name = newcomb1860 >

{{citation

{{citation

{ citation

|last1=Newcomb

|last1=Newcomb

1 = Newcomb

|first1=Simon

|first1=Simon

1 = Simon

|year=1860

|year=1860

1860年

|title=Notes on the theory of probabilities

|title=Notes on the theory of probabilities

| title = 概率论笔记

|journal=The Mathematical Monthly

|journal=The Mathematical Monthly

| journal = The Mathematical Monthly

|volume=2

|volume=2

2

|issue=4

|issue=4

第四期

|pages=134–140

|pages=134–140

| 页数 = 134-140

|url=https://babel.hathitrust.org/cgi/pt?id=nyp.33433069075590&seq=150

|url=https://babel.hathitrust.org/cgi/pt?id=nyp.33433069075590&seq=150

Https://babel.hathitrust.org/cgi/pt?id=nyp.33433069075590&seq=150

}}</ref>

}}</ref>

} </ref >

<ref name=vonBortkiewitsch1898>

<ref name=vonBortkiewitsch1898>

< ref name = vonbortkiewitsch1898 >

{{citation

{{citation

{ citation

|last1=von Bortkiewitsch |first1=Ladislaus <!-- This is the spelling of the name as it appears in the book, the Polish version would be Vladislav Bortkevič-->

|last1=von Bortkiewitsch |first1=Ladislaus <!-- This is the spelling of the name as it appears in the book, the Polish version would be Vladislav Bortkevič-->

1 = von Bortkiewitsch | first1 = Ladislaus < ! ——这是这个名字在书中出现时的拼写,波兰语版本是 vladislav bortkevi —— >

|title=Das Gesetz der kleinen Zahlen

|title=Das Gesetz der kleinen Zahlen

|title=Das Gesetz der kleinen Zahlen

|trans-title=The law of small numbers

|trans-title=The law of small numbers

| trans-title = 小数定律

|language=German

|language=German

语言 = 德语

|publisher=B.&nbsp;G.&nbsp;Teubner |location=Leipzig, Germany

|publisher=B.&nbsp;G.&nbsp;Teubner |location=Leipzig, Germany

德国莱比锡

|year=1898

|year=1898

1898年

|page=On [https://digibus.ub.uni-stuttgart.de/viewer/object/1543508614348/13 page 1], Bortkiewicz presents the Poisson distribution. On [https://digibus.ub.uni-stuttgart.de/viewer/object/1543508614348/35 pages 23–25], Bortkiewitsch presents his analysis of "4. Beispiel: Die durch Schlag eines Pferdes im preußischen Heere Getöteten." (4. Example: Those killed in the Prussian army by a horse's kick.)

|page=On [https://digibus.ub.uni-stuttgart.de/viewer/object/1543508614348/13 page 1], Bortkiewicz presents the Poisson distribution. On [https://digibus.ub.uni-stuttgart.de/viewer/object/1543508614348/35 pages 23–25], Bortkiewitsch presents his analysis of "4. Beispiel: Die durch Schlag eines Pferdes im preußischen Heere Getöteten." (4. Example: Those killed in the Prussian army by a horse's kick.)

在纽约 https://digibus.ub.uni-stuttgart.de/viewer/object/1543508614348/13第一页,Bortkiewicz 展示了纽约泊松分佈。在《 https://digibus.ub.uni-stuttgart.de/viewer/object/1543508614348/3523-25页,Bortkiewitsch 提出了他对《4》的分析。Beispiel: Die durch Schlag eines Pferdes im preußischen Heere Getöteten."(4.那些在普鲁士军队中被马踢死的人

}}</ref>

}}</ref>

} </ref >

<ref name=Lehmann1986>

<ref name=Lehmann1986>

< ref name = lehmann1986 >

{{citation

{{citation

{ citation

|last1=Lehmann |first1=Erich Leo

|last1=Lehmann |first1=Erich Leo

1 = Lehmann | first1 = Erich Leo

|title=Testing Statistical Hypotheses

|title=Testing Statistical Hypotheses

| title = 检验统计假说

|publisher=Springer Verlag |location=New York, NJ, USA

|publisher=Springer Verlag |location=New York, NJ, USA

纽约,新泽西,美国

|edition=second

|edition=second

第二季,第二集

|year=1986

|year=1986

1986年

|isbn=978-0-387-94919-2}}</ref>

|isbn=978-0-387-94919-2}}</ref>

978-0-387-94919-2}} </ref >

<ref name=Yates2014>

<ref name=Yates2014>

< ref name = yates2014 >

{{citation

{{citation

{ citation

|last1=Yates |first1=Roy D. |last2=Goodman |first2=David J.

|last1=Yates |first1=Roy D. |last2=Goodman |first2=David J.

1 = Yates | first1 = Roy d | last2 = Goodman | first2 = David j.

|title=Probability and Stochastic Processes: A Friendly Introduction for Electrical and Computer Engineers

|title=Probability and Stochastic Processes: A Friendly Introduction for Electrical and Computer Engineers

| title = 概率和随机过程: 电气和计算机工程师的友好介绍

|year=2014

|year=2014

2014年

|publisher=Wiley

|publisher=Wiley

| publisher = Wiley

|location=Hoboken, USA

|location=Hoboken, USA

| 地点: 霍博肯

|edition=2nd |isbn=978-0-471-45259-1}}

|edition=2nd |isbn=978-0-471-45259-1}}

| edition = 2 nd | isbn = 978-0-471-45259-1}

</ref>

</ref>

</ref >

<ref name=Choi1994>

<ref name=Choi1994>

< ref name = choi1994 >

{{citation

{{citation

{ citation

|last1=Choi |first1=Kwok P.

|last1=Choi |first1=Kwok P.

1 = Choi | first1 = Kwok p.

|title=On the medians of gamma distributions and an equation of Ramanujan

|title=On the medians of gamma distributions and an equation of Ramanujan

| title = 关于 γ 分布的中线和 Ramanujan 的一个方程

|journal=Proceedings of the American Mathematical Society

|journal=Proceedings of the American Mathematical Society

| journal = Proceedings of the American Mathematical Society

|year=1994 |volume=121 |issue=1 |pages=245–251

|year=1994 |volume=121 |issue=1 |pages=245–251

1994 | volume = 121 | issue = 1 | pages = 245-251

|doi=10.2307/2160389

|doi=10.2307/2160389

10.2307/2160389

|jstor=2160389|doi-access=free}}</ref>

|jstor=2160389|doi-access=free}}</ref>

2160389 | doi-access = free } </ref >

<ref name=Riordan1937>

<ref name=Riordan1937>

< ref name = riordan1937 >

{{citation

{{citation

{ citation

|last=Riordan | first=John

|last=Riordan | first=John

| last = Riordan | first = John

|journal=Annals of Mathematical Statistics

|journal=Annals of Mathematical Statistics

数理统计年鉴

|title=Moment Recurrence Relations for Binomial, Poisson and Hypergeometric Frequency Distributions

|title=Moment Recurrence Relations for Binomial, Poisson and Hypergeometric Frequency Distributions

| title = 二项式、泊松和超几何频率分布的矩常返关系

|year=1937 |volume=8 |issue=2 |pages=103–111

|year=1937 |volume=8 |issue=2 |pages=103–111

1937 | volume = 8 | issue = 2 | pages = 103-111

|jstor=2957598

|jstor=2957598

2957598

|doi=10.1214/aoms/1177732430

|doi=10.1214/aoms/1177732430

| doi = 10.1214/aoms/1177732430

|url=https://projecteuclid.org/download/pdf_1/euclid.aoms/1177732430

|url=https://projecteuclid.org/download/pdf_1/euclid.aoms/1177732430

Https://projecteuclid.org/download/pdf_1/euclid.aoms/1177732430

|format=PDF

|format=PDF

格式 = PDF

|doi-access=free

|doi-access=free

免费访问

}}</ref>

}}</ref>

} </ref >

<ref name=Jagadeesan2017>

<ref name=Jagadeesan2017>

< ref name = jagadeesan2017 >

{{cite arxiv

{{cite arxiv

{{cite arxiv

|last1=Jagadeesan |first1=Meena

|last1=Jagadeesan |first1=Meena

|last1=Jagadeesan |first1=Meena

|title=Simple analysis of sparse, sign-consistent JL

|title=Simple analysis of sparse, sign-consistent JL

| title = 稀疏、符号一致的 JL 的简单分析

|year=2017

|year=2017

2017年

|eprint=1708.02966 |class=cs.DS

|eprint=1708.02966 |class=cs.DS

1708.02966 | class = csds

}}</ref>

}}</ref>

} </ref >

<ref name=Raikov1937>

<ref name=Raikov1937>

< ref name = raikov1937 >

{{citation

{{citation

{ citation

|last=Raikov |first=Dmitry

|last=Raikov |first=Dmitry

| last = Raikov | first = Dmitry

|title=On the decomposition of Poisson laws

|title=On the decomposition of Poisson laws

关于泊松定律的分解

|journal=Comptes Rendus de l'Académie des Sciences de l'URSS

|journal=Comptes Rendus de l'Académie des Sciences de l'URSS

|journal=Comptes Rendus de l'Académie des Sciences de l'URSS

|year=1937 |volume=14 |pages=9–11

|year=1937 |volume=14 |pages=9–11

1937 | volume = 14 | pages = 9-11

}}</ref>

}}</ref>

} </ref >

<ref name=vonMises1964>

<ref name=vonMises1964>

< ref name = vonmises1964 >

{{citation

{{citation

{ citation

|last=von Mises |first=Richard

|last=von Mises |first=Richard

最后 = 冯 · 米塞斯 | 第一 = 理查德

|title=Mathematical Theory of Probability and Statistics

|title=Mathematical Theory of Probability and Statistics

数学概率统计理论

|year=1964

|year=1964

1964年

|publisher=Academic Press

|publisher=Academic Press

| publisher = Academic Press

|location=New York, NJ, USA

|location=New York, NJ, USA

纽约,新泽西,美国

|isbn=978-1-4832-3213-3

|isbn=978-1-4832-3213-3

| isbn = 978-1-4832-3213-3

|doi=10.1016/C2013-0-12460-9

|doi=10.1016/C2013-0-12460-9

| doi = 10.1016/C2013-0-12460-9

}}</ref>

}}</ref>

} </ref >

<ref name=Kamath2015>

<ref name=Kamath2015>

< ref name = kamath2015 >

{{ citation

{{ citation

{ citation

|last1=Kamath |first1=Govinda M.

|last1=Kamath |first1=Govinda M.

1 = Kamath | first1 = Govinda m.

|last2=Şaşoğlu |first2=Eren

|last2=Şaşoğlu |first2=Eren

|last2=Şaşoğlu |first2=Eren

|last3=Tse |first3=David

|last3=Tse |first3=David

3 = Tse | first3 = David

|contribution=Optimal Haplotype Assembly from High-Throughput Mate-Pair Reads

|contribution=Optimal Haplotype Assembly from High-Throughput Mate-Pair Reads

| 贡献 = 高通量配对读取的最优单体型组装

|title=2015 IEEE International Symposium on Information Theory (ISIT), 14–19 June

|title=2015 IEEE International Symposium on Information Theory (ISIT), 14–19 June

| title = 2015 IEEE 国际信息理论研讨会,6月14-19日

|year=2015

|year=2015

2015年

|pages=914–918

|pages=914–918

914-918

|doi=10.1109/ISIT.2015.7282588

|doi=10.1109/ISIT.2015.7282588

10.1109/ISIT. 2015.7282588

|location=Hong Kong, China

|location=Hong Kong, China

| 地点: 中国香港

|arxiv=1502.01975

|arxiv=1502.01975

1502.01975

}}</ref>

}}</ref>

} </ref >

<ref name=Laha1979>

<ref name=Laha1979>

< ref name = laha1979 >

{{citation

{{citation

{ citation

|last1=Laha |first1=Radha G.

|last1=Laha |first1=Radha G.

1 = Laha | first1 = Radha g.

|last2=Rohatgi |first2=Vijay K.

|last2=Rohatgi |first2=Vijay K.

2 = Rohatgi | first2 = Vijay k.

|title=Probability Theory

|title=Probability Theory

概率论

|publisher=John Wiley & Sons

|publisher=John Wiley & Sons

2012年3月24日 | publisher = 约翰威立

|location=New York, NJ, USA

|location=New York, NJ, USA

纽约,新泽西,美国

|isbn=978-0-471-03262-5

|isbn=978-0-471-03262-5

| isbn = 978-0-471-03262-5

|year=1979

|year=1979

1979年

}}</ref>

}}</ref>

} </ref >

<ref name=Mitzenmacher2005>

<ref name=Mitzenmacher2005>

< ref name = mitzenmacher2005 >

{{citation

{{citation

{ citation

|last1=Mitzenmacher |first1=Michael |author-link1=Michael Mitzenmacher

|last1=Mitzenmacher |first1=Michael |author-link1=Michael Mitzenmacher

1 = Mitzenmacher | first1 = Michael | author-link1 = Michael Mitzenmacher

|last2=Upfal |first2=Eli |author-link2=Eli Upfal

|last2=Upfal |first2=Eli |author-link2=Eli Upfal

|last2=Upfal |first2=Eli |author-link2=Eli Upfal

|title=Probability and Computing: Randomized Algorithms and Probabilistic Analysis

|title=Probability and Computing: Randomized Algorithms and Probabilistic Analysis

概率与计算: 随机算法与概率分析

|publisher=Cambridge University Press

|publisher=Cambridge University Press

剑桥大学出版社

|location=Cambridge, UK

|location=Cambridge, UK

| 地点: 英国剑桥

|isbn=978-0-521-83540-4

|isbn=978-0-521-83540-4

978-0-521-83540-4

|year=2005

|year=2005

2005年

}}</ref>

}}</ref>

} </ref >

<ref name=Short2013>

<ref name=Short2013>

< ref name = short2013 >

{{citation

{{citation

{ citation

|last1=Short |first1=Michael

|last1=Short |first1=Michael

1 = Short | first1 = Michael

|title=Improved Inequalities for the Poisson and Binomial Distribution and Upper Tail Quantile Functions

|title=Improved Inequalities for the Poisson and Binomial Distribution and Upper Tail Quantile Functions

| title = Poisson、二项分布和上尾分位数函数的改进不等式

|journal=ISRN Probability and Statistics

|journal=ISRN Probability and Statistics

| journal = ISRN 概率与统计

|year=2013 |volume=2013 |page=412958

|year=2013 |volume=2013 |page=412958

2013 | volume = 2013 | page = 412958

|doi=10.1155/2013/412958 |doi-access=free

|doi=10.1155/2013/412958 |doi-access=free

10.1155/2013/412958

}}</ref>

}}</ref>

} </ref >

<ref name=NIST2006>

<ref name=NIST2006>

< ref name = nist2006 >

{{citation

{{citation

{ citation

|last1=Prins |first1=Jack

|last1=Prins |first1=Jack

1 = Prins | first1 = Jack

|chapter=6.3.3.1. Counts Control Charts

|chapter=6.3.3.1. Counts Control Charts

6.3.3.1.计数控制图

|title=e-Handbook of Statistical Methods

|title=e-Handbook of Statistical Methods

| title = 电子统计方法手册

|chapter-url=http://www.itl.nist.gov/div898/handbook/pmc/section3/pmc331.htm

|chapter-url=http://www.itl.nist.gov/div898/handbook/pmc/section3/pmc331.htm

| chapter-url = http://www.itl.nist.gov/div898/handbook/pmc/section3/pmc331.htm

|date=2012

|date=2012

2012年

|publisher=NIST/SEMATECH

|publisher=NIST/SEMATECH

| publisher = NIST/SEMATECH

|accessdate=2019-09-20

|accessdate=2019-09-20

2019-09-20

}}</ref>

}}</ref>

} </ref >

<ref name=Zhang2013>

<ref name=Zhang2013>

< ref name = zhang2013 >

{{citation

{{citation

{ citation

|last1=Zhang |first1=Huiming

|last1=Zhang |first1=Huiming

1 = Zhang | first1 = Huiming

|last2=Liu |first2=Yunxiao

|last2=Liu |first2=Yunxiao

2 = Liu | first2 = Yunxiao

|last3=Li |first3=Bo

|last3=Li |first3=Bo

3 = Li | first3 = Bo

|title=Notes on discrete compound Poisson model with applications to risk theory

|title=Notes on discrete compound Poisson model with applications to risk theory

离散复合泊松模型及其在风险理论中的应用

|journal=Insurance: Mathematics and Economics

|journal=Insurance: Mathematics and Economics

| journal = Insurance: Mathematics and Economics

|year=2014 |volume=59 |pages=325–336

|year=2014 |volume=59 |pages=325–336

2014 | volume = 59 | pages = 325-336

|doi=10.1016/j.insmatheco.2014.09.012

|doi=10.1016/j.insmatheco.2014.09.012

| doi = 10.1016/j.insmatheco. 2014.09.012

}}</ref>

}}</ref>

} </ref >

<ref name=Zhang2016>

<ref name=Zhang2016>

< ref name = zhang2016 >

{{citation

{{citation

{ citation

|last1=Zhang |first1=Huiming

|last1=Zhang |first1=Huiming

1 = Zhang | first1 = Huiming

|last2=Li |first2=Bo

|last2=Li |first2=Bo

2 = Li | first2 = Bo

|title=Characterizations of discrete compound Poisson distributions

|title=Characterizations of discrete compound Poisson distributions

| title = 离散复合泊松分布的刻画

|journal=Communications in Statistics - Theory and Methods

|journal=Communications in Statistics - Theory and Methods

| 杂志 = 统计学通讯-理论与方法

|year=2016 |volume=45 |issue=22 |pages=6789–6802

|year=2016 |volume=45 |issue=22 |pages=6789–6802

| year = 2016 | volume = 45 | issue = 22 | pages = 6789-6802

|doi=10.1080/03610926.2014.901375

|doi=10.1080/03610926.2014.901375

| doi = 10.1080/03610926.2014.901375

}}</ref>

}}</ref>

} </ref >

<ref name=McCullagh1989>

<ref name=McCullagh1989>

< ref name = mccullagh1989 >

{{citation

{{citation

{ citation

|last1=McCullagh |first1=Peter |authorlink1=Peter McCullagh

|last1=McCullagh |first1=Peter |authorlink1=Peter McCullagh

1 = McCullagh | first1 = Peter | authorlink1 = Peter McCullagh

|last2=Nelder |first2=John |authorlink2=John Nelder

|last2=Nelder |first2=John |authorlink2=John Nelder

2 = Nelder | first2 = John | authorlink2 = John Nelder

|title = Generalized Linear Models

|title = Generalized Linear Models

| title = 广义线性模型

|publisher=Chapman and Hall

|publisher=Chapman and Hall

查普曼和霍尔

|location=London, UK

|location=London, UK

| 地点: 英国伦敦

|series=Monographs on Statistics and Applied Probability

|series=Monographs on Statistics and Applied Probability

| 系列 = 统计和应用概率专论

|year=1989

|year=1989

1989年

|volume=37

|volume=37

37

|isbn=978-0-412-31760-6

|isbn=978-0-412-31760-6

| isbn = 978-0-412-31760-6

}}</ref>

}}</ref>

} </ref >

<ref name=Anscombe1948>

<ref name=Anscombe1948>

< ref name = anscombe1948 >

{{citation

{{citation

{ citation

|last=Anscombe |first=Francis J. |author-link=Frank Anscombe

|last=Anscombe |first=Francis J. |author-link=Frank Anscombe

最后 = 安斯科姆 | 第一 = 弗朗西斯 j。| author-link = Frank Anscombe

|title=The transformation of Poisson, binomial and negative binomial data

|title=The transformation of Poisson, binomial and negative binomial data

| title = 泊松,二项式和负二项式数据的变换

|journal=Biometrika

|journal=Biometrika

| journal = Biometrika

|year=1948 |volume=35 |issue=3–4 |pages=246–254

|year=1948 |volume=35 |issue=3–4 |pages=246–254

1948 | volume = 35 | issue = 3-4 | pages = 246-254

|jstor=2332343

|jstor=2332343

2332343

|doi=10.1093/biomet/35.3-4.246

|doi=10.1093/biomet/35.3-4.246

| doi = 10.1093/biomet/35.3-4.246

}}</ref>

}}</ref>

} </ref >

<ref name=Ross2010>

<ref name=Ross2010>

< ref name = ross2010 >

{{citation

{{citation

{ citation

|last1=Ross |first1=Sheldon M.

|last1=Ross |first1=Sheldon M.

1 = Ross | first1 = Sheldon m.

|title=Introduction to Probability Models

|title=Introduction to Probability Models

| title = 概率模型简介

|edition=tenth

|edition=tenth

第十季,第十集

|publisher=Academic Press

|publisher=Academic Press

| publisher = Academic Press

|location=Boston, MA, USA

|location=Boston, MA, USA

| 地点: 美国马萨诸塞州波士顿

|year=2010

|year=2010

2010年

|isbn=978-0-12-375686-2

|isbn=978-0-12-375686-2

| isbn = 978-0-12-375686-2

}}</ref>

}}</ref>

} </ref >

<ref name=Rasch1963>

<ref name=Rasch1963>

< ref name = rasch1963 >

{{Citation

{{Citation

{引文

|last1=Rasch |first1=Georg

|last1=Rasch |first1=Georg

1 = Rasch | first1 = Georg

|contribution=The Poisson Process as a Model for a Diversity of Behavioural Phenomena

|contribution=The Poisson Process as a Model for a Diversity of Behavioural Phenomena

| 贡献 = 作为行为现象多样性模型的泊松过程

|title=17th International Congress of Psychology

|title=17th International Congress of Psychology

第17届国际心理学大会

|publisher=American Psychological Association

|publisher=American Psychological Association

美国心理学会

|place=Washington, DC, USA, August 20th – 26th, 1963

|place=Washington, DC, USA, August 20th – 26th, 1963

1963年8月20日至26日,美国华盛顿特区

|year=1963

|year=1963

1963年

|series=

|series=

系列 =

|volume=2 |issue= |pages=

|volume=2 |issue= |pages=

2 | issue = | pages =

|doi=10.1037/e685262012-108 <!-- doi leads to a paywall -->

|doi=10.1037/e685262012-108 <!-- doi leads to a paywall -->

10.1037/e685262012-108 < ! -- doi leads to a paywall -- >

|chapter-url=http://www.rasch.org/memo1963.pdf

|chapter-url=http://www.rasch.org/memo1963.pdf

| chapter-url = http://www.rasch.org/memo1963.pdf

}}</ref>

}}</ref>

} </ref >

<ref name=Flory1940>

<ref name=Flory1940>

< ref name = flori1940 >

{{citation

{{citation

{ citation

|last1=Flory

|last1=Flory

1 = Flory

|first1=Paul J.

|first1=Paul J.

1 = Paul j.

|title=Molecular Size Distribution in Ethylene Oxide Polymers

|title=Molecular Size Distribution in Ethylene Oxide Polymers

环氧乙烷聚合物的分子尺寸分布

|journal=Journal of the American Chemical Society

|journal=Journal of the American Chemical Society

美国化学学会杂志

|year=1940 |volume=62 |issue=6

|year=1940 |volume=62 |issue=6

1940 | volume = 62 | issue = 6

|pages=1561–1565

|pages=1561–1565

| 页数 = 1561-1565

|doi=10.1021/ja01863a066

|doi=10.1021/ja01863a066

| doi = 10.1021/ja01863a066

}}</ref>

}}</ref>

} </ref >

<ref name=Lomnitz1994>

<ref name=Lomnitz1994>

< ref name = lomnitz1994 >

{{citation

{{citation

{ citation

|last1=Lomnitz |first1=Cinna

|last1=Lomnitz |first1=Cinna

1 = Lomnitz | first1 = Cinna

|title=Fundamentals of Earthquake Prediction

|title=Fundamentals of Earthquake Prediction

| title = 地震预测的基本原理

|year=1994

|year=1994

1994年

|publisher=John Wiley & Sons

|publisher=John Wiley & Sons

2012年3月24日 | publisher = 约翰威立

|location=New York

|location=New York

| 地点: 纽约

|oclc=647404423

|oclc=647404423

647404423

|isbn=0-471-57419-8

|isbn=0-471-57419-8

| isbn = 0-471-57419-8

}}</ref>

}}</ref>

} </ref >

<ref name=Student1907>

<ref name=Student1907>

< ref name = student1907 >

{{citation

{{citation

{ citation

|last=Student |title=On the Error of Counting with a Haemacytometer

|last=Student |title=On the Error of Counting with a Haemacytometer

关于血细胞计数器计数的错误

|journal=Biometrika

|journal=Biometrika

| journal = Biometrika

|year=1907 |volume=5 |issue=3 |pages=351–360

|year=1907 |volume=5 |issue=3 |pages=351–360

1907 | volume = 5 | issue = 3 | pages = 351-360

|jstor=2331633

|jstor=2331633

2331633

|doi=10.2307/2331633

|doi=10.2307/2331633

10.2307/2331633

|url=https://zenodo.org/record/1620891

|url=https://zenodo.org/record/1620891

Https://zenodo.org/record/1620891

}}</ref>

}}</ref>

} </ref >

<ref name=Boland1984>

<ref name=Boland1984>

< ref name = boland1984 >

{{citation

{{citation

{ citation

|last=Boland |first=Philip J.

|last=Boland |first=Philip J.

最后 = Boland | first = Philip j。

|title=A Biographical Glimpse of William Sealy Gosset

|title=A Biographical Glimpse of William Sealy Gosset

文章标题 = 威廉·戈塞传记一瞥

|journal=The American Statistician

|journal=The American Statistician

美国统计学家

|year=1984

|year=1984

1984年

|volume=38

|volume=38

38

|issue=3

|issue=3

第三期

|pages=179–183

|pages=179–183

| 页数 = 179-183

|jstor=2683648

|jstor=2683648

2683648

|doi=10.1080/00031305.1984.10483195

|doi=10.1080/00031305.1984.10483195

| doi = 10.1080/00031305.1984.10483195

}}</ref>

}}</ref>

} </ref >

<ref name=Erlang1909>

<ref name=Erlang1909>

< ref name = erlang1909 >

{{citation

{{citation

{ citation

|last1=Erlang |first1=Agner K.

|last1=Erlang |first1=Agner K.

1 = Erlang | first1 = Agner k.

|title=Sandsynlighedsregning og Telefonsamtaler

|title=Sandsynlighedsregning og Telefonsamtaler

|title=Sandsynlighedsregning og Telefonsamtaler

|trans-title=Probability Calculation and Telephone Conversations

|trans-title=Probability Calculation and Telephone Conversations

| trans-title = 概率计算和电话交谈

|language=Danish

|language=Danish

| language = Danish

|journal=Nyt Tidsskrift for Matematik

|journal=Nyt Tidsskrift for Matematik

| journal = Nyt Tidsskrift for Matematik

|year=1909 |volume=20 |issue=B |pages=33–39

|year=1909 |volume=20 |issue=B |pages=33–39

| year = 1909 | volume = 20 | issue = b | pages = 33-39

|jstor=24528622

|jstor=24528622

24528622

}}</ref>

}}</ref>

} </ref >

<ref name=Hornby2014>

<ref name=Hornby2014>

< ref name = hornby2014 >

{{citation

{{citation

{ citation

|last=Hornby |first=Dave

|last=Hornby |first=Dave

最后 = Hornby | first = Dave

|title=Football Prediction Model: Poisson Distribution

|title=Football Prediction Model: Poisson Distribution

足球预测模型: 泊松分佈

|publisher=Sports Betting Online

|publisher=Sports Betting Online

体育博彩在线

|url=http://www.sportsbettingonline.net/strategy/football-prediction-model-poisson-distribution

|url=http://www.sportsbettingonline.net/strategy/football-prediction-model-poisson-distribution

Http://www.sportsbettingonline.net/strategy/football-prediction-model-poisson-distribution

|year=2014

|year=2014

2014年

|accessdate=2014-09-19

|accessdate=2014-09-19

2014-09-19

}}</ref>

}}</ref>

} </ref >

<ref name=Koyama2016>

<ref name=Koyama2016>

< ref name = koyama2016 >

{{Citation

{{Citation

{引文

|last1=Koyama |first1=Kento

|last1=Koyama |first1=Kento

|last1=Koyama |first1=Kento

|last2=Hokunan |first2=Hidekazu

|last2=Hokunan |first2=Hidekazu

2 = Hokunan | first2 = Hidekazu

|last3=Hasegawa |first3=Mayumi

|last3=Hasegawa |first3=Mayumi

|last3=Hasegawa |first3=Mayumi

|last4=Kawamura |first4=Shuso

|last4=Kawamura |first4=Shuso

|last4=Kawamura |first4=Shuso

|last5=Koseki |first5=Shigenobu

|last5=Koseki |first5=Shigenobu

|last5=Koseki |first5=Shigenobu

|title=Do bacterial cell numbers follow a theoretical Poisson distribution? Comparison of experimentally obtained numbers of single cells with random number generation via computer simulation

|title=Do bacterial cell numbers follow a theoretical Poisson distribution? Comparison of experimentally obtained numbers of single cells with random number generation via computer simulation

细菌细胞数量是否遵循理论上的泊松分佈?实验获得的单细胞数目与通过计算机模拟随机产生的数目的比较

|journal=Food Microbiology

|journal=Food Microbiology

2012年3月24日 | 日志 = 食品微生物学

|year=2016 |volume=60 |pages=49–53

|year=2016 |volume=60 |pages=49–53

2016 | volume = 60 | pages = 49-53

|doi=10.1016/j.fm.2016.05.019

|doi=10.1016/j.fm.2016.05.019

| doi = 10.1016/j.fm. 2016.05.019

|pmid=27554145

|pmid=27554145

27554145

}}</ref>

}}</ref>

} </ref >

<ref name=Clarke1946>

<ref name=Clarke1946>

< ref name = clarke1946 >

{{citation

{{citation

{ citation

|last1=Clarke |first1=R. D.

|last1=Clarke |first1=R. D.

1 = Clarke | first1 = r.D.

|title=An application of the Poisson distribution

|title=An application of the Poisson distribution

| title = 泊松分佈的一个应用程序

|journal=Journal of the Institute of Actuaries

|journal=Journal of the Institute of Actuaries

| Journal = the Journal of the Institute of Actuaries

|year=1946 |volume=72 |issue=3 |page=481

|year=1946 |volume=72 |issue=3 |page=481

1946 | volume = 72 | issue = 3 | page = 481

|doi=10.1017/S0020268100035435

|doi=10.1017/S0020268100035435

| doi = 10.1017/S0020268100035435

|url=https://www.actuaries.org.uk/system/files/documents/pdf/0481.pdf

|url=https://www.actuaries.org.uk/system/files/documents/pdf/0481.pdf

Https://www.actuaries.org.uk/system/files/documents/pdf/0481.pdf

|doi-access=free

|doi-access=free

免费访问

}}</ref>

}}</ref>

} </ref >

<ref name=Gallagher1976>

<ref name=Gallagher1976>

< ref name = gallagher 1976 >

{{Citation

{{Citation

{引文

|last=Gallagher |first=Patrick X.

|last=Gallagher |first=Patrick X.

最后 = 加拉格尔 | 第一 = 帕特里克 x。

|title=On the distribution of primes in short intervals

|title=On the distribution of primes in short intervals

| title = 关于素数在短时间内的分布

|journal=Mathematika

|journal=Mathematika

日志 = mathematica

|year=1976 |volume=23 |issue=1 |pages=4–9

|year=1976 |volume=23 |issue=1 |pages=4–9

1976 | volume = 23 | issue = 1 | pages = 4-9

|doi=10.1112/s0025579300016442

|doi=10.1112/s0025579300016442

| doi = 10.1112/s0025579300016442

|doi-access=free

|doi-access=free

免费访问

}}</ref>

}}</ref>

} </ref >

<ref name=Hardy1923>

<ref name=Hardy1923>

< ref name = hardy1923 >

{{Citation

{{Citation

{引文

|last1=Hardy |first1=Godfrey H. |author-link1=G. H. Hardy

|last1=Hardy |first1=Godfrey H. |author-link1=G. H. Hardy

1 = Hardy | first1 = Godfrey h | author-link1 = g.哈代

|last2=Littlewood |first2=John E. |author-link2=John Edensor Littlewood

|last2=Littlewood |first2=John E. |author-link2=John Edensor Littlewood

2 = Littlewood | first2 = John e | author-link2 = 约翰·恩瑟·李特尔伍德

|title=On some problems of "partitio numerorum" III: On the expression of a number as a sum of primes

|title=On some problems of "partitio numerorum" III: On the expression of a number as a sum of primes

| title = 关于“数的部分”的几个问题 III: 关于数作为素数之和的表达式

|journal=Acta Mathematica

|journal=Acta Mathematica

数学学报 | journal = Acta Mathematica

|year=1923 |volume=44 |pages=1–70

|year=1923 |volume=44 |pages=1–70

1923 | volume = 44 | pages = 1-70

|doi=10.1007/BF02403921

|doi=10.1007/BF02403921

| doi = 10.1007/BF02403921

|doi-access=free}}</ref>

|doi-access=free}}</ref>

| doi-access = free }} </ref >

<ref name=Cameron1998>

<ref name=Cameron1998>

< ref name = cameron1998 >

{{citation

{{citation

{ citation

|last1=Cameron |first1=A. Colin

|last1=Cameron |first1=A. Colin

1 = Cameron | first1 = a.科林

|last2=Trivedi |first2=Pravin K.

|last2=Trivedi |first2=Pravin K.

2 = Trivedi | first2 = Pravin k.

|title=Regression Analysis of Count Data

|title=Regression Analysis of Count Data

数据的回归分析

|year=1998

|year=1998

1998年

|publisher=Cambridge University Press

|publisher=Cambridge University Press

剑桥大学出版社

|location=Cambridge, UK

|location=Cambridge, UK

| 地点: 英国剑桥

|isbn=978-0-521-63567-7

|isbn=978-0-521-63567-7

| isbn = 978-0-521-63567-7

|url=https://books.google.com/books?id=SKUXe_PjtRMC&pg=PA5

|url=https://books.google.com/books?id=SKUXe_PjtRMC&pg=PA5

Https://books.google.com/books?id=skuxe_pjtrmc&pg=pa5

}}</ref>

}}</ref>

} </ref >

<ref name=Edgeworth1913>

<ref name=Edgeworth1913>

< ref name = edgeworth1913 >

{{Citation

{{Citation

{引文

|last=Edgeworth |first=Francis Y.

|last=Edgeworth |first=Francis Y.

| last = Edgeworth | first = Francis y.

|authorlink=Francis Ysidro Edgeworth

|authorlink=Francis Ysidro Edgeworth

作者/链接 = 弗朗西斯·伊西德罗·埃奇沃思

|title=On the use of the theory of probabilities in statistics relating to society

|title=On the use of the theory of probabilities in statistics relating to society

论概率论在与社会相关的统计学中的应用

|journal=[[Journal of the Royal Statistical Society]]

|journal=Journal of the Royal Statistical Society

英国皇家统计学会杂志

|year=1913 |volume=76 |issue=2 |pages=165–193

|year=1913 |volume=76 |issue=2 |pages=165–193

1913 | volume = 76 | issue = 2 | pages = 165-193

|jstor=2340091 |doi=10.2307/2340091

|jstor=2340091 |doi=10.2307/2340091

| jstor = 2340091 | doi = 2340091

}}</ref>

}}</ref>

} </ref >

<ref name=Knuth1997>

<ref name=Knuth1997>

< ref name = knuth1997 >

{{citation

{{citation

{ citation

|last=Knuth |first=Donald Ervin

|last=Knuth |first=Donald Ervin

最后 = Knuth | first = Donald Ervin

|title = Seminumerical Algorithms

|title = Seminumerical Algorithms

半数算法

|publisher=[[Addison Wesley]]

|publisher=Addison Wesley

艾迪生 · 韦斯利

|series=[[The Art of Computer Programming]]

|series=The Art of Computer Programming

计算机编程的艺术

|volume=2

|volume=2

2

|edition=3rd

|edition=3rd

3 rd

|year=1997

|year=1997

1997年

|isbn=978-0-201-89684-8

|isbn=978-0-201-89684-8

| isbn = 978-0-201-89684-8

}}</ref>

}}</ref>

} </ref >

<ref name="Devroye1986">

<ref name="Devroye1986">

< ref name ="devroye1986">

{{citation

{{citation

{ citation

|last=Devroye |first=Luc |author-link=Luc Devroye

|last=Devroye |first=Luc |author-link=Luc Devroye

最后 = Devroye | first = Luc | author-link = Luc Devroye

|chapter=Discrete Univariate Distributions

|chapter=Discrete Univariate Distributions

| chapter = 离散单变量分布

|chapterurl=http://luc.devroye.org/chapter_ten.pdf

|chapterurl=http://luc.devroye.org/chapter_ten.pdf

Http://luc.devroye.org/chapter_ten.pdf

|title=Non-Uniform Random Variate Generation

|title=Non-Uniform Random Variate Generation

| title = 非均匀随机变量生成

|year=1986

|year=1986

1986年

|publisher=Springer-Verlag

|publisher=Springer-Verlag

| publisher = Springer-Verlag

|place=New York, NJ, USA

|place=New York, NJ, USA

地点: 纽约,新泽西,美国

|pages=485–553

|pages=485–553

| 页数 = 485-553

|url=http://luc.devroye.org/rnbookindex.html

|url=http://luc.devroye.org/rnbookindex.html

Http://luc.devroye.org/rnbookindex.html

|doi=10.1007/978-1-4613-8643-8_10

|doi=10.1007/978-1-4613-8643-8_10

| doi = 10.1007/978-1-4613-8643-810

|isbn=978-1-4613-8645-2 }}</ref>

|isbn=978-1-4613-8645-2 }}</ref>

978-1-4613-8645-2}} </ref >

<ref name="Garwood1936">

<ref name="Garwood1936">

< ref name ="garwood1936">

{{citation

{{citation

{ citation

|last=Garwood |first=Frank

|last=Garwood |first=Frank

| last = Garwood | first = Frank

|title=Fiducial Limits for the Poisson Distribution

|title=Fiducial Limits for the Poisson Distribution

泊松分佈的信托限制

|journal=Biometrika

|journal=Biometrika

| journal = Biometrika

|year=1936 |volume=28 |issue=3/4 |pages=437–442

|year=1936 |volume=28 |issue=3/4 |pages=437–442

1936 | volume = 28 | issue = 3/4 | pages = 437-442

|doi=10.1093/biomet/28.3-4.437

|doi=10.1093/biomet/28.3-4.437

| doi = 10.1093/biomet/28.3-4.437

|jstor=2333958

|jstor=2333958

2333958

}}</ref><ref name=Breslow1987>{{citation

}}</ref><ref name=Breslow1987>{{citation

} </ref > < ref name = breslow1987 > { citation

|last1=Breslow

|last1=Breslow

1 = Breslow

|first1=Norman E.

|first1=Norman E.

1 = Norman e.

|authorlink1=Norman Breslow

|authorlink1=Norman Breslow

1 = Norman Breslow

|last2=Day

|last2=Day

2 = Day

|first2=Nick E.

|first2=Nick E.

2 = Nick e.

|authorlink2=Nick Day

|authorlink2=Nick Day

2 = Nick Day

|title=Statistical Methods in Cancer Research: Volume 2—The Design and Analysis of Cohort Studies

|title=Statistical Methods in Cancer Research: Volume 2—The Design and Analysis of Cohort Studies

癌症研究的统计方法: 第2卷ー队列研究的设计与分析

|year=1987

|year=1987

1987年

|publisher=[[International Agency for Research on Cancer]]

|publisher=International Agency for Research on Cancer

国际癌症研究机构

|location=Lyon, France

|location=Lyon, France

| 地点: 法国里昂

|url=http://www.iarc.fr/en/publications/pdfs-online/stat/sp82/index.php

|url=http://www.iarc.fr/en/publications/pdfs-online/stat/sp82/index.php

Http://www.iarc.fr/en/publications/pdfs-online/stat/sp82/index.php

|isbn=978-92-832-0182-3

|isbn=978-92-832-0182-3

| isbn = 978-92-832-0182-3

|access-date=2012-03-11

|access-date=2012-03-11

| access-date = 2012-03-11

|archive-url=https://web.archive.org/web/20180808161401/http://www.iarc.fr/en/publications/pdfs-online/stat/sp82/index.php

|archive-url=https://web.archive.org/web/20180808161401/http://www.iarc.fr/en/publications/pdfs-online/stat/sp82/index.php

| archive-url = https://web.archive.org/web/20180808161401/http://www.iarc.fr/en/publications/pdfs-online/stat/sp82/index.php

|archive-date=2018-08-08

|archive-date=2018-08-08

| archive-date = 2018-08-08

|url-status=dead

|url-status=dead

地位 = 死亡

}}</ref>

}}</ref>

} </ref >

<ref name=Fink1976>

<ref name=Fink1976>

< ref name = fink1976 >

{{citation

{{citation

{ citation

|last=Fink |first=Daniel

|last=Fink |first=Daniel

| last = Fink | first = Daniel

|title=A Compendium of Conjugate Priors

|title=A Compendium of Conjugate Priors

| 题目 = 共轭先验概要

|year=1997

|year=1997

1997年

|publisher=

|publisher=

| publisher =

|location=

|location=

| location =

|isbn=

|isbn=

| isbn =

}}</ref>

}}</ref>

} </ref >

<ref name=Gelman2003>

<ref name=Gelman2003>

< ref name = gelman2003 >

{{citation

{{citation

{ citation

|last1=Gelman |last2=Carlin |first2=John B.

|last1=Gelman |last2=Carlin |first2=John B.

1 = Gelman | last2 = Carlin | first2 = John b.

|last3=Stern |first3=Hal S.

|last3=Stern |first3=Hal S.

3 = Stern | first3 = Hal s.

|last4=Rubin |first4=Donald B.

|last4=Rubin |first4=Donald B.

4 = Rubin | first4 = Donald b.

|title=Bayesian Data Analysis

|title=Bayesian Data Analysis

| title = Bayesian Data Analysis

|year=2003

|year=2003

2003年

|edition=2nd

|edition=2nd

2nd

|publisher=Chapman & Hall/CRC

|publisher=Chapman & Hall/CRC

| publisher = Chapman & Hall/CRC

|location=Boca Raton, FL, USA

|location=Boca Raton, FL, USA

| location = Boca Raton,FL,USA

|isbn=1-58488-388-X

|isbn=1-58488-388-X

| isbn = 1-58488-388-X

|doi=

|doi=

| doi =

}}</ref>

}}</ref>

} </ref >

<ref name=Clevenson1975>

<ref name=Clevenson1975>

< ref name = clevenson1975 >

{{citation

{{citation

{ citation

|last1=Clevenson |first1=M. Lawrence

|last1=Clevenson |first1=M. Lawrence

1 = Clevenson | first1 = m.劳伦斯

|last2=Zidek |first2=James V.

|last2=Zidek |first2=James V.

2 = zidk | first2 = James v.

|title=Simultaneous Estimation of the Means of Independent Poisson Laws

|title=Simultaneous Estimation of the Means of Independent Poisson Laws

| title = 独立泊松定律均值的同时估计

|journal=Journal of the American Statistical Association

|journal=Journal of the American Statistical Association

美国统计协会杂志

|year=1975 |volume=70 |issue=351 |pages=698–705

|year=1975 |volume=70 |issue=351 |pages=698–705

1975 | volume = 70 | issue = 351 | pages = 698-705

|doi=10.1080/01621459.1975.10482497

|doi=10.1080/01621459.1975.10482497

| doi = 10.1080/01621459.1975.10482497

|jstor=2285958

|jstor=2285958

2285958

}}</ref>

}}</ref>

} </ref >

<ref name=Berger1985>

<ref name=Berger1985>

< ref name = berger1985 >

{{citation

{{citation

{ citation

|last=Berger |first=James O.

|last=Berger |first=James O.

最后 = 伯格 | 第一 = 詹姆斯 o。

|title=Statistical Decision Theory and Bayesian Analysis

|title=Statistical Decision Theory and Bayesian Analysis

统计决策理论和贝叶斯分析

|series=Springer Series in Statistics

|series=Springer Series in Statistics

| 系列 = 统计学中的斯普林格系列

|year=1985

|year=1985

1985年

|edition=2nd

|edition=2nd

2nd

|publisher=Springer-Verlag

|publisher=Springer-Verlag

| publisher = Springer-Verlag

|location=New York, NJ, USA

|location=New York, NJ, USA

纽约,新泽西,美国

|isbn=978-0-387-96098-2

|isbn=978-0-387-96098-2

| isbn = 978-0-387-96098-2

|doi=10.1007/978-1-4757-4286-2

|doi=10.1007/978-1-4757-4286-2

| doi = 10.1007/978-1-4757-4286-2

}}</ref>

}}</ref>

} </ref >

<ref name=Loukas1986>

<ref name=Loukas1986>

< ref name = loukas1986 >

{{citation

{{citation

{ citation

|last1=Loukas |first=Sotirios

|last1=Loukas |first=Sotirios

1 = Loukas | first = Sotirios

|last2=Kemp |first2=C. David

|last2=Kemp |first2=C. David

2 = Kemp | first2 = c.大卫

|title=The Index of Dispersion Test for the Bivariate Poisson Distribution

|title=The Index of Dispersion Test for the Bivariate Poisson Distribution

| title = 二元泊松分佈的离散度检验指数

|journal=Biometrics

|journal=Biometrics

生物测定学

|year=1986 |volume=42 |issue=4 |pages=941–948

|year=1986 |volume=42 |issue=4 |pages=941–948

| year = 1986 | volume = 42 | issue = 4 | pages = 941-948

|doi= 10.2307/2530708

|doi= 10.2307/2530708

10.2307/2530708

|jstor=2530708

|jstor=2530708

2530708

}}</ref>

}}</ref>

} </ref >

}}

}}

}}



=== Sources ===

{{refbegin}}

* {{citation

|last1 = Ahrens |first1 = Joachim H.

|last1 = Ahrens |first1 = Joachim H.

1 = Ahrens | first1 = Joachim h.

|last2 = Dieter |first2 = Ulrich

|last2 = Dieter |first2 = Ulrich

2 = Dieter | first2 = Ulrich

|title = Computer Methods for Sampling from Gamma, Beta, Poisson and Binomial Distributions

|title = Computer Methods for Sampling from Gamma, Beta, Poisson and Binomial Distributions

| title = 从伽马分布、贝塔分布、泊松分布和二项分布抽样的计算机方法

|journal = Computing

|journal = Computing

日志 = 计算

|year = 1974

|year = 1974

1974年

|volume = 12

|volume = 12

12

|issue = 3

|issue = 3

第三期

|pages = 223–246

|pages = 223–246

| page = 223-246

|doi = 10.1007/BF02293108

|doi = 10.1007/BF02293108

| doi = 10.1007/BF02293108

}}

}}

}}

* {{citation

|last1 = Ahrens |first1 = Joachim H.

|last1 = Ahrens |first1 = Joachim H.

1 = Ahrens | first1 = Joachim h.

|last2 = Dieter |first2 = Ulrich

|last2 = Dieter |first2 = Ulrich

2 = Dieter | first2 = Ulrich

|title = Computer Generation of Poisson Deviates

|title = Computer Generation of Poisson Deviates

| title = 计算机生成的泊松偏离

|journal = ACM Transactions on Mathematical Software

|journal = ACM Transactions on Mathematical Software

| journal = ACM 数学软件汇刊

|year = 1982

|year = 1982

1982年

|volume = 8

|volume = 8

8

|issue = 2

|issue = 2

2

|pages = 163–179

|pages = 163–179

| 页 = 163-179

|doi = 10.1145/355993.355997

|doi = 10.1145/355993.355997

10.1145/355993.355997

}}

}}

}}

* {{citation

|last1 = Evans |first1 = Ronald J.

|last1 = Evans |first1 = Ronald J.

1 = Evans | first1 = Ronald j.

|last2 = Boersma |first2 = J.

|last2 = Boersma |first2 = J.

2 = Boersma | first2 = j.

|last3 = Blachman |first3 = N. M.

|last3 = Blachman |first3 = N. M.

3 = Blachman | first3 = n.

|last4 = Jagers |first4 = A. A.

|last4 = Jagers |first4 = A. A.

4 = Jagers | first4 = a.答:。

|title = The Entropy of a Poisson Distribution: Problem 87-6

|title = The Entropy of a Poisson Distribution: Problem 87-6

| title = 泊松分佈的熵: 问题87-6

|journal = SIAM Review

|journal = SIAM Review

| journal = SIAM Review

|year = 1988

|year = 1988

1988年

|volume = 30

|volume = 30

30

|issue = 2

|issue = 2

2

|pages = 314–317

|pages = 314–317

| 页数 = 314-317

|doi = 10.1137/1030059

|doi = 10.1137/1030059

10.1137/1030059

|url = https://research.tue.nl/nl/publications/solution-to-problem-876--the-entropy-of-a-poisson-distribution(94cf6dd2-b35e-41c8-9da7-6ec69ca391a0).html

|url = https://research.tue.nl/nl/publications/solution-to-problem-876--the-entropy-of-a-poisson-distribution(94cf6dd2-b35e-41c8-9da7-6ec69ca391a0).html

Https://research.tue.nl/nl/publications/solution-to-problem-876--the-entropy-of-a-poisson-distribution(94cf6dd2-b35e-41c8-9da7-6ec69ca391a0).html

}}

}}

}}

{{refend}}



{{-}}

{{ProbDistributions|discrete-infinite}}



{{Authority control}}



[[Category:Poisson distribution| ]]

[[Category:Articles with example pseudocode]]

Category:Articles with example pseudocode

类别: 带有伪代码示例的文章

[[Category:Conjugate prior distributions]]

Category:Conjugate prior distributions

范畴: 共轭先验分布

[[Category:Factorial and binomial topics]]

Category:Factorial and binomial topics

类别: 阶乘和二项式主题

[[Category:Infinitely divisible probability distributions]]

Category:Infinitely divisible probability distributions

类别: 无限可分的概率分布

<noinclude>

<small>This page was moved from [[wikipedia:en:Poisson distribution]]. Its edit history can be viewed at [[泊松分布/edithistory]]</small></noinclude>

[[Category:待整理页面]]
1,592

个编辑

导航菜单