第1行: |
第1行: |
| 此词条由Jie翻译。由Lincent审校。 | | 此词条由Jie翻译。由Lincent审校。 |
− |
| |
− | {{Short description|Non-parametric statistical test between two distributions}}
| |
| | | |
| [[文件:KS_Example.png|缩略图|右|Kolmogorov–Smirnov统计数据的图示。 红线是累积分布函数,蓝线是经验分布函数,黑色箭头是K–S统计量。]] | | [[文件:KS_Example.png|缩略图|右|Kolmogorov–Smirnov统计数据的图示。 红线是累积分布函数,蓝线是经验分布函数,黑色箭头是K–S统计量。]] |
| | | |
| + | 在统计学中,'''<font color="#ff8000"> Kolmogorov–Smirnov检验</font>'''(亦称'''K-S检验'''或'''KS检验''')是一种连续(或不连续,请参见第2.2节)的一维概率分布均等性的非参数检验,可用于比较一个样本与一个参考概率分布(单一样本K-S检验),或比较两个样本(两个样本的K-S检验)。它是以安德雷·柯尔莫哥洛夫 Andrey Kolmogorov和尼古莱·斯米尔诺夫 Nikolai Smirnov的名字命名。 |
| | | |
− |
| |
− | In [[statistics]], the '''Kolmogorov–Smirnov test''' ('''K–S test''' or '''KS test''') is a [[nonparametric statistics|nonparametric test]] of the equality of continuous (or discontinuous, see [[#Discrete and mixed null distribution|Section 2.2]]), one-dimensional [[probability distribution]]s that can be used to compare a [[random sample|sample]] with a reference probability distribution (one-sample K–S test), or to compare two samples (two-sample K–S test). It is named after [[Andrey Kolmogorov]] and [[Nikolai Smirnov (mathematician)|Nikolai Smirnov]].
| |
− |
| |
− | In statistics, the Kolmogorov–Smirnov test (K–S test or KS test) is a nonparametric test of the equality of continuous (or discontinuous, see Section 2.2), one-dimensional probability distributions that can be used to compare a sample with a reference probability distribution (one-sample K–S test), or to compare two samples (two-sample K–S test). It is named after Andrey Kolmogorov and Nikolai Smirnov.
| |
− |
| |
− | 在统计学中,'''<font color="#ff8000"> Kolmogorov–Smirnov检验</font>'''(K-S检验或KS检验)是一种连续(或不连续,请参见第2.2节)的一维概率分布均等性的非参数检验,可用于比较一个样本与一个参考概率分布(单一样本K-S检验),或比较两个样本(两个样本的K-S检验)。它是以安德雷·柯尔莫哥洛夫 Andrey Kolmogorov和尼古莱·斯米尔诺夫 Nikolai Smirnov的名字命名。
| |
− |
| |
− |
| |
− |
| |
− | The Kolmogorov–Smirnov statistic quantifies a [[metric (mathematics)|distance]] between the [[empirical distribution function]] of the sample and the [[cumulative distribution function]] of the reference distribution, or between the empirical distribution functions of two samples. The [[null distribution]] of this statistic is calculated under the [[null hypothesis]] that the sample is drawn from the reference distribution (in the one-sample case) or that the samples are drawn from the same distribution (in the two-sample case). In the one-sample case, the distribution considered under the null hypothesis may be continuous (see [[#Kolmogorov distribution|Section 2]]), purely discrete or mixed (see [[#Discrete and mixed null distribution|Section 2.2]]). In the two-sample case (see [[#Two-sample Kolmogorov–Smirnov test|Section 3]]), the distribution considered under the null hypothesis is a continuous distribution but is otherwise unrestricted.
| |
− |
| |
− | The Kolmogorov–Smirnov statistic quantifies a distance between the empirical distribution function of the sample and the cumulative distribution function of the reference distribution, or between the empirical distribution functions of two samples. The null distribution of this statistic is calculated under the null hypothesis that the sample is drawn from the reference distribution (in the one-sample case) or that the samples are drawn from the same distribution (in the two-sample case). In the one-sample case, the distribution considered under the null hypothesis may be continuous (see Section 2), purely discrete or mixed (see Section 2.2). In the two-sample case (see Section 3), the distribution considered under the null hypothesis is a continuous distribution but is otherwise unrestricted.
| |
| | | |
| Kolmogorov-Smirnov统计定量描述了一个样本分布的'''<font color="#ff8000">经验分布函数 Empirical distribution function</font>'''与一个参考分布的'''<font color="#ff8000">累积分布函数 Cumulative distribution function</font>'''之间的距离,或者是两个样本分布的经验分布函数之间的距离。该统计量的'''<font color="#ff8000">零分布 Null distribution</font>'''是基于'''<font color="#ff8000">零假设Null hypothesis</font>'''(或称原始假设)下计算的,可以从参考分布中抽取样本(在单个样本的情况下),或者从相同分布中抽取样本组(在两个样本的情况下)。在单样本情况下,零假设(原假设)考虑的分布可能是连续的(请参阅第2节),纯离散的或混合的(请参阅第2.2节)。然而在考虑两个样本情况下(请参阅第3节),零假设下的分布仅能确定为连续分布,在其他方面并不受限制。 | | Kolmogorov-Smirnov统计定量描述了一个样本分布的'''<font color="#ff8000">经验分布函数 Empirical distribution function</font>'''与一个参考分布的'''<font color="#ff8000">累积分布函数 Cumulative distribution function</font>'''之间的距离,或者是两个样本分布的经验分布函数之间的距离。该统计量的'''<font color="#ff8000">零分布 Null distribution</font>'''是基于'''<font color="#ff8000">零假设Null hypothesis</font>'''(或称原始假设)下计算的,可以从参考分布中抽取样本(在单个样本的情况下),或者从相同分布中抽取样本组(在两个样本的情况下)。在单样本情况下,零假设(原假设)考虑的分布可能是连续的(请参阅第2节),纯离散的或混合的(请参阅第2.2节)。然而在考虑两个样本情况下(请参阅第3节),零假设下的分布仅能确定为连续分布,在其他方面并不受限制。 |
| | | |
| | | |
| + | '''K–S双样本检验'''是比较两个样本分布最有用,也是最通用的非参数方法之一,因为在对比两个样本时,K-S检验对其经验累积分布函数的位置和形状差异具有一定的敏感性。 |
| | | |
− | The two-sample K–S test is one of the most useful and general nonparametric methods for comparing two samples, as it is sensitive to differences in both location and shape of the empirical cumulative distribution functions of the two samples.
| |
− |
| |
− | The two-sample K–S test is one of the most useful and general nonparametric methods for comparing two samples, as it is sensitive to differences in both location and shape of the empirical cumulative distribution functions of the two samples.
| |
− |
| |
− | K–S双样本检验是比较两个样本分布最有用,也是最通用的非参数方法之一,因为在对比两个样本时,K-S检验对其经验累积分布函数的位置和形状差异具有一定的敏感性。
| |
− |
| |
− |
| |
− |
| |
− | The Kolmogorov–Smirnov test can be modified to serve as a [[goodness of fit]] test. In the special case of testing for [[Normal distribution|normality]] of the distribution, samples are standardized and compared with a standard normal distribution. This is equivalent to setting the mean and variance of the reference distribution equal to the sample estimates, and it is known that using these to define the specific reference distribution changes the null distribution of the test statistic (see [[#Test with estimated parameters|Test with estimated parameters]]). Various studies have found that, even in this corrected form, the test is less powerful for testing normality than the [[Shapiro–Wilk test]] or [[Anderson–Darling test]].
| |
− |
| |
− | The Kolmogorov–Smirnov test can be modified to serve as a goodness of fit test. In the special case of testing for normality of the distribution, samples are standardized and compared with a standard normal distribution. This is equivalent to setting the mean and variance of the reference distribution equal to the sample estimates, and it is known that using these to define the specific reference distribution changes the null distribution of the test statistic (see Test with estimated parameters). Various studies have found that, even in this corrected form, the test is less powerful for testing normality than the Shapiro–Wilk test or Anderson–Darling test. However, these other tests have their own disadvantages. For instance the Shapiro–Wilk test is known not to work well in samples with many identical values.
| |
| | | |
| Kolmogorov–Smirnov检验经过修改以后可以作为'''<font color="#ff8000">拟合优度检验 goodness of fit test</font>'''。在测试分布正态性的特殊情况下,将样本先标准化再与标准正态分布进行比较。这相当于将参考分布的均值和方差设置为与样本估计值相等。显然,使用这些值和方差来定义特定参考分布会更改检验统计量的零分布(请参阅使用估算参数进行检验)。各种研究发现,即使采用这种校正形式,该测试也不能像夏皮罗一威尔克 Shapiro-Wilk检验或安德森·达林 Anderson-Darling检验那样有效地检验正态性。当然,这些检验方法也有其自身的缺点。例如,Shapiro–Wilk检验在具有许多相同值的样本中效果并不好。 | | Kolmogorov–Smirnov检验经过修改以后可以作为'''<font color="#ff8000">拟合优度检验 goodness of fit test</font>'''。在测试分布正态性的特殊情况下,将样本先标准化再与标准正态分布进行比较。这相当于将参考分布的均值和方差设置为与样本估计值相等。显然,使用这些值和方差来定义特定参考分布会更改检验统计量的零分布(请参阅使用估算参数进行检验)。各种研究发现,即使采用这种校正形式,该测试也不能像夏皮罗一威尔克 Shapiro-Wilk检验或安德森·达林 Anderson-Darling检验那样有效地检验正态性。当然,这些检验方法也有其自身的缺点。例如,Shapiro–Wilk检验在具有许多相同值的样本中效果并不好。 |
| | | |
| | | |
− | == Kolmogorov–Smirnov statistic Kolmogorov-Smirnov统计== | + | ==Kolmogorov-Smirnov统计== |
− | | |
− | The [[empirical distribution function]] ''F''<sub>''n''</sub> for ''n'' [[Independent and identically distributed random variables|independent and identically distributed]] (i.i.d.) ordered observations ''X<sub>i</sub>'' is defined as
| |
| | | |
− | n个独立且同分布(i.i.d.)的有序观测值Xi的经验分布函数Fn定义为:
| + | ''n'' 个独立且同分布(i.i.d.)的有序观测值''X<sub>i</sub>''的经验分布函数''F''<sub>''n''</sub>定义为: |
| | | |
− | <math>F_n(x)={1 \over n}\sum_{i=1}^n I_{[-\infty,x]}(X_i)</math>
| |
| | | |
| + | :<math>F_n(x)={1 \over n}\sum_{i=1}^n I_{[-\infty,x]}(X_i)</math> |
| | | |
− |
| |
− | where I_{[-\infty,x]}(X_i) is the indicator function, equal to 1 if X_i \le x and equal to 0 otherwise.
| |
| | | |
| 其中 {\displaystyle I_{[-\infty ,x]}(X_{i})}I_{[-\infty ,x]}(X_{i})是指标函数,如果 {\displaystyle X_{i}\leq x}X_{i}\leq x等于1,否则等于0。 | | 其中 {\displaystyle I_{[-\infty ,x]}(X_{i})}I_{[-\infty ,x]}(X_{i})是指标函数,如果 {\displaystyle X_{i}\leq x}X_{i}\leq x等于1,否则等于0。 |
| | | |
| | | |
| + | 给定累积分布函数 <math>F(x)<math>的Kolmogorov–Smirnov统计量为: |
| | | |
− | The Kolmogorov–Smirnov statistic for a given cumulative distribution function F(x) is
| + | :<math>D_n= \sup_x |F_n(x)-F(x)|</math> |
| | | |
− | 给定累积分布函数F(x)的Kolmogorov–Smirnov统计量为:
| |
| | | |
− | D_n= \sup_x |F_n(x)-F(x)|
| + | 其中<math>\sup_x</math>是距离集的上限值。根据格利文科·坎泰利 Glivenko-Cantelli定理,如果样本来自分布<math>F(x)<math>,则当n变为无穷大时,<math>Dn</math>几乎肯定会收敛于0。科尔莫戈罗夫 Kolmogorov通过有效加入收敛速率来增强此结果(请参阅Kolmogorov分布)。另外Donsker定理提供了更有力的结果。 |
| | | |
− |
| |
− |
| |
− | where supx is the supremum of the set of distances. By the Glivenko–Cantelli theorem, if the sample comes from distribution F(x), then Dn converges to 0 almost surely in the limit when n goes to infinity. Kolmogorov strengthened this result, by effectively providing the rate of this convergence (see Kolmogorov distribution). Donsker's theorem provides a yet stronger result.
| |
− |
| |
− | 其中supx是距离集的上限值。根据格利文科·坎泰利 Glivenko-Cantelli定理,如果样本来自分布F(x),则当n变为无穷大时,Dn几乎肯定会收敛于0。科尔莫戈罗夫 Kolmogorov通过有效加入收敛速率来增强此结果(请参阅Kolmogorov分布)。另外Donsker定理提供了更有力的结果。
| |
− |
| |
− |
| |
− |
| |
− | In practice, the statistic requires a relatively large number of data points (in comparison to other goodness of fit criteria such as the Anderson–Darling test statistic) to properly reject the null hypothesis.
| |
| | | |
| 在实践中,该统计需要相对大量的数据点(与其他拟合优度标准相比,例如Anderson-Darling检验统计)才能正确地拒绝零假设。 | | 在实践中,该统计需要相对大量的数据点(与其他拟合优度标准相比,例如Anderson-Darling检验统计)才能正确地拒绝零假设。 |
− |
| |
| | | |
| | | |
| ==Kolmogorov distribution Kolmogorov 分布== | | ==Kolmogorov distribution Kolmogorov 分布== |
| | | |
− | The Kolmogorov distribution is the distribution of the [[random variable]]
| + | Kolmogorov分布是[[随机变量]]的分布 |
| | | |
− | Kolmogorov分布是随机变量的分布
| |
| | | |
− | <math>K=\sup_{t\in[0,1]}|B(t)|</math> | + | :<math>K=\sup_{t\in[0,1]}|B(t)|</math> |
| | | |
| | | |
| + | 其中<math>B(t)<math>是布朗 Brownian桥。K的累积分布函数为 |
| | | |
− | where B(t) is the Brownian bridge. The cumulative distribution function of K is given by
| |
| | | |
− | 其中B(t)是布朗 Brownian桥。K的累积分布函数为
| + | :<math>\operatorname{Pr}(K\leq x)=1-2\sum_{k=1}^\infty (-1)^{k-1} e^{-2k^2 x^2}=\frac{\sqrt{2\pi}}{x}\sum_{k=1}^\infty e^{-(2k-1)^2\pi^2/(8x^2)}<math>, |
− | | |
− | \operatorname{Pr}(K\leq x)=1-2\sum_{k=1}^\infty (-1)^{k-1} e^{-2k^2 x^2}=\frac{\sqrt{2\pi}}{x}\sum_{k=1}^\infty e^{-(2k-1)^2\pi^2/(8x^2)}, | |
| | | |
| | | |
第94行: |
第54行: |
| which can also be expressed by the Jacobi theta function \vartheta_{01}(z=0;\tau=2ix^2/\pi). Both the form of the Kolmogorov–Smirnov test statistic and its asymptotic distribution under the null hypothesis were published by Andrey Kolmogorov, while a table of the distribution was published by Nikolai Smirnov. Recurrence relations for the distribution of the test statistic in finite samples are available. | | which can also be expressed by the Jacobi theta function \vartheta_{01}(z=0;\tau=2ix^2/\pi). Both the form of the Kolmogorov–Smirnov test statistic and its asymptotic distribution under the null hypothesis were published by Andrey Kolmogorov, while a table of the distribution was published by Nikolai Smirnov. Recurrence relations for the distribution of the test statistic in finite samples are available. |
| | | |
− | 也可以用雅可比θ Jacobi theta函数{\displaystyle \vartheta _{01}(z=0;\tau =2ix^{2}/\pi )}表示.在零假设下,Andrey Kolmogorov定义并规范了Kolmogorov–Smirnov检验统计量的形式及其渐近分布,Nikolai Smirnov则规范了分布表。这里可以运用有限样本中检验统计量分布的递归关系。
| + | 也可以用'''雅可比θ函数 Jacobi theta function'''{\displaystyle \vartheta _{01}(z=0;\tau =2ix^{2}/\pi )}表示.在零假设下,Andrey Kolmogorov定义并规范了Kolmogorov–Smirnov检验统计量的形式及其渐近分布,Nikolai Smirnov则规范了分布表。这里可以运用有限样本中检验统计量分布的递归关系。 |
− | | |
| | | |
| | | |
− | Under null hypothesis that the sample comes from the hypothesized distribution ''F''(''x''),
| + | 当样本来自假设分布''F''(''x'')的零假设下, |
| | | |
− | 当样本来自假设分布F(x)的零假设下,
| |
| | | |
| \operatorname{Pr}(K\leq K_\alpha)=1-\alpha.\, | | \operatorname{Pr}(K\leq K_\alpha)=1-\alpha.\, |
第107行: |
第65行: |
| | | |
| | | |
| + | 在其分布中,<math>B(t)<math>指的是'''布朗桥 Brownian bridge'''。 |
| | | |
− | [[convergence of random variables|in distribution]], where ''B''(''t'') is the [[Brownian bridge]].
| |
− |
| |
− | 在其分布中,B(t)指的是布朗桥。
| |
| | | |
| + | 如果''F''是连续的,则在原假设 <math>\sqrt{n}D_n</math>下收敛到不依赖于F的Kolmogorov分布。该结果也称为'''Kolmogorov定理'''。当<math>n</math> 为有限时,此极限的精确度近似为<math>K</math>的确切累积分布函数,效果并不十分令人满意:即使<math>n=1000</math>,相应的最大误差约为<math>0.9\%</math>。在<math>n=100</math>时,此误差增加到<math>2.6\%</math>,在<math>n=10</math>时增加到完全不可接受的<math>7\%</math>。但是,如果简单地将<math>x</math>替换为 |
| | | |
− |
| |
− | If ''F'' is continuous then under the null hypothesis <math>\sqrt{n}D_n</math> converges to the Kolmogorov distribution, which does not depend on ''F''. This result may also be known as the Kolmogorov theorem. The accuracy of this limit as an approximation to the exact cdf of <math>K</math> when <math>n</math> is finite is not very impressive: even when <math>n=1000</math>, the corresponding maximum error is about <math>0.9\%</math>; this error increases to <math>2.6\%</math> when <math>n=100</math> and to a totally unacceptable <math>7\%</math> when <math>n=10</math>. However, a very simple expedient of replacing <math>x</math> by
| |
− |
| |
− | 如果F是连续的,则在原假设{\displaystyle {\sqrt {n}}D_{n}}下收敛到不依赖于F的Kolmogorov分布。该结果也称为Kolmogorov定理。当n为有限时,此极限的精确度近似为K的确切累积分布函数,效果并不十分令人满意:即使n = 1000,相应的最大误差约为0.9%。在n=100时,此误差增加到2.6%,在n=10时增加到完全不可接受的7%。但是,如果简单地将x替换为
| |
| | | |
| :<math>x+\frac{1}{6\sqrt{n}}+ \frac{x-1}{4n}</math> | | :<math>x+\frac{1}{6\sqrt{n}}+ \frac{x-1}{4n}</math> |
| | | |
| | | |
− | | + | 在Jacobi theta函数的参数e中,将这些误差分别减小到<math>0.003\%</math>, <math>0.027\%</math>和<math>0.27\%</math>;该精度足以满足现阶段所有实际应用,。 |
− | in the argument of the Jacobi theta function reduces these errors to <math>0.003\%</math>, <math>0.027\%</math>, and <math>0.27\%</math> respectively; such accuracy would be usually considered more than adequate for all practical applications.
| |
− | | |
− | 在Jacobi theta函数的参数e中,将这些误差分别减小到0.003%,0.027%和0.27%;该精度足以满足现阶段所有实际应用,。
| |
| | | |
| | | |
第130行: |
第80行: |
| The ''goodness-of-fit'' test or the Kolmogorov–Smirnov test can be constructed by using the critical values of the Kolmogorov distribution. This test is asymptotically valid when <math>n \to\infty</math>. It rejects the null hypothesis at level <math>\alpha</math> if | | The ''goodness-of-fit'' test or the Kolmogorov–Smirnov test can be constructed by using the critical values of the Kolmogorov distribution. This test is asymptotically valid when <math>n \to\infty</math>. It rejects the null hypothesis at level <math>\alpha</math> if |
| | | |
− | 拟合优度检验或Kolmogorov–Smirnov检验可通过使用Kolmogorov分布的临界值来构建。当{\displaystyle n\to \infty }时,该检验是渐近有效的。如果条件为{\displaystyle {\sqrt {n}}D_{n}>K_{\alpha },\,},它会拒绝{\displaystyle \alpha }等级上的零假设。 | + | 拟合优度检验或Kolmogorov–Smirnov检验可通过使用Kolmogorov分布的临界值来构建。当{\displaystyle n\to \infty }时,该检验是渐近有效的。如果条件为<math>n \to\infty</math>,它会拒绝<math>\alpha</math>等级上的零假设。 |
| | | |
| | | |
| + | 即''K''<sub>''α''</sub>为: |
| | | |
− | where ''K''<sub>''α''</sub> is found from
| |
− |
| |
− | 即Kα为:
| |
− |
| |
− |
| |
− |
| |
− | The asymptotic [[statistical power|power]] of this test is 1.
| |
| | | |
| 这个检验的渐近幂是1。 | | 这个检验的渐近幂是1。 |
| | | |
| | | |
− | Fast and accurate algorithms to compute the cdf <math>\operatorname{Pr}(D_n \leq x)</math> or its complement for arbitrary <math>n</math> and <math>x</math>, are available from:
| + | 用于计算任意<math>n</math>和<math>x</math>的累积分布函数<math>\operatorname{Pr}(D_n \leq x)</math>或其补数的快速准确的算法可以从以下获取: |
− | | |
− | [6] and [7] for continuous null distributions with code in C and Java to be found in [6].
| |
− | [8] for purely discrete, mixed or continuous null distribution implemented in the KSgeneral package [9] of the R project for statistical computing, which for a given sample also computes the KS test statistic and its p-value. Alternative C++ implementation is available from [8].
| |
− | | |
− | 用于计算任意n和x的累积分布函数<math>\operatorname{Pr}(D_n \leq x)</math>或其补数的快速准确的算法可以从以下获取:
| |
| | | |
| * 统计软件期刊2011年Journal of Statistical Software刊登的Simard R, L'Ecuyer P的文章《计算双向Kolmogorov–Smirnov分布》以及统计与概率通信期刊2017年刊登的Moscovich A, Nadler B 的文章《快速计算泊松过程的边界穿越概率》。关于连续零分布的C和Java代码实现可以在文章《计算双向Kolmogorov–Smirnov分布》中找到。 | | * 统计软件期刊2011年Journal of Statistical Software刊登的Simard R, L'Ecuyer P的文章《计算双向Kolmogorov–Smirnov分布》以及统计与概率通信期刊2017年刊登的Moscovich A, Nadler B 的文章《快速计算泊松过程的边界穿越概率》。关于连续零分布的C和Java代码实现可以在文章《计算双向Kolmogorov–Smirnov分布》中找到。 |
第157行: |
第96行: |
| | | |
| | | |
− | === Test with estimated parameters 用估计的参数进行测试 === | + | === 用估计的参数进行测试 === |
− | | |
− | If either the form or the parameters of ''F''(''x'') are determined from the data ''X''<sub>''i''</sub> the critical values determined in this way are invalid. In such cases, [[Monte Carlo method|Monte Carlo]] or other methods may be required, but tables have been prepared for some cases. Details for the required modifications to the test statistic and for the critical values for the [[normal distribution]] and the [[exponential distribution]] have been published, and later publications also include the [[Gumbel distribution]]. The [[Lilliefors test]] represents a special case of this for the normal distribution. The logarithm transformation may help to overcome cases where the Kolmogorov test data does not seem to fit the assumption that it came from the normal distribution.
| |
− | | |
− | 如果以数据Xi来确定F(x)的形式或参数,则以这种方式确定的临界值是无效的。在这种情况下,可能需要蒙特卡洛 Monte Carlo或其他方法,不过数据表格已经做了多个情况下的准备。业界目前已经发布了对测试统计量的必要修正细节以及正态分布和指数分布临界值的具体信息,以后的出版物还包括耿贝尔 Gumbel分布。另外莉莉福斯 Lilliefors检验代表正态分布的一种特殊情况。另外为了克服Kolmogorov检验数据可能不符合来自正态分布假设的情况,可以进行对数变换。
| |
| | | |
| + | 如果以数据''X''<sub>''i''</sub>来确定''F''(''x'')的形式或参数,则以这种方式确定的临界值是无效的。在这种情况下,可能需要蒙特卡洛 Monte Carlo或其他方法,不过数据表格已经做了多个情况下的准备。业界目前已经发布了对测试统计量的必要修正细节以及正态分布和指数分布临界值的具体信息,以后的出版物还包括耿贝尔 Gumbel分布。另外莉莉福斯 Lilliefors检验代表正态分布的一种特殊情况。另外为了克服Kolmogorov检验数据可能不符合来自正态分布假设的情况,可以进行对数变换。 |
| | | |
| | | |
− | Using estimated parameters, the questions arises which estimation method should be used. Usually this would be the maximum likelihood method, but e.g. for the normal distribution MLE has a large bias error on sigma. Using a moment fit or KS minimization instead has a large impact on the critical values, and also some impact on test power. If we need to decide for Student-T data with df = 2 via KS test whether the data could be normal or not, then a ML estimate based on H<sub>0</sub> (data is normal, so using the standard deviation for scale) would give much larger KS distance, than a fit with minimum KS. In this case we should reject H<sub>0</sub>, which is often the case with MLE, because the sample standard deviation might be very large for T-2 data, but with KS minimization we may get still a too low KS to reject H<sub>0</sub>. In the Student-T case, a modified KS test with KS estimate instead of MLE, makes the KS test indeed slightly worse. However, in other cases, such a modified KS test leads to slightly better test power.
| + | 想要使用估计参数值,自然而然会出现应该使用哪种估计方法的问题。通常情况下,采用的是最大似然法,但对于如正态分布,最大似然法在sigma上具有较大的偏差。而使用矩量拟合或KS最小化来替代则对临界值有很大影响,并且对检验功效也有一定影响。如果我们需要通过KS测试来确定df = 2的Student-T数据是否正常,那么基于H<sub>0</sub的最大似然率估计(数据是正常的,因此使用标度的标准偏差)会得出更大的KS距离,从而不符合最小KS的拟合。在这种情况下,我们应该拒绝H<sub>0</sub,在最大似然法中通常是这样,因为对于T-2数据而言,样本标准偏差可能非常大,但是如果将KS最小化,我们可能会得到太低的KS而无法拒绝H<sub>0</sub。在Student-T情况下,用KS估计而不是最大似然法来进行改进的KS检验会使其效果稍差一些。但是在其他情况下,经过改良的KS检测会得到更好的检验功效。 |
| | | |
− | 想要使用估计参数值,自然而然会出现应该使用哪种估计方法的问题。通常情况下,采用的是最大似然法,但对于如正态分布,最大似然法在sigma上具有较大的偏差。而使用矩量拟合或KS最小化来替代则对临界值有很大影响,并且对检验功效也有一定影响。如果我们需要通过KS测试来确定df = 2的Student-T数据是否正常,那么基于H0的最大似然率估计(数据是正常的,因此使用标度的标准偏差)会得出更大的KS距离,从而不符合最小KS的拟合。在这种情况下,我们应该拒绝H0,在最大似然法中通常是这样,因为对于T-2数据而言,样本标准偏差可能非常大,但是如果将KS最小化,我们可能会得到太低的KS而无法拒绝H0。在Student-T情况下,用KS估计而不是最大似然法来进行改进的KS检验会使其效果稍差一些。但是在其他情况下,经过改良的KS检测会得到更好的检验功效。
| |
| | | |
− | === Discrete and mixed null distribution 离散和混合零分布 === | + | ===离散和混合零分布 === |
| | | |
− | Under the assumption that <math>F(x)</math> is non-decreasing and right-continuous, with countable (possibly infinite) number of jumps, the KS test statistic can be expressed as:
| + | 假设<math>F(x)</math>是非递减且右连续的,且具有可数(可能是无限)的跳变次数,则KS检验统计量可表示为: |
| | | |
− | 假设F(x)是非递减且右连续的,且具有可数(可能是无限)的跳变次数,则KS检验统计量可表示为:
| |
| | | |
| :<math>D_n= \sup_x |F_n(x)-F(x)| = \sup_{0 \leq t \leq 1} |F_n(F^{-1}(t)) - F(F^{-1}(t))|. </math> | | :<math>D_n= \sup_x |F_n(x)-F(x)| = \sup_{0 \leq t \leq 1} |F_n(F^{-1}(t)) - F(F^{-1}(t))|. </math> |
| | | |
| | | |
| + | 从<math>F(x)</math>的右连续性,可以得出<math>F(F^{-1}(t)) \geq t</math>和<math>F^{-1}(F(x)) \leq x </math>,因此<math>D_{n}</math>的分布取决于零分布<math>F(x)</math>,即在连续情况下不再无分布。目前已经开发出一种快速,准确的方法,以C ++和R语言的KSgeneral软件包来实现,当<math>F(x)</math>是纯离散或混合时,可以计算出Dn的精确且渐近分布。函数<code>disc_ks_test()</code>,<code>mixed_ks_test()</code>和<code>cont_ks_test()</code>还可以针对纯离散,混合或连续的零分布和任意样本大小,计算出KS检测统计量和p值。另外作为R语言的dgof软件包的一部分,还可以计算出KS检测及其用于离散零分布和小样本量的p值。关于主要统计软件包,其中SAS PROC NPAR1WAY和Stata ksmirnov是假设<math>F(x)</math>是连续的,因此执行KS检验时,如果零分布实际上不是连续的,则该检验更为保守。详情请见: |
| + | 1. 《关于离散案例中的Kolmogorov统计量的注释 Note on the Kolmogorov Statistic in the Discrete Case》 |
| + | 2. 《皮尔逊卡方检验和Kolmogorov拟合优度检验在有效性方面的比较 A Comparison of the Pearson Chi-Square and Kolmogorov Goodness-of-Fit Tests with Respect to Validity》 |
| + | 3. 《Kolmogorov–Smirnov的有限概率性质和离散数据的相似统计量 Bounded Probability Properties of Kolmogorov–Smirnov and Similar Statistics for Discrete Data》 |
| | | |
− | From the right-continuity of <math>F(x)</math>, it follows that <math>F(F^{-1}(t)) \geq t</math> and <math>F^{-1}(F(x)) \leq x </math> and hence, the distribution of <math>D_{n}</math> depends on the null distribution <math>F(x)</math>, i.e., is no longer distribution-free as in the continuous case. Therefore, a fast and accurate method has been developed to compute the exact and asymptotic distribution of <math>D_{n}</math> when <math>F(x)</math> is purely discrete or mixed , implemented in C++ and in the KSgeneral package of the [[R (programming language)|R language]]. The functions <code>disc_ks_test()</code>, <code>mixed_ks_test()</code> and <code>cont_ks_test()</code> compute also the KS test statistic and p-values for purely discrete, mixed or continuous null distributions and arbitrary sample sizes. The KS test and its p-values for discrete null distributions and small sample sizes are also computed in as part of the dgof package of the R language. Major statistical packages among which [[SAS (software)|SAS]] <code>PROC NPAR1WAY</code> , [[Stata]] <code>ksmirnov</code> implement the KS test under the assumption that <math>F(x)</math> is continuous, which is more conservative if the null distribution is actually not continuous (see [15] [16] [17]).
| |
− |
| |
− | 从F(x)的右连续性,可以得出<math>F(F^{-1}(t)) \geq t</math>和<math>F^{-1}(F(x)) \leq x </math>,因此Dn的分布取决于零分布F(x),即在连续情况下不再无分布。目前已经开发出一种快速,准确的方法,以C ++和R语言的KSgeneral软件包来实现,当F(x)是纯离散或混合时,可以计算出Dn的精确且渐近分布。函数disc_ks_test(),mixed_ks_test()和cont_ks_test()还可以针对纯离散,混合或连续的零分布和任意样本大小,计算出KS检测统计量和p值。另外作为R语言的dgof软件包的一部分,还可以计算出KS检测及其用于离散零分布和小样本量的p值。关于主要统计软件包,其中SAS PROC NPAR1WAY和Stata ksmirnov是假设F(x)是连续的,因此执行KS检验时,如果零分布实际上不是连续的,则该检验更为保守。详情请见:
| |
− | 1. 《关于离散案例中的Kolmogorov统计量的注释Note on the Kolmogorov Statistic in the Discrete Case》
| |
− | 2. 《皮尔逊卡方检验和Kolmogorov拟合优度检验在有效性方面的比较A Comparison of the Pearson Chi-Square and Kolmogorov Goodness-of-Fit Tests with Respect to Validity》
| |
− | 3. 《Kolmogorov–Smirnov的有限概率性质和离散数据的相似统计量Bounded Probability Properties of Kolmogorov–Smirnov and Similar Statistics for Discrete Data》
| |
| | | |
| == Two-sample Kolmogorov–Smirnov test 双样本Kolmogorov–Smirnov检验 == | | == Two-sample Kolmogorov–Smirnov test 双样本Kolmogorov–Smirnov检验 == |
| | | |
| [[文件:KS2 Example.png|缩略图|右|关于两个样本的Kolmogorov–Smirnov统计量的图解。 红线和蓝线分别对应于经验分布函数,而黑色箭头是两样本的KS统计量。]] | | [[文件:KS2 Example.png|缩略图|右|关于两个样本的Kolmogorov–Smirnov统计量的图解。 红线和蓝线分别对应于经验分布函数,而黑色箭头是两样本的KS统计量。]] |
− |
| |
− | The Kolmogorov–Smirnov test may also be used to test whether two underlying one-dimensional probability distributions differ. In this case, the Kolmogorov–Smirnov statistic is
| |
| | | |
| Kolmogorov–Smirnov检验也可用于检验两个潜在的一维概率分布是否不同。在这种情况下,Kolmogorov-Smirnov统计量为 | | Kolmogorov–Smirnov检验也可用于检验两个潜在的一维概率分布是否不同。在这种情况下,Kolmogorov-Smirnov统计量为 |
| | | |
− | <math>D_{n,m}=\sup_x |F_{1,n}(x)-F_{2,m}(x)|,</math>
| |
| | | |
| + | :<math>D_{n,m}=\sup_x |F_{1,n}(x)-F_{2,m}(x)|,</math> |
| | | |
| | | |
− | where <math>F_{1,n}</math> and <math>F_{2,m}</math> are the [[empirical distribution function]]s of the first and the second sample respectively, and <math>\sup</math> is the [[Infimum and supremum|supremum function]].
| + | 其中<math>F_{1,n}</math>和<math>F_{2,m}</math>分别是第一样本和第二样本的经验分布函数,而<math>\sup</math>是上确界函数。 |
| | | |
− | 其中F1,n和F2,m分别是第一样本和第二样本的经验分布函数,而sup是上确界函数。
| |
| | | |
| + | 对于量大的样本,如果满足以下条件,则原假设在<math>\alpha</math>级被拒绝 |
| | | |
− |
| |
− | For large samples, the null hypothesis is rejected at level <math>\alpha</math> if
| |
− |
| |
− | 对于量大的样本,如果满足以下条件,则原假设在α级被拒绝
| |
| | | |
| :<math>D_{n,m}>c(\alpha)\sqrt{\frac{n + m}{n\cdot m}}.</math> | | :<math>D_{n,m}>c(\alpha)\sqrt{\frac{n + m}{n\cdot m}}.</math> |
| | | |
| | | |
− | | + | 其中<math>n</math>和<math>m</math>分别是第一样本和第二样本的大小。下表给出了在<math>\alpha</math>级最常见的<math>c({\alpha})</math>值: |
− | Where <math>n</math> and <math>m</math> are the sizes of first and second sample respectively. The value of <math>c({\alpha})</math> is given in the table below for the most common levels of <math>\alpha</math>
| |
− | | |
− | 其中n和m分别是第一样本和第二样本的大小。下表给出了在α级最常见的α值c(α):
| |
| | | |
| | | |
第226行: |
第149行: |
| |} | | |} |
| | | |
− | and in general by
| + | 一般而言: |
| | | |
− | 一般而言:
| |
| | | |
| :<math>c\left(\alpha\right)=\sqrt{-\ln\left(\tfrac{\alpha}{2}\right)\cdot \tfrac{1}{2}},</math> | | :<math>c\left(\alpha\right)=\sqrt{-\ln\left(\tfrac{\alpha}{2}\right)\cdot \tfrac{1}{2}},</math> |
| | | |
| | | |
| + | 以至于条件为: |
| | | |
− | so that the condition reads
| |
− |
| |
− | 以至于条件为:
| |
| | | |
| :<math>D_{n,m}>\frac{1}{\sqrt{n}}\cdot\sqrt{-\ln\left(\tfrac{\alpha}{2}\right)\cdot \tfrac{1 + \tfrac{n}{m}}{2}}.</math> | | :<math>D_{n,m}>\frac{1}{\sqrt{n}}\cdot\sqrt{-\ln\left(\tfrac{\alpha}{2}\right)\cdot \tfrac{1 + \tfrac{n}{m}}{2}}.</math> |
| | | |
− |
| |
− |
| |
− | Here, again, the larger the sample sizes, the more sensitive the minimal bound: For a given ratio of sample sizes (e.g. m=n), the minimal bound scales in the size of either of the samples according to its inverse square root.
| |
| | | |
| 同样,样本量越大,最小界限越敏感:对于给定比率的样本大小(例如m = n),最小界限根据其平方根的倒数来缩放两个样本的大小。 | | 同样,样本量越大,最小界限越敏感:对于给定比率的样本大小(例如m = n),最小界限根据其平方根的倒数来缩放两个样本的大小。 |
| | | |
− |
| |
− |
| |
− | Note that the two-sample test checks whether the two data samples come from the same distribution. This does not specify what that common distribution is (e.g. whether it's normal or not normal). Again, tables of critical values have been published. A shortcoming of the Kolmogorov–Smirnov test is that it is not very powerful because it is devised to be sensitive against all possible types of differences between two distribution functions. and showed evidence that the Cucconi test, originally proposed for simultaneously comparing location and scale, is much more powerful than the Kolmogorov–Smirnov test when comparing two distribution functions.
| |
| | | |
| 这里要注意的是两个样本检验出来的数据样本是否来自同一分布。其并未指定该共同分布是什么(例如,它是正常还是不正常)。而且关键值表已经得出。Kolmogorov–Smirnov检验没有那么有效,因为它被设计为对两个分布函数之间所有可能的差异敏感。如刊登在Journal of Nonparametric Statistics2009年刊上Marozzi, Marco (2009)的文章《Some Notes on the Location-Scale Cucconi Test》和刊登在Communications in Statistics – Simulation and Computation2013年刊上同样Marozzi, Marco (2009)的文章《Nonparametric Simultaneous Tests for Location and Scale Testing: a Comparison of Several Methods显示了证据,当比较两个分布函数时,最初建议同时比较位置和比例的Cucconi检验比Kolmogorov-Smirnov检验更有效。 | | 这里要注意的是两个样本检验出来的数据样本是否来自同一分布。其并未指定该共同分布是什么(例如,它是正常还是不正常)。而且关键值表已经得出。Kolmogorov–Smirnov检验没有那么有效,因为它被设计为对两个分布函数之间所有可能的差异敏感。如刊登在Journal of Nonparametric Statistics2009年刊上Marozzi, Marco (2009)的文章《Some Notes on the Location-Scale Cucconi Test》和刊登在Communications in Statistics – Simulation and Computation2013年刊上同样Marozzi, Marco (2009)的文章《Nonparametric Simultaneous Tests for Location and Scale Testing: a Comparison of Several Methods显示了证据,当比较两个分布函数时,最初建议同时比较位置和比例的Cucconi检验比Kolmogorov-Smirnov检验更有效。 |
| | | |
− | == Setting confidence limits for the shape of a distribution function 为分布函数的形状设置置信极限 ==
| |
| | | |
− | {{main article|Dvoretzky–Kiefer–Wolfowitz inequality}}
| + | == 为分布函数的形状设置置信极限 == |
| | | |
| While the Kolmogorov–Smirnov test is usually used to test whether a given ''F''(''x'') is the underlying probability distribution of ''F''<sub>''n''</sub>(''x''), the procedure may be inverted to give confidence limits on ''F''(''x'') itself. If one chooses a critical value of the test statistic ''D''<sub>''α''</sub> such that P(''D''<sub>''n''</sub> > ''D''<sub>''α''</sub>) = ''α'', then a band of width ±''D''<sub>''α''</sub> around ''F''<sub>''n''</sub>(''x'') will entirely contain ''F''(''x'') with probability 1 − ''α''. | | While the Kolmogorov–Smirnov test is usually used to test whether a given ''F''(''x'') is the underlying probability distribution of ''F''<sub>''n''</sub>(''x''), the procedure may be inverted to give confidence limits on ''F''(''x'') itself. If one chooses a critical value of the test statistic ''D''<sub>''α''</sub> such that P(''D''<sub>''n''</sub> > ''D''<sub>''α''</sub>) = ''α'', then a band of width ±''D''<sub>''α''</sub> around ''F''<sub>''n''</sub>(''x'') will entirely contain ''F''(''x'') with probability 1 − ''α''. |
| | | |
| | | |
− | | + | 虽然通常使用Kolmogorov–Smirnov检验法来检验给定的''F''(''x'')是否为''F''<sub>''n''</sub>(''x'')的潜在概率分布,但可以将过程倒过来给出''F''(''x'')本身的置信极限。如果选择检验统计量''D''<sub>''α''</sub>) = ''α''的临界值,使得 P(''D''<sub>''n''</sub>,则在''F''<sub>''n''</sub>(''x'') 周围宽度±''D''<sub>''α''</sub>内将完全包含概率为1-α的''F''(''x'') 。 |
− | 虽然通常使用Kolmogorov–Smirnov检验法来检验给定的F(x)是否为Fn(x)的潜在概率分布,但可以将过程倒过来给出F(x)本身的置信极限。如果选择检验统计量Dα的临界值,使得P(Dn>Dα)=α,则在Fn(x)周围宽度±Dα内将完全包含概率为1-α的F(x)。
| |
| | | |
| | | |
| | | |
| == The Kolmogorov–Smirnov statistic in more than one dimension 多个维度的Kolmogorov–Smirnov统计 == | | == The Kolmogorov–Smirnov statistic in more than one dimension 多个维度的Kolmogorov–Smirnov统计 == |
− |
| |
− | A distribution-free multivariate Kolmogorov–Smirnov goodness of fit test has been proposed by Justel, Peña and Zamar (1997). The test uses a statistic which is built using Rosenblatt's transformation, and an algorithm is developed to compute it in the bivariate case. An approximate test that can be easily computed in any dimension is also presented.
| |
| | | |
| 朱斯特尔 Justel,培尼亚 Peña和扎马 Zamar(1997)提出了无分布的多元Kolmogorov-Smirnov拟合优度检验。该检验使用通过Rosenblatt变换建立的统计量,开发出了一种算法来计算双变量情况。还介绍了可以在任何维度上轻松计算的近似检测法。 | | 朱斯特尔 Justel,培尼亚 Peña和扎马 Zamar(1997)提出了无分布的多元Kolmogorov-Smirnov拟合优度检验。该检验使用通过Rosenblatt变换建立的统计量,开发出了一种算法来计算双变量情况。还介绍了可以在任何维度上轻松计算的近似检测法。 |
| | | |
− |
| |
− |
| |
− | The Kolmogorov–Smirnov test statistic needs to be modified if a similar test is to be applied to [[multivariate statistics|multivariate data]]. This is not straightforward because the maximum difference between two joint [[cumulative distribution function]]s is not generally the same as the maximum difference of any of the complementary distribution functions. Thus the maximum difference will differ depending on which of <math>\Pr(x < X \land y < Y)</math> or <math>\Pr(X < x \land Y > y)</math> or any of the other two possible arrangements is used. One might require that the result of the test used should not depend on which choice is made.
| |
| | | |
| 如果要将类似的检验应用于多元数据,则需要修改Kolmogorov–Smirnov检验统计量。过程略显复杂,因为两个联合累积分布函数之间的最大差异通常与任何互补分布函数中的最大差异都不相同。因此,最大差异将取决于使用<math>\Pr(x < X \land y < Y)</math>或<math>\Pr(X < x \land Y > y)</math>中的哪一个,或者使用其他两种可能分布中的任何一种。当然有可能要求所用的检测结果无关于这样的选择。 | | 如果要将类似的检验应用于多元数据,则需要修改Kolmogorov–Smirnov检验统计量。过程略显复杂,因为两个联合累积分布函数之间的最大差异通常与任何互补分布函数中的最大差异都不相同。因此,最大差异将取决于使用<math>\Pr(x < X \land y < Y)</math>或<math>\Pr(X < x \land Y > y)</math>中的哪一个,或者使用其他两种可能分布中的任何一种。当然有可能要求所用的检测结果无关于这样的选择。 |
| | | |
| | | |
− | | + | 在满足以上要求的同时,将Kolmogorov-Smirnov统计量泛化为更高维度的一种方法是,在所有可能的排序中比较两个样本的累积分布函数,并从所得的K-S统计量中取最大。在''d'' 维数据中,有 2<sup>''d''</sup>−1个这样的排序。皮柯克 Peacock得出了一种这样的变化量(有关3D版本,另请参见Gosset),另一种由法萨诺 Fasano和弗朗切斯基尼 Franceschini得出(有关比较和计算细节,请参见Lopes等人)。检测统计量的临界值可以通过仿真获取,但取决于联合分布中的依存关系结构。 |
− | One approach to generalizing the Kolmogorov–Smirnov statistic to higher dimensions which meets the above concern is to compare the cdfs of the two samples with all possible orderings, and take the largest of the set of resulting K–S statistics. In ''d'' dimensions, there are 2<sup>''d''</sup>−1 such orderings. One such variation is due to Peacock(see also Gosset for a 3D version) and another to Fasano and Franceschini (see Lopes et al. for a comparison and computational details). Critical values for the test statistic can be obtained by simulations, but depend on the dependence structure in the joint distribution.
| |
− | | |
− | 在满足以上要求的同时,将Kolmogorov-Smirnov统计量泛化为更高维度的一种方法是,在所有可能的排序中比较两个样本的累积分布函数,并从所得的K-S统计量中取最大。在d维数据中,有2d-1个这样的排序。皮柯克 Peacock得出了一种这样的变化量(有关3D版本,另请参见Gosset),另一种由法萨诺 Fasano和弗朗切斯基尼 Franceschini得出(有关比较和计算细节,请参见Lopes等人)。检测统计量的临界值可以通过仿真获取,但取决于联合分布中的依存关系结构。
| |
| | | |
| | | |
| + | 一维的Kolmogorov-Smirnov统计量与所谓的星差D相同,因此,另一个对更高维度的本地KS扩展是将D也用于更高维度。可惜的是,很难从高维度上计算出星差。 |
| | | |
− | In one dimension, the Kolmogorov–Smirnov statistic is identical to the so-called star discrepancy D, so another native KS extension to higher dimensions would be simply to use D also for higher dimensions. Unfortunately, the star discrepancy is hard to calculate in high dimensions.
| |
− |
| |
− | 一维的Kolmogorov-Smirnov统计量与所谓的星差D相同,因此,另一个对更高维度的本地KS扩展是将D也用于更高维度。可惜的是,很难从高维度上计算出星差。
| |
| | | |
| == Implementations 软件实现== | | == Implementations 软件实现== |
| | | |
− | The Kolmogorov-Smirnov test (one or two sampled test verifies the equality of distributions) is implemented in many software programs:
| |
| Kolmogorov-Smirnov检验(一个或两个抽样检验可验证分布是否相等)可以在众多软件程序中实现: | | Kolmogorov-Smirnov检验(一个或两个抽样检验可验证分布是否相等)可以在众多软件程序中实现: |
| | | |
− | * [[Mathematica]] has [https://reference.wolfram.com/language/ref/KolmogorovSmirnovTest.html KolmogorovSmirnovTest] | + | * 科学计算软件 Wolfram Mathematica内含[https://reference.wolfram.com/language/ref/KolmogorovSmirnovTest.html KolmogorovSmirnovTestt]。 |
− | | + | * MATLAB在其统计工具箱中含有[https://de.mathworks.com/help/stats/kstest.html kstest]。 |
− | * [[MATLAB]] has [https://de.mathworks.com/help/stats/kstest.html kstest] in its Statistics Toolbox. | + | * R语言包“KSgeneral”<ref name=KSgeneral/> 可以在任意,可能离散,混合或连续的零分布下计算KS检验统计信息及其p值。 |
− | | + | * R语言的统计基本程序包在其“stats”程序包中可以运行检验,命令为[https://stat.ethz.ch/R-manual/R-patched/library/stats/html/ks.test.html ks.test {stats}]。 |
− | * The [[R (programming language)|R]] package "KSgeneral"<ref name=KSgeneral/> computes the KS test statistics and its p-values under arbitrary, possibly discrete, mixed or continuous null distribution. | |
− | | |
− | * [[R (programming language)|R]]'s statistics base-package implements the test as [https://stat.ethz.ch/R-manual/R-patched/library/stats/html/ks.test.html ks.test {stats}] in its "stats" package. | |
− | | |
− | * [[SAS (software)|SAS]] implements the test in its PROC NPAR1WAY procedure.
| |
− | | |
− | * [[Python (programming language)|Python]] has an implementation of this test provided by [[SciPy]]<ref>{{cite web |url= https://docs.scipy.org/doc/scipy-0.14.0/reference/generated/scipy.stats.kstest.html |title=scipy.stats.kstest |work=SciPy SciPy v0.14.0 Reference Guide |publisher=The Scipy community |access-date= 18 June 2019}}</ref> by Statistical functions (scipy.stats)
| |
− | | |
− | * [[SYSTAT (statistics)|SYSTAT]] (SPSS Inc., Chicago, IL)
| |
− | | |
− | * [[Java (programming language)|Java]] has an implementation of this test provided by [[Apache Commons]]<ref>{{cite web |url=https://commons.apache.org/proper/commons-math/javadocs/api-3.5/org/apache/commons/math3/stat/inference/KolmogorovSmirnovTest.html |title=KolmogorovSmirnovTes|access-date= 18 June 2019}}</ref>
| |
− | | |
− | * [[KNIME]] has a node implementing this test based on the above Java implementation<ref>{{cite web |url=https://www.knime.com/whats-new-in-knime-37#new-statistics-nodes |title=New statistics nodes |access-date= 25 June 2020}}</ref>
| |
− | | |
− | * [[StatsDirect]] (StatsDirect Ltd, Manchester, UK) implements [https://www.statsdirect.com/help/nonparametric_methods/smirnov.htm all common variants].
| |
− | | |
− | * [[Stata]] (Stata Corporation, College Station, TX) implements the test in ksmirnov (Kolmogorov–Smirnov equality-of-distributions test) command. <ref> {{ cite web |url=https://www.stata.com/manuals15/rksmirnov.pdf|title=ksmirnov — Kolmogorov –Smirnov equality-of-distributions test |access-date= 18 June 2019}} </ref>
| |
− | | |
− | * [[PSPP]] implements the test in its [https://www.gnu.org/software/pspp/manual/html_node/KOLMOGOROV_002dSMIRNOV.html KOLMOGOROV-SMIRNOV (or using K-S shortcut] function.
| |
− | | |
− | * [[Microsoft Excel|Excel]] runs the test as KSCRIT and KSPROB
| |
− | | |
− | * 科学计算软件Wolfram Mathematica内含Kolmogorov Smirnov Test。
| |
− | * MATLAB在其统计工具箱中含有K-S test。
| |
− | * R语言包“KSgeneral”可以在任意,可能离散,混合或连续的零分布下计算KS检验统计信息及其p值。
| |
− | * R语言的统计基本程序包在其“stats”程序包中可以运行检验,命令为ks.test {stats}。
| |
| * 统计软件SAS在其PROC NPAR1WAY程序中可以实现检验。 | | * 统计软件SAS在其PROC NPAR1WAY程序中可以实现检验。 |
− | * Python通过SciPy中的统计功能(scipy.stats)可以实现检验。 | + | * Python通过SciPy<ref>{{cite web |url= https://docs.scipy.org/doc/scipy-0.14.0/reference/generated/scipy.stats.kstest.html |title=scipy.stats.kstest |work=SciPy SciPy v0.14.0 Reference Guide |publisher=The Scipy community |access-date= 18 June 2019}}</ref>中的统计功能(scipy.stats)可以实现检验。 |
| * SYSTAT(SPSS Inc.,伊利诺伊州芝加哥) | | * SYSTAT(SPSS Inc.,伊利诺伊州芝加哥) |
− | * 基于Java语言开发的Apache Commons可以实现检验。 | + | * 基于Java语言开发的Apache Commons可以实现检验。<ref>{{cite web |url=https://commons.apache.org/proper/commons-math/javadocs/api-3.5/org/apache/commons/math3/stat/inference/KolmogorovSmirnovTest.html |title=KolmogorovSmirnovTes|access-date= 18 June 2019}}</ref> |
| * KNIME是一个数据分析平台,基于上述Java语言,可以通过一个节点Node来实现检验。 | | * KNIME是一个数据分析平台,基于上述Java语言,可以通过一个节点Node来实现检验。 |
− | * StatsDirect(StatsDirect Ltd,英国曼彻斯特)包含所有常见的变体。 | + | * StatsDirect(StatsDirect Ltd,英国曼彻斯特)包含所有常见的变体。[https://www.statsdirect.com/help/nonparametric_methods/smirnov.htm all common variants] |
− | * Stata(德克萨斯州大学城Stata公司)在ksmirnov(Kolmogorov–Smirnov分配均等测试)中执行检验命令。 | + | * Stata(德克萨斯州大学城Stata公司)在ksmirnov(Kolmogorov–Smirnov分配均等测试)中执行检验命令。<ref> {{ cite web |url=https://www.stata.com/manuals15/rksmirnov.pdf|title=ksmirnov — Kolmogorov –Smirnov equality-of-distributions test |access-date= 18 June 2019}} </ref> |
− | * PSPP通过其Kolmogorov–Smirnov(或使用K-S快捷功能)实施检验。 | + | * PSPP通过其Kolmogorov–Smirnov(或使用K-S快捷功能)实施检验。[https://www.gnu.org/software/pspp/manual/html_node/KOLMOGOROV_002dSMIRNOV.html KOLMOGOROV-SMIRNOV (or using K-S shortcut] |
| * Excel以KSCRIT和KSPROB的形式运行检验。 | | * Excel以KSCRIT和KSPROB的形式运行检验。 |
| | | |
| | | |
| + | == 参见 == |
| | | |
− | == See also 其他参考资料 ==
| + | *[[Lepage检验 Lepage test]] |
| | | |
− | *[[Lepage test]] | + | *[[Cucconi检验 Cucconi test]] |
| | | |
− | *[[Cucconi test]] | + | *[[Kuiper检验 Kuiper's test]] |
| | | |
− | *[[Kuiper's test]] | + | *[[Shapiro–Wilk检验 Shapiro–Wilk test]] |
| | | |
− | *[[Shapiro–Wilk test]] | + | *[[Anderson–Darling检验 Anderson–Darling test]] |
| | | |
− | *[[Anderson–Darling test]] | + | *[[Cramér–von Mises检验 Cramér–von Mises test]] |
| | | |
− | *[[Cramér–von Mises criterion|Cramér–von Mises test]]
| |
| | | |
− | * 勒帕热 Lepage检验
| + | == 参考文献 == |
− | * Cucconi检验
| |
− | * 柯伊伯 Kuiper检验
| |
− | * 夏皮罗-威尔克 Shapiro–Wilk检验
| |
− | * 安德森-达林 Anderson–Darling检验
| |
− | * 克莱姆·冯·米塞斯 Cramér–von Mises检验
| |
− | | |
− | | |
− | | |
− | == References 参考文献 == | |
| | | |
| {{Reflist|30em}} | | {{Reflist|30em}} |
第364行: |
第230行: |
| | | |
| | | |
− | == Further reading 其他资料== | + | == 进一步阅读== |
| | | |
| * Daniel, Wayne W. (1990). "Kolmogorov–Smirnov one-sample test". Applied Nonparametric Statistics (2nd ed.). Boston: PWS-Kent. pp. 319–330. ISBN 978-0-534-91976-4. | | * Daniel, Wayne W. (1990). "Kolmogorov–Smirnov one-sample test". Applied Nonparametric Statistics (2nd ed.). Boston: PWS-Kent. pp. 319–330. ISBN 978-0-534-91976-4. |
第371行: |
第237行: |
| * Corder, G. W.; Foreman, D. I. (2014). Nonparametric Statistics: A Step-by-Step Approach. Wiley. ISBN 978-1118840313. | | * Corder, G. W.; Foreman, D. I. (2014). Nonparametric Statistics: A Step-by-Step Approach. Wiley. ISBN 978-1118840313. |
| * Stephens, M. A. (1979). "Test of fit for the logistic distribution based on the empirical distribution function". Biometrika. 66 (3): 591–595. doi:10.1093/biomet/66.3.591. | | * Stephens, M. A. (1979). "Test of fit for the logistic distribution based on the empirical distribution function". Biometrika. 66 (3): 591–595. doi:10.1093/biomet/66.3.591. |
− |
| |
| | | |
| | | |
第379行: |
第244行: |
| * Corder, G. W.; Foreman, D. I. (2014). 非参数统计:分步法. Wiley. ISBN 978-1118840313. | | * Corder, G. W.; Foreman, D. I. (2014). 非参数统计:分步法. Wiley. ISBN 978-1118840313. |
| * Stephens, M. A. (1979). "基于经验分布函数的逻辑分布拟合检验". Biometrika. 66 (3): 591–595. doi:10.1093/biomet/66.3.591. | | * Stephens, M. A. (1979). "基于经验分布函数的逻辑分布拟合检验". Biometrika. 66 (3): 591–595. doi:10.1093/biomet/66.3.591. |
− |
| |
− |
| |
− |
| |
− | == External links 相关链接 ==
| |
− |
| |
− | * "Kolmogorov–Smirnov test", Encyclopedia of Mathematics, EMS Press, 2001 [1994]
| |
− | * Short introduction
| |
− | * KS test explanation
| |
− | * JavaScript implementation of one- and two-sided tests
| |
− | * Online calculator with the KS test
| |
− | * Open-source C++ code to compute the Kolmogorov distribution and perform the KS test
| |
− | * Paper on Evaluating Kolmogorov’s Distribution; contains C implementation. This is the method used in Matlab.
| |
− | * Paper on Computing the Two-Sided Kolmogorov–Smirnov Distribution; computing the cdf of the KS statistic in C or Java.
| |
− | * Paper powerlaw: A Python Package for Analysis of Heavy-Tailed Distributions; Jeff Alstott, Ed Bullmore, Dietmar Plenz. Among others, it also performs the Kolmogorov–Smirnov test. Source code and installers of powerlaw package are available at PyPi
| |
− |
| |
− |
| |
− | * “Kolmogorov–Smirnov测验”,数学百科全书,EMS出版社,2001[1994]
| |
− | * KS简短介绍
| |
− | * KS检验说明
| |
− | * 单边和双边检验的JavaScript实现
| |
− | * KS检验在线计算器
| |
− | * 开源C++代码可计算Kolmogorov分布并执行KS检验
| |
− | * 关于评估Kolmogorov分布的论文,包含C实现。这是Matlab中常使用的方法。
| |
− | * 计算双向Kolmogorov-Smirnov分布的论文;用C或Java计算KS统计信息的累积分布函数。
| |
− | * Paper powerlaw:用于分析重尾分布的Python包; Jeff Alstott,Ed Bullmore和Dietmar Plenz。除此之外,它还执行Kolmogorov-Smirnov检验。PyPi提供了powerlaw软件包的源代码和安装程序。
| |