scipy.stats.kstat¶
- 
scipy.stats.kstat(data, n=2)[source]¶
- Return the nth k-statistic (1<=n<=4 so far). - The nth k-statistic k_n is the unique symmetric unbiased estimator of the nth cumulant kappa_n. - Parameters: - data : array_like - Input array. Note that n-D input gets flattened. - n : int, {1, 2, 3, 4}, optional - Default is equal to 2. - Returns: - kstat : float - The nth k-statistic. - See also - Notes - For a sample size n, the first few k-statistics are given by: \[k_{1} = \mu k_{2} = \frac{n}{n-1} m_{2} k_{3} = \frac{ n^{2} } {(n-1) (n-2)} m_{3} k_{4} = \frac{ n^{2} [(n + 1)m_{4} - 3(n - 1) m^2_{2}]} {(n-1) (n-2) (n-3)}\]- where \(\mu\) is the sample mean, \(m_2\) is the sample variance, and \(m_i\) is the i-th sample central moment. - References - http://mathworld.wolfram.com/k-Statistic.html - http://mathworld.wolfram.com/Cumulant.html - Examples - >>> from scipy import stats >>> rndm = np.random.RandomState(1234) - As sample size increases, n-th moment and n-th k-statistic converge to the same number (although they aren’t identical). In the case of the normal distribution, they converge to zero. - >>> for n in [2, 3, 4, 5, 6, 7]: ... x = rndm.normal(size=10**n) ... m, k = stats.moment(x, 3), stats.kstat(x, 3) ... print("%.3g %.3g %.3g" % (m, k, m-k)) -0.631 -0.651 0.0194 0.0282 0.0283 -8.49e-05 -0.0454 -0.0454 1.36e-05 7.53e-05 7.53e-05 -2.26e-09 0.00166 0.00166 -4.99e-09 -2.88e-06 -2.88e-06 8.63e-13 
