In probability theory, Lindeberg's condition is a sufficient condition (and under certain conditions also a necessary condition) for the central limit theorem (CLT) to hold for a sequence of independent random variables.[1][2][3] Unlike the classical CLT, which requires that the random variables in question have finite variance and be both independent and identically distributed, Lindeberg's CLT only requires that they have finite variance, satisfy Lindeberg's condition, and be independent. It is named after the Finnish mathematician Jarl Waldemar Lindeberg.[4]
Statement
Let
be a probability space, and
, be independent random variables defined on that space. Assume the expected values
and variances
exist and are finite. Also let ![{\displaystyle s_{n}^{2}:=\sum _{k=1}^{n}\sigma _{k}^{2}.}](data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7)
If this sequence of independent random variables
satisfies Lindeberg's condition:
![{\displaystyle \lim _{n\to \infty }{\frac {1}{s_{n}^{2}}}\sum _{k=1}^{n}\mathbb {E} \left[(X_{k}-\mu _{k})^{2}\cdot \mathbf {1} _{\{|X_{k}-\mu _{k}|>\varepsilon s_{n}\}}\right]=0}](data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7)
for all
, where 1{…} is the indicator function, then the central limit theorem holds, i.e. the random variables
![{\displaystyle Z_{n}:={\frac {\sum _{k=1}^{n}(X_{k}-\mu _{k})}{s_{n}}}}](data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7)
converge in distribution to a standard normal random variable as ![{\displaystyle n\to \infty .}](data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7)
Lindeberg's condition is sufficient, but not in general necessary (i.e. the inverse implication does not hold in general).
However, if the sequence of independent random variables in question satisfies
![{\displaystyle \max _{k=1,\ldots ,n}{\frac {\sigma _{k}^{2}}{s_{n}^{2}}}\to 0,\quad {\text{ as }}n\to \infty ,}](data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7)
then Lindeberg's condition is both sufficient and necessary, i.e. it holds if and only if the result of central limit theorem holds.
Remarks
Feller's theorem
Feller's theorem can be used as an alternative method to prove that Lindeberg's condition holds.[5] Letting
and for simplicity
, the theorem states
- if
,
and
converges weakly to a standard normal distribution as
then
satisfies the Lindeberg's condition.
This theorem can be used to disprove the central limit theorem holds for
by using proof by contradiction. This procedure involves proving that Lindeberg's condition fails for
.
Interpretation
Because the Lindeberg condition implies
as
, it guarantees that the contribution of any individual random variable
(
) to the variance
is arbitrarily small, for sufficiently large values of
.
Example
Consider the following informative example which satisfies the Lindeberg condition. Let
be a sequence of zero mean, variance 1 iid random variables and
a non-random sequence satisfying:
![{\displaystyle \max _{i}^{n}{\frac {|a_{i}|}{\|a_{i}\|_{2}}}\rightarrow 0}](data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7)
Now, define the normalized elements of the linear combination:
![{\displaystyle X_{n,i}={\frac {a_{i}\xi _{i}}{\|a\|_{2}}}}](data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7)
which satisfies the Lindeberg condition:
![{\displaystyle \sum _{i}^{n}\mathbb {E} \left[\left|X_{i}\right|^{2}1(|X_{i}|>\varepsilon )\right]\leq \sum _{i}^{n}\mathbb {E} \left[\left|X_{i}\right|^{2}1\left(|\xi _{i}|>\varepsilon {\frac {\|a\|_{2}}{\max _{i}^{n}|a_{i}|}}\right)\right]=\mathbb {E} \left[\left|\xi _{i}\right|^{2}1\left(|\xi _{i}|>\varepsilon {\frac {\|a\|_{2}}{\max _{i}^{n}|a_{i}|}}\right)\right]}](data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7)
but
is finite so by DCT and the condition on the
we have that this goes to 0 for every
.
See also
References
- ^ Billingsley, P. (1986). Probability and Measure (2nd ed.). Wiley. p. 369. ISBN 0-471-80478-9.
- ^ Ash, R. B. (2000). Probability and measure theory (2nd ed.). p. 307. ISBN 0-12-065202-1.
- ^ Resnick, S. I. (1999). A probability Path. p. 314.
- ^ Lindeberg, J. W. (1922). "Eine neue Herleitung des Exponentialgesetzes in der Wahrscheinlichkeitsrechnung". Mathematische Zeitschrift. 15 (1): 211–225. doi:10.1007/BF01494395. S2CID 119730242.
- ^ Athreya, K. B.; Lahiri, S. N. (2006). Measure Theory and Probability Theory. Springer. p. 348. ISBN 0-387-32903-X.