Theorem (Classical Central Limit Theorem)

Let \(X_1,X_2,X_3,\ldots,\) be iid random variables from some unknown probability distribution with mean 0 and finite variance \(\sigma^2\). Suppose we know that \[ \frac{X_1+X_2+\cdots+X_n}{\sqrt{n}}\stackrel{d}\rightarrow Y,\mbox{ }Y\sim F. \] Then, \(Y\sim \mathcal{N}(0,\sigma^2)\).


Theorem (Lindeberg-Feller Central Limit Theorem)

Suppose that we have an array of random variable such that \[\begin{align*} &X_{1,1},\ldots,X_{1,m_1}\\ &X_{2,1},X_{2,2},\ldots,X_{1,m_2}\\ &\mbox{ }\mbox{ }\mbox{ }\mbox{ }\mbox{ }\mbox{ }\mbox{ }\vdots\\ &X_{n,1},X_{n,2},X_{n,3},\ldots,X_{n,m_n}. \end{align*}\] Suppose that

  1. within each row, random variables are independent,
  2. All random variables have mean 0,
  3. \(\text{Var}(\sum_{j=1}^{m_n}X_{n_j})=\sum_{j=1}^{m_n}\text{Var}(X_{n_j})=1\),
  4. (Linderberg’s Condition) For every \(\epsilon >0\), \(\sum_{j=1}^{m_n}E\big[X_{nj}^2I_{\{|X_{nj}>\epsilon\}}\big]\rightarrow 0\) as \(n\rightarrow \infty\).

Then, \(\sum_{j=1}^{m_n}X_{nj}\stackrel{d}{\rightarrow} N(0,1)\) as \(n\rightarrow \infty\)




Theorem (Lyapounov’s Central Limit Theorem)

Suppose that we have an array of random variable such that \[\begin{align*} &X_{1,1},\ldots,X_{1,m_1}\\ &X_{2,1},X_{2,2},\ldots,X_{1,m_2}\\ &\mbox{ }\mbox{ }\mbox{ }\mbox{ }\mbox{ }\mbox{ }\mbox{ }\vdots\\ &X_{n,1},X_{n,2},X_{n,3},\ldots,X_{n,m_n}. \end{align*}\] Suppose that

  1. within each row, random variables are independent,
  2. All random variables have mean 0,
  3. \(\text{Var}(\sum_{j=1}^{m_n}X_{n_j})=\sum_{j=1}^{m_n}\text{Var}(X_{n_j})=1\),
  4. (Lyapounov’s Condition): For some \(\delta>0\) \(\sum_{j=1}^{m_n}E\big[|X_{nj}^{2+\delta}|\big]\rightarrow 0\) as \(n\rightarrow \infty\) (This means uniformity).

Then, \(\sum_{j=1}^{m_n}X_{nj}\stackrel{d}{\rightarrow} N(0,1)\) as \(n\rightarrow \infty\)




back