\(L^2\)조건과 little “o” variance 조건

Theorem(ChebyShev Weak Law of Large Numbers)

Let \(S_n=\sum_{i=1}^n X_i, n\ge 1\), where \(X_1,X_2,\ldots\) are \(L^2\) random variables (not necessarily i.i.d). If \(b_n,n\ge 1\), are positive constants satisfying \(\text{Var}(S_n)=o(b_n^2)\), then \[ \frac{S_n-E(S_n)}{b_n}\stackrel{L^2}\rightarrow 0 \mbox{ }\mbox{ }\mbox{ and consequently }\mbox{ }\mbox{ }\frac{S_n-E(S_n)}{b_n}\stackrel{\text{Pr}}\rightarrow 0. \]

\[ E\left[\left(\frac{S_n-E(S_n)}{b_n}-0\right)^2\right]= \frac{\text{Var}(S_n)}{b_n^2}\rightarrow 0 \mbox{ }\mbox{ }\mbox{ as }n\rightarrow \infty\\ \implies \frac{S_n-E(S_n)}{b_n}\stackrel{L^2}\rightarrow 0. \]




uncorrelated, \(L^2\), uniformly bounded variance 조건

Corollary (\(L^2\) Weak Law)

Let \(X_1,X_2,\ldots\) are uncorrelated \(L^2\) random variables, with \(E(X_n)=\mu\), and \(\text{Var}(X_n)\le C<\infty\) for all \(n\ge 1\). Then \[ \frac{S_n}{n}\stackrel{L^2}\rightarrow \mu \mbox{ }\mbox{ }\mbox{ and consequently }\mbox{ }\mbox{ }\frac{S_n}{n}\stackrel{\text{Pr}}\rightarrow \mu. \]

\[ \frac{S_n-E(S_n)}{n}\stackrel{L^2}\rightarrow 0\implies \frac{S_n}{n}\stackrel{L^2}\rightarrow \mu\implies \frac{S_n}{n}\stackrel{\text{Pr}}\rightarrow \mu. \]


Theorem (Chevyshev WLLN for Random Arrays)

Suppose \(X_{n,i}\), \(1\le i\le m_n, n\ge 1\), are \(L^2\) random variables(defined on the samme probability space), and let \(S_n=\sum_{i=1}^{m_n}X_{n,i}\), \(n \ge 1\). If for some sequence of positive constants \((b_n)\), \[ \frac{\text{Var}(S_n)}{b_n^2}\rightarrow 0 \mbox{ }\mbox{ }\mbox{ as }n\rightarrow \infty, \] then \[ \frac{S_n-E(S_n)}{b_n}\stackrel{L^2}\rightarrow 0\mbox{ }\mbox{ }\mbox{ and consequently}\mbox{ }\mbox{ }\frac{S_n-E(S_n)}{b_n}\stackrel{\text{Pr}}\rightarrow 0. \]





예제(쿠폰뽑기)

우리가 \(n\)개의 각기 다른 쿠폰을 복원추출한다고 하자. 그리고, \(S_{n,m}\)을 총 \(n\)개 쿠폰 중 \(m\)개의 각기 다른 쿠폰을 뽑을 때 까지의 추출 횟수라고 하자\((\)즉, \(0\le m\le n)\). 그렇다면

\[ X_{n,i}=S_{n,i}-S_{n,i-1}\sim \text{Geometric}\left(p=\frac{n-(i-1)}{n}=1-\frac{i-1}{n}\right), \mbox{ }\mbox{ }\mbox{ }1\le i\le n, \] i.e., \(n\)개 중에 \(i\)번째로 distinct한 쿠폰을 뽑기위한 추출횟수는 \((i)\)번째 득 하기까지의 추출횟수\(- (i-1)\)번째까지의 추출횟수이고 이는 기하분포를 따른다.

성공확률 \(p\in(0,1]\)인 기하분포의 평균은 \(\frac{1}{p}\), 분산은 \(\frac{(1-p)}{p^2}\le \frac{1}{p^2}\)임을 기억하자.

그렇다면 우리가 complete set \(n\)개를 전부 뽑는 데 걸린 추출 횟수는 \[ S_n=\sum_{i=1}^n X_{n,i} \] 이다. 때문에 \[ E(S_n)=\sum_{i=1}^nE(X_{n,i})=\sum_{i=1}^n\frac{n}{n-(i-1)}=n\sum_{m=1}^n m^{-1}\implies \frac{E(X_n)}{n}=\sum_{m=1}^n m^{-1}, \] 이고 \[ \text{Var}(S_n)=\sum_{i=1}^n\text{Var}(X_{n,i})\le \sum_{i=1}^n\left(\frac{n}{n-(i-1)}\right)^2=n^2\sum_{m=1}^n m^{-2}\le n^2\sum_{m=1}^\infty m^{-2}. \] 이다.

또한 바로 위의 theorem에서 \(m_n=n\)으로 잡고 \(b_n=n\log n\)으로 잡으면 \[ \frac{\text{Var}(S_n)}{b_n^2}\le \frac{n^2\sum_{m=1}^\infty m^{-2}}{(n\log n)^2}= \frac{\sum_{m=1}^\infty m^{-2}}{(\log n)^2}\rightarrow 0 \mbox{ }\mbox{ }\mbox{ }\mbox{ as }n\rightarrow \infty, \] 때문에 체비셰프 WLLN에 의해 \[ \frac{S_n-E(S_n)}{b_n}=\frac{S_n-n\sum_{m=1}^n m^{-1}}{n \log n}\stackrel{\text{Pr}}\rightarrow 0. \] 결국 \[ \frac{S_n}{n\log n}= \frac{S_n-n\sum_{m=1}^n m^{-1}}{n \log n}+ \frac{n\sum_{m=1}^n m^{-1}}{n \log n}\stackrel{\text{Pr}}\rightarrow 0+1=1. \] 예를 들어 \(n=500\)이면, complete set을 갖기 위한 예상 추출 회수를 \(500\log 500\approx 3107\)정도로 예측할 수 있다.




독립

Theorem (A Weak Law for Triangular Arrays)

For each \(n\ge 1\), suppose that \(X_{n,i},1\le i\le m_n\) are independent random variables, and let \(S_n=\sum_{i=1}^{m_n} X_{n,i}\). Suppose further that \(0<b_n\rightarrow \infty\) and define \[ X_{n,i}^*=X_{n,i} I_{\{|X_{n,i}|\le b_n\}},\mbox{ }\mbox{ }\mbox{ and }\mbox{ }\mbox{ } a_n=\sum_{i=1}^{m_n}E(X_{n,i}^*),\mbox{ }\mbox{ }n\ge 1. \] If both

  1. \(\sum_{i=1}^{m_n}P(|X_{n,i}|>b_n)\rightarrow 0\) as \(n\rightarrow \infty\), and

  2. \(\frac{1}{b_n^2}\sum_{i=1}^{m_n}E({X_{n,i}^*}^2)\rightarrow 0\) as \(n\rightarrow \infty\),

then \[ \frac{S_n-a_n}{b_n}\stackrel{\text{Pr}}{\rightarrow}0\mbox{ }\mbox{ }\mbox{ }\mbox{ as }n\rightarrow \infty. \]





i.i.d조건(보통은 Weak Law에서는 i.i.d 조건이 없는데 Feller의 WLLN에는 존재한다)

Theorem (Feller’s Weak Law of Large Numbers)

Let \(X_1,X_2,\ldots\) be i.i.d random variables with \[ nP(|X_1|>n)\rightarrow 0 \mbox{ }\mbox{ }\mbox{ as }n\rightarrow \infty. \] Let \(\mu_n=E(X_1 I_{\{|X_1|\le n\}})\). Then, \[ \frac{S_n}{n}-\mu_n \stackrel{\text{Pr}}\rightarrow 0. \]




정리
  1. 체비셰프 WLLN : \(\{X_n\}\)\(L^2\)이고 \(\text{Var}(S_n)=o(b_n^2)\)일 때 \[ \frac{S_n-E(S_n)}{b_n}\stackrel{\text{Pr}}\rightarrow 0. \]


  1. \(L^2\) WLLN: \(\{X_n\}\)들이 독립(uncorrelated)이고 variance가 uniformly bounded되어있을 때 \[ \frac{S_n}{n}\stackrel{\text{Pr}}\rightarrow \mu. \]

  2. WLLN for Triangular Arrays: \(\{X_{n,i}\}\)들이 독립이고 \(X_{n,i}^*=X_{n,i}I_{\{|X_{n,i}|\le b_n\}}\)라 하자.

    • \(\sum_{i=1}^{m_n}P(|X_{n,i}|>b_n)\rightarrow 0\)이고

    • \(\frac{1}{b_n^2}\sum_{i=1}^{m_n}E({X_{n,i}^*}^2)\rightarrow 0\)일 때

    \[\frac{S_n-E(X_n^*)}{b_n}\stackrel{\text{Pr}}{\rightarrow}0\]


  1. Feller WLLN: \(\{X_n\}\)들이 i.i.d이고 \(nP(|X_1|>n)\rightarrow 0\)라 하자. 또한 \(\mu_n=E(X_1 I_{\{|X_1|\le n\}})\)라 하면

\[ \frac{S_n}{n}-\mu_n \stackrel{\text{Pr}}\rightarrow 0. \]




back