중요(조건들을 잘 기억하자)

Theorem(Kolmogorov’s Maximal Inequality)

Suppose that \(X_1,\ldots, X_n\) are independent random variables with mean 0, and let \(S_n=\sum_{i=1}^n X_i, n\ge 1\). Then for any \(\alpha>0\), \[ P\left(\max_{1\le j\le n} \{|S_j|\ge \alpha \}\right)\le \frac{1}{\alpha^2}\text{Var}(S_n). \]



중요(조건을 잘 기억하자)

Theorem(Etemadi’s maximal inequality)

If \(X_1,\ldots,X_n\) are independent random variables, then for any \(\alpha>0\), \[ P(\max_{1\le j\le n}|S_j|\ge 3\alpha)\le 3\max_{1\le j\le n}P(|S_j|\ge \alpha) \]





매우 중요하다.

Theorem(Kolmogorov’s Convergence Criterion)

If \(X_1, X_2,\ldots\) are independent mean 0 random variables with \(\sum_{n=1}^\infty \text{Var}(X_n)<\infty\), then \(\sum_{n=1}^\infty X_n\) converges a.s. and in \(L^2\). Moreover, \(E(\sum_{n=1}^\infty X_n)=0\) and \(\text{Var}(\sum_{n=1}^\infty X_n)= \sum_{n=1}^\infty \text{Var}(X_n)\).





Theorem(Levy’s Theorem)

If \(X_1,X_2,\ldots\) are independent random variables, then as \(n\rightarrow \infty\),

\(S_n\rightarrow S_\infty\) a.s. \((\exists\) a r.v. \(S_\infty)\) \(\iff S_n\stackrel{\text{Pr}}\rightarrow S_\infty\).




Corollary

If \(X_1,X_2,\ldots\) are independent, mean 0, uniformly bounded random variables, then

\(\sum_{n=1}^\infty X_n\) converges a.s. \(\iff\) \(\sum_{n=1}^\infty \text{Var}(X_n)<\infty\).




Lemma

If \(X_1,X_2,\ldots,\) and \(X_1^*,X_2^*\ldots\) are sequensces of random variables with identical finite dimensional distributions, i.e., with \[ (X_1,\ldots,X_n)\sim (X_1^*,\ldots,X_n^*)\mbox{ }\mbox{ }\mbox{ }\forall \mbox{ }n\ge 1, \] then \(X_n\) converges a.s. \(\iff\) \(X_n^*\) converges a.s.




Lemma

For independent, uniformly bounded random variables \(X_n,n\ge 1\),

if \(\sum_{n=1}^\infty X_n\) converges a.s., then \(\sum_{n=1}^\infty E(X_n)\) converges.





정리

  1. Kolmogorov’s 수렴정리 : \(\{X_n\}\)들이 독립이고 평균이 0인 변수일 때 \(\sum_{n=1}^\infty \text{Var}(X_n)<\infty\)라면

    • \(\sum_{n=1}^\infty X_n\) converges a.s이다(저세상 끝의 변수는 variance가 0\(\implies\)a.s. 수렴이라고 생각할 수 있다).

    • \(L^2\) 수렴한다 (Sum of variance가 finite하기 때문에 당연한 결과일수도 있다).

    • \(E(\sum_{n=1}^\infty X_n)=0\) and \(\text{Var}(\sum_{n=1}^\infty X_n)= \sum_{n=1}^\infty \text{Var}(X_n)\).


  1. Levy’s 이론 : \(\{X_n\}\)들이 독립일 때 a.s.수렴과 확률수렴은 동치이다(매우 자주 쓰인다).

    • \(S_n \rightarrow S_\infty\) a.s. \(\iff S_n\stackrel{\text{Pr}}\rightarrow S_\infty\)


  1. \(\{X_n\}\) 독립, 평균 0, uniformly bounded : \(\sum_{n=1}^\infty X_n\) converges a.s.\(\iff\) \(\sum_{n=1}^\infty\text{Var}(X_n)<\infty.\)

    • 원래 Kolmogorov’s 수렴정리에 따르면 \(\Longleftarrow\)만 성립했지만, uniformly bounded조건이 추가되면 동치가 된다.


  1. \(\{X_n\}\) 독립, uniformly bounded : \(\sum_{n=1}^\infty X_n\) converges a.s.\(\implies\sum_{n=1}^\infty E(X_n)\) converges.



back