Definition

The \(\sigma\)-field generated by a random variable \(X\), denoted \(\sigma(X)\) is the smallest \(\sigma\)-field w.r.t which \(X\) is measurable (as a mapping into (\(\mathbb{R},\mathcal{R}\))).


Similarly, the \(\sigma\)-field generated by a random vector \(X=(X_1,\ldots,X_k)\), again denoted \(\sigma(X)\) is the smallest \(\sigma\)-field w.r.t which \(X\) is measurable (as a mapping into (\(\mathbb{R}^k,\mathcal{R}^k\))).


Finally, the \(\sigma\)-field generated by an arbitrary collection of random variables \(\{X_t,t\in T\}\)(defined on a common probability space \((\Omega, \mathcal{F},P)\)), is the smallest \(\sigma\)-field w.r.t which all \(X_t,t\in T\) are measurable. This \(\sigma\)-filed is denoted \(\sigma(X_t,t\in T)\).



Theorem

Let \(X=(X_1,\ldots,X_k)\) be a random vector. Then

  1. \(\sigma(X)=\sigma(X_1,\ldots,X_k)=\{ X^{-1}(H):H\in \mathcal{R}^k\}\).

  2. A random variable \(Y\) is \(\sigma(X)\)-measurable if and only if \(Y=f(X)\) for some Borel measurable function \(f:\mathbb{R}^k\rightarrow \mathbb{R}\).

    • \(Y=f(X)\) 는 composition of measurable mappings \(X\) and \(f\)이다. Note: \[ (\Omega, \sigma(X))\stackrel{X}{\longrightarrow}(\mathbb{R}^k,\mathcal{R}^k)\stackrel{f}{\longrightarrow}(\mathbb{R},\mathcal{R}). \]



Proposition

For a random variable \(X\),

\[ \sigma(X)=\sigma \left(\left\{\omega:X(\omega)\le x, x\in \mathbb{R} \right\} \right)=\sigma\left(\left\{X^{-1}((-\infty,x]), x\in \mathbb{R} \right\} \right). \]

중요하다.

Definition

Random variables (random vectors) \(X_1,\ldots X_k\) are independent if the \(\sigma\)-fields \(\sigma(X_1),\ldots,\sigma(X_k)\) are independent, or equivalently, if \[ P(X_1\in H_1,\ldots X_k\in H_k)= P(X_1\in H_1)\cdots P(X_k\in H_k) \mbox{ }\mbox{ }\mbox{ for all }H_1,\ldots,H_k\in \mathcal{R}^1. \]



Theorem

Random variables \(X_1,\ldots,X_k\) are independent if and only if \[ \mu=\mu_1\times\cdots\times\mu_k, \mbox{ }\mbox{ }\mbox{ (product measure)}, \] or equivalently, \[ F(x)=F_1(x_1)\cdots F_k(x_k)\mbox{ }\mbox{ }\mbox{ for all } x=(x_1,\ldots,x_k)\in \mathbb{R}^k. \]

Theorem

If \(X_1,\ldots,X_k\) are independent random variables and \(g_1,\ldots,g_k\) are Borel measurable functions, then \(g_1(X_1),\ldots,g_k(X_k)\) are independent random variables.




Theorem

If \(X\) and \(Y\) are independent random variables, either both nonnegative or both integrable, then \[ E(XY)=E(X)E(Y). \] * 이 Theorem은 기대값을 split하는 것뿐만이 장점이 아니라, \(X,Y \in L^1\)이고 \(X,Y\) 가 독립이면 \(XY\in L^1\)임을 보인다는 것이 놀랍다.



중요하다

Theorem

Suppose that \(X\) and \(Y\) are independent random vectors (\(k\) and \(m\) dimensional, respectively) with respective distributions \(P_X\) and \(P_Y\). Let \(g:\mathbb{R}^{k+m}\rightarrow \mathbb{R}\) be a Borel measurable function, and let \(A\in \mathcal{R}^m\). if either \(g\) is nonnegative, or \(g(X,Y)\) is integrable, then \[ E[g(X,Y)I_A(Y)]=\int_A E[g(X,y)]dP_Y(y). \] * 증명 : \(X, Y\)가 독립이기 때문에 이 챕터의 4번째 theorem에 의해 joint distribution은 product of each distribution(product measure)이다. 그러므로 change of variable theorem과 Fubini’s theorem에 의해

\[\begin{eqnarray*} E[g(X,Y)I_A(Y)] &=& \int_{\Omega_1\times\Omega_2}g(X(\omega_1),Y(\omega_2))I_{Y^{-1}(A)}(\omega)\mbox{ }dP(\omega_1 \times \omega_2) \\ &=&\int_{\mathbb{R}^{k+m}}g(x,y)I_A(y)\mbox{ }dP_X \times P_Y(x \times y)\\ &=&\int_{\mathbb{R}^{m}}\int_{\mathbb{R}^{k}}g(x,y)I_A(y)\mbox{ }d P_X(x)P_Y(y)\\ &=&\int_{\mathbb{R}^{m}}I_A(y)\left[\int_{\mathbb{R}^{k}}g(x,y)\mbox{ }d P_X(x)\right]P_Y(y)\\ &=&\int_{A}E\left[g(X,y)\right]P_Y(y)\\ \end{eqnarray*}\]



back