MLE 예제: 매우매우 중요하다.

Theorem (Asymptotic Normality of MLE)

Suppose that \(z_1,\ldots z_n\) are i.i.d random variables from p.d.f \(f(\cdot;\theta_0)\). Let \(J=E(\{d\log f(z;\theta)/d\theta\}_{\theta=\theta_0}\{d\log f(z;\theta)/d\theta'\}_{\theta=\theta_0})\) be the information matrix, and let \(J=E(\{d^2\log f(z;\theta)/d\theta d\theta'\}_{\theta=\theta_0})\) be the expected hessian matrix. Then, \[ \sqrt{n}(\hat\theta-\theta_0)\stackrel{d}\rightarrow N(0,J^{-1}). \]

Now, note that \(\bar\theta\stackrel{P}\rightarrow \theta_0\), because \(\hat\theta\stackrel{P}\rightarrow \theta_0\). Thus \(\{n^{-1}\sum_{i=1}^n \log'' f(z_i;\bar\theta)\}^{-1}\stackrel{P}\rightarrow \{E(d^2 logf(Z;\theta_0)/d\theta d\theta')\}^{-1}=H^{-1}\) by continuous mapping theorem.

Also, note that \(n^{-1/2}\sum_{i=1}^n \{\log' f(z_i;\theta_0)\}\stackrel{d}\rightarrow N(0,J)\) by CLT. By Slutsky’s theorem, we have \[ \sqrt{n}(\hat\theta-\theta_0)\stackrel{d}\rightarrow -H^{-1}N(0,J)\equiv N(0,(-H^{-1})J(-H^{-1})) \equiv N(0,J^{-1}) \]

Theorem (The Basic Results)

Suppose that \(\hat\theta\) satisfies eq (1) in 1.1, and \(\hat\theta\stackrel{P}\rightarrow \theta_0\). Also, suppose that

  1. \(\theta_0\in \text{interior}(\Theta)\),
  2. \(\hat Q_n(\theta)\) is twice coutinuously differentiable in a neighborhood \(\mathcal{N}\) of \(\theta_0\),
  3. \(\sqrt{n}d\hat Q_n(\theta_0)/d\theta_0 \stackrel{d}\rightarrow N(0,\Sigma)\)
  4. There is \(H(\theta)\) that is continuous at \(\theta_0\) and \(\sup_{\theta\in \mathcal{N}}||d^2\hat Q_n(\theta)/d\theta d\theta' -H(\theta)||\stackrel{P}\rightarrow 0\),
  5. \(H=H(\theta_0)\) is nonsingular.

Then \(\sqrt{n}(\hat\theta-\theta_0)\stackrel{d}\rightarrow N(0,H^{-1}\Sigma H^{-1}).\)