Definition

Let \(W\) be a subspace of \(V\). The orthogonal complement of \(W\) is \[ W^\perp:=\{ v \in V: v\perp w \mbox{ for all } w \in W\} \]   

Theorem

Let \(W\) be a subspace of \(V\) and let \(W^\perp\) be its orthogonal complement. Then

  1. Every \(x\in V\) can be written uniquely as \(x=x_0+x_1\), with \(x_0\in W\) and \(x_1\in W^\perp\).

  2. \(\text{dim}(V)\)=\(\text{dim}(W)\)+\(\text{dim}(W^\perp)\).

  

Theorem

Let \(A\) be an \(n\times n\) matrix. Then, \(\text{dim}(C(A))+\text{dim}(N(A))=n.\)

  

Definition

Let \(A\) be an \(n\times n\) matrix. The scalar \(\lambda\) os called an eigenvalue of \(A\) if \(A-\lambda I\) is singular, i.e., there exists a vector \(v\ne 0\) such that \(Av=\lambda v\).

Any such vector is called an eigenvector corresponding to \(\lambda\).

  

Definition(Adjoints)

Consider a general inner product on \(\mathbb{R}^n\). For any operator or matrix \(A\) mapping \(\mathbb{R}^n\) to \(\mathbb{R}^n\), there exists another operator or matrix \(A^*\), called the adjoint, that satisfies \[ (Ax,y)=(x,A^*y) \hspace{3mm} \mbox{ for all } x,y\in \mathbb{R}^n. \]

An operator \(A\) is self-adjoint if it is equal to its adjoint, i.e., \(A=A^*\implies (Ax,y)=(x,Ay)\).

  

Lemma

If \(\lambda_1\), \(\lambda_2\) are distinct eigenvalues of the symmetric matrix \(A\), and \(v_1\), and \(v_2\) are corresponding eigenvectors, then \(v_1 \perp v_2\).

  

Lemma

The orthogonality implies that they are linearly independent.

  

Lemma

If \(A\) is symmetric, then \(C(A)\) and \(N(A)\) are orthogonal complement.

  

매우 중요

Theorem(Spectral Decomposition)

Let \(A\) be a symmetric \(n\times n\) amtrix. Then, we can write \[ A=PDP', \] where \(D=\text{diag}(\lambda_1,\lambda_2,\ldots,\lambda_n)\), and P is orthogonal.

The \(\lambda's\) are eigenvalues of \(A\), and the \(i^{th}\) column of \(P\) is an eigenvector corresponding to \(\lambda_i\).