Overview & SVD

Spectral Decomposition

for symmetric matrix :

where is evec of . Therefore is orthogonal Matrix.

let symmetric Matrix of rank , , . Then there exists orthogonal Matrix , which means and

at here, by letting i-th ev, , then

then and

let be i-th column vector of . then

thus

remark

let orthogonal Matrix , therefore , and .

let symmetric Matrix with full rank. then by SVD,

let symmetric Matrix with full rank. then by SVD,

in particular, a Cov Matrix can be written by

Singular value Decomposition: General-version

decomposition of any aribtrary Matrix with rank , .

. 이때 의 non-zero ev.

의 corresponding evec으로 구성. 따라서 Both , i.e., are column orthogonal.

thus

ab
\begin{alignat}{2} A &= \Gamma \Lambda \triangle^T &&=\sum_{i=1}^r \lambda_i \pmb \gamma_i \pmb \delta_i ' \\ A'A &= \triangle \Lambda^2 \triangle^T &&=\sum_{i=1}^r \lambda_i^2 \pmb \delta_i \pmb \delta_i ' \\ AA'&= \Gamma \Lambda^2 \Gamma^T &&=\sum_{i=1}^r \lambda_i^2 \pmb \gamma_i \pmb \gamma_i ' \\ \gamma_k ' A &= \gamma_k ' \Gamma \Lambda \triangle^T &&= \lambda_k \pmb \delta_k ' \\ A \delta_k &= \Gamma \Lambda \triangle^T \delta_k &&= \lambda_k \pmb \gamma_k \end{alignat}<br>\begin{alignat}{2} A &= \Gamma_1 \Lambda_1 \Gamma^T_1 &&=\sum_{i=1}^r \lambda_i \pmb \gamma_i \pmb \gamma_i ' \\ A'A &= \Gamma_1 \Lambda_1^2 \Gamma^T_1 &&=\sum_{i=1}^r \lambda_i^2 \pmb \gamma_i \pmb \gamma_i ' \\ AA'&= && \\ \gamma_k ' A &= \lambda_k \gamma_k ' \gamma_k \gamma_k ' &&= \lambda_k \pmb\gamma_k ' \\ A \gamma_k &= \lambda_k \gamma_k \gamma_k ' \gamma_k &&= \lambda_k \pmb \gamma_k \end{alignat}<br>
cd

therefore, generalized inverse matrix, G-inverse Matrix will be

Singular value Decomposition: Another-version

rank arbitrary Matrix

. 이때 의 non-zero ev.

의 corresponding evec으로 구성. 따라서 Both , i.e., are column orthogonal.

Quadratic Forms

for symmetric Matrix , vector :

if corresponding quadratic form is positive definite(semi-definite), is called positive definite(semi-definite). This is written by .

propositions

if , and is corresponding quadratic form, then , is ev of .

, then maximum of is given by the largest ev of .

the vector which maximizes(minimizes) is the corresponding evec of for largest(smallest) ev of .

more generally, for ev of , ,

if , then

Partitioned Matrices

and . then

for , the non-zero ev of and are the same and have the same multiplicity. if is evec of for an ev , then is an evec of .

for , if , then non-zero ev, if it exists, equals with evec .

Geometrical Aspects

mutually orthogonal

In that case, has rank , and is a diagonal Matrix with in the i-th diagonal position.

let’s consider bivariate data , and let . then correlation b/w and is

where is the angle b/w the deviation vectors and .

For two dimensions, the rotation can be expressed:

\begin{alignat}{2} \pmb y &= \begin{pmatrix} y_1 \\ y_2 \end{pmatrix} &&= \begin{pmatrix} \cos(\theta) & \sin(\theta) \\ -\sin(\theta) & \cos(\theta) \end{pmatrix} \begin{pmatrix} x_1 \\ x_2 \end{pmatrix} && = \Gamma \begin{pmatrix} x_1 \\ x_2 \end{pmatrix} \\ &= \Gamma \pmb x \tag{clockwise rotation} \\ \\ \pmb y & &&= \begin{pmatrix} \cos(\theta) & -\sin(\theta) \\ \sin(\theta) & \cos(\theta) \end{pmatrix} \begin{pmatrix} x_1 \\ x_2 \end{pmatrix} &&= \Gamma ' \begin{pmatrix} x_1 \\ x_2 \end{pmatrix} \\ & = \Gamma ' \pmb x \tag{counter-clockwise rotation} \end{alignat}

Column, Row and Null Space

Matrix :

Spaces by Singular Value Decomposition: General-version,

Matrix with :

  • note: Matrix with

has full rank (is nonsingular) if has full column rank ( has linearly independent columns).