Eigen Decomposition
The Eigen decomposition is a decomposition technique that factorizes a matrix into a set of eigenvectors and eigenvalues. This is a widely used decomposition technique in machine learning, in fact the PCA algorithm uses Eigedecomposition.
What is an Eigenvector?
Before we look at the Eigendecomposition technique, we should first understand the properties of an eigenvectors. An eigenvector is a vector which when multiplied by a matrix, the product is equal a scalar multiplied to the vector itself. In other words, an eigenvector $v$ is a vector with respect to a matrix A such that:
$$ A\cdot v = \lambda \cdot v$$
where $v$ is the eigenvector and $\lambda$ is the eigenvalue.
Example:
For example, suppose we have a matrix $A$ and vector $v$:
$$ A = \begin{pmatrix} 4 & 0 & -1 \\ 2 & 2 & 3 \\ 7 & 5 & 0 \end{pmatrix}$$
$$v = \begin{pmatrix} 1 \\ 1 \\ 2 \end{pmatrix}$$
then:
$$A \cdot v = \begin{pmatrix} 4 & 0 & -1 \\ 2 & 2 & 3 \\ 7 & 5 & 0 \end{pmatrix} \begin{pmatrix} 1 \\ 1 \\ 2 \end{pmatrix}$$
$$ A \cdot v = \begin{pmatrix}4*1+ 0*1 + 1*2 \\ 2*1 +-2*1 + 3*2 \\ 5*1 + 7*1+0*2 \end{pmatrix} = \begin{pmatrix} 6 \\ 6 \\ 12 \end{pmatrix} = 2v$$
In the above case, $v$ is the eigenvector and $6$ is the eigenvalue.
Each dimension of a square matrix can have it's own eigenvector and eigenvalue. This means that $3X3$ matrix will have $3$ eigenvectors and $3$ eigen values. This is a nice design because it allows for decomposition.
We can rewrite the above property in matrix as:
$$A \cdot Q = Q \cdot \Lambda $$
where $Q$ is a combination of eigenvectors and $\Lambda$ is a diagonal matrix of eigenvalues.
Using our matrix properties, we can rewrite the equation as:
$$A \cdot Q \cdot Q^{-1} = Q \cdot \Lambda \cdot Q^{-1}$$
$$A \cdot I = Q \cdot \Lambda \cdot Q^{-1}$$
Thus:
$$A = Q \cdot \Lambda \cdot Q^{-1}$$
Implementation in Eigen Decomposition
import numpy as np
A = np.array(np.linspace(1,16, 16)).reshape(4,4)
A
Computing eigenvectors and values.
vals, vect = np.linalg.eig(A)
vals, vect
We can then reconstruct the fomular above to compute for matrix $A$.
vect.dot(np.diag(vals)).dot(np.linalg.inv(vect))