Eigen Decomposition


Eigen decomposition is very important in linear algebra. It is a factorization of a matrix into a canonical form. The matrix is denoted in terms of its eigenvalues and eigenvectors. Merely diagonalizable matrices may be factorized in this manner. The decomposition is named spectral decomposition when the matrix being factorized is a normal symmetric matrix.

In this article, we will understand the Eigen Decomposition in detail as it is a more advanced topic in Linear Algebra. This is similarly more relevant and useful in machine learning.


The matrix decomposition of a square matrix A into supposed eigenvalues and eigenvectors is a very imperative one. This decomposition usually drives below the name matrix diagonalization.

On the other hand, this name is less than optimal. Meanwhile, the process being defined is actually the decomposition of a matrix into a product of three other matrices. Only one of which is included is the diagonal. Similarly, due to all other standard types of matrix decomposition use the term decomposition in their names. For example, Cholesky decomposition, Hessenberg decomposition, and so on.

Therefore, the decomposition of a matrix into matrices is composed of its eigenvectors and eigenvalues. That is named Eigen decomposition in this work.

A vector is an eigenvector of a matrix if it fulfills the below equation.

A . v = lambda . v

This is known as the eigenvalue equation. Where;

Otherwise, lacking the dot notation.

Av = lambdav
  • A matrix could have one eigenvector and eigenvalue for every dimension of the parental matrix.
  • Not all square matrices may be decomposed into eigenvectors and eigenvalues.
  • A number cans only be decomposed in a way that needs complex numbers.
  • The parental matrix may be made known to be a product of the eigenvectors and eigenvalues.
A = Q . diag(V) . Q^-1

On the other hand, lacking the dot notation.

A = Qdiag(V)Q^-1


  • Q is a matrix included of the eigenvectors.
  • diag (V) is a diagonal matrix comprised of the eigenvalues beside the diagonal.
  • This is sometimes denoted with a capital lambda.
  • Q^-1 is the opposite of the matrix comprised of the eigenvectors.

Eigenvectors and Eigenvalues

  • Eigenvectors are unit vectors as their length or magnitude is equal to 1.0.
  • They are frequently stated as right vectors.
  • It means a column vector is different from a row vector or a left vector.
  • A right-vector is a vector because we know them.
  • Eigenvalues are coefficients useful to eigenvectors.
  • That provides the vectors their length or magnitude.
  • For instance, a negative eigenvalue can reverse the direction of the eigenvector as part of scaling it.
  • A matrix that has single positive eigenvalues is stated as a positive definite matrix.
  • However, if the eigenvalues are all negative, it is mentioned as a negative definite matrix.

Calculation of Eigen decomposition

  • An Eigen decomposition is considered on a square matrix.
  • That is calculated by using an efficient iterative algorithm.
  • Frequently an eigenvalue is found first.
  • Then an eigenvector is set up to solve the equation as a set of coefficients.
  • The Eigen decomposition may be calculated in NumPy.
  • We can calculate it by using the eig () function.
  • The following example main defines a 3×3 square matrix.
  • The Eigen decomposition is calculated on the matrix.
  • That matrix returns the eigenvalues and eigenvectors.










# eigendecomposition

from numpy import array

from NumPy.linalg import eig

# define matrix

A = array([[1, 2, 3], [4, 5, 6], [7, 8, 9]])


# calculate eigendecomposition

values, vectors = eig(A)













[[1 2 3]

[4 5 6]

[7 8 9]]


[  1.61168440e+01  -1.11684397e+00  -9.75918483e-16]


[[-0.23197069 -0.78583024  0.40824829]

[-0.52532209 -0.08675134 -0.81649658]

[-0.8186735   0.61232756  0.40824829]]