Matrix analysis


In mathematics, particularly in linear algebra and applications, matrix analysis is the study of matrices and their algebraic properties. Some particular topics out of many include; operations defined on matrices, functions of matrices, and the eigenvalues of matrices.

Matrix spaces

The set of all m×n matrices over a field F denoted in this article Mmn form a vector space. Examples of F include the set of rational numbers ℚ, the real numbers ℝ, and set of complex numbers ℂ. The spaces Mmn and Mpq are different spaces if m and p are unequal, and if n and q are unequal; for instance M32M23. Two m×n matrices A and B in Mmn can be added together to form another matrix in the space Mmn:
and multiplied by a α in F, to obtain another matrix in Mmn:
Combining these two properties, a linear combination of matrices A and B are in Mmn is another matrix in Mmn:
where α and β are numbers in F.
Any matrix can be expressed as a linear combination of basis matrices, which play the role of the basis vectors for the matrix space. For example, for the set of 2×2 matrices over the field of real numbers, M22, one legitimate basis set of matrices is:
because any 2×2 matrix can be expressed as:
where a, b, c,d are all real numbers. This idea applies to other fields and matrices of higher dimensions.

Determinants

The determinant of a square matrix is an important property. The determinant indicates if a matrix is invertible. Determinants are used for finding eigenvalues of matrices, and for solving a system of linear equations.

Eigenvalues and eigenvectors of matrices

Definitions

An n×n matrix A has eigenvectors x and eigenvalues λ defined by the relation:
In words, the matrix multiplication of A followed by an eigenvector x, is the same as multiplying the eigenvector by the eigenvalue. For an n×n matrix, there are n eigenvalues. The eigenvalues are the roots of the characteristic polynomial:
where I is the n×n identity matrix.
Roots of polynomials, in this context the eigenvalues, can all be different, or some may be equal. After solving for the eigenvalues, the eigenvectors corresponding to the eigenvalues can be found by the defining equation.

Perturbations of eigenvalues

Matrix similarity

Two n×n matrices A and B are similar if they are related by a similarity transformation:
The matrix P is called a similarity matrix, and is necessarily invertible.

Unitary similarity

Canonical forms

Row echelon form

Jordan normal form

Weyr canonical form

Frobenius normal form

Triangular factorization

LU decomposition

LU decomposition splits a matrix into a matrix product of an upper triangular matrix and a lower triangle matrix.

Matrix norms

Since matrices form vector spaces, one can form axioms to define a "size" of a particular matrix. The norm of a matrix is a positive real number.

Definition and axioms

For all matrices A and B in Mmn, and all numbers α in F, a matrix norm, delimited by double vertical bars ||... ||, fulfills:
The Frobenius norm is analogous to the dot product of Euclidean vectors; multiply matrix elements entry-wise, add up the results, then take the positive square root:
It is defined for matrices of any dimension.

Positive definite and semidefinite matrices

Functions

Matrix elements are not restricted to constant numbers, they can be mathematical variables.

Functions of matrices

A functions of a matrix takes in a matrix, and return something else.

Matrix-valued functions

A matrix valued function takes in something and returns a matrix.

Other branches of analysis