In mathematics, a Hermitian matrix is a complexsquare matrix that is equal to its own conjugate transpose—that is, the element in the -th row and -th column is equal to the complex conjugate of the element in the -th row and -th column, for all indices and : or in matrix form: Hermitian matrices can be understood as the complex extension of realsymmetric matrices. If the conjugate transpose of a matrix is denoted by, then the Hermitian property can be written concisely as Hermitian matrices are named after Charles Hermite, who demonstrated in 1855 that matrices of this form share a property with real symmetric matrices of always having real eigenvalues. Other, equivalent notations in common use are, although note that in quantum mechanics, typically means the complex conjugate only, and not the conjugate transpose.
Alternative characterizations
Hermitian matrices can be characterized in a number of equivalent ways, some of which are listed below:
A square matrix is Hermitian if and only if it is equal to its adjoint, that is, it satisfies for any pair of vectors, where denotes the inner product operation. This is also the way that the more general concept of self-adjoint operator is defined.
In this section, the conjugate transpose of matrix is denoted as, the transpose of matrix is denoted as and conjugate of matrix is denoted as. See the following example: The diagonal elements must be real, as they must be their own complex conjugate. Well-known families of Hermitian matrices include the Pauli matrices, the Gell-Mann matrices and their generalizations. In theoretical physics such Hermitian matrices are often multiplied byimaginary coefficients, which results in skew-Hermitian matrices. Here, we offer another useful Hermitian matrix using an abstract example. If a square matrix equals the multiplication of a matrix and its conjugate transpose, that is,, then is a Hermitian positive semi-definite matrix. Furthermore, if is row full-rank, then is positive definite.
Properties
The entries on the main diagonal of any Hermitian matrix are real.
A matrix that has only real entries is Hermitian if and only if it is symmetric. A real and symmetric matrix is simply a special case of a Hermitian matrix.
Every Hermitian matrix is a normal matrix. That is to say,.
The finite-dimensional spectral theorem says that any Hermitian matrix can be diagonalized by a unitary matrix, and that the resulting diagonal matrix has only real entries. This implies that all eigenvalues of a Hermitian matrix with dimension are real, and that has linearly independent eigenvectors. Moreover, a Hermitian matrix has orthogonal eigenvectors for distinct eigenvalues. Even if there are degenerate eigenvalues, it is always possible to find an orthogonal basis of consisting of eigenvectors of.
The sum of any two Hermitian matrices is Hermitian.
The inverse of an invertible Hermitian matrix is Hermitian as well.
The product of two Hermitian matrices and is Hermitian if and only if.
For an arbitrary complex valued vector the product is real because of. This is especially important in quantum physics where Hermitian matrices are operators that measure properties of a system e.g. total spin which have to be real.
The Hermitian complex -by- matrices do not form a vector space over the complex numbers,, since the identity matrix is Hermitian, but is not. However the complex Hermitian matrices do form a vector space over the real numbers. In the -dimensional vector space of complex matrices over, the complex Hermitian matrices form a subspace of dimension. If denotes the -by- matrix with a in the position and zeros elsewhere, a basis can be described as follows:
If orthonormal eigenvectors of a Hermitian matrix are chosen and written as the columns of the matrix, then one eigendecomposition of is where and therefore
Additional facts related to Hermitian matrices include:
The sum of a square matrix and its conjugate transpose is Hermitian.
The difference of a square matrix and its conjugate transpose is skew-Hermitian. This implies that the commutator of two Hermitian matrices is skew-Hermitian.
An arbitrary square matrix can be written as the sum of a Hermitian matrix and a skew-Hermitian matrix. This is known as the Toeplitz decomposition of.
In mathematics, for a given complex Hermitian matrix M and nonzero vector x, the Rayleigh quotient, is defined as: For real matrices and vectors, the condition of being Hermitian reduces to that of being symmetric, and the conjugate transpose to the usual transpose. Note that for any non-zero real scalar. Also, recall that a Hermitian matrix has real eigenvalues. It can be shown that, for a given matrix, the Rayleigh quotient reaches its minimum value when is . Similarly, and. The Rayleigh quotient is used in the min-max theorem to get exact values of all eigenvalues. It is also used in eigenvalue algorithms to obtain an eigenvalue approximation from an eigenvector approximation. Specifically, this is the basis for Rayleigh quotient iteration. The range of the Rayleigh quotient is called a numerical range. When the matrix is Hermitian, the numerical range is equal to the spectral norm. Still in functional analysis, is known as the spectral radius. In the context of C*-algebras or algebraic quantum mechanics, the function that to associates the Rayleigh quotient for a fixed and varying through the algebra would be referred to as "vector state" of the algebra.