# Hermitian

A number of mathematical entities are named **Hermitian**, after the mathematician Charles Hermite. In this article we discuss the term Hermitian as used in operator and matrix theory to refer to a certain kind of operator (or matrix). We also discuss a number of related concepts including symmetric and self-adjoint unbounded operators. We also warn the reader that there is a range of conflicting terminology in use in the literature generally and in Wikipedia in particular.

A **Hermitian matrix** (or **self-adjoint matrix**) is a square matrix with complex entries so that the matrix is equal to its own conjugate transpose - that is, if the element in the *i*th row and *j*th column is equal to the complex conjugate of the element in the *j*th row and *i*th column, for all indices *i* and *j*:

Every Hermitian matrix is normal, and the finite-dimensional spectral theorem applies. It says that any Hermitian matrix can be diagonalized by a unitary matrix, and that the resulting diagonal matrix has only real entries. This means that all eigenvaluess of a Hermitian matrix are real, and, moreover, eigenvectors with distinct eigenvalues are orthogonal. It is possible to find an orthonormal basis of **C**^{n} consisting only of eigenvectors.

If the eigenvalues of a Hermitian matrix are all positive, then the matrix is positive definite. Matrix theorists sometimes refer to real Hermitian matrices as *symmetric* matrices, since indeed they are symmetric with respect to the diagonal.

Table of contents |

2 Self-adjoint operators 3 Extensions of symmetric operators 4 References |

## Symmetric and Hermitian operators

For preliminaries see Hilbert space. A partially defined linear operator *A* on a Hilbert space *H* is called **symmetric** iff

*x*and

*y*in the domain of

*A*. This usage is fairly standard in the functional analysis literature.

By the Hellinger-Toeplitz theorem, a symmetric everywhere defined operators is bounded.

Bounded symmetric operators are also called *Hermitian*.

This definition agrees with the one given above if we take as *H* the Hilbert space **C**^{n} with the standard dot product and interpret a square matrix as a linear operator on this Hilbert space. It is however much more general as there are important infinite-dimensional Hilbert spaces.

The spectrum of any Hermitian operator is real; in particular all its eigenvalues are real.

A version of the spectral theorem also applies to Hermitian operators; while the eigenvectors corresponding to different eigenvalues are orthogonal, in general it is *not* true that the Hilbert space *H* admits an orthonormal basis consisting only of eigenvectors of the operator. In fact, Hermitian operators need not have any eigenvalues or eigenvectors at all.

**Example**. Consider the complex Hilbert space L^{2}[0,1] and the differential operator *A* = d^{2} / d*x*^{2}, defined on the subspace consisting of all differentiable functions *f* : [0,1] → **C** with *f*(0) = *f*(1) = 0. Then integration by parts easily proves that *A* is symmetric. Its eigenfunctions are the sinusoids sin(*n*π*x*) for *n* = 1,2,..., with the real eigenvalues *n*^{2}π^{2}; the well-known orthogonality of the sine functions follows as a consequence of the property of being symmetric.

## Self-adjoint operators

Given a densely defined linear operator *A* on *H*, its adjoint ''A'\'* is defined as follows:

*H*.

- By the Riesz representation theorem for linear functionals, if
*x*is in the domain of*A**, there is a unique vector*z*in*H*such that

*z*is defined to be

*A**

*x*.

*A*is symmetric iff

*A*is

**self-adjoint**iff

*A*=

*A**.

**Example**. Consider the complex Hilbert space L^{2}(**R**), and the operator which multiplies a given function by *x*:

^{2}functions for which the right-hand-side is square-integrable.

*A*is a symmetric operator without any eigenvalues and eigenfunctions. In fact the operator is self-adjoint.

As we will show later, self-adjoint operators have very important spectral properties; they are in fact multiplication operators on general measure spaces.

## Extensions of symmetric operators

**Theorem**. Suppose *A* is a symmetric operator. Then there is a
unique partial defined operator linear

W(*A*) is isometric on its domain. Moreover, the range of
1 - W(*A*) is dense in *H*.

The mappings W and S are inverses of each other.

The mapping W is called the **Cayley transform**. It associates a partial defined isometry to any symmetric densely-defined operator. Note that the mappings W and S are monotone. This means that if *B* is a symmetric operator that extends the densely defined symmetric operator *A*, then W(*B*) extends W(*A*), and similarly for S.

**Theorem**. A necessary and sufficient condition *A* be self-adjoint is that its Cayley transform W(*A*) be unitary.

This gives a necessary and sufficent condition for *A* to have a self-adjoint extension:

**Theorem**. A necessary and sufficient condition *A* have a self adjoint extension is that W(*A*) have a unitary extension.

A partially defined isometric operator *V* on a Hilbert space *H* has a unique isometric extension to the norm closure of dom(*V*). A partially defined isometric operator with closed domain is called a **partial isometry**.

Given a partial isometry *V*, the **deficiency indices** of **V** are defined as
follows:

**Theorem**. A partial isometry

*V*has a unitary extension iff the deficiency indices are identical.