An operator operating on the elements of the vector space V has certain kets, called eigenkets, on which its action is simply that of rescaling. Ω|V> = ω|V>. |V> is an eigenket (eigenvector) of Ω, ω is the corresponding eigenvalue.
Example:If |V> is an eigenvector, so is a|V>. The eigenvectors are only fixed to an overall scale factor. If we require that <V|V> = 1, the ambiguity is partially removed, a phase factor eiβ is still arbitrary.
Examples:How do we find the eigenvalues and eigenvectors of an
arbitrary linear operator Ω?
Ω|ψ> = ω|ψ>, (Ω - ω I)|ψ> = 0.
We can transform this equation into a matrix equation by choosing a basis {|ui>}
such that
|ψ> = ∑ici|ui>. Then ∑j(Ωij
- ωδij)cj = 0.
This matrix equation has a nontrivial solution only if det(Ω - ω I) = 0.
(For a trivial solution all cj are zero.)
This is the characteristic equation.
The determinant is
Here εijk = 1 for even permutations, εijk = -1 for odd permutations, and
εijk = 0 for repeating indices.
Our characteristic equation therefore is a polynomial of order N in ω, where N is the dimension of the vector space. It has N roots. These roots can be real or complex, distinct or identical. These N roots are the eigenvalues. They are found in a particular basis, but are independent of the choice of basis. If the vector space V is defined over the space of complex numbers, then each linear operator Ω has at least one eigenvalue. The characteristic equation has at least one root.
Example:How do we find the eigenvectors associated with the
eigenvalues?
Assume one of the eigenvalues is ω.
We have to solve
the system of N equations, ∑j(Ωij - ωδij)cj
= 0, (i = 1
to N) for the unknowns cj. If Ω
is a linear operator and |ψ> is an eigenvector, then a|ψ> is also an eigenvector.
The cj can
therefore only be uniquely defined up to a multiplicative constant. That means that at
most N - 1 of the above equations can be linearly independent. (The determinant could
not be zero otherwise.) If N - 1 of the equations are linearly independent, then the
solution for the cj is unique up to an arbitrary multiplicative
constant. If less than N - 1 of the equations are linearly independent, then we can
find more than one linearly independent solution for the cj.
If this
happens, the eigenvalue ω is said to be degenerate, and the associated eigenvectors form a
subspace Vω of V.
If the characteristic equation has N distinct roots, we can find N (up to a phase factor) unique, normalized, linearly independent eigenvectors which form a basis of V. If the characteristic equation does not have N distinct roots, we may be able to find more than one linearly independent eigenvector associated with a multiple root and so still find N linearly independent eigenvalues. But we may also find that only one eigenvector is associated with a multiple root and that the operator does not have a basis of eigenvectors.
Eigenvalues and eigenvectors of a Hermitian operator
The eigenvalues of a Hermitian operator are real.
Proof:
Let Ω|ω> = ω|ω >, then <ω|Ω|ω > = ω<ω|ω >.
Take the adjoint. <ω|Ω†|ω> = ω*<ω|ω >.
If Ω = Ω† then
<ω|Ω|ω > = ω*<ω|ω>, (ω - ω*)<ω|ω
> = 0.
Since |ω > ≠ 0, we have ω = ω*.
Every Hermitian operator has at least one basis of orthonormal eigenvectors.
The matrix
of the operator is diagonal in this basis and has its eigenvalues as its diagonal entries.
Proof:
The characteristic equation has at least one root, call it ω1.
At least one non-zero eigenvector |ω1> corresponds to this eigenvalue. Let V⊥1n-1
be the subspace of all vectors orthogonal to |ω1>.
As a basis for V we can now choose the normalized vector |ω1>
and n - 1 orthonormal vectors |V⊥11>, |V⊥12>, |V⊥13>,
..., |V⊥1n - 1>. In this basis
the matrix of Ω has the following form:
.
We use the fact that Ω is Hermitian
when we set <ω1|Ω = ω1<ω1|.
The characteristic equation now takes the form
"(ω1 - ω) times determinant of the square matrix denoted by X
=
0",
or (ω1 - ω)∑m=0n-1cmωm
= (ω1 - ω)Pn-1(ω) = 0, where Pn-1(ω) is a polynomial of order n - 1.
The polynomial Pn-1 must also generate at least one root, call it ω2, and a normalized eigenvector |ω2>.
Define a subspace V⊥1,2n - 2
of vectors in V⊥1n - 1 orthogonal to |ω2> and repeat the procedure.
Finally the matrix of
Ω in the basis {|ω1>,
|ω2>, ..., |ωn>} becomes
.
Every |ωi> is chosen from a subspace
that is orthogonal to the previous one, therefore the basis {|ωi>}
is orthogonal.
Note: We did not assume that the eigenvalues are all distinct (non
degenerate).
If the eigenvalues are degenerate, then there exist many bases of eigenvectors that diagonalize
Ω.
Assume Ω|ω1> = ω|ω1> and Ω|ω2> = ω|ω2>.
Then Ω(a1|ω1> + a2|ω2>) = ω(a1|ω1>
+ a2|ω2>) for any a1, a2 in
F. There exists a whole subspace spanned by |ω1> and |ω2>, whose elements are eigenvectors
of Ω with eigenvalue ω.
Consider a two dimensional vector space with an orthonormal basis {|1>,
|2>}.
Consider an operator whose matrix in that basis is
.
(a) Is the operator Hermitian? Calculate its eigenvalues and eigenvectors.
(b) Calculate the matrices which represent the projectors onto these eigenvectors.
The eigenvalues of a unitary operator are complex numbers of unit magnitude.
Eigenvectors with different eigenvalues are mutually orthogonal.
Proof:
Let U|ui> = ui|ui>, U|uj>
= uj|uj>.
Let i, j denote eigenvectors with different eigenvalues if i ≠ j.
Then <uj|U†U|ui> = uj*ui<uj|ui>.
But U†U = I, therefore <uj|ui> = uj*ui<uj|ui>,
or (1 - uj*ui)<uj|ui>
= 0.
If i = j then <ui|ui> ≠ 0,
therefore ui*ui = 1.
If i ≠ j
then 1 - uj*ui ≠ 0.
(uj ≠ ui -->
uj*ui ≠ ui*ui
≠ uj*uj ≠ 1.)
Therefore <uj|ui> = 0.
Assume both operators are degenerate. By ordering the basis vectors we can get the
matrix representation of A into the form
.
This basis, however, is not unique.
The eigensubspace corresponding to every degenerate eigenvalue has an infinity of
bases.
How does B appear in this basis?
AB|aia> = BA|aia> = aiB|aia>,
where a = 1, ..., gn and gi
denotes the degree of degeneracy of the eigenvalue ai. B|aia> lies in the eigensubspace associated with the
eigenvalue ai. Vectors from different eigensubspaces are orthogonal.
The
matrix of B is therefore a block diagonal matrix.
.
The matrix of B in the eigensubspace is Hermitian and therefore can be diagonalized by
trading the basis {|ai>} for an eigenbasis of Bi.
The matrix of A remains diagonal, since we are choosing another orthonormal basis
in a degenerate eigenspace. If B is not degenerate in a given subspace, we will end
up with a unique orthonormal basis of eigenvectors of both A and
B.
If B is degenerate in any given subspace, the basis we find is not unique.
A third
operator C, which commutes with A and B, ([A,C] = 0, [B,C] = 0)
may still not be diagonal in this basis, and we may have to diagonalize the matrix of
C
in the eigensubspaces which belong to both A and B. It is however always
possible to find a set of operators {A, B, C, ...,} which commute by pairs
and uniquely define a common eigenbasis. Such a set of operators is called a
complete set of commuting observables (C.S.C.O.).
It is
generally understood that C.S.C.O. refers to the minimal set.