An operator operating on the elements of the vector space V has certain kets, called eigenkets, on which its action is simply that of rescaling. Ω|V> = ω|V>. |V> is an eigenket (eigenvector) of Ω, ω is the corresponding eigenvalue.

- If we are solving HΦ = EΦ or -(ħ
^{2}/(2m))∂^{2}Φ(x)/∂x^{2}+ U(x)Φ(x) = EΦ(x),

then we are solving for the eigenvectors Φ(x)_{ }∈ V = L_{x}^{2}of

H = -(ħ^{2}/(2m))∂^{2}/∂x^{2}+ U(x)),

and for the corresponding eigenvalues E.

If |V> is an eigenvector, so is a|V>.
The eigenvectors are only fixed to an overall scale factor. If we require that <V|V>
= 1,
the ambiguity is partially removed, a phase factor e^{iβ} is still arbitrary.

- I|V> = |V>. Every vector is an eigenvector of the identity operator with eigenvalue 1.
- P
_{V}= |V><V|. If <V|V> = 1, then P_{V}is the projector into the subspace spanned by |V>.

Let P_{V}|V’> = a|V’> = |V><V|V’>.

<V|V’> is a number.

If <V|V’> ≠ 0 then we need |V’> = b|V> for |V‘> to be an eigenvector. Then a = 1.

If <V|V’> = 0 then a = 0.

The eigenvalues of P_{V}are either 0 or 1. The eigenvectors of P_{V}are either perpendicular or parallel to |V> with eigenvalues 0 and 1 respectively.

**How do we find the eigenvalues and eigenvectors of an
arbitrary linear operator Ω?
**Ω|ψ> = ω|ψ>, (Ω - ω I)|ψ> = 0.

We can transform this equation into a matrix equation by choosing a basis {|u

|ψ> = ∑

This matrix equation has a nontrivial solution only if det(Ω - ω I) = 0. (For a trivial solution all c

This is the characteristic equation.

The determinant is

Here ε

Our characteristic equation therefore is a polynomial of order N in ω, where N is the dimension of the vector space. It has N roots. These roots can be real or complex, distinct or identical. These N roots are the eigenvalues. They are found in a particular basis, but are independent of the choice of basis. If the vector space V is defined over the space of complex numbers, then each linear operator Ω has at least one eigenvalue. The characteristic equation has at least one root.

- Let Ω be the operator rotating the vector
**A**clockwise through an angle θ in two dimensions. The matrix of Ω in the {**i**,**j**} basis is

.

The eigenvalues are found from det(Ω - ω I) = 0.

or (cosθ - ω)^{2}+ sin^{2}θ = 0.

We have ω^{2}- 2ωcosθ + 1 = 0, ω = cosθ ± (cos^{2}θ - 1)^{1/2}= cosθ ± i sinθ.

For sinθ ≠ 0 no real, but two complex solutions exist.

**How do we find the eigenvectors associated with the
eigenvalues?
**Assume one of the eigenvalues is ω.

We have to solve the system of N equations, ∑

- The operator A is represented by the matrix

in some basis.

It has eigenvalues -2 and 4.

Two independent solutions are associated with the eigenvalue -2,

c_{1}= 0 , c_{2}= 1 , c_{3}= 0 ,

and

c_{1}= 1/√2 , c_{2}= 0 , c_{3}= -1/√2.

If the characteristic equation has N distinct roots, we can find N (up to a phase factor) unique, normalized, linearly independent eigenvectors which form a basis of V. If the characteristic equation does not have N distinct roots, we may be able to find more than one linearly independent eigenvector associated with a multiple root and so still find N linearly independent eigenvalues. But we may also find that only one eigenvector is associated with a multiple root and that the operator does not have a basis of eigenvectors.

**Eigenvalues and eigenvectors of a Hermitian operator
**The eigenvalues of a Hermitian operator are real.

Let Ω|ω> = ω|ω >, then <ω|Ω|ω > = ω<ω|ω >.

Take the adjoint. <ω|Ω

If Ω = Ω

Since |ω > ≠ 0, we have ω = ω

Every Hermitian operator has at least one basis of orthonormal eigenvectors.
The matrix
of the operator is diagonal in this basis and has its eigenvalues as its diagonal entries.

**Proof:**

The characteristic equation has at least one root, call it ω_{1}.
At least one non-zero eigenvector |ω_{1}> corresponds to this eigenvalue. Let V_{⊥1}^{n-1}
be the subspace of all vectors orthogonal to |ω_{1}>.
As a basis for V we can now choose the normalized vector |ω_{1}>
and n - 1 orthonormal vectors |V_{⊥1}^{1}>, |V_{⊥1}^{2}>, |V_{⊥1}^{3}>,
..., |V_{⊥1}^{n - 1}>. In this basis
the matrix of Ω has the following form:

.

We use the fact that Ω is Hermitian
when we set <ω_{1}|Ω = ω_{1}<ω_{1}|.

The characteristic equation now takes the form

"(ω_{1} - ω) times determinant of the square matrix denoted by X
=
0",

or (ω_{1} - ω)∑_{m=0}^{n-1}c_{m}ω^{m}
= (ω_{1} - ω)P^{n-1}(ω) = 0, where P^{n-1}(ω) is a polynomial of order n - 1.

The polynomial P^{n-1} must also generate at least one root, call it ω_{2}, and a normalized eigenvector |ω_{2}>.
Define a subspace V_{⊥1,2}^{n - 2}
of vectors in V_{⊥1}^{n - 1} orthogonal to |ω_{2}> and repeat the procedure.
Finally the matrix of
Ω in the basis {|ω_{1}>,
|ω_{2}>, ..., |ω_{n}>} becomes

.

Every |ω_{i}> is chosen from a subspace
that is orthogonal to the previous one, therefore the basis {|ω_{i}>}
is orthogonal.

Note: We did not assume that the eigenvalues are all distinct (non
degenerate).

If the eigenvalues are degenerate, then there exist many bases of eigenvectors that diagonalize
Ω.

Assume Ω|ω_{1}> = ω|ω_{1}> and Ω|ω_{2}> = ω|ω_{2}>.

Then Ω(a_{1}|ω_{1}> + a_{2}|ω_{2}>) = ω(a_{1}|ω_{1}>
+ a_{2}|ω_{2}>) for any a_{1}, a_{2} in
F. There exists a whole subspace spanned by |ω_{1}> and |ω_{2}>, whose elements are eigenvectors
of Ω with eigenvalue ω.

Consider a two dimensional vector space with an orthonormal basis {|1>,
|2>}.

Consider an operator whose matrix in that basis is
.

(a) Is the operator Hermitian? Calculate its eigenvalues and eigenvectors.

(b) Calculate the matrices which represent the projectors onto these eigenvectors.

- Solution:

(a) (σ_{y})_{ij}= (σ_{y})_{ji}, therefore σ_{y}is Hermitian. It is the matrix of a Hermitian operator. Its eigenvalue are real.

To find the eigenvalues β we set

.

β^{2}- 1 = 0, β = ±1.

The eigenvector associated with β_{1}= 1 is |ψ_{1}> = ∑_{i=1}^{2}c_{i}|i> with c_{2}= i c_{1}.

|ψ_{1}> = (1/√2)|1> + (i/√2)|2>.

The eigenvector associated with β_{2}= -1 is |ψ_{2}> = ∑_{i=1}^{2}c_{i}|i> with c_{2}= -i c_{1}.

|ψ_{2}> = (1/√2)|1> - (i/√2)|2>.

(b) The projector onto |ψ_{i}> is P_{i}= |ψ_{i}><ψ_{i}|.

The matrix elements are (P_{i})_{jk}= <j|ψ_{i}><ψ_{i}|k>.

matrix of P_{1}: , matrix of P_{2}: .

The eigenvalues of a unitary operator are complex numbers of unit magnitude.
Eigenvectors with different eigenvalues are mutually orthogonal.

**Proof:**

Let U|u_{i}> = u_{i}|u_{i}>, U|u_{j}>
= u_{j}|u_{j}>.

Let i, j denote eigenvectors with different eigenvalues if i ≠ j.

Then <u_{j}|U^{†}U|u_{i}> = u_{j}^{*}u_{i}<u_{j}|u_{i}>.

But U^{†}U = I, therefore <u_{j}|u_{i}> = u_{j}^{*}u_{i}<u_{j}|u_{i}>,
or (1 - u_{j}^{*}u_{i})<u_{j}|u_{i}>
= 0.

If i = j then <u_{i}|u_{i}> ≠ 0,
therefore u_{i}^{*}u_{i} = 1.

If i ≠ j
then 1 - u_{j}^{*}u_{i} ≠ 0.

(u_{j} ≠ u_{i }-->
u_{j}^{*}u_{i} ≠ u_{i}^{*}u_{i}
≠ u_{j}^{*}u_{j} ≠ 1.)

Therefore <u_{j}|u_{i}> = 0.

Assume the operator A is non degenerate, only one basis of orthonormal eigenvectors {|a

BA|a

a

This implies B|a

<a

Assume both operators are degenerate. By ordering the basis vectors we can get the
matrix representation of A into the form

.

This basis, however, is not unique.

The eigensubspace corresponding to every degenerate eigenvalue has an infinity of
bases.

**How does B appear in this basis?
**AB|a

.

The matrix of B in the eigensubspace is Hermitian and therefore can be diagonalized by
trading the basis {|a_{i}>} for an eigenbasis of B_{i}.
The matrix of A remains diagonal, since we are choosing another orthonormal basis
in a degenerate eigenspace. If B is not degenerate in a given subspace, we will end
up with a unique orthonormal basis of eigenvectors of both A and
B.
If B is degenerate in any given subspace, the basis we find is not unique.
A third
operator C, which commutes with A and B, ([A,C] = 0, [B,C] = 0)
may still not be diagonal in this basis, and we may have to diagonalize the matrix of
C
in the eigensubspaces which belong to both A and B. It is however always
possible to find a set of operators {A, B, C, ...,} which commute by pairs
and uniquely define a common eigenbasis. Such a set of operators is called a
complete set of commuting observables (C.S.C.O.).
It is
generally understood that C.S.C.O. refers to the minimal set.