(A - I n )x = 0
Download
Report
Transcript (A - I n )x = 0
Computing Eigen Information for Small Matrices
The eigen equation can be rearranged as follows:
Ax = x Ax = Inx Ax - Inx = 0 (A - In)x = 0
(1)
The matrix equation (A - In)x = 0 is a homogeneous linear system with coefficient
matrix A - In . Since an eigenvector x cannot be the zero vector, this means we seek a
nontrivial solution to the linear system (A - In)x = 0 . Thus ns(A - In) 0 or
equivalently rref(A - In) must contain a zero row. It follows that matrix A - In must be
singular, so from Chapter 3,
det(A - In) = 0.
(or det(In - A) = 0 )
(2)
Equation (2) is called the characteristic equation of matrix A and solving it for gives
us the eigenvalues of A. Because the determinant is a linear combination of particular
products of entries of the matrix, the characteristic equation is really a polynomial in
equation of degree n. We call
c() = det(A - In)
(3)
the characteristic polynomial of matrix A. The eigenvalues are the solutions of (2) or
equivalently the roots of the characteristic polynomial (3). Once we have the n
eigenvalues of A, 1, 2, ..., n, the corresponding eigenvectors are nontrivial solutions
of the homogeneous linear systems
(A - iIn)x = 0
for i = 1,2, ..., n.
(4)
We summarize the computational approach for determining eigenpairs (, x) as a
two-step procedure:
Example: Find eigenpairs of
Step I. Find the eigenvalues.
The eigenvalues are 1 3 and 2 2.
Step II. To find corresponding eigenvectors we solve
(A - iIn)x = 0
for i = 1,2
Given that is an eigenvalue of A, then we know that matrix A - In is singular and
hence rref(A - In) will have at least one zero row.
A homogeneous linear system whose coefficient matrix has rref with at least one zero
row will have a solution set with at least one free variable. The free variables can be
chosen to have any value as long as the resulting solution is not the zero vector.
In the example for eigenvalue 1 = 3, the general solution was
The free variable in this case could be any nonzero value. We chose x2 = 2 to avoid
fractions, but this is not required. If we chose x2 = 1/7, then
is a valid eigenvector.
Since there will be at least one free variable when determining an eigenvector to
correspond to an eigenvalue, there are infinitely many ways to express the
entries of an eigenvector.
Example:
Find the eigenvalues and corresponding eigenvectors of
The characteristic polynomial is
Its factors are
Corresponding eigenvectors are
So the eigenvalues are 1, 2, and 3.
The eigenvalues are the roots of the characteristic polynomial, thus it is possible that a
numerical value can be a root more than once. For example:
has roots 0, 3, 3. We say 3 is a multiple or repeated root. The number of times the
root is repeated is called its multiplicity. Thus it follows that a matrix can have
repeated eigenvalues.
Example:
Find eigenpairs of
The characteristic polynomial is
Thus the eigenvalues are 1 = 2 = 1 and 3 = 2. We have a repeated eigenvalue of
multiplicity 2.
To find corresponding eigenvectors we do the following:
We find that one eigenvector is
Explain why {p1, p2, p3} is a basis for R3.
More properties of eigenvalues and eigenvectors.
A and AT have the same eigenvalues.
Proof: We see this by showing that A and AT have the same characteristic polynomial:
Since the characteristic polynomials are the same and eigenvalues are
roots of the characteristic polynomial, the eigenvalues must be the same.
Exercise: Let A be the transition matrix of a Markov process where is a probability
matrix which has all entries in [0, and the sum of the entries in each column equal to 1.
Show that λ =1 is an eigenvalue of A.
If A is diagonal, upper triangular, or lower triangular, then the eigenvalues of A are its
diagonal entries.
Proof: Construct a proof.
More properties of eigenvalues and eigenvectors.
Symmetric matrices arise frequently in applications. The next two properties provide
information about their eigenvalues and eigenvectors.
The eigenvalues of a symmetric matrix are real numbers.
Proof: (The steps of a guided proof are in the exercises.)
The eigenvectors corresponding to distinct eigenvalues of a symmetric matrix
are orthogonal.
Proof: (The steps of a guided proof are in the exercises.)
More properties of eigenvalues and eigenvectors.
For a square matrix A we have the following important ideas:
Is A nonsingular or singular? {See Chapter 2.}
Is det(A) nonzero or zero?
{See Chapter 3.}
Next we connect these to properties of the eigenvalues.
det(A) is the product of the absolute values of the eigenvalues of A.
Proof: If 1, 2, ..., n are the eigenvalues of A, then they are the roots of its
characteristic polynomial. Hence
Evaluating this polynomial at = 0 gives det(A) = (-1)2n12 ... n = 12 ... n
A is nonsingular if and only if 0 is not an eigenvalue of A, or equivalently, A is singular
if and only if 0 is an eigenvalue of A.
Proof: Use the previous property to construct a proof.
A particular type of matrix based on eigenvector properties.
A matrix A is called defective if A has an eigenvalue of multiplicity m > 1 for which
the corresponding eigenspace has a basis of fewer than m vectors; that is, the
dimension of the eigenspace corresponding to is less than m.
Example:
Given matrix
which has eigenvalues λ = 0, 0, 4.
Determine if A is defective.
Solution: Determine the number of linearly independent eigenvectors corresponding to
the repeated eigenvalue λ = 0.
Find the number of vectors in a basis for the null space of A – 0I.
A - 0I = A, so we find the rref([A | 0]). We get
We see there are two free variables in the general solution so there will be two
linearly independent eigenvectors corresponding to the eigenvalue of multiplicity two.
Hence A is not defective.
Example:
Determine if
is defective. Its eigenvalues are λ = 1, 6, 6.
Solution: Determine the number of linearly independent eigenvectors corresponding
to the repeated eigenvalue λ = 6. Find the number of vectors in a basis for the null
space of A – 6I.
A - 6I =
, so we find the rref([A | 0]). We get
We see there is one free variables in the general solution so there will be only one
linearly independent eigenvectors corresponding to the eigenvalue of multiplicity two.
Hence A is defective.