NOTICE: Citizendium is still being set up on its newer server, treat as a beta for now; please see here for more. Citizendium - a community developing a quality comprehensive compendium of knowledge, online and free. Click here to join and contribute—free CZ thanks our previous donors. Donate here. Treasurer's Financial Report -- Thanks to our content contributors. --

Eigenvalue

(Redirected from Eigenvalues)

Main Article
Talk
Related Articles  [?]
Bibliography  [?]
Citable Version  [?]

This editable Main Article is under development and not meant to be cited; by editing it you can help to improve it towards a future approved, citable version. These unapproved articles are subject to a disclaimer.

In linear algebra an eigenvalue of a (square) matrix  is a number  that satisfies the eigenvalue equation,



where det means the determinant,  is the identity matrix of the same dimension as , and in general  can be complex. The origin of this equation, the characteristic polynomial of A, is the eigenvalue problem, which is to find the eigenvalues and associated eigenvectors of . That is, to find a number  and a vector  that together satisfy



What this equation says is that even though  is a matrix its action on  is the same as multiplying the vector by the number . This means that the vector  and the vector  are parallel (or anti-parallel if  is negative). Note that generally this will not be true. This is most easily seen with a quick example. Suppose

 and 

Then their matrix product is



whereas the scalar product is



Obviously then  unless  and simultaneously . For a given , it is easy to pick numbers for the entries of  and  such that this is not satisfied.

The eigenvalue equation

So where did the eigenvalue equation  come from? Well, we assume that we know the matrix  and want to find a number  and a non-zero vector  so that . (Note that if  then the equation is always true, and therefore uninteresting.) So now we have . It doesn't make sense to subtract a number from a matrix, but we can factor out the vector if we first multiply the right-hand term by the identity, giving us



Now we have to remember the fact that  is a square matrix, and so it might be invertible. If it was invertible then we could simply multiply on the left by its inverse to get



but we have already said that  can't be the zero vector! The only way around this is if  is in fact non-invertible. It can be shown that a square matrix is non-invertible if and only if its determinant is zero. That is, we require



which is the eigenvalue equation stated above.

A more technical approach

So far we have looked eigenvalues in terms of square matrices. As usual in mathematics though we like things to be as general as possible, since then anything we prove will be true in as many different applications as possible. So instead we can define eigenvalues in the following way.

Definition: Let  be a vector space over a field , and let  be a linear map. An eigenvalue associated with  is an element  for which there exists a non-zero vector  such that



Then  is called the eigenvector of  associated with .