- First, what happens when we multiply the two vectors
by H?
-
What's a better basis for both the domain and codomain in which to
represent this transformation?
- Okay, so we've found some vectors that have the property that
their images under the transformation h are
actually just scalar multiples of themselves.
These vectors have a special name: eigenvectors. And the scalars
by which they're scalled are called
eigenvalues.
- Now, an important question is this: under what conditions can we
accomplish this this "diagonalization"? Under what
conditions can we find eigenvectors and eigenvalues
that allow us to construct this very simple matrix
representation of the homomorphism?
Are all matrices "diagonalizable"? (Can you think of a
homomorphism from
we've studied geometrically where no vector will have
as an image a multiple of itself?
- The vectors I proposed for multiplication by the matrix H
came "out of the blue". How would we go about finding
them? Let's think about what we're looking for in terms
of an equation:
We can re-write that as
obviously (since I is the identity matrix); now we take
everything over to the lefthand side,
i.e.