Skip to main content

Are matrices the only “things” which have eigenvalues

Date:
-
Location:
CB 247
Speaker(s) / Presenter(s):
Jeff Ovall, University of Kentucky

A beautiful theorem in elementary linear algebra asserts that any symmetric matrix A 2 Rn×n will have n real eigenvalues _1, _2, . . . , _n with corresponding eigenvectors v1, v2, . . . , vn (Avi = _ivi). Furthermore, these eigenvectors vk 2 Rn may be chosen so that they are unit-length and mutually orthogonal (perpendicular), kvik2 = vi · vi = 1 for each i , vi · vj = 0 when i 6= j .  A nice consequence of this result is that any vector u 2 Rn can be uniquely expressed in terms of these eigenvectors in a simple way:  u = (u · v1)v1 + (u · v2)v2 + · · · + (u · vn)vn .  We will discuss how one can generalize such ideas and results to vector spaces of functions, with matrix multiplication being replaced by differentiation.  Although the focus of the talk will concern describing a way of (approximately) computing eigenvalues and eigenfunctions in this broader context, we will also give some intuition as to why such problems are of physical interest.  The approximations themselves will bring us right back to where we started—eigenvalue problems for (really huge!) symmetric matrices.