Wednesday, November 2, 2016

Eigenvalues and Eigenvectors

Back to linear algebra for a bit. Eigenvalues and vectors are linchpins for much of big data analysis, so I really should know this stuff absolutely cold not just for the Q and possibly research, but for my day job as well. Of course, so much of the actual work is now done in off-the-shelf packages, one can be pretty naive as to how it functions and still get results. However, I don't want to be that guy.

Today, I'll just post the basics.

If A is an n matrix then any scalar λ is an eigenvalue of A if there exists a nonzero vector x such that Ax = λx. In such a case, x is the eigenvector of λ.

It's clear that we only have such a case if (A - λI)x = 0 which implies that |A - λI| = 0. Expanding that determinant gives the characteristic polynomial of A: p(λ) = |A - λI|. The roots of that polynomial are the eigenvalues of A.

OK, finding roots of a polynomial. We did that in 9th grade. What's the catch? Well, mainly that it doesn't work. Finding the roots of higher-order polynomials is A LOT harder than High School Algebra texts would have you believe. Pretty much impossible using digital hardware unless you specifically constructed the polynomial to have an easy solution. Real world polynomials don't have easy solutions. So, we're off to the realm of numerical approximations (which explains why so many people just fire up the software package from the get-go and don't bother asking questions).

And, that really is the way to get actual solutions. However, understanding what's going on is a big plus when you're trying to figure out what problem you want solved. We'll look at a few applications over the next few days.


No comments:

Post a Comment