Given that any inner product space has an orthonormal basis, the obvious question is, "How do I find one?" It's actually pretty easy. The Gram-Schmidt process starts with any set of vectors that span the space and sequentially converts them into an orthonormal basis.
The process starts by picking any one of the vectors and normalizing it (that is, dividing it by it's length so the resulting vector has the same direction, but length 1). Then, you take another vector, and compute it's projection onto the space spanned by the first vector. The difference between the two is necessarily orthogonal to the first space. Normalize that, and you've got your next basis vector. Keep going until you've reached the dimension of the space. (Every text I've checked starts with a basis rather than simply a set of vectors that span the space, but there's no reason for that restriction as long as you chuck the dependent vectors as you go and stop when you've reached the dimension of the space).
Formally, the process is as follows:
Let {xi} span an inner product space V of dimension n.
Let u1 = x1/||x1||
for 0 < k < n, if xk+1 is in the space spanned by {ui} i ≤ k, then drop xi from the set and renumber the remaining vectors (this somewhat ugly step is why the textbooks like to start with a basis rather than a spanning set. Repeat until you have an xk+1 independent from {ui} i ≤ k.
Let uk = (xk+1 - pk) / ||xk+1 - pk|| where
Then {ui} is an orthonormal basis of V.
While slick in it's own right, the real power of this is evident when we look at the factorization of matrices. Tune in tomorrow for more linear algebra excitement!
No comments:
Post a Comment