I'm pivoting over to stats results, and what better result to use for that than Cauchy-Schwarz? It's ostensibly a Linear Algebra result, but the application to stats is both elegant and significant.
First, the result.
Given two vectors u, v from an inner product space V, the following inequality holds:
|<u, v>|2 ≤ <u, u> <v, v> (equivalently, |<u, v>| ≤ ||u|| ||v||) with equality holding if and only if u and v are linearly dependent (that includes the case where one or both is 0)
In the case of the real-valued plane with the Euclidean distance metric, this becomes the familiar triangle inequality: the shortest distance between two points is a straight line; going through any point off the line makes the journey longer.
In the case of less obvious spaces, the result is less obvious, but it means the same thing. If you want to get somewhere, head in that direction. OK, fine, but what does this have to do with statistics?
Let X and Y be random variables. Define the inner product as <X, Y> = E(XY). Then
|E(XY)|2 < E(X2) E(Y2)
Transform X' = X - E(X) and Y' = Y - E(Y) and you have
|Cov(X, Y)|2 = |E(X'Y')|2 ≤ E(X'2) E(Y'2) = Var(X) Var(Y)
which is an important result. Given Cauchy-Schwarz, the proof is simply showing that E(XY) really is an inner product, which is pretty straightforward. Thus, while it's really a special case, many stats texts will list Cov(X, Y)2 ≤ Var(X) Var(Y) as the Cauchy-Schwarz inequality.
No comments:
Post a Comment