Portmanteau Lemma
The following are equivalent, describing convergence in distribution:
- P(Xn <= x) -> P(X <= x) for all continuity points of P(X < x).
- E f(Xn) -> E f(X) for all bounded, continuous f
- E f(Xn) -> E f(X) for all bounded Lipschitz functions f
- lim inf E f(Xn) >= E f(X) for all nonnegative, continuous f
- lim inf P(Xn in G) >= P(X in G) for every open set G
- lim sup P(Xn in F) <= P(X in F) for every closed set F
- P(Xn in B) -> P(X in B) fr all Borel sets B with P(X in boundary) = 0
Probably not terribly useful beyond the first three, but who knows?
Continuous Mapping Theorem
Let G be continuous at every point of a set C such that P(X in C) = 1. Then if Xn->X, G(Xn)->G(X). This holds for all three types of convergence (distribution, probability, and almost surely).
I already knew that one, but I don't think I've recorded it here.
Prohorov's Theorem
If Xn converges in distribution to X, then {Xn} is uniformly tight (that is, if there is a uniform N such that for e>0 there exists M such that P(||Xn|| > M) < e.
Conversely, if Xn is uniformly tight, there exists a subsequence that converges in distribution to X.
Not sure how useful that is; maybe it will become obvious in the next 10 pages of the book.
Helly's lemma
Each given sequence Fn of cumulative distribution functions on Rk possesses a subsequence Fnj such that Fnj(x) -> F(x) at each continuity point x. Here, F(x) may be a defective cdf (that is, the density may not integrate to 1).
He also calls out the Markov inequality and Slutsky's lemma, but those two are so well known I won't bother repeating them.
This book has about 450 pages. At this rate, I'll have nearly a thousand results to learn.
No comments:
Post a Comment