It's late, so I'm just going to post a few named results without a lot of commentary. Actually, most are definitions, but they set up two very important results:
Convergence in distribution (definition; also known as weak convergence or convergence in law): The PDF's of a sequence of random variables converge to a single limiting PDF for every value at which the limiting PDF is continuous.
Convergence in probability (definition): A sequence of random variables {Xi} converges in probability to a random variable X if, for every ε > 0, P(|Xn - X| < ε) converges to 1.
Weak Law of Large Numbers (first important result): The sample mean of a sequence of iid random variables converges in probability to the true mean provided it is finite.
Almost sure convergence (another definition): Stronger than convergence in probability, here we move the limit inside the probability. That is, it's not just that the probability of the sequence and limit get close goes to one, but the actual points on which that relationship holds has probability 1. Or, to state the contrapositive, the portion of the sample space where they don't converge is a set of measure zero. You have to construct some goofy cases to show these aren't saying the same thing, but failing to deal with such counterexamples is what got mathematics into so much trouble in the 16th and 17th century, so modern mathematicians are rightly careful to consider them.
Strong Law of Large Numbers (second, and even more important result): A sequence of iid random variables {Xi} with E(Xi) = μ (finite) and Var(Xi) = σ2 has a sample mean that converges almost surely to μ.
Again, in the vast majority of cases, the two results are a distinction without a difference, but there are cases where the second is stronger. The first is simply saying that the probability of the sample mean being off by greater than some arbitrarily small value goes to zero. The second says that, plus that the set of values where the convergence doesn't hold is of no significance.
No comments:
Post a Comment