Thursday, September 22, 2016

Moment Generating Functions

I'll be honest, I hate this topic. I can't tell you why. It always seems weird and contrived. But, it's also pretty foundational, so it needs to be noted.

The nth moment of a random variable X is E(Xn). The central moment is the same thing, except that you first norm the variable to it's mean, that is, E(X - E(X))n.

The moment generating function (mgf), is the function that, wait for it... generates the moments. You do this through successive differentiation, which immediately suggests that this is some function involving e. Indeed it is: MX(t) = E(etX). This function is only considered to be defined when the expectation exists in some neighborhood of 0.

To get the actual moments, you differentiate at zero. In the continuous case, you need to be able to move the differentiation inside the integral, which is not always valid, but is for pretty much any distribution a person would actually use. If your pdf is some sort of fractal function that can't be Riemann Integrated, you probably don't have a defined expectation at zero, anyway.

Since each differentiation of etX yeilds another power of X, we get

M(n)X(t) = E(XnetX), so M(n)X(0) = EXn

OK, fine, and there may be a few odd cases where computing the mgf and then differentiating is easier than just computing the moment directly (but not many). The real power of this exercise is that, for many distributions, the mgf uniquely identifies the distribution. So, if you don't know the distribution, but you can compute moments from a sample from the distribution, you may be able to take a decent stab at the underlying distribution.

Of course, us Bayesians would just put a non-parametric prior on there and let the likelihood point the way, but I suppose some would call that cheating.

No comments:

Post a Comment