Saturday, October 22, 2016

Variance 3.0 (continued)

I was not completely joking at the end of yesterday's post. We were having company over in the evening and I was supposed to be helping Kate get things ready. I incurred a bit (but not a lot) of wrath for finishing as much as I did under those circumstances.

Anyway, let's take a look at that integral.



That's a polynomial times an exponential, so we should be able to grind it out by repeated integration by parts. Before we do that, though, let's back it up one step.



What's the integrand? It's a function of a random variable times the density of the variable. The integral is over the entire domain of the random variable. That makes the integral the expectation of the function. Let's see where that takes us:



Now, since λ ~ Gamma(h,1/s), β ~ InvGamma(h,s). So,



The constraints on h are not to be taken lightly. The variance of the inverse gamma is not defined until you have at least three observations (if you actually crank through the integration by parts from the original equation, you'll find the same limitation comes into play since you have to repeat the parts twice to get rid of the squared term in the polynomial). These will have to be "provided" in the prior. While that means we will not be able to use a non-informative prior, that's actually a good thing. We really want to bias the variance high until we have significant data to the contrary. Anyway, we now have everything we need to compete the variance. It's a bit messy, but manageable.

No comments:

Post a Comment