The calcs we perform at work tend to be additive rather than multiplicative, so I generally don't think much about numerical stability. Got bit by it fairly hard tonight while implementing my Metropolis algorithm for the final project in Bayesian Stats. The problem is to fit an arbitrary data set to a poisson predictor model. No problem in theory, but when you compute the joint probability density, it boils down to zero pretty quickly.
Fortunately, there are ways around these problems. Namely, convert to logarithms. Then you're back to adding, which is nice and stable. So, that's what I did and it seems to work OK.
I'm a little surprised this wasn't mentioned when we were covering the problems in class. I'd expect a math major to figure it out pretty quickly, but many of the students are from other disciplines and are taking the class more for the practical methods than the underlying theory. They may have a tougher time. Well, I don't know if any of them read my blog, but I'm happy to give away that tidbit to any of them that do. The actual implementation in R, I'll keep under wraps until after the assignment is due.
No comments:
Post a Comment