Wednesday, May 4, 2016

Non-parametric priors

We had a colloquium today with a professor from Purdue. It was quite good. Rather than try to summarize it myself, I'll just repost the announcement:

SPEAKER: Prof. Vinayak Rao (Department of Statistics, Purdue University)
TITLE: Non-Parametric Bayes and Random Probability Measures
ABSTRACT: With large and complex datasets becoming commonplace, traditional probabilistic models with fixed and finite number of parameters can be too inflexible, suffering from issues of underfitting or overfitting.The Bayesian nonparametric approach is an increasingly popular alternative to parametric models, where a model with an unbounded complexity is used to represent an infinitely complex reality. Of course, any finite dataset has finite complexity, and the Bayesian approach of maintaining a posterior probability over the latent structure is used to mitigate possible overfitting. In this talk I will review the philosophy of nonparametric Bayes, and some methodology, including its workhorse, the Dirichlet process. I will also cover some work on constructing of dependent random probability measures (RPMs) which possess both flexible marginal distributions, as well as rich correlation structure. If time permits, I will should how computation via Markov chain Monte Carlo is straightforward and discuss some applications of these models to clustering and topic modeling.

Afterwards, I talked with him a bit about the work I'm doing. I think I'd like to learn more about the Dirichlet process. As with Metropolis, it's one of those things where it just doesn't seem like it will work. But, it does. I like counter-intuitive stuff like that.

No comments:

Post a Comment