Archive

Archive for January, 2011

Bayesian Inference

January 27, 2011 1 comment

Recently I have been looking for a good short resource for the fundamental Bayesian inference. There are tons of good relevant materials online, unfortunately, some are too long, some are too theoretic. Come on, I want to find something intuitive, making sense and understandable and readable within 30 minutes! After a couple of hours wandering on internet and reading some good relevant books, I decided to pick the best 3 books, of course in my opinion, that contributes to my understanding the most within half an hour.

  1. Bayesian Data Analysis by Andrew Gelman et al. — If you understand the concept of Bayesian framework and just want to review Bayesian inference, then the first pages of Chapter2 are sufficient. Good examples can be found in section 2.6.
  2. Bayesian core: a practical approach to computational Bayesian statistics by Jean-Michel Marin, Christian P. Robert — The book emphasizes the important concepts and has a lot of examples necessary to understand the basic ideas.
  3. Bayesian Field Theory by Jörg C. Lemm — This book provides great perspective of Bayesian framework to learning, information theory and physics. There are very a lot of figures and graphical models making the book easy to follow.
Advertisements

GMM-BIC: A simple way to determine the number of Gaussian components

January 4, 2011 4 comments

Gaussian Mixture Models (GMMs) are very popular in broad area of applications because its performance and its simplicity. However, it is still an open problem on how to determine the number of Gaussian components in a GMM. One simple solution to this problem is to use Bayesian Information Criteria (BIC) to penalize the complexity of the GMM. That is, the cost function of BIC-GMM is composed of 2 parts: 1) log-likelihood and 2) complexity penalty term. Consequently, the final GMM would be a model that can fit the data well, but not “overfitting” the model in BIC sense. There are tons of tutorials on the internet. Here I would like to share my MATLAB code for demo.

Note that Variational Bayes GMM (VBGMM) can also solve this problem in a different flavor and is worth to study and compare with GMM-BIC. I also provided some details of the derivations of VBGMM here.

Closed-Form Expression for Divergence of Gaussian Mixture Models

January 3, 2011 Leave a comment

This report presents an efficient approach to calculate the difference between two probability density functions (pdfs), each of which is a mixture of Gaussians (MoG). Unlike Kullback-Leibler divergence (DKL), the authors propose that the Cauchy-Schwarz (CS) pdf divergence measure (DCS) can give an analytic closed-form expression for MoG. This property of the DCS makes fast and efficient calculations possible, which is desired tremendously in real-world applications where the dimensionality of the data/features is very high. We show that DCS follows similar trends as DKL, but can be computed much faster, especially when the dimensionality is high. Moreover, the proposed method is shown to significantly outperformDKL in classifying real-world 2D and 3D objects based on distances alone. Full paper [pdf]. MATLAB code is available here.