Home > Academics, Research > Closed-Form Expression for Divergence of Gaussian Mixture Models

Closed-Form Expression for Divergence of Gaussian Mixture Models

This report presents an efficient approach to calculate the difference between two probability density functions (pdfs), each of which is a mixture of Gaussians (MoG). Unlike Kullback-Leibler divergence (DKL), the authors propose that the Cauchy-Schwarz (CS) pdf divergence measure (DCS) can give an analytic closed-form expression for MoG. This property of the DCS makes fast and efficient calculations possible, which is desired tremendously in real-world applications where the dimensionality of the data/features is very high. We show that DCS follows similar trends as DKL, but can be computed much faster, especially when the dimensionality is high. Moreover, the proposed method is shown to significantly outperformDKL in classifying real-world 2D and 3D objects based on distances alone. Full paper [pdf]. MATLAB code is available here.

  1. No comments yet.
  1. No trackbacks yet.

Leave a comment