EM vs. mean-field variational approximation
When I derive mean-field (MF) approximation inference for many problems, I observed that MF gives very similar update equations to that of EM. The similarities not only appear on the results, but on the derivation process as well. This afternoon my adviser was out of the office ^_^, so I have some free time to figure out what’s going on between EM and MF. I decided to use a simple problem like Gaussian Mixture Model (GMM) as a model for explanation. Here is the writeup [link]
Deep Belief Networks
reading list for deep belief networks:
http://deeplearning.net/tutorial/DBN.html
http://videolectures.net/icml09_bengio_lecun_tldar/
http://videolectures.net/jul09_hinton_deeplearn/
http://www.cs.toronto.edu/~hinton/MatlabForSciencePaper.html
http://www.cs.toronto.edu/~hinton/deeprefs.html
http://videolectures.net/icml09_ng_udssrs/