Previously I wrote about Gibbs sampling in pairwise Markov Random Fields, where we generally have to update one variable at a time to obtain a full Gibbs sample. The Restricted Boltzmann Machine is a special MRF with additional independence assumptions that allow for efficient conditional inference and block Gibbs sampling; this makes maximum-likelihood learning tractable, using sampling-based methods like Contrastive Divergence. I demonstrate this in a Python notebook.
Restricted Boltzmann Machine and Contrastive Divergence
Jul 21 2017