top of page
Search
Writer's pictureDR.GEEK

Restricted Boltzmann machine (RBM)

14th-October-2020


• A restricted Boltzmann machine (RBM) is a generative stochastic artificial neural network that can learn a probability distribution over its set of inputs.

• RBMs were initially invented under the name Harmonium by Paul Smolensky in 1986, and rose to prominence after Geoffrey Hinton and collaborators invented fast learning algorithms for them in the mid-2000s. RBMs have found applications in dimensionality reduction, classification, collaborative filtering, feature learning and topic modelling. They can be trained in either supervised or unsupervised ways, depending on the task.

• As their name implies, RBMs are a variant of Boltzmann machines, with the restriction that their neurons must form a bipartite graph: a pair of nodes from each of the two groups of units (commonly referred to as the "visible" and "hidden" units respectively) may have a symmetric connection between them; and there are no connections between nodes within a group. By contrast, "unrestricted" Boltzmann machines may have connections between hidden units. This restriction allows for more efficient training algorithms than are available for the general class of Boltzmann machines, in particular the gradient-based contrastive divergence algorithm.


RBM with DNN

  • Restricted Boltzmann machines can also be used in deep learning networks. In particular, deep belief networks can be formed by "stacking" RBMs and optionally fine-tuning the resulting deep network with gradient descent and backpropagation.


The standard type of RBM has binary-valued (Boolean/Bernoulli) hidden and visible units, and consists of a matrix of weights.


















• Restricted Boltzmann machine defines hidden variables, but since it aims to approximate the peripheral distribution of visible variables, the meaning is almost unchanged.

• After letting the RBM learn one step, the activity (corresponding to the value of the unit) of the invisible unit is regarded as the learning data of the higher hierarchical RBM. The learning method of superposing this RBM can make it possible to efficiently learn invisible units that are multi-layered. This method is one of the general methods for deep learning. In this method, the generation model as a whole is improved by adding one new hierarchy. Also, as an extended Boltzmann machine type, there is an RBM that can use real numbers as well as binary values, and the summary is described with a normal Boltzmann machine.

• One practical example of RBM is improvement of performance of speech recognition software.

1 view0 comments

Recent Posts

See All

Comments


bottom of page