top of page
Search
Writer's pictureDR.GEEK

Boltzmann Machines for Structured or Sequential Outputs

(9th-Dec-2020)


• In the structured output scenario, we wish to train a model that can map from some input x to some output y, and the different entries of y are related to each other and must obey some constraints. For example, in the speech synthesis task, y is a waveform, and the entire waveform must sound like a coherent utterance. A natural way to represent the relationships between the entries in y is to use a probability distribution p(y | x). Boltzmann machines, extended to model conditional distributions, can supply this probabilistic model. The same tool of conditional modeling with a Boltzmann machine can be used not just for structured output tasks, but also for sequence modeling. In the latter case, rather than mapping an input x to an output y, the model must estimate a probability distribution over a sequence of variables, p(x(1),...,x( ) τ ). Conditional Boltzmann machines can represent factors of the form p(x( ) t | x(1),...,x( 1) t− ) in order to accomplish this task. An important sequence modeling task for the video game and film industry is modeling sequences of joint angles of skeletons used to render 3-D characters. These sequences are often collected using motion capture systems to record the movements of actors. A probabilistic model of a character’s movement allows the generation of new, previously unseen, but realistic animations. To solve this sequence modeling task, Taylor 2007 et al. ( ) introduced a conditional RBM modeling p(x( ) t | x( 1) t− ,...,x( ) t m − ) for small m. The model is an RBM over p(x( ) t ) whose bias parameters are a linear function of the preceding m values of x. When we condition on different values of x( 1) t− and earlier variables, we get a new RBM over x. The weights in the RBM overx never change, but by conditioning on different past values, we can change the probability of different hidden units in the RBM being active. By activating and deactivating different subsets of hidden units, we can make large changes to the probability distribution induced on x. Other variants of conditional RBM ( , ) and other variants of sequence Mnih et al. 2011 modeling using conditional RBMs are possible (Taylor and Hinton 2009 Sutskever , ; et al., ; 2009 Boulanger-Lewandowski 2012 et al., ).

5 views0 comments

Recent Posts

See All

Comments


bottom of page