Papers
Topics
Authors
Recent
Search
2000 character limit reached

Variational approach to unsupervised learning

Published 24 Apr 2019 in cond-mat.dis-nn and cs.LG | (1904.10869v1)

Abstract: Deep belief networks are used extensively for unsupervised stochastic learning on large datasets. Compared to other deep learning approaches their layer-by-layer learning makes them highly scalable. Unfortunately, the principles by which they achieve efficient learning are not well understood. Numerous attempts have been made to explain their efficiency and applicability to a wide class of learning problems in terms of principles drawn from cognitive psychology, statistics, information theory, and more recently physics, but quite often these imported principles lack strong scientific foundation. Here we demonstrate how one can arrive at convolutional deep belief networks as potential solution to unsupervised learning problems without making assumptions about the underlying framework. To do this, we exploit the notion of symmetry that is fundamental in machine learning, physics and other fields, utilizing the particular form of the functional renormalization group in physics.

Authors (1)
Citations (4)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.