2000 character limit reached
Variational Predictive Information Bottleneck
Published 23 Oct 2019 in cs.LG, cs.IT, math.IT, and stat.ML | (1910.10831v1)
Abstract: In classic papers, Zellner demonstrated that Bayesian inference could be derived as the solution to an information theoretic functional. Below we derive a generalized form of this functional as a variational lower bound of a predictive information bottleneck objective. This generalized functional encompasses most modern inference procedures and suggests novel ones.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.