Papers
Topics
Authors
Recent
Search
2000 character limit reached

Probabilistic models, compressible interactions, and neural coding

Published 28 Dec 2021 in q-bio.NC and cond-mat.stat-mech | (2112.14334v2)

Abstract: In physics we often use very simple models to describe systems with many degrees of freedom, but it is not clear why or how this success can be transferred to the more complex biological context. We consider models for the joint distribution of many variables, as with the combinations of spiking and silence in large networks of neurons. In this probabilistic framework, we argue that simple models are possible if the mutual information between two halves of the system is consistently sub--extensive, and if this shared information is compressible. These conditions are not met generically, but they are met by real world data such as natural images and the activity in a population of retinal output neurons. We introduce compression strategies that combine the information bottleneck with an iteration scheme inspired by the renormalization group, and find that the number of parameters needed to describe the distribution of joint activity scales with the square of the number of neurons, even though the interactions are not well approximated as pairwise. Our results also show that this shared information is essentially equal to the information that individual neurons carry about natural visual inputs, which has surprising implications for the neural code.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.