Structure of activity in multiregion recurrent neural networks
Abstract: Neural circuits comprise multiple interconnected regions, each with complex dynamics. The interplay between local and global activity is thought to underlie computational flexibility, yet the structure of multiregion neural activity and its origins in synaptic connectivity remain poorly understood. We investigate recurrent neural networks with multiple regions, each containing neurons with random and structured connections. Inspired by experimental evidence of communication subspaces, we use low-rank connectivity between regions to enable selective activity routing. These networks exhibit high-dimensional fluctuations within regions and low-dimensional signal transmission between them. Using dynamical mean-field theory, with cross-region currents as order parameters, we show that regions act as both generators and transmitters of activity -- roles that are often in tension. Taming within-region activity can be crucial for effective signal routing. Unlike previous models that suppressed neural activity to control signal flow, our model achieves routing by exciting different high-dimensional activity patterns through connectivity structure and nonlinear dynamics. Our analysis offers insights into multiregion neural data and trained neural networks.
- D. J. Felleman and D. C. Van Essen, Distributed hierarchical processing in the primate cerebral cortex., Cerebral cortex (New York, NY: 1991) 1, 1 (1991).
- G. Okazawa and R. Kiani, Neural Mechanisms That Make Perceptual Decisions Flexible (2023).
- C. Fang and K. L. Stachenfeld, Predictive auxiliary objectives in deep rl mimic learning in the brain, arXiv preprint arXiv:2310.06089 (2023).
- T. A. Machado, I. V. Kauvar, and K. Deisseroth, Multiregion neuronal activity: the forest and the trees (2022).
- M. G. Perich and K. Rajan, Rethinking brain-wide interactions through multi-region ‘network of networks’ models, Current opinion in neurobiology 65, 146 (2020).
- U. Pereira-Obilinovic, J. Aljadeff, and N. Brunel, Forgetting Leads to Chaos in Attractor Networks, Physical Review X 13, 10.1103/PhysRevX.13.011009 (2023), arXiv:2112.00119 .
- K. Rajan, L. Abbott, and H. Sompolinsky, Stimulus-dependent suppression of chaos in recurrent neural networks, Physical review e 82, 011903 (2010).
- L. Abbott, Where are the switches on this thing, 23 Problems in systems neuroscience , 423 (2006).
- A. Cichocki, Tensor networks for big data analytics and large-scale optimization problems, arXiv preprint arXiv:1407.3124 (2014).
- J. C. Bridgeman and C. T. Chubb, Hand-waving and interpretive dance: an introductory course on tensor networks, Journal of physics A: Mathematical and theoretical 50, 223001 (2017).
- H. Sompolinsky, A. Crisanti, and H.-J. Sommers, Chaos in random neural networks, Physical review letters 61, 259 (1988).
- J. Aljadeff, M. Stern, and T. Sharpee, Transition to chaos in random networks with cell-type-specific connectivity, Physical review letters 114, 088101 (2015).
- R. Srinath, D. A. Ruff, and M. R. Cohen, Attention improves information flow between neuronal populations without changing the communication subspace, Current Biology 31, 5299 (2021).
- A. M. Turing, The chemical basis of morphogenesis, Bulletin of mathematical biology 52, 153 (1990).
- J. P. Cunningham and B. M. Yu, Dimensionality reduction for large-scale neural recordings, Nature neuroscience 17, 1500 (2014).
- J. J. Hopfield, Neural networks and physical systems with emergent collective computational abilities., Proceedings of the national academy of sciences 79, 2554 (1982).
- R. Ben-Yishai, R. L. Bar-Or, and H. Sompolinsky, Theory of orientation tuning in visual cortex., Proceedings of the National Academy of Sciences 92, 3844 (1995).
- Y. Burak and I. R. Fiete, Accurate path integration in continuous attractor network models of grid cells, PLoS computational biology 5, e1000291 (2009).
- K. Fukuda, cdd/cdd+ reference manual, Institute for Operations Research, ETH-Zentrum , 91 (1997).
- L. Van der Maaten and G. Hinton, Visualizing data using t-sne., Journal of machine learning research 9 (2008).
- A. Valente, J. Pillow, and S. Ostojic, Extracting computational mechanisms from neural activity with low-rank networks, Neur Inf Proc Sys (2022).
- L. Cimeša, L. Ciric, and S. Ostojic, Geometry of population activity in spiking networks with low-rank structure, PLoS Computational Biology 19, e1011315 (2023).
- Y. Shao and S. Ostojic, Relating local connectivity and global dynamics in recurrent excitatory-inhibitory networks, PLOS Computational Biology 19, e1010855 (2023).
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.