Papers
Topics
Authors
Recent
Search
2000 character limit reached

In-context learning emerges in chemical reaction networks without attention

Published 10 Jan 2026 in cond-mat.dis-nn, cond-mat.stat-mech, and q-bio.MN | (2601.06712v1)

Abstract: We investigate whether chemical processes can perform in-context learning (ICL), a mode of computation typically associated with transformer architectures. ICL allows a system to infer task-specific rules from a sequence of examples without relying solely on fixed parameters. Traditional ICL relies on a pairwise attention mechanism which is not obviously implementable in chemical systems. However, we show theoretically and numerically that chemical processes can achieve ICL through a mechanism we call subspace projection, in which the entire input vector is mapped onto comparison subspaces, with the dominant projection determining the computational output. We illustrate this mechanism analytically in small chemical systems and show numerically that performance is robust to input encoding and dynamical choices, with the number of tunable degrees of freedom in the input encoding as a key limitation. Our results provide a blueprint for realizing ICL in chemical or other physical media and suggest new directions for designing adaptive synthetic chemical systems and understanding possible biological computation in cells.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.