Information-theoretic vs. thermodynamic entropy production in autonomous sensory networks
Abstract: For sensory networks, we determine the rate with which they acquire information about the changing external conditions. Comparing this rate with the thermodynamic entropy production that quantifies the cost of maintaining the network, we find that there is no universal bound restricting the rate of obtaining information to be less than this thermodynamic cost. These results are obtained within a general bipartite model consisting of a stochastically changing environment that affects the instantaneous transition rates within the system. Moreover, they are illustrated with a simple four-states model motivated by cellular sensing. On the technical level, we obtain an upper bound on the rate of mutual information analytically and calculate this rate with a numerical method that estimates the entropy of a time-series generated with a simulation.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.