Papers
Topics
Authors
Recent
Search
2000 character limit reached

Time-information uncertainty relations in thermodynamics

Published 15 Jan 2020 in cond-mat.stat-mech | (2001.05418v1)

Abstract: Physical systems that power motion and create structure in a fixed amount of time dissipate energy and produce entropy. Whether living or synthetic, systems performing these dynamic functions must balance dissipation and speed. Here, we show that rates of energy and entropy exchange are subject to a speed limit -- a time-information uncertainty relation -- imposed by the rates of change in the information content of the system. This uncertainty relation bounds the time that elapses before the change in a thermodynamic quantity has the same magnitude as its initial standard deviation. From this general bound, we establish a family of speed limits for heat, work, entropy production, and entropy flow depending on the experimental constraints on the system. In all of these inequalities, the time scale of transient dynamical fluctuations is universally bounded by the Fisher information. Moreover, they all have a mathematical form that mirrors the Mandelstam-Tamm version of the time-energy uncertainty relation in quantum mechanics. These bounds on the speed of arbitrary observables apply to transient systems away from thermodynamic equilibrium, independent of the physical assumptions about the stochastic dynamics or their function.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.