Fluctuation Theorems from a Continuous-Time Markov Model of Information-Thermodynamic Capacity in Biochemical Signal Cascades
Abstract: Biochemical signaling cascades transmit intracellular information while dissipating energy under nonequilibrium conditions. We model a cascade as a code string and apply information-entropy ideas to quantify an optimal transmission rate. A time-normalized entropy functional is maximized to define a capacity-like quantity governed by a conserved multiplier. To place the theory on a rigorous stochastic-thermodynamic footing, we formulate stepwise signaling as a continuous-time Markov jump process with forward and reverse competing rates. The embedded jump chain yields well-defined transition probabilities that justify time-scale-based expressions. Under local detailed balance, the log ratio of forward and reverse rates can be interpreted as entropy production per event, enabling a trajectory-level derivation of detailed and integral fluctuation theorems. We further connect the information-theoretic capacity to the mean dissipation rate and outline finite-time fluctuation structure via the scaled cumulant generating function (SCGF) and Gallavotti--Cohen symmetry, including a worked example using MAPK/ERK timescales.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.