Universal Bounds on Information-Processing Capabilities of Markov Processes

arXiv (Cornell University)(2023)

Cited 0|Views4
No score
Abstract
We consider a finite-state, continuous-time Markov process, represented in the "linear framework" by a directed graph with labelled edges which specifies the infinitesimal generator of the process. If the graph is strongly connected, the process has a unique steady-state probability distribution, $p$, which may not be one of thermodynamic equilibrium. If the label (rate) of any edge (transition) is perturbed, to reach the new steady-state probability distribution $p'$, we find that the Kullback-Leibler (KL) divergence between these distributions is bounded by the change in the thermodynamic affinity, $\Delta A(C)$, of any cycle, $C$, that includes the altered transition, D$_{KL}$$(p'||p) \leq |\Delta A(C)|$, irrespective of the structure of the graph. It follows that, if an equilibrium distribution is shifted away from equilibrium by perturbing a single rate, then the free energy difference between these distributions is similarly bounded $F^{neq}-F^{eq}\leq |\Delta A(C)|$. Our analysis reveals universal, energy-induced bounds on the information-processing capabilities of Markov systems operating arbitrarily far from thermodynamic equilibrium.
More
Translated text
Key words
information-processing information-processing,markov
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined