Article

# Effects of error on fluctuations under feedback control.

Department of Physics, The University of Tokyo-Hongo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8654, Japan.

Physical Review E (Impact Factor: 2.31). 08/2011; 84(2 Pt 1):021123. DOI: 10.1103/PhysRevE.84.021123 Source: PubMed

- [Show abstract] [Hide abstract]

**ABSTRACT:**The efficiency of a feedback mechanism depends on the precision of the measurement outcomes obtained from the controlled system. Accordingly, measurement errors affect the entropy production in the system. We explore this issue in the context of active feedback cooling by modeling a typical cold damping setup as a harmonic oscillator in contact with a heat reservoir and submitted to a velocity-dependent feedback force that reduces the random motion. We consider two models that distinguish whether the sensor continuously measures the position of the resonator or directly its velocity (in practice, an electric current). Adopting the standpoint of the controlled system, we identify the `entropy pumping' contribution that describes the entropy reduction due to the feedback control and that modifies the second law of thermodynamics. We also assign a relaxation dynamics to the feedback mechanism and compare the apparent entropy production in the system and the heat bath to the total entropy production in the super-system that includes the controller. In this context, entropy pumping reflects the existence of hidden degrees of freedom and the apparent entropy production satisfies fluctuation theorems associated to an effective Langevin dynamics.Journal of Statistical Mechanics Theory and Experiment 03/2013; 2013(06). · 1.87 Impact Factor - [Show abstract] [Hide abstract]

**ABSTRACT:**We study nonequilibrium thermodynamics of complex information flows induced by interactions between multiple fluctuating systems. Characterizing nonequilibrium dynamics by causal networks (i.e., Bayesian networks), we obtain novel generalizations of the second law of thermodynamics and the fluctuation theorem, which include an informational quantity characterized by the topology of the causal network. Our result implies that the entropy production in a single system in the presence of multiple other systems is bounded by the information flow between these systems. We demonstrate our general result by a simple model of biochemical adaptation.Physical Review Letters 11/2013; 111(18):180603. · 7.94 Impact Factor - [Show abstract] [Hide abstract]

**ABSTRACT:**The problem of calculating the rate of mutual information between two coarse-grained variables that together specify a continuous time Markov process is addressed. As a main obstacle, the coarse-grained variables are in general non-Markovian, therefore, an expression for their Shannon entropy rates in terms of the stationary probability distribution is not known. A numerical method to estimate the Shannon entropy rate of continuous time hidden-Markov processes from a single time series is developed. With this method the rate of mutual information can be determined numerically. Moreover, an analytical upper bound on the rate of mutual information is calculated for a class of Markov processes for which the transition rates have a bipartite character. Our general results are illustrated with explicit calculations for four-state networks.Journal of Statistical Physics 06/2013; 153(3). · 1.40 Impact Factor

Data provided are for informational purposes only. Although carefully collected, accuracy cannot be guaranteed. The impact factor represents a rough estimation of the journal's impact factor and does not reflect the actual current impact factor. Publisher conditions are provided by RoMEO. Differing provisions from the publisher's actual policy or licence agreement may be applicable.