Article

Normal Typicality and von Neumann's Quantum Ergodic Theorem

Proceedings of The Royal Society A Mathematical Physical and Engineering Sciences (Impact Factor: 2). 07/2009; 466(2123). DOI: 10.1098/rspa.2009.0635
Source: arXiv

ABSTRACT We discuss the content and significance of John von Neumann's quantum ergodic theorem (QET) of 1929, a strong result arising from the mere mathematical structure of quantum mechanics. The QET is a precise formulation of what we call normal typicality, i.e., the statement that, for typical large systems, every initial wave function $\psi_0$ from an energy shell is "normal": it evolves in such a way that $|\psi_t> <\psi_t|$ is, for most $t$, macroscopically equivalent to the micro-canonical density matrix. The QET has been mostly forgotten after it was criticized as a dynamically vacuous statement in several papers in the 1950s. However, we point out that this criticism does not apply to the actual QET, a correct statement of which does not appear in these papers, but to a different (indeed weaker) statement. Furthermore, we formulate a stronger statement of normal typicality, based on the observation that the bound on the deviations from the average specified by von Neumann is unnecessarily coarse and a much tighter (and more relevant) bound actually follows from his proof. Comment: 18 pages LaTeX, no figures

0 Followers
 · 
221 Views
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: An approach to quantum mechanics is developed which makes the Heisenberg cut between the deterministic microscopic quantum world and the partly deterministic, partly stochastic macroscopic world explicit. The microscopic system evolves according to the Schrodinger equation with stochastic behaviour arising when the system is probed by a set of coarse grained macroscopic observables whose resolution scale defines the Heisenberg cut. The resulting stochastic process can account for the different facets of the classical limit: Newton's laws (ergodicity broken); statistical mechanics of thermal ensembles (ergodic); and solve the measurement problem (partial ergodicity breaking). In particular, the usual rules of the Copenhagen interpretation, like the Born rule, emerge, along with completely local descriptions of EPR type experiments. The formalism also re-introduces a dynamical picture of equilibration and thermalization in quantum statistical mechanics and provides insight into how classical statistical mechanics can arise in the classical limit and in a way that alleviates various conceptual problems.
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The study of electron transport in quantum devices is mainly devoted to DC properties, while the fluctuations of the electrical current (or voltage) around these DC values, the so-called quantum noise, are much less analyzed. The computation of quantum noise is intrinsically linked (by temporal correlations) to our ability to understand/compute the time-evolution of a quantum system that is measured several times. Therefore, quantum noise requires a salutary understanding of the perturbation (collapse) of the wave function while being measured. There are several quantum theories in the literature that provide different (but empirically equivalent) ways of understanding/computing it. In this work, the quantum noise (and the collapse) associated to an electron impinging upon a semitransparent barrier are explained using Bohmian mechanics. The ability of this theory (which deals with wave and point-like particles) in explaining the collapse in a natural way, from the conditional wave function, are emphasized. From this result, the fundamental understanding and practical computation of quantum noise with Bohmian trajectories are discussed. Numerical simulations of low and high frequency features of quantum shot noise in a resonant tunneling diode are presented (through the BITLLES simulator), showing the usefulness of the Bohmian approach.
    Journal of Computational Electronics 10/2014; DOI:10.1007/s10825-015-0672-6 · 1.37 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: When an isolated quantum system is driven out of equilibrium, expectation values of general observables start oscillating in time. This article reviews the general theory of such \emph{temporal fluctuations}. We first survey some results on the strength of such temporal fluctuations. For example temporal fluctuations are exponentially small in the system's volume for generic systems whereas they fall-off algebraically in integrable systems. We then concentrate on the the so-called quench scenario where the system is driven out-of-equilibrium under the application of a sudden perturbation. For sufficiently small perturbations, temporal fluctuations of physical observables can be characterized in full generality and can be used as an effective tool to probe quantum criticality of the underlying model. In the off-critical region the distribution becomes Gaussian. Close to criticality the distribution becomes a universal function uniquely characterized by a single critical exponent, that we compute explicitly. This contrasts standard equilibrium quantum fluctuations for which the critical distribution depends on a numerable set of critical coefficients and is known only for limited examples. The possibility of using temporal fluctuations to determine pseudo-critical boundaries in optical lattice experiments is further reviewed.

Full-text (2 Sources)

Download
79 Downloads
Available from
May 15, 2014