PreprintPDF Available

Decoherence Time

Preprints and early-stage research may not have been peer reviewed yet.


I critically assess some published values for decoherence times. I show there are two characteristic times inversely proportional to each other: decoherence time, and probability decay time, with second often mistaken for the first. I present formulas for decoherence and decay times
Decoherence time
Sergei Viznyuk
I critically assess some published values for decoherence times. I show there
are two characteristic times inversely proportional to each other: decoherence
time, and probability decay time, with second often mistaken for the first. I
present formulas for decoherence and decay times
While decoherence has been extensively studied, surprisingly little is available in terms of
numeric results, either calculated from formulas [1, 2] or measured in experiments [3, 4]. The
reason for researchers shying away from publishing numbers becomes apparent once these
numbers are estimated from widely quoted expressions [5, 1]. The estimated values of decoherence
times span an incredible range from  for a large molecule immersed into cosmic
microwave background [1], to  for a “canonical” classical object [5] with mass ,
characteristic length , temperature , and assumed characteristic “relaxation” time of .
An interval of  cannot conceivably correspond to any physical process in a macroscopic
object, as no parameter can undergo a change in any meaningful way during such time. To note, it
takes  times longer, i.e.  for light to cross proton radius. Due to time-energy
uncertainty, the duration of  of a change in object’s state would result in energy
uncertainty of , enough to evaporate  object. Such improbable times call into question
the premise decoherence estimates are based on.
Part of the problem, at least for some authors, is that no clear, unambiguous and universally
accepted definition of coherence (which is supposed to get lost in the process) is available [6].
Which leads to a confusion as far as which process’ characteristic time to take as decoherence
time. The common view links coherence with presence of interference terms in density matrix,
and their reduction with decoherence [1]. With this understanding, can the characteristic time of
reduction of interference be taken as decoherence time? As I explain below, that is not the case.
The reduction of object’s density matrix is achieved by tracing out entangled ancilla system,
often assumed to be the environment [1]. As has been shown elsewhere [7], the “tracing out”
operation is nothing else but a measurement performed on ancilla
. The decoherence time is,
therefore, the interval between state preparation
, and subsequent measurement, even if only on
ancilla part of the state. The classical information extracted by measurement on ancilla reduces
ambiguity in measurement of entangled object’s state [7], i.e. reduces interference terms in density
matrix. The reduction of density matrix signifies decay of object’s probability distribution.
One would intuitively expect the faster ancilla is measured, the faster will interference decay,
but, in fact, the relation between decoherence time defined above, and the decay of probability
distribution is inverse, i.e. the smaller is , the slower is the decay
. One can understand this by
considering measurements on ancilla as random walk from the surface of generalized Bloch ball
[8] towards its interior. A point on the surface of Bloch ball corresponds to the initial pure state,
and the point in the center corresponds to the complete mixture. The larger are the steps (i.e. the
larger is ), the fewer steps are needed to reduce density matrix, the faster is the probability decay.
It can also be schematically proven as follows. The distance from the surface of Bloch ball signifies
and not by ancilla, as often mistakenly claimed [1]
the preparation is also measurement, just with a different device, in a different measurement basis
this is also known as quantum Zeno effect [12]
the probability decay. The random walk is described by binomial distribution with probability
 
to make a step in either direction. The variance of binomial distribution after random
steps is given by:
 
The standard deviation , multiplied by the length of each step, gives the distance gained from
the surface of Bloch ball, i.e. the probability decay. If elapsed time is , then  
. Using (1):
 
From (2), 
, i.e. the probability decay time is inversely proportional to
decoherence time . The expression for probability decay has been obtained in Section 4 of [9]:
 
, where is the elapsed time; is the characteristic decay time; is explained below; is
calculated from object’s energy spectrum as:
A generic formula for decoherence time has been obtained in [9] using Fermi’s golden rule:
, where is density of states, i.e. the number of states per unit energy interval around energy
of the object. From (3,5) the characteristic probability decay time is:
For a two-level system, is simply equal to the difference in energy levels: . Assuming
no degeneracy, there is 1 state per  energy interval, i.e.  
. Then, from (5,6):
Thus, for non-degenerate two-level system, the probability decay time and decoherence time are
both equal to Margolus-Levitin bound [10]. A sensible value for decoherence time of  has been
obtained in [4] from experimental data on spectral linewidth, effectively using formula (7), even
though (7) is only applicable to non-degenerate two-level systems.
The value of can be quite large for macroscopic objects. That explains why probability
decay time (6) can be incredibly small. One has to keep in mind, the probability decay (3) does
not describe a change in any particular object’s state. It only means the reduction in correlation
between measurements outcomes obtained by different devices when performing measurements
on ensemble of identically prepared objects. Each measurement and ensuing [partial] decoherence
has characteristic time (5). One cannot measure probability distribution (3) on timescales shorter
than (5). On such timescales (3) is a mere abstraction.
I shall now expound on the value of parameter in (3). A measurement outcome is one of
possible events . The measurement event sample represents the collected information about
object’s state [9]. Here is the number of occurrences of event in the sample. There are
distinct ways to collect event sample ; where equals statistical weight of the sample:
, where  is Boltzmann’s entropy of the sample. Different ways to collect the sample (i.e.
different event sequences) relate to different correlations between events. One of the ways to
collect event sample would have the same sequence of events as that of the measurement sample
collected on initial pure state. In full decoherence, all ways to collect event sample have equal
probability  
. Thus,  
is the minimum probability the measured object is in initial pure state,
with confidence (fidelity) provided in [11].
M. Schlosshauer, "Quantum Decoherence," arXiv:1911.06282 [quant-ph], 2019.
J. Anglin, J. Paz and W. Zurek, "Deconstructing Decoherence," arXiv:quant-ph/9611045,
A. Berkley, H. Xu, R. Ramos, M. Gubrud, F. Strauch, P. Johnson, J. Anderson, A. Dragt,
C. Lobb and F. Wellstood, "Entangled Macroscopic Quantum States in Two
Superconducting Qubits," Science, vol. 300, no. 1548-1550, p. 5625, 2003.
M. Steffen, M. Ansmann, R. Bialczak, N. Katz, E. Lucero, R. McDermott, M. Neeley, E.
Weig, A. Cleland and J. Martinis, "Measurement of the Entanglement of Two
Superconducting Qubits via State Tomography," Science, vol. 313, no. 5792, pp. 1423-
1425, 2006.
W. Zurek, "Reduction of the Wavepacket: How Long Does it Take?," arXiv:quant-
ph/0302044, 2003.
E. Okon and D. Sudarsky, "Less Decoherence and More Coherence in Quantum Gravity,
Inflationary Cosmology and Elsewhere," arXiv:1512.05298 [quant-ph], 2015.
S. Viznyuk, "No decoherence by entanglement," 2020. [Online]. Available:
D. Massimiliano and S. Bianchi, "The extended Bloch representation of quantum
mechanics and the hidden-measurement solution to the measurement problem," Annals of
Physics, vol. 351, pp. 975-1025, 2014.
S. Viznyuk, "From QM to KM," 2020. [Online]. Available:
N. Margolus and L. Levitin, "The maximum speed of dynamic evolution," arXiv:quant-
ph/9710043, 1998.
S. Viznyuk, "The measurement and the state evaluation," 2020. [Online]. Available:
B. Misra and E. Sudarshan, "The Zeno's paradox in quantum theory," Journal of
Mathematical Physics, vol. 18, no. 4, pp. 756-763, 1977.
ResearchGate has not been able to resolve any citations for this publication.
Full-text available
A generalized Poincare-Bloch sphere, in which the states of a quantum entity of arbitrary dimension are geometrically represented, is investigated and further extended, to also incorporate the measurements. This extended representation constitutes a general solution to the measurement problem, inasmuch it allows to derive the Born rule as an average over hidden-variables, describing not the state of the quantum entity, but its interaction with the measuring system. According to this modelization, a quantum measurement is to be understood, in general, as a tripartite process, formed by an initial deterministic decoherence-like process, a subsequent indeterministic collapse-like process, and a final deterministic purification-like process. We also show that quantum probabilities can be generally interpreted as the probabilities of a first-order non-classical theory, describing situations of maximal lack of knowledge regarding the process of actualization of potential interactions, during a measurement.
Full-text available
Demonstration of quantum entanglement, a key resource in quantum computation arising from a nonclassical correlation of states, requires complete measurement of all states in varying bases. By using simultaneous measurement and state tomography, we demonstrated entanglement between two solid-state qubits. Single qubit operations and capacitive coupling between two super-conducting phase qubits were used to generate a Bell-type state. Full two-qubit tomography yielded a density matrix showing an entangled state with fidelity up to 87%. Our results demonstrate a high degree of unitary control of the system, indicating that larger implementations are within reach.
We present spectroscopic evidence for the creation of entangled macroscopic quantum states in two current-biased Josephson-junction qubits coupled by a capacitor. The individual junction bias currents are used to control the interaction between the qubits by tuning the energy level spacings of the junctions in and out of resonance with each other. Microwave spectroscopy in the 4 to 6 gigahertzrange at 20 millikelvin reveals energy levels that agree well with theoretical results for entangled states. The single qubits are spatially separate, and the entangled states extend over the 0.7-millimeter distance between the two qubits.
this paper. This question can be asked with various levels of sophistication. Here we will discuss a particularly simple measure of speed: the maximum number of distinct states that the system can pass through, per unit of time. For a computer, this would correspond to the maximum number of operations per second.
Deconstructing Decoherence
  • J Anglin
  • J Paz
  • W Zurek
J. Anglin, J. Paz and W. Zurek, "Deconstructing Decoherence," arXiv:quant-ph/9611045, 1996.
Reduction of the Wavepacket: How Long Does it Take?
  • W Zurek
W. Zurek, "Reduction of the Wavepacket: How Long Does it Take?," arXiv:quantph/0302044, 2003.
Less Decoherence and More Coherence in Quantum Gravity
  • E Okon
  • D Sudarsky
E. Okon and D. Sudarsky, "Less Decoherence and More Coherence in Quantum Gravity, Inflationary Cosmology and Elsewhere," arXiv:1512.05298 [quant-ph], 2015.
No decoherence by entanglement
  • S Viznyuk
S. Viznyuk, "No decoherence by entanglement," 2020. [Online]. Available:
The measurement and the state evaluation
  • S Viznyuk
S. Viznyuk, "The measurement and the state evaluation," 2020. [Online]. Available: