ArticlePDF Available

Abstract and Figures

In Mod. Phys. Lett. A 9, 3119 (1994), one of us (R.D.S) investigated a formulation of quantum mechanics as a generalized measure theory. Quantum mechanics computes probabilities from the absolute squares of complex amplitudes, and the resulting interference violates the (Kolmogorov) sum rule expressing the additivity of probabilities of mutually exclusive events. However, there is a higher order sum rule that quantum mechanics does obey, involving the probabilities of three mutually exclusive possibilities. We could imagine a yet more general theory by assuming that it violates the next higher sum rule. In this paper, we report results from an ongoing experiment which sets out to test the validity of this second sum rule by measuring the interference patterns produced by three slits and all the possible combinations of those slits being open or closed. We use an attenuated laser light combined with single photon counting to confirm the particle character of the measured light.
Content may be subject to copyright.
arXiv:0811.2068v1 [quant-ph] 13 Nov 2008
Testing Born’s Rule in Quantum Mechanics with a
Triple Slit Experiment
Urbasi Sinha, Christophe Couteau, Zachari Medendorp, Immo Söllner,†,
Raymond Laflamme,∗∗, Rafael Sorkin‡,∗∗ and Gregor Weihs,†
Institute for Quantum Computing, University of Waterloo, 200 University Ave W,
Waterloo, Ontario N2L 3G1, Canada
Institut für Experimentalphysik, Universität Innsbruck, Technikerstrasse 25, 6020 Innsbruck, Austria
∗∗Perimeter Institute for Theoretical Physics, 31 Caroline St, Waterloo, Ontario N2L 2Y5, Canada
Department of Physics, Syracuse University, Syracuse, NY 13244-1130
Abstract. In Mod. Phys. Lett. A 9, 3119 (1994), one of us (R.D.S) investigated a formulation of quantum mechanics as a
generalized measure theory. Quantum mechanics computes probabilities from the absolute squares of complex amplitudes, and
the resulting interference violates the (Kolmogorov) sum rule expressing the additivity of probabilities of mutually exclusive
events. However, there is a higher order sum rule that quantum mechanics does obey, involving the probabilities of three
mutually exclusive possibilities. We could imagine a yet more general theory by assuming that it violates the next higher
sum rule. In this paper, we report results from an ongoing experiment that sets out to test the validity ofthis second sum rule
by measuring the interference patterns produced by three slits and all the possible combinations of those slits being open or
closed. We use attenuated laser light combined with single photon counting to confirm the particle character of the measured
Keywords: Probability, Quantum mechanics, Born’s rule, Interference, Foundations of Quantum Mechanics
PACS: 03.65.Ta, 42.50.Ct
Quantum Mechanics has been one of the most successful tools in the history of Physics. It has revolutionized Modern
Physics and helped explain many phenomena. However, in spite of all its successes, there are still some gaps in
our understanding of the subject and there may be more to it than meets the eye. This makes it very important to have
experimental verifications of all the fundamental postulates of Quantum Mechanics. In this paper, we aim to test Born’s
interpretation of probability [1], which states that if a quantum mechanical state is specified by the wavefunction
[2], then the probability p(r,t)that a particle lies in the volume element d3rlocated at rand at time t, is given by:
p(r,t)) =
Although this definition of probability has been assumed to be true in describing several experimental results, no
experiment has ever been performed to specifically test this definition alone. Already in his Nobel lecture in 1954, Born
raised the issue of proving his postulate. Yet, 54 years have passed without there being a dedicated attempt at such
a direct experimental verification, although the overwhelming majority of experiments indirectly verify the postulate
when they show results that obey quantum mechanics. In this paper, we report the results of ongoing experiment that
directly tests Born’s rule.
In Ref. [3], one of us (R.D.S) proposed a triple slit experiment motivated by the “sum over histories” approach to
Quantum Mechanics. According to this approach, Quantum theory differs from classical mechanics not so much in
its kinematics, but in its dynamics, which is stochastic rather than deterministic. But if it differs from deterministic
theories, it also differs from previous stochastic theories through the new phenomenon of interference. Although the
quantum type of randomness is thus non-classical, the formalism closely resembles that of classical probability theory
when expressed in terms of a sum over histories. Each set A of histories is associated with a non-negative real number
pA=|A|called the “quantum measure”, and this measure can in certain circumstances be interpreted as a probability
(but not in all circumstances because of the failure of the classical sum rules as described below). It is this measure (or
the corresponding probability) that enters the sum rules we are concerned with.Details of the quantum measure theory
following a sum over histories approach can be found in [3, 4].
Interference expresses a deviation from the classical additivity of the probabilities of mutually exclusive events.
This additivity can be expressed as a “sum rule” I=0 which says that the interference between arbitrary pairs of
alternatives vanishes. In fact, however, one can define a whole hierarchy of interference terms and corresponding sum-
rules as given by the following equations. They measure not only pairwise interference, but also higher types involving
three or more alternatives, types which could in principle exist, but which quantum mechanics does not recognize.
IAB =pAB pApB(3)
IABC =pABC pAB pBC pCA +pA+pB+pC(4)
Equations (2), (3), and (4) refer to the zeroth, first, and second sum rule respectively. Here, pABC means the
probability of the disjoint union of the sets A, B, and C. A physical system in which such probability terms appear
would be a system with three classes of paths [5], for example three slits A, B and C in an opaque aperture. For particles
incident on the slits, pAwould refer to the probability of a particle being detected at a chosen detector position having
traveled through slit A and pBand pCwould refer to similar probabilities through slits B and C.
The zeroth sum rule needs to be violated (IA6=0) for a non-trivial measure. If the first sum rule holds, i.e. IAB =0,
it leads to regular probability theory for example for classical stochastic processes. Violation of the first sum rule
(IAB 6=0) is consistent with Quantum Mechanics. A sum rule always entails that the higher ones in the hierarchy hold.
However, since the first sum rule is violated in Quantum Mechanical systems, one needs to go on to check the second
sum rule. In known systems, triadditivity of mutually exclusive probabilities is true i.e., the second sum rule holds,
IABC =0. This follows from algebra as shown below and is based on the assumption that Born’s rule holds.
pABC =|
=pA+pB+pC+ (pAB pApB) + ( pBC pBpC) + ( pCA pCpA)
=pAB +pBC +pCA pApBpC(5)
IABC pABC pAB pBC pCA +pA+pB+pC=0 (6)
If however,there is a higher order correction to Born’s rule (however small that correction might be), equation (5)
will lead to a violation of the second sum rule. The triple slit experiment proposes to test the second sum rule, or
in more physical language, to look for a possible “three way interference” beyond the pairwise interference seen in
FIGURE 1. Pictorial representation of how the different probability terms are measured. The leftmost configuration has all slits
open, whereas the rightmost has all three slits blocked. The black bars represent the slits, which are never changed or moved
throughout the experiment. The thick grey bars represent the opening mask, which will is moved in order to make different
combinations of openings overlap with the slits, thus switching between the different combinations of open and closed slits.
quantum mechanics. For this purpose we define a quantity
=pABC pAB pBC pCA +pA+pB+pCp0.(7)
Figure 1 shows how the various probabilities could be measured in a triple slit configuration. As opposed to the ideal
formulation where empty sets have zero measure, we need to provide for a non-zero p0, the background probability of
detecting particles when all paths are closed. This takes care of any experimental background, such as detector dark
counts. For better comparison between possible realizations of such an experiment, we further define a normalized
variant of
, where (8)
=|pAB pApB+p0|+|pBC pBpC+p0|+|pCA pCpA+p0|.(9)
is a measure of the regular interference contrast,
can be seen as the ratio of the violation of the second sum
rule versus the expected violation of the first sum rule. (If
=0 then
=0 trivially, and we really are not dealing
with quantum behavior at all, but only classical probabilities.) In the following sections we will describe how we
implemented the measurements of all the terms that compose
and analyze our results.
2.1. Making the slits
Our first step in designing the experiment was to find a way to reliably block and unblock the slits, which we
expected to be very close together, so that simple shutters wouldn’t work. Therefore we decided to use a set of two
plates, one containing the slit pattern and one containing patters to block or unblock the slits. The slits were fabricated
by etching them on some material which covered a glass plate. The portion of the material which had the slits etched
in would be transparent to light and the rest of the glass plate which was still covered would be opaque. However,
not all materials exhibit the same degree of opacity to infra-red light and this leads to spurious transmission through
portions of the glass plate which should be opaque in theory.
Various types of materials were used for etching the slits and each modification led to a decrease in spurious
transmission through the glass plate. At first, a photo-emulsion plate was used which had a spurious transmission
of around 5%. This was followed by a glass plate with a chromium layer on top. This had a spurious transmission of
around 3%. The plate currently in use has an aluminium layer of 500 nm thickness on top. Aluminium is known to
have a very high absorption coefficient for infrared light and this led to a spurious transmission of less than 0.1%.
FIGURE 2. Different ways of measuring the eight intensities. The LHS shows a schematic of a 3 slit pattern. In the center, the
first blocking scheme is demonstrated, in which the slits are blocked according to the terms being measured. The whole glass plate
is thus transparent with only the blocking portions opaque. The RHS shows the second blocking scheme in which the slits are
opened up as needed on a glass plate which is completely opaque except for the unblocking openings.
Single Photon
Blocking Mask
Single Photon
Optical Fiber
FIGURE 3. Schematic of experimental set-up
The blocking patterns were etched on a different glass plate covered with the same material as the first glass plate.
Figure 2 shows an example of a set of blocking patterns which wouldgive rise to the eight intensities corresponding to
the probability terms related to the 3-slit open, 2-slit open and 1-slit open configurations as discussed in the previous
Another way of achieving the eight intensities would be to open up the right number and position of slits instead of
blocking them off. This is also shown in Figure 2 and leads to a big change in the appearance of the second glass plate.
In the first instance, when the slits were being blocked for the different cases, the rest of the glass plate was transparent
and only the portions which were being used for blocking off the slits in the first glass plate were opaque to light. This
led to spurious effects as a lot of light was being let through the glass plate this way, which caused background features
in the diffraction patterns. However, with the second design, the whole plate was covered with the opaque material
and only portions which were being used to open up slits allowed light to go through, thus leading to diminishing
background effects.
2.2. The experimental set-up
Figure 3 shows a schematic of the complete experimental set-up. The He-Ne laser beam passes through an
arrangement of mirrors and collimators before being incident on a 50/50 beam splitter. In the near future we will
replace the laser by a heralded single photon source [6]. The beam then splits into two, one of the beams is used as a
reference arm for measuring fluctuations in laser power whereas the other beam is incident on the glass plate, which
has the slit pattern etched on it. The beam height and waist is adjusted so that it is incident on a set of three slits, the slits
being centered on the beam. There is another glass plate in front which has the corresponding blocking designs on it
such that one can measure the seven probabilities in equation (4). The slit plate remains stationary whereas the blocking
plate is moved up and down in front of the slits to yield the various combinations of opened slits needed to measure
the seven probabilities. As mentioned above, in our experimental set-up, we also measure an eighth probability which
corresponds to all three slits being closed in order to account for dark counts and any background light. Figure 1 shows
this pictorially. There is a horizontal microscope (not shown in Figure 3) for initial alignment between the slits and
the corresponding openings. A multi-mode optical fiber is placed at a point in the diffraction pattern and connected to
an avalanche photo-diode (APD) which measures the photon counts corresponding to the various probabilities. Using
a single photon detector confirms the particle character of light at the detection level. The optical fiber can be moved
to different positions in the diffraction pattern in order to obtain the value of
at different positions in the pattern.
Figure 4 shows a measurement of the eight diffraction patterns corresponding to the eight configurations of open and
closed slits as required by equation (7).
FIGURE 4. Diffraction patterns of the eight combinations of open and closed slitsincluding all slits closed (“0”), measured using
a He-Ne laser. The vertical axis is in units of 1000 photocounts.
FIGURE 5. Overnight measurement of
. Each data point corresponds to approximately 5 min of total measurement time. A
slight drift of the mean is visible. The error bars are the size of the standard deviation of the
2.3. Results
In a null experiment like ours, where we try to prove the existence or absence of an effect, proper analysis of
possible sources of errors is of utmost importance. It is essential to have a good estimation of both random errors
in experimental quantities as well as possible potential sources of systematic errors. For each error mechanism we
calculate within the framework of some accepted theory how much of a deviation from the ideally expected value
they will cause. Drifts in time or with repetition can often be corrected by better stabilization of the apparatus, but any
errors that do not change in time can only be characterized by additional measurements.
We have measured
for various detector points using a He-Ne laser. Initially, the value of
showed strong variations
with time and this was solved by having better temperature control in the lab and also by enclosing the set-up in a
black box so that it is not affected by stray photons in the lab. Fig. 5 shows a recent overnight run which involved
around hundred times at a position near the center of the diffraction pattern. Only a slight drift in the
mean can be discerned. The typical value of
is in the range of 102±103. The random error is the standard error
of the mean of
and obviously it is too small to explain the deviation of the mean of
from zero. Next we analyze
some systematic errors which may affect our experiment to see if these can be big enough to explain the deviation of
from the zero expected from Borns rule.
By virtue of the definition of the measured quantity
(or its normalized variant
) some potential sources of errors do
not play a role. For example, it is unimportant whether the three slits in the aperture have the same size, shape,
open transmission, or closed light leakage. However, in the current set-up we are measuring the eight different
combinations of open and closed slits using a blocking mechanism that does not block individual slits but by changing
a global unblocking mask. Also, the measurements of the different combinations occur sequentially, which makes
the experiment prone to the effects of fluctuations and drifts. In the following we will analyze the effects of three
systematic error mechanisms, power drifts or uneven mask transmission, spurious mask transmission combined with
misalignment, and detector nonlinearities.
The power of a light source is never perfectly stable and the fact that we measure the eight individual combinations at
different times leads to a difference in the total energy received by a certain aperture combination over the time interval
it is being measured for. Since in practice we don’t know how the power will change, and because we may choose a
random order of our measurements we can effectively convert this systematic drift into a random error. Conversely,
if in the experiment we found that the power was indeed drifting slowly in one direction, then randomization of the
measurement sequence would mitigate a non-zero mean.
Let us therefore assume a stationary mean power Pand a constant level of fluctuations Paround that power for an
averaging time that is equal to the time we take to measure one of the eight combinations. Let the relative fluctuation
p=P/P. Using Gaussian error propagation, the fluctuation
, whose quantum theoretical mean is zero, is
then given by
ABC + (1+sBC
BC + (1+sAC
AC + (1+sAB
(1+ (sBC +sAC)
C+ (1+ (sBC +sAB)
B+ (1+ (sAC +sAB)
(1+ (sBC +sAC +sAB)
where the quantities sare the signs of the binary interference terms that appear in
, e.g. sAB =sign(I(A,B)). Fig. 6
shows a plot of
/pas a function of the position in the diffraction pattern. The curve has divergences wherever
has a zero. These are the only points that have to be avoided. Otherwise, the relative power stability of the source
translates with factors close to unity into the relative error of
Obviously,Eq. 11 is also exactly the formula for the propagation of independent random errors of anyorigin in the
measurements, if they are all of the same relative magnitude. However, if we use a photon counting technique, then
the random error of each measurement follows from the Poissonian distribution of the photocounts. In this case, the
(relative) random error of Pxis proportional to 1/Px, where xis any of the eight combinations. As a consequence the
random error of
will be proportional to the same expression with all the P2
xreplaced by Px.
While it appears that drifting or fluctuating power can be mitigated, a worse problem is that in our realization of the
unblocking of slits every pattern could potentially have slightly different transmission. Possible reasons for this are
dirt, or incomplete etching of the metal layer, or inhomogeneities in the glass substrate or the antireflection coating
layers. In order to avoid any of these detrimental possibilities the next implementation of the slits will be air slits in a
steel membrane.
As a second source of systematic errors we have identified the unwanted transmission of supposedly opaque parts
of the slit and blocking mask. This by itself would not cause a non-zero
, but combined with small errors in the
FIGURE 6. Fluctuation
caused by fluctuating source power p(solid line). The horizontal axis is the spatial coordinate
in the far field of the three slits. The dotted line shows a scaled three-slit diffraction pattern as a position reference.
FIGURE 7. Value of
in the diffraction pattern of three slits for the following set of parameters: 30
m slit size, 100
m slit
separation, 800 nm wavelength, 100
m opening size, 5% unwanted mask transmission, and a set of displacements of the blocking
mask uniformly chosen at random from the interval [0,10
alignment of the blocking mask it will yield aperture transmission functions that are not simply always the same open
and closed slit, but in every one of the eight combinations we have a particular aperture transmission function. If the
slits were openings in a perfectly opaque mask there would be no effect, since they are not being moved between the
measurements of different combinations. In practice, we found that all our earlier masks had a few percent of unwanted
transmission as opposed to the current one which has an unwanted transmission smaller than 0.1%. Fig. 7 shows the
results of a simulation assuming the parameters of the current mask, which seems to be good enough to avoid this kind
of systematic error at the cur rent level of precision.
Finally, there is a source of systematic error, which is intrinsically linked to the actual objectiveof this measurement.
We set out to check the validity of Born’s rule, that probabilities are given by absolute squares of amplitudes. Yet,
any real detector will have some nonlinearity. In a counting measurement the effect of dead-time will limit the
linearity severely, even at relatively low average count rates. A typical specification for an optical power meter is
0.5% nonlinearity within a given measurement range. The measurement of all eight combinations involves a large
dynamic range. From the background intensity to the maximum with all three slits open, this could be as much as
six orders of magnitude. Fig. 8 shows that 1% nonlinearity translates into a non-zero value of
of up to 0.007. For
the measurements shown above the mean count rate was about 80,000 counts per second. Given a specified dead time
of our detector of 50 ns, we expect the deviation from linearity to be about 0.4% and the resulting apparent value of
All of these systematics are potential contributors to a non-zero mean
. From the above calculations and our efforts
to stabilize the incident power and improvements in the mask properties, we conclude that while detector nonlinearities
FIGURE 8. Value of
in the diffraction pattern of three slits for a 0.5% nonlinear detector, where the ratio between the maximum
detected power and the minimum detector power is 100.
may have contributed something, the main source of systematic error must be the inhomogeneities in the unblocking
mask. Hopefully air slits will bring a significant improvement.
In this experiment, we have attempted to test Borns rule for probabilities. This is a null experiment but due to
experimental inaccuracies, we have measured a value of
which is about 102±103. We have analyzed some
major sources of systematic errors that could affect our experiments and we will try to reduce their influence in future
implementations. Further, we plan replace the random laser source by a heralded single photon source [6]. This will
ensure the particle nature of light both during emission and detection and give us the advantage that we can count
the exact number of particles entering the experiment. At this point we don’t know of any other experiment that has
tried to test Born’s rule using three-path interference, therefore we cannot judge how well we are doing. However,
our collaborators [7] are undertaking an interferometric experiment using neutrons, which will perform the test in a
completely different system. These two approaches are complementary and help us in our quest to estimate the extent
of the validity of the Born interpretation of the wavefunction.
Research at IQC and Perimeter Institute was funded in part by the Government of Canada through NSERC and by
the Province of Ontario through MRI. Research at IQC was also funded in part by CIFAR. This research was partly
supported by NSF grant PHY-0404646. U.S. thanks Aninda Sinha for useful discussions.
1. M. Born, Zeitschrift für Physik 37, 863–867 (1926).
2. E. Schrödinger, Phys. Rev. 28, 1049–1070 (1926).
3. R. D. Sorkin, Modern Physics Letters A 9, 3119–3127 (1994), arXiv:gr-qc/9401003.
4. R. D. Sorkin, “Quantum Measure Theory and its Interpretation, in Quantum Classical Correspondence: Proceedings of the
4th Drexel Symposium on Quantum Nonintegrability, edited by D. Feng, and B.-L. Hu, International Press, Cambridge, MA,
1997, pp. 229–251, arXiv:gr-qc/9507057v2.
5. G. Weihs, M. Reck, H. Weinfurter, and A. Zeilinger, Optics Lett. 21, 302–304 (1996).
6. E. Bocquillon, C. Couteau, M. Razavi, R. Laflamme, and G. Weihs, Coherence measures for heralded single-photon sources
(2008), arXiv:0807.1725.
7. D. G. Cory (2008), private communication.
... Its purposeful experimental testing, however, came into implementation relatively recently. The pioneering works of Sinha et al. [3,4] have demonstrated, in a three-slit interference laser experiment, 1 the null effect within an accuracy of 10 −2 ± 10 −3 . ...
... The rudimentary physics at the moment is just the click collections. Because of this, the rule (5.13) does not require-it should also be emphasized-any physical terminology: dynamics, evolution (Schrödinger's) equation, measuring process, apparatus, degrees of freedom, particles, Hamiltonians, interactions, environment, etc. Nor does the derivation address a density matrixmixture of |α 's-and such concepts as space/time/locality/causality (as in the EPR-controversy, say), (non)relativity, (non)inertial reference frames and gravity 4 and debatable [12,20,33,34] notions like collapses, information, the 'world(s)/mind(s)', the MWI-bifurcations of the universe [8,12], (classical/objective) reality or a subjective/anthropic [19, pp. 155-65] category of the rational belief/preference [12]. ...
... Subsequent actions, including the non-operatorial reading of the device non-commutativity A = B, do not then require any postulations and have been described in the previous section. The sequence (4.9) · · · (4.10) and point (4) in the theorem remain in force. ...
We deduce the Born rule from a purely statistical take on quantum theory within minimalistic math-setup. No use is required of quantum postulates. One exploits only rudimentary quantum mathematics—a linear, not Hilbert’, vector space—and empirical notion of the Statistical Length of a state. Its statistical nature comes from the lab micro-events (detector-clicks) being formalized into the C -coefficients of quantum superpositions. We also comment that not only has the use not been made of quantum axioms (scalar-product, operators, interpretations , etc.), but that the involving thereof would be, in a sense, inconsistent when deriving the rule. In point of fact, the quadratic character of the statistical length, and even not (the ‘physics’ of) Born’s formula, represents a first step in constructing the mathematical structure we name the Hilbert space of quantum states.
... Its purposeful experimental testing, however, came into implementation relatively recently. The pioneering works of U. Sinha et al [20,21] have demonstrated, in a 3-slit interference laser experiment, the null-effect within an accuracy 10 −2 ± 10 −3 . The rule is considered as one of the cornerstone of the theory, although many researchers have long pointed out [2,6,8,12,18,19,[24][25][26], and it seems to be a majority opinion, that this Born formula is not a fundamental 'mantra' and can be derived from other tenets of quantum mechanics (QM). ...
... That said, equality (21) should be supplemented with (22) and obeyed under all a's. Simplifying notation (a 1 , a 2 ) (x, y), we require (xx) p + (yy) p = (ax + by) p (ax + by) p + (cx + dy) p (cx + dy) p for all (x, x, y, y), which are understood to be independent variables. ...
We deduce the Born rule. No use is required of quantum postulates. One exploits only rudimentary quantum mathematics -- a linear, not Hilbert's, vector space -- and empirical notion of the statistical length of a state. Its statistical nature comes from experimental micro-events: the abstract quantum clicks.
... Interference and coherence effects are some of the most useful measures in studying quantum mechanical effects. In this work, we will investigate the contributions of nonclassical paths [4][5][6][7] in the precise measurement of interference effects. ...
In the the double-slit experiment, non-classical paths are Feynman paths that go through both slits. Prior work with atom cavities as which-way detectors in the double-slit experiment, has shown these paths to be experimentally inaccessible. In this paper, we show how such a setup can indeed detect non-classical paths with 1\% probability, if one considers a different type of non-classical path than previously investigated. We also show how this setup can be used to erase and restore the coherence of the non-classical paths. Finally, we also show how atom cavities may be used to implement a Born-rule violation measure (the Quach parameter), which up until now has only been a formal construct.
... The validity of the quantum approach to physics was confirmed by thousands of successful experiments and technological inventions. However, one can still have doubts that the whole body of micro-physics can be covered by quantum formalism (e.g., experiments of the group of Weihs [62]). In psychology, the situation is worse. ...
Full-text available
Recently, quantum formalism started to be actively used outside of quantum physics: in psychology, decision-making, economics, finances, and social science. Human psychological behavior is characterized by a few basic effects; one of them is the question order effect (QOE). This effect was successfully modeled (Busemeyer–Wang) by representing questions A and B by Hermitian observables and mental-state transformations (back action of answering) by orthogonal projectors. However, then it was demonstrated that such representation cannot be combined with another psychological effect, known as the response replicability effect (RRE). Later, this no-go result was generalized to representation of questions and state transformations by quantum instruments of the atomic type. In light of these results, the possibility of using quantum formalism in psychology was questioned. In this paper, we show that, nevertheless, the combination of the QOE and RRE can be modeled within quantum formalism, in the framework of theory of non-atomic quantum instruments.
... 8 It has been shown recently though that the tripartite interference condition I 3 = 0 holds in quantum mechanics by using a 3-slit experiment [12,13]. 9 Without entering such discussion here, Sorkin has proposed an interpretation of the number µ(A), assigned to an event A by the quantum measure µ, in terms of the notion of 'preclusion' instead of the of notion of 'expectation' [14]. ...
Full-text available
Schwinger’s algebra of selective measurements has a natural interpretation in the formalism of groupoids. Its kinematical foundations, as well as the structure of the algebra of observables of the theory, were presented in [F. M. Ciaglia, A. Ibort and G. Marmo, Schwinger’s picture of quantum mechanics I: Groupoids, Int. J. Geom. Meth. Mod. Phys. (2019), arXiv:1905.12274 [math-ph], doi:10.1142/S0219887819501196. F. M. Ciaglia, A. Ibort and G. Marmo, Schwinger’s picture of quantum mechanics II: Algebras and observables, Int. J. Geom. Meth. Mod. Phys. (2019), doi:10.1142/ S0219887819501366]. In this paper, a closer look to the statistical interpretation of the theory is taken and it is found that an interpretation in terms of Sorkin’s quantum measure emerges naturally. It is proven that a suitable class of states of the algebra of virtual transitions of the theory allows to define quantum measures by means of the corresponding decoherence functionals. Quantum measures satisfying a reproducing property are described and a class of states, called factorizable states, possessing the Dirac–Feynman “exponential of the action” form are characterized. Finally, Schwinger’s transformation functions are interpreted similarly as transition amplitudes defined by suitable states. The simple examples of the qubit and the double slit experiment are described in detail, illustrating the main aspects of the theory.
... Since, quantum formalism admits only linear vector spaces, higher order interference effects which can exist in generalized probabilistic theories [66][67][68][69][70], for a single quantum in YDS like experiments, are simply ruled out (see Eq. ...
Full-text available
The main ideas of the wave-particle non-dualistic interpretation of quantum mechanics are elucidated using two well-known examples, viz., (i) a spin-1/2 system in the Stern-Gerlach experiment and (ii) Young's double-slit experiment, representing the cases of observables with discrete and continuous eigenvalues, respectively. It's proved that only Born's rule can arise from quantum formalism as a limiting case of the relative frequency of detection. Finally, non-duality is used to unambiguously explain Hanbury Brown-Twiss effect, at the level of individual quanta, for the two-particle coincidence detection.
A well-known result for the interference of two single-mode fields is that the degree of coherence and the degree of indistinguishability are the same when we consider the detection of a single photon. In this article, we present the relation between the degree of coherence, path indistinguishability and the fringe visibility considering interference of multiple numbers of single-mode fields while being interested in the detection of a single photon only. We will also mention how Born’s rule of interference for multiple sources is reflected in these results.
In the explanations of the double slit experiment it is usually assumed that the superposition of the diffracted waves by two separate slits (one open and another closed) is the same as superimposing the diffracted waves when both slits are open at the same time. This naïve use of the superposition principle is generally not valid in either classical electromagnetism or quantum mechanics. As we will see, the diffraction diagram of one of the slits is altered when the other is open and when the slits are closer, the interaction between them is greater. In this work we will analyse the interaction between different types of sources (antennas, loudspeakers and, of course, slits) and we will show how this interaction explains the apparent anomalies in energy conservation and also the recent results of Young’s interference that show that the total diffracted intensity is reduced or increased depending on the distance between the slits. We also make a brief comment on the contributions of our work to the supposed looped trajectories of energy in interference phenomena.
Leggett-Garg (LG) tests for macrorealism were originally designed to explore quantum coherence on the macroscopic scale. Interference experiments and systems modeled by harmonic oscillators provide useful examples of situations in which macroscopicity has been approached experimentally and are readily turned into LG tests for a single dichotomic variable Q. Applying this approach to the double-slit experiment in which a noninvasive measurement at the slits is included, we exhibit LG violations. We find that these violations are always accompanied by destructive interference. The converse is not true in general and we find that there are nontrivial regimes in which there is destructive interference but the two-time LG inequalities are satisfied, which implies that it is in fact often possible to assign (indirectly determined) probabilities for the interferometer paths. Similar features have been observed in recent work involving a LG analysis of a Mach-Zehnder interferometer and we compare with those results. We extend the analysis to the triple-slit experiment again finding LG violations, and we also exhibit examples of some surprising relationships between LG inequalities and NSIT conditions that do not exist for dichotomic variables. For the simple harmonic oscillator, we find an analytically tractable example showing a two-time LG violation with a gaussian initial state, echoing in simpler form recent results of Bose et al. [Phys. Rev. Lett. 120, 210402 (2018)].
Full-text available
A well known result for the interference of two single-mode fields is that the degree of coherence and the degree of indistinguishablity are same when we consider the detection of a single photon. In this article we present the relation between degree of coherence, path indistinguishability and the fringe visibility considering interference of multiple number of single-mode fields while being interested in the detection of a single photon only . We will also mention how Born's rule of interference for multiple sources is reflected in these results.
Full-text available
We report the realization of a three-path Mach-Zehnder interferometer using single-mode fibers and two integrated 3 x 3 fiber couplers. We observed enhanced phase sensitivity, as compared with two-path interferometers, with a visibility of the interference pattern of more than 97%. This interferometer has an analog in two-photon interferometry, and we believe it to be the first nontrivial example of N x N multiport interferometers.
Full-text available
Single-photon sources (SPSs) are mainly characterized by the minimum value of their second-order coherence function, viz. their $g^{(2)}$ function. A precise measurement of $g^{(2)}$ may, however, require high time-resolution devices, in whose absence, only time-averaged measurements are accessible. These time-averaged measures, standing alone, do not carry sufficient information for proper characterization of SPSs. Here, we develop a theory, corroborated by an experiment, that allows us to scrutinize the coherence properties of heralded SPSs that rely on continuous-wave parametric down-conversion. Our proposed measures and analysis enable proper standardization of such SPSs. Comment: 4 pages, 4 figures, corrected Eq. (10)
We propose a realistic, spacetime interpretation of quantum theory in which reality constitutes a *single* history obeying a "law of motion" that makes definite, but incomplete, predictions about its behavior. We associate a "quantum measure" |S| to the set S of histories, and point out that |S| fulfills a sum rule generalizing that of classical probability theory. We interpret |S| as a "propensity", making this precise by stating a criterion for |S|=0 to imply "preclusion" (meaning that the true history will not lie in S). The criterion involves triads of correlated events, and in application to electron-electron scattering, for example, it yields definite predictions about the electron trajectories themselves, independently of any measuring devices which might or might not be present. (So we can give an objective account of measurements.) Two unfinished aspects of the interpretation involve *conditonal* preclusion (which apparently requires a notion of coarse-graining for its formulation) and the need to "locate spacetime regions in advance" without the aid of a fixed background metric (which can be achieved in the context of conditional preclusion via a construction which makes sense both in continuum gravity and in the discrete setting of causal set theory).
The additivity of classical probabilities is only the first in a hierarchy of possible sum-rules, each of which implies its successor. The first and most restrictive sum-rule of the hierarchy yields measure-theory in the Kolmogorov sense, which physically is appropriate for the description of stochastic processes such as Brownian motion. The next weaker sum-rule defines a {\it generalized measure theory} which includes quantum mechanics as a special case. The fact that quantum probabilities can be expressed ``as the squares of quantum amplitudes'' is thus derived in a natural manner, and a series of natural generalizations of the quantum formalism is delineated. Conversely, the mathematical sense in which classical physics is a special case of quantum physics is clarified. The present paper presents these relationships in the context of a ``realistic'' interpretation of quantum mechanics. Comment: 13 pages (The only revision is the correction of a typo in equation 3.)
  • E Schrödinger
E. Schrödinger, Phys. Rev. 28, 1049–1070 (1926).