## No full-text available

To read the full-text of this research,

you can request a copy directly from the author.

To read the full-text of this research,

you can request a copy directly from the author.

... The objective of this paper is to explore the origin and function of the phase of wavefunctions in the solutions of Schrödinger's equation, building on and clarifying some previous work [1,2]. As in emergent quantum mechanics [3], the goal is to find an underpinning of Schrödinger's equation, rather than an interpretation. ...

... where a normalization constant α has been inserted. 3 The constant is necessary to allow the filtered paths in the continuum limit to survive diffusive scaling. If we keep α = 1 the parity filtered ensemble is dominated by the full ensemble of diffusive paths and the effect of these paths in relation to all diffusive paths will be lost in the continuum limit. ...

By associating a binary signal with the relativistic worldline of a particle, a binary form of the phase of non-relativistic wavefunctions is naturally produced by time dilation. An analog of superposition also appears as a Lorentz filtering process, removing paths that are relativistically inequivalent. In a model that includes a stochastic component, the free-particle Schrödinger equation emerges from a completely relativistic context in which its origin and function is known. The result establishes the fact that the phase of wavefunctions in Schrödinger’s equation and the attendant superposition principle may both be considered remnants of time dilation. This strongly argues that quantum mechanics has its origins in special relativity.

... is a distribution meaning that the first term on the right hand side of (15) requires that the integration in k and k be interpreted as Cauchy principle value integrals. It is possible to calculate τ νn (z) in the limit of large z provided that the initial state g mk is regular enough as a function of k; see Appendix B for mathematical details. ...

... is written as the sum of three terms: the first term τ P nm (z) originates from the Cauchy principle value integration based on the first term in (15), the other two terms τ δ± nm (z). come from integration of δ (n) (ω mk ε − ω mkε ) giving contributions at k = ±k that is indicated in the super index δ± . ...

The paper explores the fundamental physical principles of quantum mechanics (in fact, quantum field theory) that limit the bit rate for long distances and examines the assumption used in this exploration that losses can be ignored. Propagation of photons in optical fibers is modelled using methods of quantum electrodynamics. We define the "photon duration" as the standard deviation of the photon arrival time; we find its asymptotics for long distances and then obtain the main result of the paper: the linear dependence of photon duration on the distance when losses can be ignored. This effect puts the limit to joint increasing of the photon flux and the distance from the source and it has consequences for quantum communication. Once quantum communication develops into a real technology (including essential decrease of losses in optical fibres), it would be appealing to engineers to increase both the photon flux and the distance. And here our "photon flux/distance effect" has to be taken into account. This effect also may set an additional constraint to the performance of a loophole free test of Bell's type—to close jointly the detection and locality loopholes.

... This problem has also a foundational dimension as playing a crucial role in performance of a loophole-free test for Bell's type [2] inequalities, see [3]- [8]. Such a test should finally close all possibilities to interpret quantum mechanics as emergent from a local realistic model (although, see, e.g., [9]- [18] for discussions, cf., e.g., [19]). 1 It is clear that without a test which is free from every loophole, the present foundational grounds of quantum mechanics can be questioned. And it is not only the foundations that can be questioned, but even the most successful quantum technologies such as quantum cryptography and quantum random generators. ...

... Now, (17) is introduced into (20). The integration in (18) related to the second term in (20) can be performed using the definition of the δ (n) distribution whereas for the first term an asymptotic analysis for large z, based on the standard Laplace transform ...

The paper explores the fundamental physical principles of quantum mechanics
(in fact, quantum field theory) which limit the bit rate for long distances.
Propagation of photons in optical fibers is modeled using methods of quantum
electrodynamics. We define photon "duration" as the standard deviation of the
photon arrival time; we find its asymptotics for long distances and then obtain
the main result of the paper: the linear dependence of photon duration on the
distance. This effect puts the limit to joint increasing of the photon flux and
the distance from the source and it has important consequences both for quantum
information technologies and quantum foundations. Once quantum communication
develops into a real technology, it would be appealing to the engineers to
increase both the photon flux and the distance. And here our "photon
flux/distance effect" has to be taken into account (at least if successively
emitted photons are considered as independent). This effect also has to be
taken into account in a loophole free test of Bell's type -- to close jointly
the detection and locality loopholes.

... On the contrary, as we have already pointed out in previous papers, it is a more detailed model inspired by the bouncer/walker experiments that can show the fertility of said analogy. It enables us to show that our model, being of the type of an " emergent quantum mechanics " [13] [14], can provide a deeper-level explanation of the dBB version of quantum mechanics (Chapter 2). ...

Elements of a "deeper level" explanation of the deBroglie-Bohm (dBB) version
of quantum mechanics are presented. Our explanation is based on an analogy of
quantum wave-particle duality with bouncing droplets in an oscillating medium,
the latter being identified as the vacuum's zero-point field. A hydrodynamic
analogy of a similar type has recently come under criticism by Richardson et
al., because despite striking similarities at a phenomenological level the
governing equations related to the force on the particle are evidently
different for the hydrodynamic and the quantum descriptions, respectively.
However, said differences are not relevant if a radically different use of said
analogy is being made, thereby essentially referring to emergent processes in
our model. If the latter are taken into account, one can show that the forces
on the particles are identical in both the dBB and our model. In particular,
this identity results from an exact matching of our emergent velocity field
with the Bohmian "guiding equation". One thus arrives at an explanation
involving a deeper, i.e. subquantum, level of the dBB version of quantum
mechanics. We show in particular how the classically-local approach of the
usual hydrodynamical modeling can be overcome and how, as a consequence, the
configuration-space version of dBB theory for $N$ particles can be completely
substituted by a "superclassical" emergent dynamics of $N$ particles in real
3-dimensional space.

... Our research program thus pertains to the scope of theories on " Emergent Quantum Mechanics " . (For the proceedings of a first international conference exclusively devoted to this topic, see Grössing (2012) [13]. For their original models, see in particular the papers by Adler, Elze, Ord, Grössing et al., ...

A research program within the scope of theories on "Emergent Quantum
Mechanics" is presented, which has gained some momentum in recent years. Via
the modeling of a quantum system as a non-equilibrium steady-state maintained
by a permanent throughput of energy from the zero-point vacuum, the quantum is
considered as an emergent system. We implement a specific "bouncer-walker"
model in the context of an assumed sub-quantum statistical physics, in analogy
to the results of experiments by Couder's group on a classical wave-particle
duality. We can thus give an explanation of various quantum mechanical features
and results on the basis of a "21st century classical physics", such as the
appearance of Planck's constant, the Schr\"odinger equation, etc. An essential
result is given by the proof that averaged particle trajectories' behaviors
correspond to a specific type of anomalous diffusion termed "ballistic"
diffusion on a sub-quantum level. It is further demonstrated both analytically
and with the aid of computer simulations that our model provides explanations
for various quantum effects such as double-slit or n-slit interference. We show
the averaged trajectories emerging from our model to be identical to Bohmian
trajectories, albeit without the need to invoke complex wave functions or any
other quantum mechanical tool. Finally, the model provides new insights into
the origins of entanglement, and, in particular, into the phenomenon of a
"systemic" nonlocality.

... Transmission of quantum information for long distances is one of the most important problems of theoretical and experimental research [1]. This problem has also a foundational dimension: in performance of a loophole-free test for Bell's type [2] inequalities, see34567, to close the century long debate on a possibility to combine peacefully local realism with quantum formalism, cf.8910111213141516. In this paper we study spatial and temporal dependencies of detection probabilities for photons propagating in optical fibres. ...

The electromagnetic fields of a single optic fibre mode are quantized based on the observation that these fields can be derived from a scalar harmonic oscillator function depending on only time and the axial wavenumber. Asymptotic results for both the one-photon probability density and two-photon correlation density functions within the forward light cone are presented, showing an algebraic decay for large times or distances. This algebraic decay, increasing the uncertainty in the arrival time of the photons, also remains in the presence of dispersion shift, in qualitative agreement with experimental results. Also presented are explicit formulae to be used in parameter studies to optimize quantum optic fibre communications.

Quantization is derived as an emergent phenomenon, resulting from the permanent interaction between matter and radiation field. The starting point for the derivation is the existence of the (continuous) random zero-point electromagnetic radiation field (zpf) of mean energy ω/2 per normal mode. A thermodynamic and statistical analysis leads unequivocally (and without quantum assumptions) to the Planck distribution law for the complete field in equilibrium. The problem of the quantization of matter is then approached from the same perspective: A detailed study of the dynamics of a particle embedded in the zpf shows that when the entire system eventually reaches a situation of energy balance thanks to the combined effect of diffusion and dissipation, the particle has acquired its characteristic quantum properties. To obtain the quantum-mechanical description it has been necessary to do a partial averaging and take the radiationless approximation. Consideration of the neglected radiative terms allows to establish contact with nonrelativistic quantum electrodynamics and derive the correct formulas for the first-order radiative corrections. Quantum mechanics emerges therefore as a partial, approximate and time-asymptotic description of a phenomenon that in its original (pre-quantum) description is entirely local and causal.

We explain how weak values and the local momentum can be better understood in terms of Bohm's notion of structure process. The basic ideas of this approach can be expressed in a fully algebraic way, generalising Heisenberg's original matrix mechanics. This approach leads to questions that are now being experimentally investigated by our group at University College London.

What defines an emergent quantum mechanics (EmQM)? Can new insight be advanced into the nature of quantum nonlocality by seeking new links between quantum and emergent phenomena as described by self-organization, complexity, or emergence theory? Could the development of a future EmQM lead to a unified, relational image of the cosmos? One key motivation for adopting the concept of emergence in relation to quantum theory concerns the persistent failure in standard physics to unify the two pillars in the foundations of physics: quantum theory and general relativity theory (GRT). The total contradiction in the foundational, metaphysical assumptions that define orthodox quantum theory versus GRT might render inter-theoretic unification impossible. On the one hand, indeterminism and non-causality define orthodox quantum mechanics, and, on the other hand, GRT is governed by causality and determinism. How could these two metaphysically-contradictory theories ever be reconciled? The present work argues that metaphysical contradiction necessarily implies physical contradiction. The contradictions are essentially responsible also for the measurement problem in quantum mechanics. A common foundation may be needed for overcoming the contradictions between the two foundational theories. The concept of emergence, and the development of an EmQM, might help advance a common foundation - physical and metaphysical - as required for successfull inter-theory unification.

We provide support for the claim that momentum is conserved for individual events in the electron double slit experiment. The natural consequence is that a physical mechanism is responsible for this momentum exchange, but that even if the fundamental mechanism is known for electron crystal diffraction and the Kapitza-Dirac effect, it is unknown for electron diffraction from nano-fabricated double slits. Work towards a proposed explanation in terms of particle trajectories affected by a vacuum field is discussed. The contentious use of trajectories is discussed within the context of oil droplet analogues of double slit diffraction.

In light of a recent reformulation of Bell's theorem from causal principles by Wiseman and the author, I argue that the conflict between quantum theory and relativity brought up by Bell's work can be softened by a revision of our classical notions of causation. I review some recent proposals for a quantum theory of causation that make great strides towards that end, but highlight a property that is shared by all those theories that would not have satisfied Bell's realist inclinations. They require (implicitly or explicitly) agent-centric notions such as "controllables" and "uncontrollables", or "observed" and "unobserved". Thus they relieve the tensions around Bell's theorem by highlighting an issue more often associated with another deep conceptual issue in quantum theory: the measurement problem. Rather than rejecting those terms, however, I argue that we should understand why they seem to be, at least at face-value, needed in order to reach compatibility between quantum theory and relativity. This seems to suggest that causation, and thus causal structure, are emergent phenomena, and lends support to the idea that a resolution of the conflict between quantum theory and relativity necessitates a solution to the measurement problem.

The de Broglie - Bohm pilot-wave theory - uniquely among realistic candidate quantum theories - allows a straightforward and simple definition of the wave function of a subsystem of some larger system (such as the entire universe). Such sub-system wave functions are called "Conditional Wave Functions" (CWFs). Here we explain this concept and indicate the CWF's role in the Bohmian explanation of the usual quantum formalism, and then develop (and motivate) the more speculative idea that something like single-particle wave functions could replace the (ontologically problematical) universal wave function in some future, empirically adequate, pilot-wave-type theory. Throughout the presentation is pedagogical and points are illustrated with simple toy models.

The concept of 'super-indeterminism' captures the notion that the free choice assumption of orthodox quantum mechanics necessitates only the following requirement: an agent's free-choice performance in the selection of measurement settings must not represent an exception to the rule of irreducible quantum indeterminism in the physical universe (i.e, "universal indeterminism"). Any additional metaphysical speculation, such as to whether quantum indeterminism, i.e., intrinsic randomness, implicates the reality of experimenter "freedom", "free will", or "free choice", is redundant in relation to the predictive success of orthodox quantum mechanics. Accordingly, super-indeterminism views as redundant also, from a technical standpoint, whether an affirmative or a negative answer is claimed in reference to universal indeterminism as a necessary precondition for experimenter freedom. Super-indeterminism accounts, for example, for the circular reasoning which is implicit in the free will theorem by Conway and Kochen [1,2]. The concept of super-indeterminism is of great assistance in clarifying the often misunderstood meaning of the concept of "free variables" as used by John Bell [3]. The present work argues that Bell sought an operational, effective free will theorem, one based upon the notion of "determinism without predetermination", i.e., one wherein "free variables" represent universally uncomputable variables. In conclusion, the standard interpretation of quantum theory does not answer, and does not need to answer in order to ensure the predictive success of orthodox theory, the question of whether either incompatibilism or compatibilism is valid in relation to free-will metaphysics and to the free-will phenomenology of experimenter agents in quantum mechanics.

It is demonstrated how quantum mechanics emerges from the stochastic dynamics of force-carriers. It is shown that the quantum Moyal equation corresponds to some dynamic correlations between the momentum of a real particle and the position of a virtual particle, which are not present in classical mechanics. The new concept throws light on the physical meaning of quantum theory, showing that the Planck constant square is a second-second cross-cumulant.

We call attention on the fact that recent unprecedented technological achievements, in particular in the field of quantum optics, seem to open the way to new experimental tests which might be relevant both for the foundational problems of quantum mechanics as well as for investigating the perceptual processes.

I review the proposal made in my 2004 book [1], that quantum theory is an emergent theory arising from a deeper level of dynamics. The dynamics at this deeper level is taken to be an extension of classical dynamics to non-commuting matrix variables, with cyclic permutation inside a trace used as the basic calculational tool. With plausible assumptions, quantum theory is shown to emerge as the statistical thermodynamics of this underlying theory, with the canonical commutation-anticommutation relations derived from a generalized equipartition theorem. Brownian motion corrections to this thermodynamics are argued to lead to state vector reduction and to the probabilistic interpretation of quantum theory, making contact with phenomenological proposals [2, 3] for stochastic modifications to Schrödinger dynamics.

In 2005, Couder, Protiere, Fort and Badouad showed that oil droplets bouncing
on a vibrating tray of oil can display nonlocal interactions reminiscent of the
particle-wave associations in quantum mechanics; in particular they can move,
attract, repel and orbit each other. Subsequent experimental work by Couder,
Fort, Protiere, Eddi, Sultan, Moukhtar, Rossi, Molacek, Bush and Sbitnev has
established that bouncing drops exhibit single-slit and double-slit
diffraction, tunnelling, quantised energy levels, Anderson localisation and the
creation/annihilation of droplet/bubble pairs.
In this paper we explain why. We show first that the surface waves guiding
the droplets are Lorentz covariant with the characteristic speed c of the
surface waves; second, that pairs of bouncing droplets experience an
inverse-square force of attraction or repulsion according to their relative
phase, and an analogue of the magnetic force; third, that bouncing droplets are
governed by an analogue of Schroedinger's equation where Planck's constant is
replaced by an appropriate constant of the motion; and fourth, that orbiting
droplet pairs exhibit spin-half symmetry and align antisymmetrically as in the
Pauli exclusion principle. Our analysis explains the similarities between
bouncing-droplet experiments and the behaviour of quantum-mechanical particles.
It also enables us to highlight some differences, and to predict some
surprising phenomena that can be tested in feasible experiments.

This is a progress report on a preliminary feasibility study of experimental setups for preparing and probing a gravitational cat state [1].

The Schrödinger equation for the particle wave function introduced via its action was derived from Newton equation for a point-like particle moving under effect of both potential force and fluctuation-dissipative environment, provided considering only stable motion of the particle. The model considered assumes existence of wave function as a physical field rather than just a mathematical abstraction.

Neutron interferometry provides a powerful tool to investigate particle and wave features in quantum physics. Single particle interference phenomena can be observed with neutrons and the entanglement of degrees of freedom, i.e., contextuality can be verified and used in further experiments. Entanglement of two photons, or atoms, is analogous to a double slit diffraction of a single photon, neutron or atom. Neutrons are proper tools for testing quantum mechanics because they are massive, they couple to electromagnetic fields due to their magnetic moment, they are subject to all basic interactions, and they are sensitive to topological effects, as well. The 4π-symmetry of spinor wave functions, the spin-superposition law and many topological phenomena can be made visible, thus showing interesting intrinsic features of quantum physics. Related experiments will be discussed. Deterministic and stochastic partial absorption experiments can be described by Bell-type inequalities. Neutron interferometry experiments based on post-selection methods renewed the discussion about quantum non-locality and the quantum measuring process. It has been shown that interference phenomena can be revived even when the overall interference pattern has lost its contrast. This indicates a persisting coupling in phase space even in cases of spatially separated Schrödinger cat-like situations. These states are extremely fragile and sensitive against any kind of fluctuations and other decoherence processes. More complete quantum experiments also show that a complete retrieval of quantum states behind an interaction volume becomes impossible in principle, but where and when a collapse of the wave-field occurs depends on the level of experiment.

Several recent experiments were devoted to walkers, structures that associate a droplet bouncing on a vibrated liquid with the surface waves it excites. They reveal that a form of wave-particle duality exists in this classical system with the emergence of quantum-like behaviours. Here we revisit the single particle diffraction experiment and show the coexistence of two waves. The measured probability distributions are ruled by the diffraction of a quantumlike probability wave. But the observation of a single walker reveals that the droplet is driven by a pilot wave of different spatial structure that determines its trajectory in real space. The existence of two waves of these types had been proposed by de Broglie in his "double solution" model of quantum mechanics. A difference with the latter is that the pilot-wave is, in our experiment, endowed with a "path memory". When intrusive measurements are performed, this memory effect induces transient chaotic individual trajectories that generate the resulting statistical behaviour.

The modern concept of spacetime usually emerges from the consideration of moving clocks on the assumption that world-lines are continuous. In this paper we start with the assumption that natural clocks are digital and that events are discrete. By taking different continuum limits we show that the phase of non-relativistic quantum mechanics and the odd metric of spacetime both emerge from the consideration of discrete clocks in relative motion. From this perspective, the continuum limit that manifests itself in 'spacetime' is an infinite mass limit. The continuum limit that gives rise to the Schrödinger equation retains a finite mass as a beat frequency superimposed on the 'Zitterbewegung' at the Compton frequency. We illustrate this in a simple model in which a Poisson process drives a relativistic clock that gives rise to a Feynman path integral, where the phase is a manifestation of the twin paradox. The example shows that the non-Euclidean character of spacetime and the wave-particle duality of quantum mechanics share a common origin. They both emerge from the necessity that clocks age at rates that are path dependent.

Closely associated with the notion of weak value is the problem of reconstructing the post-selected state: this is the so-called reconstruction problem. We show that the reconstruction problem can be solved by inversion of the cross-Wigner transform, using an ancillary state. We thereafter show, using the multidimensional Hardy uncertainty principle, that maximally concentrated cross-Wigner transforms corresponds to the case where a weak measurement reduces to an ordinary von Neumann measurement.

The contextuality of quantum mechanics can be shown by the violation of inequalities based on measurements of well chosen observables. These inequalities have been designed separately for both discrete and continuous variables. Here we unify both strategies by introducing general conditions to demonstrate the contextuality of quantum mechanics from measurements of observables of arbitrary dimensions. Among the consequences of our results is the impossibility of having a maximal violation of contextuality in the Peres-Mermin scenario with discrete observables of odd dimensions. In addition, we show how to construct a large class of observables with a continuous spectrum enabling the realization of contextuality tests both in the Gaussian and non-Gaussian regimes.

Eighty years ago Einstein demonstrated that a particular interpretation of the reduction of wave function led to a paradox and that this paradox disappeared if statistical interpretation of quantum mechanics was adopted. According to the statistical interpretation a wave function describes only an ensemble of identically prepared physical systems. Searching for an intuitive explanation of long range correlations between outcomes of distant measurements, performed on pairs of physical systems prepared in a spin singlet state, John Bell analysed local realistic hidden variable models and proved that correlations consistent with these models satisfy Bell inequalities which are violated by some predictions of quantum mechanics. Several different local models were constructed, various inequalities proven and shown to be violated by experimental data. Some physicists concluded that Nature is definitely not local. We strongly disagree with this conclusion and we critically analyze some influential finite sample proofs of various inequalities and so called quantum Randi challenges. We also show how one can win so called Bell game.The violation of inequalities does not prove that local and causal explanation of correlations is impossible. The violation of inequalities gives only a strong argument against counterfactual definiteness and against a point of view according to which experimental outcomes are produced in irreducible random way. We also explain the meaning of sample homogeneity loophole and show how it can invalidate statistical significance tests. We point out that this loophole was not closed in several recent experiments testing local realism.

We show that quantum predictions for the dual-rail realisation of a qubit can be faithfully simulated with classical stochastic gates and particles which interact entirely in a local manner. In the presented model 'non-locality' appears only on the epistemic level of description.

We argue that Hartle-Hawking states in the Regge quantum gravity model generically contain non-trivial entanglement between gravity and matter fields. Generic impossibility to talk about "matter in a point of space" is in line with the idea of an emergent spacetime, and as such could be taken as a possible candidate for a criterion for a plausible theory of quantum gravity. Finally, this new entanglement could be seen as an additional "effective interaction", which could possibly bring corrections to the weak equivalence principle.

The dark matter in the galaxy cluster Abell 1689 is modelled as an isothermal sphere of neutrinos. New data on the 2d mass density allow an accurate description of its core and halo. The model has no "missing baryon problem" and beyond 2.1 Mpc the baryons have the cosmic mass abundance. Combination of cluster data with the cosmic dark matter fraction - here supposed to stem from the neutrinos - leads to a solution of the dark matter riddle by left and right handed neutrinos with mass (1.861 ± 0.016)h
70-2eV/c
2. The thus far observed absence of neutrinoless double beta decay points to (quasi-) Dirac neutrinos: uncharged electrons with different flavour and mass eigenbasis, as for quarks. Though the cosmic microwave background spectrum is matched up to some 10% accuracy only, the case is not ruled out because the plasma phase of the early Universe may be turbulent.

In the context of nonrelativistic quantum mechanics, Gaussian wavepacket solutions of the time-dependent Schrödinger equation provide useful physical insight. This is not the case for relativistic quantum mechanics, however, for which both the Klein-Gordon and Dirac wave equations result in strange and counterintuitive wavepacket behaviors, even for free-particle Gaussians. These behaviors include zitterbewegung and other interference effects. As a potential remedy, this paper explores a new trajectory-based formulation of quantum mechanics, in which the wavefunction plays no role [Phys. Rev. X, 4, 040002 (2014)]. Quantum states are represented as ensembles of trajectories, whose mutual interaction is the source of all quantum effects observed in nature—suggesting a "many interacting worlds" interpretation. It is shown that the relativistic generalization of the trajectory-based formulation results in well-behaved free-particle Gaussian wavepacket solutions. In particular, probability density is positive and well-localized everywhere, and its spatial integral is conserved over time—in any inertial frame. Finally, the ensemble-averaged wavepacket motion is along a straight line path through spacetime. In this manner, the pathologies of the wave-based relativistic quantum theory, as applied to wavepacket propagation, are avoided.

The idea that the Planck length is the smallest unit of length, and the Planck time the smallest unit of time, is natural, and has been suggested many times. One can, however, also derive this more rigorously, using nothing more than the fact that black holes emit particles, according to Hawking's theory, and that these particles interact gravitationally. It is then observed that the particles, going in and out, form quantum states bouncing against the horizon. The dynamics of these microstates can be described in a partial wave expansion, but Hawking's expression for the entropy then requires a cut-off in the transverse momentum, in the form of a Brillouin zone, and this implies that these particles live on a lattice.

There are few proposals, which explicitly allow for (experimentally testable) deviations from standard quantum theory. Collapse models are among the most-widely studied proposals of this kind. The Schrödinger equation is modified by including nonlinear and stochastic terms, which describe the collapse of the wave function in space. These spontaneous collapses are rare for microscopic systems, hence their quantum properties are left almost unaltered. On the other hand, collapses become more and more frequent, the larger the object, to the point that macroscopic superpositions are rapidly suppressed. The main features of collapse models will be reviewed. An update of the most promising experimental tests will be presented.

We describe the development of an experiment to measure the weak value of the transverse momentum operator (local momentum [1]) of cold atoms passing through a matter- wave interferometer. The results will be used to reconstruct the atom's average trajectories. We describe our progress towards this goal using laser cooled argon atoms.

Emergent Quantum Mechanics (EmQM) seeks to construct quantum mechanical theory and behaviour from classical underpinnings. This paper explores the possibility that the field of classical general relativity (GR) could supply a sub-quantum medium for these subquantum mechanics. Firstly, I present arguments which show that GR satisfies many of the a priori requirements for a sub-quantum medium. Secondly, some potential obstacles to using GR as the underlying field are noted, for example field strength (isn't gravity a very weak force?) and spin 2. Thirdly, the ability of dynamical exchange processes to create very strong effective fields is demonstrated through the use of a simple model, which solves many of the issues raised in the second section. I conclude that there appears to be enough evidence to pursue this direction of study further, particularly as this line of research also has the possibility to help unify quantum mechanics and general relativity.

Fundamental modifications of the standard Schrödinger equation by additional nonlinear terms have been considered for various purposes over the recent decades. It came as a surprise when, inverting Abner Shimonyi's observation of "peaceful coexistence" between standard quantum mechanics and relativity, N. Gisin proved in 1990 that any (deterministic) nonlinear Schrödinger equation would allow for superluminal communication. This is by now the most spectacular and best known anomaly. We discuss further anomalies, simple but foundational, less spectacular but not less dramatic.

We argue that insights in fundamental physics could be provided by fields such as mathematical logic, and, as an example, briefly discuss some of the issues that arise from the incompleteness theorems. The advantage of this type of approach would be that it could perhaps give access to information about physics at the most fundamental level, without having to rely on increasingly accurate although always approximate empirical models alone.

A system is being designed and constructed in order to measure the weak value of spin for atomic systems. The experiment utilises spin-1 metastable helium atoms in the 23S
1 state. This paper outlines the experiment and its features.

It is argued that standard quantum theory without collapse provides a satisfactory explanation of everything we experience in this and in numerous parallel worlds. The only fundamental ontology is the universal wave function evolving in a deterministic way without action at a distance.

Bell's Theorem may well be the best known result in the foundations of quantum mechanics. Here, it is presented as stating that for any hidden variable theory the combination of the conditions Parameter Independence, Outcome Independence, Source Independence and Compatibility with Quantum Theory leads to a contradiction. Based on work by Roger Colbeck and Renato Renner, an extension of Bell's Theorem is considered. In this extension the theorem is strengthened by replacing Outcome Independence by a strictly weaker condition.

We compute the corrections to the Schwarzschild metric necessary to reproduce the Hawking temperature derived from a Generalized Uncertainty Principle (GUP), so that the GUP deformation parameter is directly linked to the deformation of the metric. Using this modified Schwarzschild metric, we compute corrections to the standard General Relativistic predictions for the perihelion precession for planets in the solar system, and for binary pulsars. This analysis allows us to set bounds for the GUP deformation parameter from well-known astronomical measurements.

I clarify the differences between various approaches in the literature which attempt to link gravity and thermodynamics. I then describe a new perspective1 based on the following features: (1) As in the case of any other matter field, the gravitational field equations should also remain unchanged if a constant is added to the Lagrangian; in other words, the field equations of gravity should remain invariant under the transformation Tab → Tab + δab (constant). (2) Each event of spacetime has a certain number (f) of microscopic degrees of freedom ('atoms of spacetime'). This quantity f is proportional to the area measure of an equi-geodesic surface, centered at that event, when the geodesic distance tends to zero. The spacetime should have a zero-point length in order for f to remain finite. (3) The dynamics is determined by extremizing the heat density at all events of the spacetime. The heat density is the sum of a part contributed by matter and a part contributed by the atoms of spacetime, with the latter being LP
−4f. The implications of this approach are discussed.

Cellular automata can show well known features of quantum mechanics, such as a linear rule according to which they evolve and which resembles a discretized version of the Schrödinger equation. This includes corresponding conservation laws. The class of "natural" Hamiltonian cellular automata is based exclusively on integer-valued variables and couplings and their dynamics derives from an Action Principle. They can be mapped reversibly to continuum models by applying Sampling Theory. Thus, "deformed" quantum mechanical models with a finite discreteness scale l are obtained, which for l → 0 reproduce familiar continuum results. We have recently demonstrated that such automata can form "multipartite" systems consistently with the tensor product structures of nonrelativistic many-body quantum mechanics, while interacting and maintaining the linear evolution. Consequently, the Superposition Principle fully applies for such primitive discrete deterministic automata and their composites and can produce the essential quantum effects of interference and entanglement.

Entropic Dynamics (ED) is a framework that allows the formulation of dynamical theories as an application of entropic methods of inference. In the generic application of ED to derive the Schrödinger equation for N particles the dynamics is a non-dissipative diffusion in which the system follows a "Brownian" trajectory with fluctuations superposed on a smooth drift. We show that there is a family of ED models that differ at the "microscopic" or sub-quantum level in that one can enhance or suppress the fluctuations relative to the drift. Nevertheless, members of this family belong to the same universality class in that they all lead to the same emergent Schrödinger behavior at the "macroscopic" or quantum level. The model in which fluctuations are totally suppressed is of particular interest: the system evolves along the smooth lines of probability flow. Thus ED includes the Bohmian or causal form of quantum mechanics as a special limiting case. We briefly explore a different universality class - a nondissipative dynamics with microscopic fluctuations but no quantum potential. The Bohmian limit of these hybrid models is equivalent to classical mechanics. Finally we show that the Heisenberg uncertainty relation is unaffected either by enhancing or suppressing microscopic fluctuations or by switching off the quantum potential.

Exact predictions for most quantum systems are computationally inaccessible. This is the so-called many body problem, which is present in most common interpretations of quantum mechanics. Therefore, predictions of natural quantum phenomena have to rely on some approximations (assumptions or simplifications). In the literature, there are different types of approximations, ranging from those whose justification is basically based on theoretical developments to those whose justification lies on the agreement with experiments. This last type of approximations can convert a quantum theory into an "unfalsifiable" quantum theory, true by construction. On the practical side, converting some part of a quantum theory into an "unfalsifiable" one ensures a successful modeling (i.e. compatible with experiments) for quantum engineering applications. An example of including irreversibility and dissipation in the Bohmian modeling of open systems is presented. On the ontological level, however, the present-day foundational problems related to controversial quantum phenomena have to avoid (if possible) being contaminated by the unfalsifiability originated from the many body problem. An original attempt to show how the Bohmian theory itself (minimizing the role of many body approximations) explains the transitions from a microscopic quantum system towards a macroscopic classical one is presented.

Which is the physical agent behind the antisymmetry of the electron state vectors? With the purpose to find an answer to this key question, we analyze the stationary states of a system containing two noninteracting electrons, using the tools of stochastic electrodynamics. In previous work, the resonant response of two particles to common modes of the random zero- point field has been shown to lead to the nonfactorizability of the composite state vector. Here we extend the analysis to particles with spin. When two electrons constitute a single system, a correlation is established between their dynamical variables through the common relevant modes of the zero-point field, which acts as a mediator. An analysis of the exchange properties of the bipartite state vectors obtained is shown to lead to the connection between spin and symmetry. The conclusion is that due consideration of the vacuum field in first quantization leads to the corresponding statistics for an assembly of electrons.

The Leggett-Garg inequality is a widely used test of the "quantumness" of a system, and involves correlations between measurements realized at different times. According to its widespread interpretation, a violation of the Legget-Garg inequality disproves macroscopic realism and non-invasiveness. Nevertheless, recent results point out that macroscopic realism is a model dependent notion and that one should always be able to attribute to invasiveness a violation of a Legget-Garg inequality. This opens some natural questions: how to provide such an attribution in a systematic way? How can apparent macroscopic realism violation be recast into a dimensional independent invasiveness model? The present work answers these questions by introducing an operational model where the effects of invasiveness are controllable through a parameter associated with what is called the measurability of the physical system. Such a parameter leads to different generalized measurements that can be associated with the dimensionality of a system, to measurement errors or to back action.

The process algebra approach to quantum mechanics posits a finite, discrete, determinate ontology of primitive events which are generated by processes (in the sense of Whitehead). In this ontology, primitive events serve as elements of an emergent space-time and of emergent fundamental particles and fields. Each process generates a set of primitive elements, using only local information, causally propagated as a discrete wave, forming a causal space termed a causal tapestry. Each causal tapestry forms a discrete and finite sampling of an emergent causal manifold (space-time) M and emergent wave function. Interactions between processes are described by a process algebra which possesses 8 commutative operations (sums and products) together with a non-commutative concatenation operator (transitions). The process algebra possesses a representation via nondeterministic combinatorial games. The process algebra connects to quantum mechanics through the set valued process and configuration space covering maps, which associate each causal tapestry with sets of wave functions over M. Probabilities emerge from interactions between processes. The process algebra model has been shown to reproduce many features of the theory of non-relativistic scalar particles to a high degree of accuracy, without paradox or divergences. This paper extends the approach to a semi-classical form of quantum electrodynamics.

We review major appearances of the functional expression $\pm \Delta \rho
^{1/2}/ \rho ^{1/2}$ in the theory of diffusion-type processes and in quantum
mechanically supported dynamical scenarios. Attention is paid to various
manifestations of "pressure" terms and their meaning(s) in-there.

ResearchGate has not been able to resolve any references for this publication.