The Foundations of Quantum Mechanics — Historical Analysis and Open Questions: Lecce, 1993
Abstract
In The Foundations of Quantum Mechanics - Historical Analysis and Open Questions, leading Italian researchers involved in different aspects of the foundations and history of quantum mechanics are brought together in an interdisciplinary debate. The book therefore presents an invaluable overview of the state of Italian work in the field at this moment, and of the open problems that still exist in the foundations of the theory.
Audience: Physicists, logicians, mathematicians and epistemologists whose research concerns the historical analysis of quantum mechanics.
Chapters (35)
We consider the conservation of angular momentum in the measurement theories of Bohr (apparatus described classically), von Neumann (standard quantum theory) and Wigner, Araki and Yanase.
The problem of the intrinsic characterization of classical probabilities is considered. A theorem is given which specifies when an inequality of the form 0 ≤ L ≤ 1, where L is a linear combination of the observed probabilities, is a condition for classical representability. An algorithm, based on Boole’s theory, that generates Bell-type inequalities for given empirical situations is illustrated.
We argue that pursuing the aim of building satisfactory “views of quantum reality”, or “quantum ontologies”, is epistemologically legitimate and not subject to no-go theorems. The question as to what extent existing interpretations may be considered satisfactory from this point of view is dealt with on the basis of criteria elaborated within the post-Kantian philosophical tradition. The aim may also prove worth the effort from a purely scientific point of view, although it may imply such radical changes of the ordinary ways of thinking as to require very long times.
We discuss a quantum measurement model which allows to test in a quantitative framework the concept of non-invasivity introduced by Leggett and Garg. A new technique which allows to speed-up the computations is described and applied to a bistable potential schematizing consecutive measurements of magnetic flux in radio-frequency superconducting quantum interferometer devices.
Generally historians of science studied Planck’s theory of black body radiation (1900–1906) with focus of their analysis in the question whether resonator energy is continuous or discontinuous. Following an alternative historiographical approach, we consider this question incorrect and secondary and we think that the central point for the comprehension of Planck’s theoretical work consists in his evolutive thought about irreversibility. The main steps of this evolution are essentially the hypothesis of natural radiation, the elementary disorder principle, the combinatorial entropy and the phaseplane subdivision: focusing our attention on this conceptual development it is possible to understand how this theory is able to highlight some quantistic aspects without strict incompatibility with classical physics, particularly the appearance of energy elements, phase-space elementary regions and microscopical indetermination.
We refer on some recent studies on classical electrodynamics of point particles, as described by the Abraham-Lorentz-Dirac equation. Such studies exploit positively the well known existence of generic runaway solutions. Indeed the additional requirement that has to be imposed, namely the restriction to initial data giving rise to nonrunaway behaviour, turns out to allow for unexpected phenomena, for example a behaviour qualitatively similar to that occurring in the quantum tunnel effect. It is pointed out how this fact might be relevant for the problem of hidden parameters.
This paper analyses the role of the dispersion formula in relation to the development of modern physics. After proving to be a decisive factor in the debate on the nature of X-rays, the dispersion problem highlighted the limits of the old quantum theory and played a vital role in the conceptual process that led to the formulation of Matrix Mechanics. The historical account presented in this paper enables us to comment on the meaning of Quantum Mechanics as it is normally interpreted.
Turning from classical to quantum physics, new problems arise with regard to the traditional philosophical question of what a ‘physical object’ is. A recent ‘group-theoretical’ approach to the question as to whether it does make sense to speak of ‘quantum objects’ is illustrated, investigating the connection it affords with the traditional problem of the ‘objectivity’ of physical knowledge. The individuality issue for quantum particles is also taken into account.
The standard logico-algebraic approach to quantum mechanics singled out a class of structures, called state-event-probability structures and underlying orthodox Hilbert space approach. These structures constitued the so called logic of quantum mechanics. With the development of the unsharp formulation the corresponding structures underlying generalized quantum mechanics, and called state-effect-probability structures, have been introduced. Two possible axiomatic formulations for the logic of unsharp (i.e., generalized) quantum mechanics are presented; the mutual relationships are investigated and some steps are taken in the direction of proving their equivalence.
The principle which states that non-commuting quantum observables cannot be measured together gives rise to interpretative questions in quantum mechanics. In this article we derive this principle, according to the Von Neumann approach, and some of these interpretative questions are discussed. In particular, the sense in which the mentioned principle may be “outflanked” by means of correlations between outcomes of different observables, as in the Einstein, Podolsky and Rosen conceptual experiment, is clarified.
The heart of Kuhn’s book Black Body Theory and the quantum discontinuity 1894–1912 is the demonstration that Planck never intended in 1900-1 to introduce the quantum discontinuity into physics, with his distribution law, as many authoritative historians maintain it. Kuhn proves that Planck was very far from conceiving the necessity of quanta when he followed, more and more closely, Boltzmann’s statistical concepts in order to find the distribution law he searched for. We have analyzed the proofs of Kuhn’s thesis and we have discovered that the various statistical concepts “inehrited” by Planck have their common foundation in Boltzmann’s conception of infinite. Kuhn, who is such a profound historian, could not escape from this fact. Kuhn, however, has not stressed this conceptual continuity either for giving a deeper explanation of the events of the birth of quanta, or for using that as a paradigm and saving, in this way, his work from the accusation that, in the black-body case, he was not able to use his famous interpretative scheme of the history of science. Our conclusion on this matter is that Kuhn behaves like most of the physicists who consider mathematics as a constant of the history, and not a variable that plays a decisive role in the evolution of physics.
The proposal of using Macroscopic Quantum Coherence in a SQUID as a test of the validity of Quantum Mechanics (QM) for macroscopic systems (Leggett A., 1980; Leggett A. and Garg A. 1985) is considered. We note that if only Macroscopic Realism (MR) is assumed but the requirement that the flux measurement is non invasive (NIM) is dropped, only the measurement of the charge would discriminate between QM and MR. This discrimination however depends critically on the experimental parameters. There is a threshold above which QM is consistent with MR but the measurement is invasive as in QM.
In the paper some quantum correlations are determined using a classical probability function. These results are reached giving up independence and taking into account some suitable conditions of dependence among quantum particles.
Although Einstein frequently discussed the problem of the theoryexperiment relationship, it has gone almost unnoticed by Einstein scholars that, in his papers, this problem is presented in the form of a search for criteria for the attribution of physical significance to the concepts of Relativity and Unified-Field theories, i.e. in the form of a stipulation of meaning. Einstein confronted this problem beginning with his early approaches to Special Relativity in 1905 and, until his last years, he never ceased to search for possible solutions. Thus, problems concerning the “stipulation of meaning” interweave all of Einstein’s methodological discussions on General Relativity and its generalisation into the Unitary-Field theories. In these discussions, Einstein often came in touch with the methodological views of mathematical physicists such as Weyl, Eddington, Levi-Civita, et al.
Lukasiewicz quantum logic is semantically characterized by the class of all quantum MV algebras. The standard model of this logic is based on effects in a Hilbert space. We discuss the physical interpretation of different kinds of conjunction that arise in this framework.
The subject of the paper is a path integral rapresentation for the semigroup \( \{ {e^{ - t{H_1}}}\} t \geqslant 0 \) generated by the quantum Hamiltonian H 1 of a relativistic spinless particle in an external electromagnetic field. The result is compared with the “Feynman-Kac” formula which holds for relativistic Schrödinger operators.
The conceptual tension between the special theory of relativity and quantum mechanics is examined from the point of view of the ontological determinateness of events in Minkowski spacetime. While the standard interpretation of quantum mechanics requires an “open”, indeterminate future in which events have “fuzzy” attributes before measurement, special relativity has often been thought to require that the future lobes of light-cones at each point contain only fully determinate events. Recent attempts at introducing indeterminateness in Minkowski spacetime are discussed in light of the philosophical implications of experiments violating Bell’s inequalities.
In the ’70s the research on foundations of science recognized two basic options — respectively, on the kind of mathematics and on the kind of logic. By means of these options, the foundations of quantum mechanics are scrutinized anew. It results an intrinsic dualism owing to which the measurement theory takes different choices on these options than the theory of the unperturbed evolution of the system. Moreover, the past two debates on wave-corpuscle dualism and incompleteness are reduced to two definite problems. The last one obtains a definite positive answer when it is analysed in the mathematics which is alternative to the classical one — i.e., constructive mathematics.
Here we present a formulation of Quantum Mechanics that is founded on the fundamental choices — only potential infinity and problematic organization — that — according to previous results by A. Drago — are characteristic of the alternative theories to the Newtonian one; hence on the symmetries and without differential equations. In this new formulation, the fundamental problem of the theory is recognized in Heisenberg’s uncertainty relations. We state the “Uncertainty Principle” by a double negated sentence that is not the same as an affirmative one, that is to say a characteristic sentence of non-classical logic which in this way is introduced from the beginning of the theory. Then, we state the commutation relations by means of the classical symmetries, that essentially follow Jordan’s new version of Heisenberg’s formulation.
The present work explores the possibility that the standard formulation of quantum mechanics might be viewed within a gnoseological perspective of Kantian type. In particular it seems possible to maintain that the theory of knowledge described in the second edition of the Critique of Pure Reason is not essentially jeopardised by quantum theory, but for what concerns the structure of the a priori defined by Kant. On a number of grounds, however, one might argue, albeit not fully conclusively, that even the schemes chosen by Kant are justified by quantum physics.
It is shown that the correspondence between non relativistic Quantum Mechanics and electromagnetic Wave Optics can be implemented in paraxial approximation with the Particle Beam Transport (optics and dynamics). This is done by introducing the recently proposed Thermal Wave Model (TWM) which assumes that the evolution of a charged-particle beam is governed by a Schrödinger-like equation for a complex function, the so-called beam wave function, whose squared modulus is proportional to the particle number density. This implemented correspondence suggests, at least in paraxial approximation, to develop a formal unified framework capable of describing together different optical and dynamical phenomena for better understanding, from a physical point of view, one subject by using the same language and similar concepts developed in the other ones. In addition this unified framework would be useful for transferring the quantum computational techniques into the other quantum-like theories in order to solve concrete physical problems. As an example, a recent relevant application of TWM to the accelerator physics is presented.
Bell’s theorem is proved to rest on a metatheoretical assumption (MCP) regarding the validity of physical laws that is compatible with the worldview of Classical Physics, not with the worldview of Quantum Physics (QP). A new general principle (MGP) is stated here that is consistent with the basic operational philosophy of QP. By using MGP, which does not modify the observative content of QP, some sample proofs of Bell’s theorem are invalidated. We conclude that the adoption of a more rigorous quantum attitude leads to give up with some features, as nonlocality and noncompatibility with any form of realism, that are usually retained to be unavoidable (and somewhat paradoxical) consequences of QP.
In order to give a more complete description of an actual experiment on EPR paradox, the upper limits for Bell and Clauser-Horn-Shimony-Holt inequalities are deduced in the case of three-valued observables. This limit results a function of the supplementary assumptions. The different cases of “random no-detection processes” and “parameter-independent selection” are discussed. A new stronger limit η= 0.811 is deduced for the quantum detection efficiency in order to perform a “loophole-free” experiment.
We analyze the consequences of the nonlocal aspects of quantum mechanics as far as the possibility of attributing objective properties to individual physical systems is concerned. We reconsider the argument of the celebrated EPR paper with reference to a relativistic context and we show that, while the conclusion that quantum strict correlations and locality imply incompleteness is correct, the common further conclusion that requiring completeness implies the acceptance of some spooky action at a distance is inappropriate.
Starting from Rothstein’s analysis of the EPR paradox, the logical incompleteness of quantum mechanics and non-separability which implies “the death of atomism” (Primas) are discussed from both physical and logico-epistemological points of view. Indeed, the indeterminacy of the microphysical, simple-elementary entities as fundamental ground of physis involves the breakdown of any kind of reductionism. Reductionism presupposes the possibility of such a determinacy, and so it is nothing else than a particular form of “determinism”. The indeterminate universe of physical processes is non-separable and a physics of the universe as a whole seems to be needed. Furthermore, from a historical and philosophical point of view the principle of separability is shown to be related to the meta-physical project of the technical dominion over nature and the emergence of non-separability in physical theories is analysed.
We prove that any unsharp orthoalgebra gives rise to a quantum MV algebra (QMV algebra) and that any QMV algebra determines in natural way an unsharp orthoalgebra. Some properties of the QMV algebra of all effects of a Hilbert space are also investigated.
We give a short presentation of Nelson stochastic mechanics, as a generalization of classical mechanics, based on the theory of stochastic processes and stochastic variational principles. Stochastic mechanics can be connected to quantum mechanics through a very simple physical interpretation scheme. From this point of view, stochastic mechanics can be seen as a quantization procedure for mechanical systems, different, but physically equivalent, to the usual operator quantization. Then we deal with the problems related to the possibility of considering stochastic mechanics as a complete physical theory. Through a discrete generalization, we show how the main features of the postulated underlying Brownian motion, at the origin of quantum fluctuations, can be derived also as a consequence of stochastic variational principles. We also discuss the problem of formulating stochastic mechanics in representations different from the configuration representation, and show how the different representations, related by unitary transformations in ordinary quantum mechanics, are connected in stochastic mechanics through stochastic measure preserving transformations. Finally, we show how the basic aspects of the measurement problem, in particular the wave packet reduction, can be interpreted in the frame of generalized stochastic mechanics. Here, the wave function collapse is not instantaneous, but is ruled by a well defined dynamical scheme, with a time asymptotic relaxation behavior. Moreover, the relaxation is slower, if the result of the measurement is more uncertain. Finally, we discuss the possibility of embedding stochastic mechanics into a generalized scheme of Schrödinger stochastic processes. 1
The very different approaches used by E. Fermi and P. A. M. Dirac to derive the Fermi-Dirac particle statistics are analyzed in detail. We suggest that the stability of theoretical conclusions through conceptual change, as exemplified in this historical case study, plays a role in justifying our belief in theoretical statements.
In 1952 David Bohm suggested a new interpretation of Quantum Mechanics showing explicitly how the indeterministic description of non relativistic wave mechanics could be trasformed into a deterministic one. The essential idea of this interpretation had been advanced by de Broglie in 1927. We present the basic tenets of Bohm theory at the light of Bohmian mechanics approach of D. Dürr, S. Goldstein and N. Zanghì.
This paper is a series of intertwining observations about the connection between the logic-theoretic noncommutativity and a logical foundation of quantum mechanics. We will analyze noncommutativity, both from an algebraic and prooftheoretic point of view, w.r.t. the quantum mechanics notion that the order of observation making is central to their description. To this end, we will present the sequential conjunction ⊗ : A ⊗ B) means “A at time t 1 and then B at time t 2”.
The thread running through our discourse is given by quantales, i.e. algebraic structures introduced by Mulvey as models for the logic of quantum mechanics, which offer an appropriate algebraic (and topological) tool for describing noncommutativity.
The philosophical issues involved in establishing quantum conceptual foundations are first summarised. Then the analogous models historically advanced by quantum physicists to regard those conceptual demands are shortly analysed. The fathers of quantum mechanics had to take into account two different bounding lines, being aware on one side of the needing to refer to some peculiar philosophical framework and on the other of the awful consequence of quantum revolution, which makes all philosophical systems advanced before the quantum development unable to satisfy quantum interpretative exigencies. The want of any plausible quantum model following from some philosophical features becomes manifest when considering both this goal and the above mentioned limits. Finally, a new model will be presented, more powerful in our opinion, based on the hermeneutical philosophy. It is noteworthy that the authors of quantum ‘revolution’ never contemplated this kind of model, though hermeneutics took a significant part in the cultural movement which gave rise to quantum theory as well. A careful examination of the hermeneutic model is then required, to evidence its capability in disentangling the complex knot of unsolved conceptual inquiries that quantum perspective involves.
We offer a concise account of the Random Path Quantization (RPQ), whose motivation comes from the fact that quantum amplitudes satisfy (almost) the same calculus that probabilities obey in the theory of classical stochastic diffusion processes. Indeed — as a consequence of this structural analogy — a new approach to quantum mechanics naturally emerges as the quantum counterpart of the Langevin description of classical stochastic diffusion processes: This is just the RPQ. Starting point is classical mechanics as formulated a là Hamilton-Jacobi. Quantum fluctuations enter the game through a certain white noise added in the first-order equation that yields the configuration space trajectories as controlled by the solutions of the (classical) Hamilton-Jacobi equation. A Langevin equation arises in this way and provides the quantum random paths. The quantum mechanical propagator is finally given by a noise average involving the quantum random paths (in complete analogy with what happens for the transition probability of a classical stochastic diffusion process within the conventional Langevin treatment). The general structure of the RPQ is discussed, along with a suggested intuitive picture of the quantum theory.
Quantum Mechanics doesn’t confine itself, as Relativity Theory, to making some properties of bodies variable and dependent on their state of motion, but it also questions the concept itself of individual object. At variance with Classical Statistical Mechanics, which admits an indistinguishability in fact but not in principle among individual objects, Quantum Mechanics does not distinguish at all among apparently identical atomic particles. However, though the different quantum statistics introduce different correlations among atomic objects, they do not automatically verify specific theoretical models of those objects, such as their supposed loss of identity. This conclusion contradicts the opinion of E. Fermi, the author, with P. A. M. Dirac, of the second quantum statistics, who did not want to question this point seriously in the philosophically disengaged Italian scientific milieu of his time.
Ever since its birth following the introduction of Planck’s constant h in 1900, quantum theory has been legitimated only by the amazing capacity it has exhibited in accounting quantitatively for a huge variety of physical phenomena. The so-called Copenhagen interpretation has allowed to accomodate in a coherent frame the tenets of quantum mechanics, but at the cost of bringing in the demand of abandoning, at the microscopic level, causality, determinism, and physical reality. The results of Aspect and coworkers’ ingenious experiments, however, can give further support to the belief in the unlimited validity of the quantum theory. But, for all its successes, the theory remains basically incomprehensible.
Four empirical formulations of the causality principle, due respectively to Laplace, Kant, John S. Mill and Hume, are discussed in relation to the philosophy of quantum mechanics, showing how they are all violated by the basic principles of the Copenhagen interpretation, whereas they are perfectly compatible, with the only exception of Laplacean determinism, with a realistic reinterpretation of the present theory.
It is a commonplace that XXth century physics has produced powerful new theories, such as Relativity and quantum mechanics, that upset the world view provided by XIXth century physics. But every physicist knows how difficult it may be to explain the basic aspects of these theories to people having a non-physical professional training. The main reason of this is that both Relativity and quantum mechanics are based on fundamental ideas that are not hard to grasp in themselves, but deeply contrast the primary categories on which our everyday thinking is based, so that it is impossible to place relativistic and quantum results within the framework suggested by ordinary intuition and common sense. Yet, despite this similarity, there are some relevant differences between the difficulties arising in Relativity and in quantum mechanics. In order to understand this point better, let us focus our attention on Special Relativity first (analogous arguments can be forwarded by considering General Relativity). Here, the strange links between space and time following from the even more strange assumption that the velocity of light is independent of the motion of the observer conflict with the very simple conception of space and time implicit in our daily practice (and explicitly stated in classical Physics, think of Newton’s “absolute space” and “absolute time”): but this conflict regards geometrical space-time models, not the very roots of our language, hence our thought. Then, let us consider quantum mechanics. Here it is a basic notion that properties of physical systems are nonobjective, in the sense that a property cannot be thought of as existing if a measurement of it is not performed. As Mermin [30] writes,
“it is a fundamental quantum doctrine that a measurement does not, in general, reveal a preexisting value of the measured property”.
It is well known that the fathers of Quantum Physics (QP) maintained that a new epistemological attitude, not only a new physical theory, was needed in order to get over the crisis of Classical Physics (CP). But it is also well known that a universal agreement on the epistemological conception that must support QP has not yet been attained: this has deep consequences on the proposed interpretations of the theory, which have multiplied beyond any reasonable limit.
ResearchGate has not been able to resolve any references for this publication.