QUANTUM THEORY: Reconsideration of Foundations - 3
810(2006); http://dx.doi.org/10.1063/1.2158707View Description Hide Description
The present conference takes place in the same year that celebrates the centenary of Albert Einstein. Hence it is a good occasion to reflect on those problems which have been at the core of Einstein’s intellectual activity. Undoubtedly the foundation of quantum mechanics (QM) is one of these problems. It is known that Einstein was never convinced by the interpretation of quantum mechanics accepted, in his times and still now, by the majority of physicists. The fact that he was sharing this skepticism with people like Schrödinger and, most of all, the fact that no convincing answer, to the doubts of these people, had emerged in a more than half a century old debate, helped in keeping alive the attention of a growing number of people on this problem.
The crucial issue is that the standard interpretation of QM has some physical implications which are experimentally verifiable and which, for several years, have been thought to be incompatible with relativity theory (the so‐called “quantum nonlocality”). On the other hand alternative, more intuitive, interpretations (such as the ensemble interpretation) seemed to be ruled out from very well confirmed experimental data.
The way out from this impasse has required a deep analysis of the connections between mathematics and physics as well as the emergence of new ideas both in mathematics (non‐Kolmogorovian probabilities) and in physics (the theory of adaptive systems).
The Einstein centenary is a good occasion for a short survey of these developments with the goal of answering the intriguing question posed in the title of the present paper.
810(2006); http://dx.doi.org/10.1063/1.2158708View Description Hide Description
Non‐commutative diagrams, where X → Y → Z is allowed and X → Z → Y is not, may equally well apply to Malusian experiments with photons traversing polarizers, and to sequences of elementary chemical reactions. This is why non‐commutative probabilistic, logical, and dynamical structures necessarily occur in chemical or biological dynamics. We discuss several explicit examples of such systems and propose an exactly solvable nonlinear toy model of a “brain‐heart” system. The model involves non‐Kolmogorovian probability calculus and soliton kinetic equations integrable by Darboux transformations.
810(2006); http://dx.doi.org/10.1063/1.2158709View Description Hide Description
Two key concepts of quantum theory, complementarity and entanglement, are considered with respect to their significance in and beyond physics. An axiomatically formalized, weak version of quantum theory, more general than the ordinary quantum theory of physical systems, is described. Its mathematical structure generalizes the algebraic approach to ordinary quantum theory. The crucial formal feature leading to complementarity and entanglement is the non‐commutativity of observables.
The ordinary Hilbert space quantum mechanics can be recovered by stepwise adding the necessary features. This provides a hierarchy of formal frameworks of decreasing generality and increasing specificity. Two concrete applications, more specific than weak quantum theory and more general than ordinary quantum theory, are discussed: (i) complementarity and entanglement in classical dynamical systems, and (ii) complementarity and entanglement in the bistable perception of ambiguous stimuli.
810(2006); http://dx.doi.org/10.1063/1.2158710View Description Hide Description
In a quantum measurement, a coupling g between the system S and the apparatus A triggers the establishment of correlations, which provide statistical information about S. Robust registration requires A to be macroscopic, and a dynamical symmetry breaking of A governed by S allows the absence of any bias. Phase transitions are thus a paradigm for quantum measurement apparatuses, with the order parameter as pointer variable. The coupling g behaves as the source of symmetry breaking. The exact solution of a model where S is a single spin and A a magnetic dot (consisting of N interacting spins and a phonon thermal bath) exhibits the reduction of the state as a relaxation process of the off‐diagonal elements of S + A, rapid due to the large size of N. The registration of the diagonal elements involves a slower relaxation from the initial paramagnetic state of A to either one of its ferromagnetic states. If g is too weak, the measurement fails due to a “Buridan’s ass” effect. The probability distribution for the magnetization then develops not one but two narrow peaks at the ferromagnetic values. During its evolution it goes through wide shapes extending between these values.
810(2006); http://dx.doi.org/10.1063/1.2158711View Description Hide Description
This is a review of the ideas behind the Fisher‐Rao metric on classical probability distributions, and how they generalize to metrics on density matrices. As is well known, the unique Fisher‐Rao metric then becomes a large family of monotone metrics. Finally I focus on the Bures‐Uhlmann metric, and discuss a recent result that connects the geometric operator mean to a geodesic billiard on the set of density matrices.
810(2006); http://dx.doi.org/10.1063/1.2158712View Description Hide Description
In these notes we associate a natural Heisenberg group bundle Ha with a singularity free smooth vector field X = (id,a) on a submanifold M in a Euclidean three‐space. This bundle yields naturally an infinite dimensional Heisenberg group . A representation of the C *‐group algebra of is a quantization. It causes a natural Weyl‐deformation quantization of X. The influence of the topological structure of M on this quantization is encoded in the Chern class of a canonical complex line bundle inside Ha .
810(2006); http://dx.doi.org/10.1063/1.2158713View Description Hide Description
These lecture notes survey some joint work with Samson Abramsky as it was presented by me at several conferences in the summer of 2005. It concerns ‘doing quantum mechanics using only pictures of lines, squares, triangles and diamonds’. This picture calculus can be seen as a very substantial extension of Dirac’s notation, and has a purely algebraic counterpart in terms of so‐called Strongly Compact Closed Categories (introduced by Abramsky and I in [3, 4]) which subsumes my Logic of Entanglement . For a survey on the ‘what’, the ‘why’ and the ‘hows’ I refer to a previous set of lecture notes [12, 13]. In a last section we provide some pointers to the body of technical literature on the subject.
810(2006); http://dx.doi.org/10.1063/1.2158714View Description Hide Description
Stochastic electrodynamics (SED) is a classical theory of nature advanced significantly in the 1960s by Trevor Marshall and Timothy Boyer. Since then, SED has continued to be investigated by a very small group of physicists. Early investigations seemed promising, as SED was shown to agree with quantum mechanics (QM) and quantum electrodynamics (QED) for a few linear systems. In particular, agreement was found for the simple harmonic electric dipole oscillator, physical systems composed of such oscillators and interacting electromagnetically, and free electromagnetic fields with boundary conditions imposed such as would enter into Casimir‐type force calculations. These results were found to hold for both zero‐point and non‐zero temperature conditions. However, by the late 1970s and then into the early 1980s, researchers found that when investigating nonlinear systems, SED did not appear to provide agreement with the predictions of QM and QED. A proposed reason for this disagreement was advocated by Boyer and Cole that such nonlinear systems are not sufficiently realistic for describing atomic and molecular physical systems, which should be fundamentally based on the Coulombic binding potential. Analytic attempts on these systems have proven to be most difficult. Consequently, in recent years more attention has been placed on numerically simulating the interaction of a classical electron in a Coulombic binding potential, with classical electromagnetic radiation acting on the classical electron. Good agreement was found for this numerical simulation work as compared with predictions from QM. Here this worked is reviewed and possible directions are discussed. Recent simulation work involving subharmonic resonances for the classical hydrogen atom is also discussed; some of the properties of these subharmonic resonances seem quite interesting and unusual.
810(2006); http://dx.doi.org/10.1063/1.2158715View Description Hide Description
The debate on the nature of quantum probabilities in relation to Quantum Non Locality has elevated Quantum Mechanics to the level of an Operational Epistemic Theory . In such context the quantum superposition principle has an extraneous non epistemic nature. This leads us to seek purely operational foundations for Quantum Mechanics, from which to derive the current mathematical axiomatization based on Hilbert spaces.
In the present work I present a set of axioms of purely operational nature, based on a general definition of “the experiment”, the operational/epistemic archetype of information retrieval from reality. As we will see, this starting point logically entails a series of notions [state, conditional state, local state, pure state, faithful state, instrument, propensity (i.e. “effect”), dynamical and informational equivalence, dynamical and informational compatibility, predictability, discriminability, programmability, locality, a‐causality, rank of the state, maximally chaotic state, maximally entangled state, informationally complete propensity, etc.], along with a set of rules (addition, convex combination, partial orderings, … ), which, far from being of quantum origin as often considered, instead constitute the universal syntactic manual of the operational/epistemic approach. The missing ingredient is, of course, the quantum superposition axiom for probability amplitudes: for this I propose some substitute candidates of purely operational/epistemic nature.
810(2006); http://dx.doi.org/10.1063/1.2158716View Description Hide Description
A detailed analysis of stochastic electrodynamics (SED) as a foundation for quantum mechanics has shown that the reasons for its failure in the case of nonlinear forces are not to be ascribed to the founding principles of the theory but to the approximation methods introduced, particularly the use of the Fokker‐Planck approximation and perturbation theory. To recover the intrinsic possibilities of SED a new, non perturbative approach has been developed, namely linear stochastic electrodynamics (LSED). We here present the basic principles on which LSED is constructed. The demand that the solutions of the SED problem comply with as few as three principles, each one of which is shown to have a clear physical meaning, leads in a natural way to the quantum mechanical description in its Heisenberg form. We briefly re‐examine some of the most often discussed conceptual problems of quantum mechanics from the point of view offered by the new theory and show that it offers well defined and clear physical anwers to them, within a realist and causal perspective. To conclude we add brief comments on a couple of predictions of the theory, the test of which could eventually lead to its validation or refutation.
810(2006); http://dx.doi.org/10.1063/1.2158717View Description Hide Description
Neutron optical experiments are presented, which exhibit quantum contextual phenomena. Entanglement is achieved not between particles, but between degrees of freedom, in this case, for a single‐particle. Appropriate combinations of the direction of spin analysis and the position of the phase shifter allow an experimental verification of the violation of a Bell‐like inequality. Our experiments manifest the fact that manipulation of the wavefunction in one Hilbert space influences the result of the measurement in the other Hilbert space: manipulation without touch! Next, we report another experiment which exhibits other peculiarity of quantum contextuality, e.g., originally intended to show a Kochen‐Specker‐like phenomenon. We have introduced inequalities for quantitative analysis of the experiments. The value obtained in the experiments clearly showed violations of prediction by non‐contextual theory. Finally, we have accomplished a tomographic determination of entangled quantum state in single‐neutrons. There, characteristics of the Bell‐sate are confirmed: four poles for the real part of the density matrix are clearly seen.
810(2006); http://dx.doi.org/10.1063/1.2158718View Description Hide Description
We re‐examine the claim made by Englert, Scully, Süssman and Walther that in certain ‘Welcher Weg’ (Which Way) interference experiments, the Bohm trajectories behave in such a bizarre and unacceptable way that they must be considered as unreliable and even ‘surreal’. We show that this claim cannot be correct and is based on an incorrect use of the Bohm approach.
810(2006); http://dx.doi.org/10.1063/1.2158719View Description Hide Description
The development of practical optical technology has been essential to empirical explorations of foundational questions in quantum theory, such as tests of Bell inequalities. Here, some current and potential uses of entangled multi‐photon states, which have long lain at the heart of foundational considerations in quantum theory, are explored. The use of quantum decoherence mitigation techniques such as decoherence‐free subspaces (DFSs) which involve such entangled quantum states in non‐trivial Hilbert spaces and which were until recently merely theoretical constructs, are shown capable of playing a role in furthering practical optical QKD and other communication tasks using quantum information resources. Elements of a recently introduced basis of four‐photon entangled states are shown to lie in a decoherence‐free subspace and to result from the concatenation of quantum codes, as examples.
810(2006); http://dx.doi.org/10.1063/1.2158720View Description Hide Description
In relativity, two simultaneous events at two different places are not simultaneous for observers in different Lorentz frames. In the Einstein‐Podolsky‐Rosen experiment, two simultaneous measurements are taken at two different places. Would they still be simultaneous to observers in moving frames? It is a difficult question, but it is still possible to study this problem in the microscopic world. In the hydrogen atom, the incertainty can be considered to be entirely associated with the groundstate. However, is there an uncertainty associated with the time‐separation variable between the proton and electron? This time‐separation variable is a forgotten, if not hidden, variable in the present form of quantum mechanics. The first step toward the simultaneity problem is to study the role of this time‐separation variable in the Lorentz‐covariant world. It is shown possible to study this problem using harmonic oscillators applicable to hadrons which are bound states of quarks. It is also possible to derive consequences that can be tested experimentally.
810(2006); http://dx.doi.org/10.1063/1.2158721View Description Hide Description
We show that quantum mechanics can be represented as an asymptotic projection of statistical mechanics of classical fields. Our approach does not contradict to a rather common opinion that quantum mechanics could not be reduced to statistical mechanics of classical particles. Notions of a system and causality can be reestablished on the prequantum level, but the price is sufficiently high — the infinite dimension of the phase space. In our approach quantum observables, symmetric operators in the Hilbert space, are obtained as the second order derivatives of functionals of classical fields: f(ψ) → A quant = f″(0). Statistical states are given by Gaussian ensembles of classical fields with zero mean value (so these are vacuum fluctuations) and dispersion α which plays the role of a small parameter of the model (so these are small vacuum fluctuations). Our approach can be called Prequantum Classical Statistical Field Theory — PCSFT.
810(2006); http://dx.doi.org/10.1063/1.2158722View Description Hide Description
Quantum mechanics is considered to arise from an underlying classical structure (“hidden variable theory”, “sub‐quantum mechanics”), where quantum fluctuations follow from a physical noise mechanism. The stability of the hydrogen ground state can then arise from a balance between Lorentz damping and energy absorption from the noise. Since the damping is weak, the ground state phase space density should predominantly be a function of the conserved quantities, energy and angular momentum.
A candidate for this phase space density is constructed for ground state of the relativistic hydrogen problem of a spinless particle. The first excited states and their spherical harmonics are also considered in this framework.
The analytic expression of the ground state energy can be reproduced, provided averages of certain products are replaced by products of averages. This analysis puts forward that quantum mechanics may arise from an underlying classical level as a slow variable theory, where each new quantum operator relates to a new, well separated time interval.
810(2006); http://dx.doi.org/10.1063/1.2158723View Description Hide Description
The quantum teleportation has been discussed by several articles by several different schemes. In most of models, complete teleportation can be occurred if the entangled state between Alice and Bob is maximal. In , we reformulated the teleportation and showed in our scheme that the complete teleportation is possible even in the case for non‐maximal entangled state.
810(2006); http://dx.doi.org/10.1063/1.2158724View Description Hide Description
General relativity and quantum mechanics are not consistent with each other. This conflict stems from the very fundamental principles on which these theories are grounded. General relativity, on one hand, is based on the equivalence principle, whose strong version establishes the local equivalence between gravitation and inertia. Quantum mechanics, on the other hand, is fundamentally based on the uncertainty principle, which is essentially nonlocal. This difference precludes the existence of a quantum version of the strong equivalence principle, and consequently of a quantum version of general relativity. Furthermore, there are compelling experimental evidences that a quantum object in the presence of a gravitational field violates the weak equivalence principle. Now it so happens that, in addition to general relativity, gravitation has an alternative, though equivalent, description, given by teleparallel gravity, a gauge theory for the translation group. In this theory torsion, instead of curvature, is assumed to represent the gravitational field. These two descriptions lead to the same classical results, but are conceptually different. In general relativity, curvature geometrizes the interaction while torsion, in teleparallel gravity, acts as a force, similar to the Lorentz force of electrodynamics. Because of this peculiar property, teleparallel gravity describes the gravitational interaction without requiring any of the equivalence principle versions. The replacement of general relativity by teleparallel gravity may, in consequence, lead to a conceptual reconciliation of gravitation with quantum mechanics.
810(2006); http://dx.doi.org/10.1063/1.2158725View Description Hide Description
The article offers a rereading of Bohr’s reply to the argument of A. Einstein, B. Podolsky, and N. Rosen (EPR) concerning the incompleteness, or else nonlocality, of quantum mechanics. Bohr shows EPR’s argument to be deficient on their own terms, by virtue of an ambiguity found in their application of the criterion of reality they propose to the phenomena in question. He also offers an alternative interpretation of the EPR experiment itself, the thought experiment proposed by EPR, as part of his overall interpretation of quantum mechanics as complementarity.