Complex numbers are an intrinsic part of the mathematical formalism of quantum theory, and are perhaps its most mysterious feature. But what is their physical origin? In this talk, I show how it is possible to trace the complex nature of the quantum formalism directly to the basic symmetries associated with the basic operations which allow elementary experiments to be combined into more elaborate ones. In particular, I show that, by harnessing these symmetries, the Feynman rules of quantum theory can be derived from the assumption that a pair of real numbers is associated to each sequence of measurement outcomes, and that the probability of this sequence is a real-valued function of this number pair. The derivation has numerous intriguing implications, such as pointing to a deep connection between the foundations of quantum theory and the foundations of number systems. It also demonstrates that, contrary to the rather prevalent working hypothesis that the structure of the quantum formalism has something essentially to do with nonlocality, the core of the quantum formalism in fact does not depend in any essential way on the properties of space. Reference: "Origin of Complex Quantum Amplitudes and Feynman's Rules", Phys. Rev. A 81, 022109 (2010). Full text available at www.philipgoyal.org
Peter Evans The extent to which Julian Barbour's Machian formulation of general relativity and his interpretation of canonical quantum gravity can be called timeless is addressed. We differentiate two types of timelessness in Barbour's work (1994a, 1994b and 1999) and attempt to refine Barbour's metaphysical claim by providing an account of the essential features of time through considerations of the representation of time in physical theory. We argue that Barbour's claim of timelessness is dubious with respect to his Machian formulation of general relativity but warranted with respect to his interpretation of canonical quantum gravity. We conclude by discussing some of the implications of Barbour's view.
The aim of this talk is to review and discuss some aspects of quantum entanglement in the quantum field theoretic (QFT) domain. The discussion takes place in the algebraic approach to QFT, the motivation for which is briefly discussed. We consider in what sense this approach is sometimes called 'local quantum theory'. We discuss a possible 'realist' understanding of quantum entanglement within this framework, addressing some conceptual and methodological worries raised by Einstein (among others).
The de Broglie-Bohm pilot-wave program is an attempt to formulate quantum theory (including quantum field theory) as a theory without observers, by assuming that the wave-function is not the complete description of a system, but must be supplemented by additional variables (beables). Although many progress has been made in order to extend the pilot-wave theory to quantum field theory, a compelling ontology for quantum field theory is still lacking and the choice of beable is likely to be relevant for the study of quantum non-equilibrium systems and their relaxation properties (Valentini). The present work takes its root in the fact that in the standard model of particle physics, all fermions are fundamentally massless and acquire their bare mass when the Higgs field condenses. In our tentative to build a pilot-wave model for quantum field theory in which beables are attributed to massless fermions, we are naturally led to Weyl spinors and to Penrose's zig-zag picture of the electron. In my talk, I will sketch this tentative and insist on some of its remarkable properties: namely that a positive-energy massive Dirac electron can be thought of as a superposition of positive and negative energy Weyl spinors of the same helicity, and that the massive Dirac electron can in principle move luminally at all times. Based on a joint work with H. Wiseman.
The effects of closed timelike curves (CTCs) in quantum dynamics, and its consequences for information processing have recently become the subject of a heated debate. Deutsch introduced a formalism for treating CTCs in a quantum computational framework. He postulated a consistency condition on the chronology-violating systems which led to a nonlinear evolution on the systems that come to interact with the CTC. This has been shown to allow tasks which are impossible in ordinary linear quantum evolution, such as computational speed-ups over (linear) quantum computers, and perfectly distinguishing non-orthogonal quantum states. Bennett and co-authors have argued, on the other hand, that nonlinear evolution allows no such exotic effects. They argued that all proofs of exotic effects due to nonlinear evolutions suffer from a fallacy they called the " linearity trap". Here we review the argument of Bennett and co-authors and show that there is no inconsistency in assuming linearity at the level of a classical ensemble, even at the presence of nonlinear quantum evolution. In fact, this is required for the very existence of empirically verifiable nonlinear evolution. The arguments for exotic quantum effects are thus seen to be based on the necessity for a fundamental distinction between proper and improper mixtures in the presence of nonlinear evolutions. We show how this leads to an operationally well-defined version of the measurement problem that we call the "preparation problem".
Tim Ralph We consider quantum mechanical particles that traverse general relativistic wormholes in such a way that they can interact with their own past, thus forming closed timelike curves. Using a simple geometric argument we reproduce the solutions proposed by Deutsch for such systems. Deutsch's solutions have attracted considerable interest because they do not contain paradoxes, however, as originally posed, they do contain ambiguities. We show that these ambiguities are removed by following our geometric derivation.
Can a density matrix be regarded as a description of the physically real properties of an individual system? If so, it may be possible to attribute the same objective significance to statistical mechanical properties, such as entropy or temperature, as to properties such as mass or energy. Non-linear modifications to the evolution of a density matrix can be proposed, based upon this idea, to account for thermodynamic irreversibility. Traditional approaches to interpreting quantum phenomena assume that an individual system is described by a pure state, with density matrices arising only through a statistical mixture or through tracing out entangled degrees of freedom. Treating the density matrix as fundamental can affect the viability of some of these interpretations, and introducting thermodynamically motivated non-linearities will not, in themselves, help in solving the quantum measurement problem.
Feynman showed that the path of least action is determined by quantum interference. The interference may be viewed as part of a quantum algorithm for minimising the action. In fact, Lloyd describes the Universe as a giant quantum computer whose purpose is to calculate its own state. Could the direction of time that the universe is apparently following be determined by a quantum algorithm? The answer lies in the violation of time reversal (T) invariance that is being observed in an increasing number of particle accelerator experiments. The violation signifies a fundamental asymmetry between the past and future and calls for a major shift in the way we think about time. Here we show that processes which violate T invariance induce destructive interference between different paths that the universe can take through time. The interference eliminates all paths except for two that represent continuously forwards and continuously backwards time evolution. This suggests that quantum interference from T violation processes gives rise to the phenomenological unidirectional nature of time. A path consisting exclusively of forward steps gives the shortest path to a point which is in the forwards direction. The quantum interference, therefore, underlies a quantum algorithm that determines shortest path through time.
In an ontological model of quantum theory that is Bell-local, one can assume without loss of generality that the outcomes of measurements are determined deterministically by the ontic states (i.e. the values of the local hidden variables). The question I address in this talk is whether such determinism can always be assumed in a noncontextual ontological model of quantum theory, in particular whether it can be assumed for nonprojective measurements. While it is true that one can always represent a measurement by a deterministic response function by incorporating ancillary degrees of freedom into one's description (for instance those of the apparatus), I show that in moving to such a representation, one typically loses the warrant to apply the assumption of measurement noncontextuality. The implications for experimental tests of measurement noncontextuality will be discussed.
Traditionally, the focus on determining characteristic properties of quantum mechanics has been on properties such as entanglement. However, entanglement is a property of multiple systems. Another interesting question is to ask what properties are characteristic of single quantum systems. Two answers to this question are: 1.There is a continuous path of pure quantum states connecting any two quantum states [1], and, 2.Quantum mechanics is preparation noncontextual [2]. In this talk, I will discuss a link between these two answers to this question. In particular, I will establish some strict upper bounds on the maximum size of the set of quantum states that can be modelled in a preparation noncontextual, nonnegative theory and show that this set contains pure states that cannot be connected to any other pure state in the set. I will also discuss a common example of a preparation noncontextual model that allows negative values, namely, a discrete Wigner function, and establish necessary and sufficient conditions for bases of an arbitrary dimensional Hilbert space to have nonnegative Wigner functions, i.e., to admit a classical model. I will conclude with a discussion of some open problems. [1] L. Hardy, quant-ph/0101012v4 (2001). [2] R. W. Spekkens, Phys. Rev. A, 71, 052108 (2005)
Matt Palmer An explicit description of a physical system is necessarily written with respect to a particular reference frame. It is important to know how to adapt the description when a different, equally valid, reference frame is chosen. In the case of classical frames there is a well-defined covariance of the description. The question we want to address is: How can we extend this description of change of reference frame to the case where the frames are quantum objects? We study this problem within specific toy models, and approach it operationally. We define a procedure that will change the quantum reference frame with which a quantum system is described. We find this procedure induces decoherence in the system and is described by a non-unitary CP map, which is in interesting distinction to the reversible nature of the classical change of frame procedures.
Information theory provides a novel approach to study of the consequences of symmetry of dynamics which goes far beyond the traditional conservation laws and Noether's theorem. The conservation laws are not applicable to the dissipative and open systems. In fact, as we will show, even in the case of closed system dynamics if the state of system is not pure the conservation laws do not capture all the consequences of symmetry. Using information theoretic approach to this problem we introduce new quantities called asymmetry monotones, that if the system is closed they are constant of motion and otherwise, if the system is open, they are always non-increasing. We also explain how different results in quantum information theory can have non-trivial consequences about the symmetric dynamics of quantum systems.
Ken Wharton An analysis of the path-integral approach to quantum theory motivates the hypothesis that two experiments with the same classical action should have dual ontological descriptions. If correct, this hypothesis would not only constrain realistic interpretations of quantum theory, but would also act as a constructive principle, allowing any realistic model of one experiment to generate a corresponding model for its action-dual. Two pairs of action-dual experiments will be presented, including one experiment that violates the Bell inequality and yet is action-dual to a single particle. Demanding a consistent, realistic ontology leads to a highly restricted parameter space of possible interpretations.
Wheeler's delayed choice (WDC) is one of the "standard experiments in foundations". It aims at the puzzle of a photon simultaneously behaving as wave and particle. Bohr-Einstein debate on wave-particle duality prompted the introduction of Bohr's principle of complementarity, ---`.. the study of complementary phenomena demands mutually exclusive experimental arrangements" . In WDC experiment the mutually exclusive setups correspond to the presence or absence of a second beamsplitter in a Mach-Zehnder interferometer (MZI). A choice of the setup determines the observed behaviour. The delay ensures that the behaviour cannot be adapted before the photon enters MZI. Using WDC as an example, we show how replacement of classical selectors by quantum gates streamlines experiments and impacts on foundational questions. We demonstrate measurements of complementary phenomena with a single setup, where observed behaviour of the photon is chosen after it has been already detected. Spacelike separation of the setup components becomes redundant. The complementarity principle has to be reformulated --- instead of complementarity of experimental setups we now have complementarity of measurement results. Finally we present a quantum-controlled scheme of Bell-type experiments.
Quantum correlations cannot be given any classical explanation that would satisfy Bell's local causality assumption. This quite intriguing feature of quantum theory, known as quantum non-locality, has fascinated physicists for years, and has more recently been proven to have interesting applications in quantum information processing. To properly understand the power of quantum non-locality, it is important to be able to quantify it. One way for that is to compare it to other "non-local resources", such as classical communication or "non-local Popescu-Rohrlich (PR) boxes", and try to use these alternative resources to reproduce the quantum correlations. I will review known results on this subject, and present new simulations of multipartite non-local correlations.
To model statistical correlations that violate Bell inequalities (such as singlet state correlations), one must relax at least one of three physically plausible postulates: measurement independence (experimenters can freely choose measurement settings independently of any underlying variables describing the system); no-signalling (underlying marginal distributions for one observer cannot depend on the measurement setting of a distant observer), and determinism (all outcomes can be fully determined by the values of underlying variables). It will be shown that, for any given model, one may quantify the degrees of measurement dependence, signalling and indeterminism, by three numbers M, S and I. It will further be shown how the Bell-CHSH inequality may be generalised to a "relaxed" Bell inequality, of the form ++-<=B(I,S,M), where the upper bound is tight and ranges between 2 and 4. The usual Bell-CHSH inequality corresponds to I=S=M=0. More generally, the bound B(I,S,M) quantifies the necessary mutual tradeoff between I, S and M that is required to model a given violation of the Bell-CHSH inequality. Some information-theoretic implications will be briefly described, as well as a no-signalling deterministic model of the singlet state that allows up to 86% experimental free will.
Karim Thebault Canonical quantization techniques are generally considered to provide one of the most rigorous methodologies for passing from a classical to a quantum description of reality. For classical Hamiltonian systems with constraints a number of such techniques are available (i.e. gauge fixing, Dirac constraint quantization, BRST quantization and geometric quantization) but all are arguably equivalent to the quantization of an underlying reduced phase space that parameterizes the "true degrees of freedom" and displays a symplectic geometric structure. The philosophical coherence of making any ontological investment in such a space for the case of canonical general relativity will be questioned here. Further to this, the particular example of Dirac quantization will be critically examined. Under the Dirac scheme the classical constraint functions are interpreted as quantum constraint operators restricting the allowed state vectors. For canonical general relativity this leads to the Wheeler-de Witt equation and the infamous problem of time but, prima facie, seems to rely on our interpretation of the classical Poisson bracket algebra of constraints as the phase space realization of the theory's local symmetries (i.e. the group of space-time diffeomorphisms). As with the construction of an interpretively viable symplectic reduced phase space, this straight forward connection between constraints and local symmetry will be questioned for the case of GR. These issues cast doubt on the basis behind the derivation of the so-called wave function of the universe and give us some grounds for re-examining the entire canonical quantum gravity program as currently constituted.
Shan Gao We investigate the validity of the field explanation of the wave function by analyzing the mass and charge density distributions of a quantum system. According to protective measurement, a charged quantum system has effective mass and charge density distributed in space, proportional to the square of the absolute value of its wave function. If the wave function is a description of a physical field, then the mass and charge density will be distributed in space simultaneously for a charged quantum system, and thus there will exist a remarkable electrostatic self-interaction of its wave function, though the gravitational self-interaction is too weak to be detected presently. This not only violates the superposition principle of quantum mechanics but also contradicts experimental observations.