Preprint

Deriving robust noncontextuality inequalities from algebraic proofs of the Kochen-Specker theorem: the Peres-Mermin square

Authors:
Preprints and early-stage research may not have been peer reviewed yet.
To read the file of this research, you can request a copy directly from the authors.

Abstract

When a measurement is compatible with each of two other measurements that are incompatible with one another, these define distinct contexts for the given measurement. The Kochen-Specker theorem rules out models of quantum theory that satisfy a particular assumption of context-independence: that sharp measurements are assigned outcomes both deterministically and independently of their context. This notion of noncontextuality is not suited to a direct experimental test because realistic measurements always have some degree of unsharpness due to noise. However, a generalized notion of noncontextuality has been proposed that is applicable to any experimental procedure, including unsharp measurements, but also preparations as well, and for which a quantum no-go result still holds. According to this notion, the model need only specify a probability distribution over the outcomes of a measurement in a context-independent way, rather than specifying a particular outcome. It also implies novel constraints of context-independence for the representation of preparations. In this article, we describe a general technique for translating proofs of the Kochen-Specker theorem into inequality constraints on realistic experimental statistics, the violation of which witnesses the impossibility of a noncontextual model. We focus on algebraic state-independent proofs, using the Peres-Mermin square as our illustrative example. Our technique yields the necessary and sufficient conditions for a particular set of correlations (between the preparations and the measurements) to admit a noncontextual model. The inequalities thus derived are demonstrably robust to noise. We specify how experimental data must be processed in order to achieve a test of these inequalities. We also provide a criticism of prior proposals for experimental tests of noncontextuality based on the Peres-Mermin square.

No file available

Request Full-text Paper PDF

To read the file of this research,
you can request a copy directly from the authors.

ResearchGate has not been able to resolve any citations for this publication.
Article
Full-text available
Many experiments in the field of quantum foundations seek to adjudicate between quantum theory and speculative alternatives to it. To do so, one must analyse the experimental data in a manner that does not presume the correctness of the quantum formalism. The mathematical framework of generalized probabilistic theories (GPTs) provides a means of doing so. We present a scheme for determining what GPTs are consistent with a given set of experimental data. It proceeds by performing tomography on the preparations and measurements in a self-consistent manner, i.e., without presuming a prior characterization of either. We illustrate the scheme by analyzing experimental data for a large set of preparations and measurements on the polarization degree of freedom of a single photon. We find that the smallest and largest GPT state spaces consistent with our data are a pair of polytopes, each approximating the shape of the Bloch Sphere and having a volume ratio of 0.977±0.0010.977 \pm 0.001, which provides a quantitative bound on the scope for deviations from quantum theory. We also demonstrate how our scheme can be used to bound the extent to which nature might be more nonlocal than quantum theory predicts, as well as the extent to which it might be more or less contextual. Specifically, we find that the maximal violation of the CHSH inequality can be at most 1.3%±0.11.3\% \pm 0.1 greater than the quantum prediction, and the maximal violation of a particular noncontextuality inequality can not differ from the quantum prediction by more than this factor on either side.
Article
Full-text available
We provide a scheme for inferring causal relations from uncontrolled statistical data based on tools from computational algebraic geometry, in particular, the computation of Groebner bases. We focus on causal structures containing just two observed variables, each of which is binary. We consider the consequences of imposing different restrictions on the number and cardinality of latent variables and of assuming different functional dependences of the observed variables on the latent ones (in particular, the noise need not be additive). We provide an inductive scheme for classifying functional causal structures into distinct observational equivalence classes. For each observational equivalence class, we provide a procedure for deriving constraints on the joint distribution that are necessary and sufficient conditions for it to arise from a model in that class. We also demonstrate how this sort of approach provides a means of determining which causal parameters are identifiable and how to solve for these. Prospects for expanding the scope of our scheme, in particular to the problem of quantum causal inference, are also discussed.
Article
Full-text available
We experimentally show that nonlocality can be produced from single-particle contextuality by using two-particle correlations which do not violate any Bell inequality by themselves. This demonstrates that nonlocality can come from an a priori different simpler phenomenon, and connects contextuality and nonlocality, the two critical resources for, respectively, quantum computation and secure communication. From the perspective of quantum information, our experiment constitutes a proof of principle that quantum systems can be used simultaneously for both quantum computation and secure communication.
Article
Full-text available
Random access coding is an information task that has been extensively studied and found many applications in quantum information. In this scenario, Alice receives an n-bit string x, and wishes to encode x into a quantum state ρx, such that Bob, when receiving the state ρx, can choose any bit i ∈ [n] and recover the input bit xi with high probability. Here we study two variants: parity-oblivious random access codes (RACs), where we impose the cryptographic property that Bob cannot infer any information about the parity of any subset of bits of the input apart from the single bits xi; and even-parity-oblivious RACs, where Bob cannot infer any information about the parity of any even-size subset of bits of the input. In this paper, we provide the optimal bounds for parity-oblivious quantum RACs and show that they are asymptotically better than the optimal classical ones. Our results provide a large non-contextuality inequality violation and resolve the main open problem in a work of Spekkens et al (2009 Phys. Rev. Lett. 102 010401). Second, we provide the optimal bounds for even-parity-oblivious RACs by proving their equivalence to a non-local game and by providing tight bounds for the success probability of the non-local game via semidefinite programming. In the case of even-parity-oblivious RACs, the cryptographic property holds also in the device independent model. © 2016 IOP Publishing Ltd and Deutsche Physikalische Gesellschaft.
Article
Full-text available
A central question in quantum computation is to identify the resources that are responsible for quantum speed-up. Quantum contextuality has been recently shown to be a resource for quantum computation with magic states for odd-prime dimensional qudits and two-dimensional systems with real wave functions. The phenomenon of state-independent contextuality poses a priori an obstruction to characterizing the case of regular qubits, the fundamental building block of quantum computation. Here, we establish contextuality of magic states as a necessary resource for a large class of quantum computation schemes on qubits. We illustrate our result with a concrete scheme related to measurement-based quantum computation.
Article
Full-text available
The problem of causal inference is to determine if a given probability distribution on observed variables is compatible with some causal structure. The difficult case is when the structure includes latent variables. We here introduce the inflation technique for tackling this problem. An inflation of a causal structure is a new causal structure that can contain multiple copies of each of the original variables, but where the ancestry of each copy mirrors that of the original. For every distribution compatible with the original causal structure we identify a corresponding family of distributions, over certain subsets of inflation variables, which is compatible with the inflation structure. It follows that compatibility constraints at the inflation level can be translated to compatibility constraints at the level of the original causal structure; even if the former are weak, such as observable statistical independences implied by disjoint causal ancestry, the translated constraints can be strong. In particular, we can derive inequalities whose violation by a distribution witnesses that distribution's incompatibility with the causal structure (of which Bell inequalities and Pearl's instrumental inequality are prominent examples). We describe an algorithm for deriving all of the inequalities for the original causal structure that follow from ancestral independences in the inflation. Applied to an inflation of the Triangle scenario with binary variables, it yields inequalities that are stronger in at least some aspects than those obtainable by existing methods. We also describe an algorithm that derives a weaker set of inequalities but is much more efficient. Finally, we discuss which inflations
Article
Full-text available
One of the fundamental results in quantum foundations is the Kochen-Specker no-go theorem. For the quantum theory, the no-go theorem excludes the possibility of a class of hidden variable models where value attribution is context independent. Recently, the notion of contextuality has been generalized for different operational procedures and it has been shown that preparation contextuality of mixed quantum states can be a useful resource in an information-processing task called parity-oblivious multiplexing. Here, we introduce a new class of information processing tasks, namely d-level parity oblivious random access codes and obtain bounds on the success probabilities of performing such tasks in any preparation noncontextual theory. These bounds constitute noncontextuality inequalities for any value of d. For d=3, using a set of mutually asymmetric biased bases we show that the corresponding noncontextual bound is violated by quantum theory. We also show quantum violation of the inequalities for some other higher values of d. This reveals operational usefulness of preparation contextuality of higher level quantum systems.
Article
Full-text available
To make precise the sense in which nature fails to respect classical physics, one requires a formal notion of classicality. Ideally, such a notion should be defined operationally, so that it can be subject to direct experimental test, and it should be applicable in a wide variety of experimental scenarios so that it can cover the breadth of phenomena thought to defy classical understanding. Bell's notion of local causality fulfils the first criterion but not the second. The notion of noncontextuality fulfils the second criterion, but it is a long-standing question whether it can be made to fulfil the first. Previous attempts to test noncontextuality have all assumed idealizations that real experiments cannot achieve, namely noiseless measurements and exact operational equivalences. Here we show how to devise tests that are free of these idealizations. We perform a photonic implementation of one such test, ruling out noncontextual models with high confidence.
Article
Full-text available
Does the quantum state represent reality or our knowledge of reality? In making this distinction precise, we are led to a novel classification of hidden variable models of quantum theory. We show that representatives of each class can be found among existing constructions for two-dimensional Hilbert spaces. Our approach also provides a fruitful new perspective on arguments for the nonlocality and incompleteness of quantum theory. Specifically, we show that for models wherein the quantum state has the status of something real, the failure of locality can be established through an argument considerably more straightforward than Bell's theorem. The historical significance of this result becomes evident when one recognizes that the same reasoning is present in Einstein's preferred argument for incompleteness, which dates back to 1935. This fact suggests that Einstein was seeking not just any completion of quantum theory, but one wherein quantum states are solely representative of our knowledge. Our hypothesis is supported by an analysis of Einstein's attempts to clarify his views on quantum theory and the circumstance of his otherwise puzzling abandonment of an even simpler argument for incompleteness from 1927.
Article
Full-text available
It is a recent realization that many of the concepts and tools of causal discovery in machine learning are highly relevant to problems in quantum information, in particular quantum nonlocality. The crucial ingredient in the connection between both fields is the tool of Bayesian networks, a graphical model used to reason about probabilistic causation. Indeed, Bell's theorem concerns a particular kind of a Bayesian network and Bell inequalities are a special case of linear constraints following from such models. It is thus natural to look for generalized Bell scenarios involving more complex Bayesian networks. The problem, however, relies on the fact that such generalized scenarios are characterized by polynomial Bell inequalities and no current method is available to derive them beyond very simple cases. In this work, we make a significant step in that direction, providing a general and practical method for the derivation of polynomial Bell inequalities in a wide class of scenarios, applying it to a few cases of interest. We also show how our construction naturally gives rise to a notion of non-signalling in generalized networks.
Article
Full-text available
The Kochen-Specker theorem demonstrates that it is not possible to reproduce the predictions of quantum theory in terms of a hidden variable model where the hidden variables assign a value to every projector deterministically and noncontextually. A noncontextual value-assignment to a projector is one that does not depend on which other projectors - the context - are measured together with it. Using a generalization of the notion of noncontextuality that applies to both measurements and preparations, we propose a scheme for deriving inequalities that test whether a given set of experimental statistics is consistent with a noncontextual model. Unlike previous inequalities inspired by the Kochen-Specker theorem, we do not assume that the value-assignments are deterministic and therefore in the face of a violation of our inequality, the possibility of salvaging noncontextuality by abandoning determinism is no longer an option. Our approach is operational in the sense that it does not presume quantum theory: a violation of our inequality implies the impossibility of a noncontextual model for any operational theory that can account for the experimental observations, including any successor to quantum theory.
Article
Full-text available
In order to claim that one has experimentally tested whether a noncontextual ontological model could underlie certain measurement statistics in quantum theory, it is necessary to have a notion of noncontextuality that applies to unsharp measurements, i.e., those that can only be represented by positive operator-valued measures rather than projection-valued measures. This is because any realistic measurement necessarily has some nonvanishing amount of noise and therefore never achieves the ideal of sharpness. Assuming a generalized notion of noncontextuality that applies to arbitrary experimental procedures, it is shown that the outcome of a measurement depends deterministically on the ontic state of the system being measured if and only if the measurement is sharp. Hence for every unsharp measurement, its outcome necessarily has an indeterministic dependence on the ontic state. We defend this proposal against alternatives. In particular, we demonstrate why considerations parallel to Fine’s theorem do not challenge this conclusion.
Article
Full-text available
In many a traditional physics textbook, a quantum measurement is defined as a projective measurement represented by a Hermitian operator. In quantum information theory, however, the concept of a measurement is dealt with in complete generality and we are therefore forced to confront the more general notion of positive-operator valued measures (POVMs), which suffice to describe all measurements that can be implemented in quantum experiments. We study the (in)compatibility of such POVMs and show that quantum theory realizes all possible (in)compatibility relations among sets of POVMs. This is in contrast to the restricted case of projective measurements for which commutativity is essentially equivalent to compatibility. Our result therefore points out a fundamental feature with respect to the (in)compatibility of quantum observables that has no analog in the case of projective measurements.
Article
Full-text available
The fields of quantum non-locality in physics, and causal discovery in machine learning, both face the problem of deciding whether observed data is compatible with a presumed causal relationship between the variables (for example a local hidden variable model). Traditionally, Bell inequalities have been used to describe the restrictions imposed by causal structures on marginal distributions. However, some structures give rise to non-convex constraints on the accessible data, and it has recently been noted that linear inequalities on the observable entropies capture these situations more naturally. In this paper, we show the versatility of the entropic approach by greatly expanding the set of scenarios for which entropic constraints are known. For the first time, we treat Bell scenarios involving multiple parties and multiple observables per party. Going beyond the usual Bell setup, we exhibit inequalities for scenarios with extra conditional independence assumptions, as well as a limited amount of shared randomness between the parties. Many of our results are based on a geometric observation: Bell polytopes for two-outcome measurements can be naturally imbedded into the convex cone of attainable marginal entropies. Thus, any entropic inequality can be translated into one valid for probabilities. In some situations the converse also holds, which provides us with a rich source of candidate entropic inequalities.
Article
Full-text available
We present a number of observables-based proofs of the Kochen-Specker (KS) theorem based on the N-qubit Pauli group for N >= 4, thus adding to the proofs that have been presented earlier for the two- and three-qubit groups. These proofs have the attractive feature that they can be presented in the form of diagrams from which they are obvious by inspection. They are also irreducible in the sense that they cannot be reduced to smaller proofs by ignoring some subset of qubits and/or observables in them. A simple algorithm is given for transforming any observables-based KS proof into a large number of projectors-based KS proofs; if the observables-based proof has O observables, with each observable occurring in exactly two commuting sets and any two commuting sets having at most one observable in common, the number of associated projectors-based parity proofs is 2^O. We introduce symbols for the observables- and projectors-based KS proofs that capture their important features and also convey a feeling for the enormous variety of both these types of proofs within the N-qubit Pauli group. We discuss an infinite family of observables-based proofs, whose members apply to all numbers of qubits from two up, and show how it can be used to generate projectors-based KS proofs involving only nine bases (or experimental contexts) in any dimension of the form 2^N for N >= 2. Some implications of our results are discussed.
Article
Full-text available
So far, most of the literature on (quantum) contextuality and the Kochen-Specker theorem seems either to concern particular examples of contextuality, or be considered as quantum logic. Here, we develop a general formalism for contextuality scenarios based on the combinatorics of hypergraphs which significantly refines a similar recent approach by Cabello, Severini and Winter (CSW). In contrast to CSW, we explicitly include the normalization of probabilities, which gives us a much finer control over the various sets of probabilistic models like classical, quantum and generalized probabilistic. In particular, our framework specializes to (quantum) nonlocality in the case of Bell scenarios, which arise very naturally from the Foulis-Randall product. In the spirit of CSW, we find close relationships to various invariants studied in combinatorics. The recently proposed Local Orthogonality Principle turns out to be a special case of a general principle for contextuality scenarios related to the Shannon capacity of graphs. Our results imply that it is dominated by a low level of the Navascu\'es-Pironio-Ac\'in hierarchy of semidefinite programs, which we apply to contextuality scenarios. We hope that our approach may also serve as an introduction for combinatorialists to the subject of nonlocality and contextuality. Our conjectures on graphs whose Shannon capacity coincides with their independence number may be of particular interest.
Article
Full-text available
We present an experimental state-independent violation of an inequality for noncontextual theories on single particles. We show that 20 different single-photon states violate an inequality which involves correlations between results of sequential compatible measurements by at least 419 standard deviations. Our results show that, for any physical system, even for a single system, and independent of its state, there is a universal set of tests whose results do not admit a noncontextual interpretation. This sheds new light on the role of quantum mechanics in quantum information processing.
Article
Full-text available
The question of whether quantum phenomena can be explained by classical models with hidden variables is the subject of a long lasting debate. In 1964, Bell showed that certain types of classical models cannot explain the quantum mechanical predictions for specific states of distant particles. Along this line, some types of hidden variable models have been experimentally ruled out. An intuitive feature for classical models is non-contextuality: the property that any measurement has a value which is independent of other compatible measurements being carried out at the same time. However, the results of Kochen, Specker, and Bell show that non-contextuality is in conflict with quantum mechanics. The conflict resides in the structure of the theory and is independent of the properties of special states. It has been debated whether the Kochen-Specker theorem could be experimentally tested at all. Only recently, first tests of quantum contextuality have been proposed and undertaken with photons and neutrons. Yet these tests required the generation of special quantum states and left various loopholes open. Here, using trapped ions, we experimentally demonstrate a state-independent conflict with non-contextuality. The experiment is not subject to the detection loophole and we show that, despite imperfections and possible measurement disturbances, our results cannot be explained in non-contextual terms. Comment: 5 pages, 4 figures, one reference added
Article
Full-text available
In a noncontextual hidden variable model of quantum theory, hidden variables determine the outcomes of every measurement in a manner that is independent of how the measurement is implemented. Using a generalization of this notion to arbitrary operational theories and to preparation procedures, we demonstrate that a particular two-party information-processing task, "parity-oblivious multiplexing," is powered by contextuality in the sense that there is a limit to how well any theory described by a noncontextual hidden variable model can perform. This bound constitutes a "noncontextuality inequality" that is violated by quantum theory. We report an experimental violation of this inequality in good agreement with the quantum predictions. The experimental results also provide the first demonstration of 2-to-1 and 3-to-1 quantum random access codes.
Article
Full-text available
ing with credit is permitted. To copy otherwise, to republish, to post on servers, to redistribute to lists, or to use any component of this work in other works, requires prior specific permission and/or a fee. Permissions may be requested from Publications Dept, ACM Inc., 1515 Broadway, New York, NY 10036 USA, fax +1 (212) 869-0481, or permissions@acm.org. c fl 1996 by the Association for Computing Machinery, Inc. ACM Transactions on Mathematical Software, Vol.22, No. 4 (Dec. 1996), pp. 469-483, http://www.acm.org/pubs/toc/Abstracts/toms/235821.html 2 Delta Geometric algorithms, languages, and systems General Terms: Algorithms, Reliability Additional Key Words and Phrases: convex hull, Delaunay triagulation, Voronoi diagram, halfspace intersection 1. INTRODUCTION The convex hull of a set of points is the smallest convex set that contains the points. The convex hull is a fundamental construction for mathematics and computational geometry. For example, Boardman uses the convex h...
Article
Full-text available
An operational definition of contextuality is introduced which generalizes the standard notion in three ways: (1) it applies to arbitrary operational theories rather than just quantum theory, (2) it applies to arbitrary experimental procedures, rather than just sharp measurements, and (3) it applies to a broad class of ontological models of quantum theory, rather than just deterministic hidden variable models. We derive three no-go theorems for ontological models, each based on an assumption of noncontextuality for a different sort of experimental procedure; one for preparation procedures, another for unsharp measurement procedures (that is, measurement procedures associated with positive-operator valued measures), and a third for transformation procedures. All three proofs apply to two-dimensional Hilbert spaces, and are therefore stronger than traditional proofs of contextuality.
Article
Correlations in Bell and noncontextuality inequalities can be expressed as a positive linear combination of probabilities of events. Exclusive events can be represented as adjacent vertices of a graph, so correlations can be associated to a subgraph. We show that the maximum value of the correlations for classical, quantum, and more general theories is the independence number, the Lovász number, and the fractional packing number of this subgraph, respectively. We also show that, for any graph, there is always a correlation experiment such that the set of quantum probabilities is exactly the Grötschel-Lovász-Schrijver theta body. This identifies these combinatorial notions as fundamental physical objects and provides a method for singling out experiments with quantum correlations on demand.
Article
Quantum information enables dramatic new advantages for computation, such as Shor's factoring algorithm and quantum simulation algorithms. This naturally raises the fundamental question: what unique resources of the quantum world enable the advantages of quantum information? There have been many attempts to answer this question, with proposals including the hypothetical "quantum parallelism" some associate with quantum superposition, the necessity of large amounts of entanglement, and much ado about quantum discord. Unfortunately none of these proposals have proven satisfactory, and, in particular, none have helped resolve outstanding challenges confronting the field. For example, on the theoretical side, the most general classes of problems for which quantum algorithms might offer an exponential speed-up over classical algorithms are poorly understood. On the experimental side, there remain significant challenges to designing robust, large-scale quantum computers, and an important open problem is to determine the minimal physical requirements of a useful quantum computer. A framework identifying relevant resources for quantum computation should help clarify these issues, for example, by identifying new efficient simulation schemes for classes of quantum algorithms and by clarifying the trade-offs between the distinct physical requirements for achieving robust quantum computation. Here we establish that quantum contextuality, a generalization of nonlocality identified by Bell and Kochen-Specker almost 50 years ago, is a critical resource for quantum speed-up within the leading model for fault-tolerant quantum computation, magic state distillation. We prove our results by finding the exact value of the independence number in an infinite family of graphs, which is a particular instance of an NP-hard problem.
Article
A new modification of the double description method is proposed for constructing the skeleton of a polyhedral cone. Theoretical results and a numerical experiment show that the modification is considerably superior to the original algorithm in terms of speed.
Article
Although skeptical of the prohibitive power of no-hidden-variables theorems, John Bell was himself responsible for the two most important ones. I describe some recent versions of the lesser known of the two (familiar to experts as the "Kochen-Specker theorem") which have transparently simple proofs. One of the new versions can be converted without additional analysis into a powerful form of the very much better known "Bell's Theorem," thereby clarifying the conceptual link between these two results of Bell.
Article
In 1960, the mathematician Ernst Specker described a simple example of nonclassical correlations which he dramatized using a parable about a seer who sets an impossible prediction task to his daughter's suitors. We revisit this example here, using it as an entree to three central concepts in quantum foundations: contextuality, Bell-nonlocality, and complementarity. Specifically, we show that Specker's parable offers a narrative thread that weaves together a large number of results, including: the impossibility of measurement-noncontextual and outcome-deterministic ontological models of quantum theory (the Kochen-Specker theorem), in particular the proof of Klyachko; the impossibility of Bell-local models of quantum theory (Bell's theorem), especially the proofs by Mermin and Hardy; the impossibility of a preparation-noncontextual ontological model of quantum theory; and the existence of triples of positive operator valued measures (POVMs) that can be measured jointly pairwise but not triplewise. Along the way, several novel results are presented, including: a generalization of a theorem by Fine connecting the existence of a joint distribution over outcomes of counterfactual measurements to the existence of a noncontextual model; a generalization of Klyachko's proof of the Kochen-Specker theorem; a proof of the Kochen-Specker theorem in the style of Hardy's proof of Bell's theorem; a categorization of contextual and Bell-nonlocal correlations in terms of frustrated networks; a new inequality testing preparation noncontextuality; and lastly, some results on the joint measurability of POVMs and the question of whether these can be modeled noncontextually. Finally, we emphasize that Specker's parable provides a novel type of foil to quantum theory, challenging us to explain why the particular sort of contextuality and complementarity embodied therein does not arise in a quantum world.
Article
The demonstrations of von Neumann and others, that quantum mechanics does not permit a hidden variable interpretation, are reconsidered. It is shown that their essential axioms are unreasonable. It is urged that in further examination of this problem an interesting axiom would be that mutually distant systems are independent of one another.
Article
We present an experimental test of quantum contextuality by using two-photon product states. The experimental results show that the noncontextual hidden-variable theories are violated by nonentangled states in spite of the local hidden-variable theories can be violated or not. We find that the Hong-Ou-Mandel-type quantum interference effect causes the quantum contextuality.
Article
A new proof of the Kochen-Specker theorem uses 33 rays, instead of 117 in the original proof. If the number of dimensions is increased from 3 to 4, only 24 rays are needed.
Article
This paper presents a degenerate extreme point strategy for active set algorithms which classify linear constraints as either redundant or necessary. The strategy makes use of an efficient method for classifying constraints active at degenerate extreme points. Numerical results indicate that significant savings in the computational effort required to classify the constraints can be achieved.
Article
A number of new proofs of the Kochen-Specker theorem are given based on the observables of the three-qubit Pauli group. Each proof is presented in the form of a diagram from which it is obvious by inspection. Each of our observable-based proofs leads to a system of projectors and bases that generally yields a large number of "parity proofs" of the Kochen-Specker theorem. Some examples of such proofs are given and some of their applications are discussed.
Article
A convex polytopeP can be specified in two ways: as the convex hull of the vertex set V of P, or as the intersection of the set H of its facet-inducing halfspaces. The vertex enumeration problem is to compute V from H>. The facet enumeration problem is to compute H from V. These two problems are essentially equivalent under point/hyperplane duality. They are among the central computational problems in the theory of polytopes. It is open whether they can be solved in time polynomial in |H| + |V| and the dimension. In this paper we consider the main known classes of algorithms for solving these problems. We argue that they all have at least one of two weaknesses: inability to deal well with “degeneracies”, or, inability to control the sizes of intermediate results. We then introduce families of polytopes that exercise those weaknesses. Roughly speaking, fat-lattice or intricate polytopes cause algorithms with bad degeneracy handling to perform badly; dwarfed polytopes cause algorithms with bad intermediate size control to perform badly. We also present computational experience with trying to solve these problem on these hard polytopes, using various implementations of the main algorithms.
Conference Paper
Let Γ0 be a set of n halfspaces in Ed (where the dimension d is fixed) and let m be a parameter, n ≤ m ≤ n⌊d/2⌋. We show that Γ0 can be preprocessed in time and space O(m1+δ) (for any fixed δ > 0) so that given a vector c ∈ Ed and another set Γq of additional halfspaces, the function c · x can be optimized over the intersection of the halfspaces of Γ0 ∪ Γq in time O((n/m1/⌊d/2⌋ + |Γq|)log4d+3n). The algorithm uses a multidimensional version of Megiddo′s parametric search technique and recent results on halfspace range reporting. Applications include an improved algorithm for computing the extreme points of an n-point set P in Ed, improved output-sensitive computation of convex hulls and Voronoi diagrams, and a Monte-Carlo algorithm for estimating the volume of a convex polyhedron given by the set of its vertices (in a fixed dimension).
Article
Tight Bell inequalities are facets of Pitowsky's correlation polytope and are usually obtained from its extreme points by solving the hull problem. Here we present an alternative method based on a combination of algebraic results on extensions of measures and variable elimination methods, e.g., the Fourier-Motzkin method. Our method is shown to overcome some of the computational difficulties associated with the hull problem in some non-trivial cases. Moreover, it provides an explanation for the arising of only a finite number of families of Bell inequalities in measurement scenarios where one experimenter can choose between an arbitrary number of different measurements.
Article
Two distant systems can exhibit quantum nonlocality even though the correlations between them admit a local model. This nonlocality can be revealed by testing extra correlations between successive measurements on one of the systems which do not admit a noncontextual model whatever the reduced state of this system is. This shows that quantum contextuality plays a fundamental role in quantum nonlocality, and allows an experimental test of the Kochen-Specker with locality theorem.
Article
We show that there are Bell-type inequalities for noncontextual theories that are violated by any quantum state. One of these inequalities between the correlations of compatible measurements is particularly suitable for testing this state-independent violation in an experiment.
Article
In theory, quantum computers offer a means of solving problems that would be intractable on conventional computers. Assuming that a quantum computer could be constructed, it would in practice be required to function with noisy devices called 'gates'. These gates cause decoherence of the fragile quantum states that are central to the computer's operation. The goal of so-called 'fault-tolerant quantum computing' is therefore to compute accurately even when the error probability per gate (EPG) is high. Here we report a simple architecture for fault-tolerant quantum computing, providing evidence that accurate quantum computing is possible for EPGs as high as three per cent. Such EPGs have been experimentally demonstrated, but to avoid excessive resource overheads required by the necessary architecture, lower EPGs are needed. Assuming the availability of quantum resources comparable to the digital resources available in today's computers, we show that non-trivial quantum computations at EPGs of as high as one per cent could be implemented.
Article
. The double description method is a simple and useful algorithm for enumerating all extreme rays of a general polyhedral cone in IR d , despite the fact that we can hardly state any interesting theorems on its time and space complexities. In this paper, we reinvestigate this method, introduce some new ideas for efficient implementations, and show some empirical results indicating its practicality in solving highly degenerate problems. 1 Introduction A pair (A; R) of real matrices A and R is said to be a double description pair or simply a DD pair if the relationship Ax 0 if and only if x = R for some 0 holds. Clearly, for a pair (A; R) to be a DD pair, it is necessary that the column size of A is equal to the row size of R, say d. The term "double description" was introduced by Motzkin et al. [MRTT53], and it is quite natural in the sense that such a pair contains two different descriptions of the same object. Namely, the set P (A) represented by A as P (A) = fx 2 IR ...
Article
We consider a model of quantum computation in which the set of elementary operations is limited to Clifford unitaries, the creation of the state 0>|0>, and qubit measurement in the computational basis. In addition, we allow the creation of a one-qubit ancilla in a mixed state ρ\rho, which should be regarded as a parameter of the model. Our goal is to determine for which ρ\rho universal quantum computation (UQC) can be efficiently simulated. To answer this question, we construct purification protocols that consume several copies of ρ\rho and produce a single output qubit with higher polarization. The protocols allow one to increase the polarization only along certain ``magic'' directions. If the polarization of ρ\rho along a magic direction exceeds a threshold value (about 65%), the purification asymptotically yields a pure state, which we call a magic state. We show that the Clifford group operations combined with magic states preparation are sufficient for UQC. The connection of our results with the Gottesman-Knill theorem is discussed. Comment: 15 pages, 4 figures, revtex4
The problem of hidden variables in quantum mechanics
  • S Kochen
  • E Specker
S. Kochen and E. Specker, "The problem of hidden variables in quantum mechanics," J. App. Math. Mech. 17, 59 (1967).
Interview: John Bell, Particle Physicist
  • C Mann
  • R Crease
C. Mann and R. Crease, "Interview: John Bell, Particle Physicist," Omni Magazine 10, 84 (1988).
  • M F Pusey
M. F. Pusey, "The robust noncontextuality inequalities in the simplest scenario," arXiv:1506.04178 (2015).
Experimental Implementation of a Kochen-Specker Set of Quantum Tests
  • V Ambrosio
  • I Herbauts
  • E Amselem
  • E Nagali
  • M Bourennane
  • F Sciarrino
  • A Cabello
V. D'Ambrosio, I. Herbauts, E. Amselem, E. Nagali, M. Bourennane, F. Sciarrino, and A. Cabello, "Experimental Implementation of a Kochen-Specker Set of Quantum Tests," Phys. Rev. X 3, 011012 (2013).