Article

Linear Program for Testing Nonclassicality and an Open-Source Implementation

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

A well-motivated method for demonstrating that an experiment resists any classical explanation is to show that its statistics violate generalized noncontextuality. We here formulate this problem as a linear program and provide an open-source implementation of it which tests whether or not any given prepare-measure experiment is classically explainable in this sense. The input to the program is simply an arbitrary set of quantum states and an arbitrary set of quantum effects; the program then determines if the Born rule statistics generated by all pairs of these can be explained by a classical (noncontextual) model. If a classical model exists, it provides an explicit model. If it does not, then it computes the minimal amount of noise that must be added such that a model does exist, and then provides this model. We generalize all these results to arbitrary generalized probabilistic theories (and accessible fragments thereof) as well; indeed, our linear program is a test of simplex embeddability as introduced in Schmid et al. [PRX Quantum 2, 010331 (2021).] and generalized in Selby et al. [Phys. Rev. A 107, 062203 (2023).].

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... This is motivated by the fact that generalized noncontextuality in its standard form appears to be concerned with a fixed set of operational equivalences [14,16,25]. We first discuss the notion of in-principle indistinguishability [16,26,27] in Sec. IV B, and we argue in favor of replacing it, in this context, with a notion of indistinguishability with respect to a set of procedures, or indistinguishability with respect to all procedures on a system: indeed, we argue that in-principle indistinguishability is always fundamentally of this form. ...
... For example, two preparation procedures are only operationally equivalent if they give the same statistics for the outcomes of all possible measurements" (p. 20, [26]). Such a statement only makes sense with respect to an underlying notion of system, and we wish to make this assumption explicit in the absence of a consensus on the status of this assumption. ...
... This work did not discuss transformations, although these can also be described in a noncontextual ontological model [14]: it would be interesting to reflect further on possible insights drawn from the case of transformation noncontextuality. Practically speaking, it could be of interest to adapt existing algorithms for testing noncontextuality [25,26] to the general case of relative noncontextuality. With the exception of operationally noncontextual ontological models (Sec. ...
Article
Full-text available
Generalized noncontextuality is a well-studied notion of classicality that is applicable to a single system, as opposed to Bell locality. It relies on representing operationally indistinguishable procedures identically in an ontological model. However, operational indistinguishability depends on the set of operations that one may use to distinguish two procedures: we refer to this set as the reference of indistinguishability. Thus, whether or not a given experiment is noncontextual depends on the choice of reference. The choices of references appearing in the literature are seldom discussed, but typically relate to an implicit notion of a system underlying the experiment. This shift in perspective then begs the question: how should one define the extent of the system underlying an experiment? This question depends in part on one's beliefs in the universe being fundamentally continuous or fundamentally composite. To draw a coherent picture of the possible approaches one may use, we start by formulating a notion of noncontextuality for prepare-and-measure scenarios with respect to an explicit reference of indistinguishability. We investigate how verdicts of noncontextuality depend on this choice of reference, and in the process we introduce the concept of the noncontextuality graph of a prepare-and-measure scenario. We then discuss several proposals that one may appeal to in order to fix the reference to a specific choice, and we relate these proposals to different conceptions of what a system really is. With this discussion, we advocate that whether or not an experiment is noncontextual is not as absolute as often perceived. Published by the American Physical Society 2024
... While the left side of the arrow depicts an arbitrary shaped state and effects, the right side shows that the state and effect space transformation in order to embedding them into a simplex and hypercube respectively. The no-fill arrows show the result obtained in [65,66] that suitable noise in state viable of simplex-embeddability, while the gray arrow shows the result obtained in this paper namely the unsharpness in measurement is another viable parameter of simplex-embeddability. ...
... We demonstrated that any set of Alice's observables that is used to remotely prepare Bob's states is compatible if and only if Bob's local statistics can be described in terms of a simplex-embeddable GPT. In [65,66], it is shown that any preparation whose corresponding statistics cannot be described in terms of a simplex-embeddable GPT can be made simplex-embeddable by allowing dephasing noise to the states. In contrast to references [65,66], where prepare-measure scenario was considered, we argued that in our bipartite Bell scenario, Alice's unsharp measurements remotely prepares noisy states so that a simplex-embeddable GPT explanation exists (which is otherwise not simplex-embeddable). ...
... In [65,66], it is shown that any preparation whose corresponding statistics cannot be described in terms of a simplex-embeddable GPT can be made simplex-embeddable by allowing dephasing noise to the states. In contrast to references [65,66], where prepare-measure scenario was considered, we argued that in our bipartite Bell scenario, Alice's unsharp measurements remotely prepares noisy states so that a simplex-embeddable GPT explanation exists (which is otherwise not simplex-embeddable). Obviously, the degree of unsharpness parameter needs to be bounded. ...
Article
Full-text available
In a bipartite Bell scenario involving two local measurements per party and two outcomes per measurement, the measurement incompatibility in one wing is both necessary and sufficient to reveal the nonlocality. However, such a one-to-one correspondence fails when one of the observers performs more than two measurements. In such a scenario, the measurement incompatibility is necessary but not sufficient to reveal the nonlocality. In this work, within the formalism of general probabilistic theory (GPT), we demonstrate that unlike the nonlocality, the incompatibility of N arbitrary measurements in one wing is both necessary and sufficient for revealing the generalised contextuality for the sub-system in the other wing. Further, we formulate an elegant form of inequality for any GPT that is necessary for N-wise compatibility of N arbitrary observables. Moreover, we argue that any theory that violates the proposed inequality possess a degree of incompatibility that can be quantified through the amount of violation. We claim that it is the generalised contextuality that provides a restriction to the allowed degree of measurement incompatibility of any viable theory of nature and thereby super-select the quantum theory. Finally, we discuss the geometrical implications of our results.
... If it does admit of such a model, we say that the physical system is noncontextual. This can be checked with a linear program [27,28]. ...
... Now that we have a description of the different S τ and E τ , we can check whether the corresponding GPT systems are noncontextual. To do so, we use the linear program of [28] to determine whether the GPT systems are simplex-embeddable. Every system becomes noncontextual if a sufficient amount of noise is added. ...
... It was shown in [27] that non-contextuality can be established via a linear program which takes as an input a set of states and a set of effects, and determines whether their statistics, via a specified probability rule, can be reproduced by a classical (noncontextual) model. We make use of an open-source version of this program [28], which, moreover, in the case that the GPT is contextual, computes the amount of noise that must be added such that a noncontextual model can be fitted. More specifically, the linear program of [28] is a test of simplex embeddability: the property that a GPT's state space can be embedded into a classical probability simplex, with its effect space as the dual of that simplex. ...
Preprint
Characterizing the nonclassicality of quantum systems under minimal assumptions is an important challenge for quantum foundations and technology. Here we introduce a theory-independent method of process tomography and perform it on a superconducting qubit. We demonstrate its decoherence without assuming quantum theory or trusting the devices by modelling the system as a general probabilistic theory. We show that the superconducting system is initially well-described as a quantum bit, but that its realized state space contracts over time, which in quantum terminology indicates its loss of coherence. The system is initially nonclassical in the sense of generalized contextuality: it does not admit of a hidden-variable model where statistically indistinguishable preparations are represented by identical hidden-variable distributions. In finite time, the system becomes noncontextual and hence loses its nonclassicality. Moreover, we demonstrate in a theory-independent way that the system undergoes non-Markovian evolution at late times. Our results extend theory-independent tomography to time-evolving systems, and show how important dynamical physical phenomena can be experimentally monitored without assuming quantum theory.
... In earlier works [17,56], two related concepts were introduced: fragments of a GPT and accessible GPT fragments. The distinction arises because the states and effects in a fragment of a given GPT need not span the vector space of the GPT, in which case one has a choice: one can either represent the processes in the fragment as living within the space they span, or one can represent them in the (larger) vector space of the original GPT. ...
... We illustrate this theorem by a simple example: by demonstrating how the gbit is the shadow of a fragment of a simplicial theory. This example has also been discussed previously in the literature [45,56,76]. (Here, we give some extra details relative to those earlier presentations.) ...
... As one can check explicitly (e.g., via a linear program [56]), the original GPT fragment in this example is simplex-embeddable. So this constitutes an example where the shadow map takes a simplex-embeddable fragment to a simplex embeddable GPT, despite dramatically altering the fragment. ...
Preprint
Full-text available
It is commonly believed that failures of tomographic completeness undermine assessments of nonclassicality in noncontextuality experiments. In this work, we study how such failures can indeed lead to mistaken assessments of nonclassicality. We then show that proofs of the failure of noncontextuality are robust to a very broad class of failures of tomographic completeness, including the kinds of failures that are likely to occur in real experiments. We do so by showing that such proofs actually rely on a much weaker assumption that we term relative tomographic completeness: namely, that one's experimental procedures are tomographic for each other. Thus, the failure of noncontextuality can be established even with coarse-grained, effective, emergent, or virtual degrees of freedom. This also implies that the existence of a deeper theory of nature (beyond that being probed in one's experiment) does not in and of itself pose any challenge to proofs of nonclassicality. To prove these results, we first introduce a number of useful new concepts within the framework of generalized probabilistic theories (GPTs). Most notably, we introduce the notion of a GPT subsystem, generalizing a range of preexisting notions of subsystems (including those arising from tensor products, direct sums, decoherence processes, virtual encodings, and more). We also introduce the notion of a shadow of a GPT fragment, which captures the information lost when one's states and effects are unknowingly not tomographic for one another.
... Proving whether a general operational scenario is classical is not straightforward, but Ref. [39] presents a linear program for testing this in any arbitrary prepare-and-measure scenario and a ready-to-use implementation available in Mathematica and Python 1 . Formally, to check for the existence of such a noncontextual model the code instead checks for the condition of simplex-embeddability, an equivalent notion of nonclassicality devised for GPTs [40]. ...
... Since we have good and operationally motivated measures for contextuality -robustness against different kinds of noise -and we know that contextuality powers up the advantage behind parity-oblivious multiplexing tasks, it is natural to ask how these contextuality measures relate to the natural quantifier in POM tasks, i.e., the success probability itself. In this work, we turn to this problem by exploring the generality allowed by the code in Ref. [39] to define different kinds of noise and compare its behaviour to the success probability in POM tasks. We find a general expression relating the robustness of contextuality to depolarisation and the success rate for n-to-1 POM tasks and show how it can be used to recover a well-known bound on the number of bits optimally encoded in a qubit in this task. ...
... Often, assessing noncontextuality of an operational scenario requires the characterisation of a large ontic space 3 and the possible epistemic states and response functions over it that can explain the statistical data and conform to the assumption of nonlocality for all equivalence relations. To avoid these issues when assessing the nonclassicality of the examples investigated in this work, we will employ the linear program introduced in Ref. [39], which relies on results from the framework of generalised probabilistic theories (GPT) [23][24][25]. We will not enter into the details of how the operational scenario translates to the GPT description, but a summary of the linear program and the reasoning behind it is provided in Appendix A. The relevant aspect for the present work is that this linear program decides the existence of a simplex embedding for the GPT fragment (see Fig. 1), which in turn is equivalent to the existence of a noncontextual ontological model for the associated operational scenario [40]. ...
Preprint
Full-text available
Generalised contextuality is a well-motivated notion of nonclassicality powering up a myriad of quantum tasks, among which is the celebrated case of a two-party information processing task where classical information is compressed in a quantum channel, the parity-oblivious multiplexing (POM) task. The success rate is the standard quantifier of resourcefulness for this task, while robustness-based quantifiers are as operationally motivated and have known general properties. In this work, we leverage analytical and numerical tools to estimate robustness of contextuality in POM scenarios under different types of noise. We conclude that for the 3-to-1 case the robustness of contextuality to depolarisation, as well as a minimisation of the robustness of contextuality to dephasing over all bases, are good quantifiers for the nonclassical advantage of this scenario. Moreover, we obtain a general relation between robustness of contextuality to depolarisation and the success rate in any n-to-1 POM scenario and show how it can be used to bound the number of bits encoded in this task.
... In practice, this is achieved employing the notion of simulations of GPT systems from other GPT systems, as defined in [32]. We dedicate Section 5.2 to the discussion of the relation with other works that study contextuality in GPTs, namely [32,44,[47][48][49][50][51][52]. ...
... On simplex embeddability. Other influential sources of inspiration for our work are [47,51,52]. As we already mentioned in the introduction, [47] introduces the notion of simplex embeddability as the geometrical criterion to assess whether a GPT system is noncontextual. ...
... 7 The criterion is extended to accessible GPT fragments in [51], where the latter correspond to more general mathematical objects than GPTs and characterize generic prepare-and-measure experimental setups. The work of [52] provides an algorithm for testing contextuality in any prepare-and-measure scenario. In particular, if it exists, it returns an explicit noncontextual model for the scenario and, if not, it provides the minimum amount of (depolarizing) noise which would be required until a noncontextual model would become possible. ...
Preprint
Full-text available
In this work we present a hierarchy of generalized contextuality. It refines the traditional binary distinction between contextual and noncontextual theories, and facilitates their comparison based on how contextual they are. Our approach focuses on the contextuality of prepare-and-measure scenarios, described by general probabilistic theories (GPTs). To motivate the hierarchy, we define it as the resource ordering of a novel resource theory of GPT-contextuality. The building blocks of its free operations are classical systems and GPT-embeddings. The latter are simulations of one GPT by another, which preserve the operational equivalences and thus cannot generate contextuality. Noncontextual theories can be recovered as least elements in the hierarchy. We then define a new contextuality monotone, called classical excess, given by the minimal error of embedding a GPT within an infinite classical system. In addition, we show that the optimal success probability in the parity oblivious multiplexing game also defines a monotone in our resource theory. We end with a discussion of a potential interpretation of the non-free operations of the resource theory of GPT-contextuality as expressing a kind of information erasure.
... This simple geometric characterization of the notion of noncontextuality within the framework of GPTs has been useful for exploring the relationship between contextuality and incompatibility [26,27]. It has also been employed in the development of an open-source code for testing whether a given prepare-andmeasure scenario constitutes a proof of contextuality, and, moreover, for providing a quantification of how robust to depolarizing noise this proof is [28]. We will leverage this tool here in our study of the relationship between contextuality and coherence. ...
... In this work, we show that there are proofs of contextuality that can be obtained with any non-zero amount of coherence. We then modify the open-source linear program from Ref. [28] and use this to investigate the robustness of contextuality to the action of dephasing noise with respect to a fixed basis in a collection of prepare-and-measure scenarios. Finally, we find a proof of contextuality that is maximally robust to dephasing noise, in the sense that the experiment remains a proof of contextuality for any amount of decoherence apart from total decoherence. ...
... The existence of a simplex-embedding can be tested using the linear program introduced in Ref. [28]. In the case of quantum theory (which is the case we study here) the linear program simply takes as input a set of density matrices (representing the preparations) and a set of POVM elements (representing the measurement-outcome pairs), and checks whether or not these are simplex-embeddable, and consequently, whether the quantum scenario admits of a noncontextual ontological model. ...
Preprint
Full-text available
Generalized contextuality is a resource for a wide range of communication and information processing protocols. However, contextuality is not possible without coherence, and so can be destroyed by dephasing noise. Here, we explore the robustness of contextuality to partially dephasing noise in a scenario related to state discrimination (for which contextuality is a resource). We find that a vanishing amount of coherence is sufficient to demonstrate the failure of noncontextuality in this scenario, and we give a proof of contextuality that is robust to arbitrary amounts of partially dephasing noise. This is in stark contrast to partially depolarizing noise, which is always sufficient to destroy contextuality.
... These are constraints on the operational statistics that are satisfied if and only if a noncontextual ontological model of the statistics exists. (Another route is via searching for simplex-embeddings [4,52].) Such tests also provide a means of defining the classical-nonclassical boundary for the statistics in an arbitrary operational theory, including alternatives to quantum theory [53,54]. ...
Preprint
Full-text available
We provide the first systematic technique for deriving witnesses of contextuality in prepare-transform-measure scenarios. More specifically, we show how linear quantifier elimination can be used to compute a polytope of correlations consistent with generalized noncontextuality in such scenarios. This polytope is specified as a set of noncontextuality inequalities that are necessary and sufficient conditions for observed data in the scenario to admit of a classical explanation relative to any linear operational identities, if one ignores some constraints from diagram preservation. While including these latter constraints generally leads to tighter inequalities, it seems that nonlinear quantifier elimination would be required to systematically include them. We also provide a linear program which can certify the nonclassicality of a set of numerical data arising in a prepare-transform-measure experiment. We apply our results to get a robust noncontextuality inequality for transformations that can be violated within the stabilizer subtheory. Finally, we give a simple algorithm for computing all the linear operational identities holding among a given set of states, of transformations, or of measurements.
... A], or nonnegative quasiprobability representations [28][29][30]. The failure of noncontextual models to explain data can be robustly analysed [31][32][33][34], experimentally tested [35][36][37], and quantified [38][39][40]. ...
Preprint
Full-text available
In classical thermodynamics, heat must spontaneously flow from hot to cold systems. In quantum thermodynamics, the same law applies when considering multipartite product thermal states evolving unitarily. If initial correlations are present, anomalous heat flow can happen, temporarily making cold thermal states colder and hot thermal states hotter. Such effect can happen due to entanglement, but also because of classical randomness, hence lacking a direct connection with nonclassicality. In this work, we introduce scenarios where anomalous heat flow \emph{does} have a direct link to nonclassicality, defined to be the failure of noncontextual models to explain experimental data. We start by extending known noncontextuality inequalities to a setup where sequential transformations are considered. We then show a class of quantum prepare-transform-measure protocols, characterized by time intervals (0,τc)(0,\tau_c) for a given critical time τc\tau_c, where anomalous heat flow happens only if a noncontextuality inequality is violated. We also analyze a recent experiment from Micadei et. al. [Nat. Commun. 10, 2456 (2019)] and find the critical time τc\tau_c based on their experimental parameters. We conclude by investigating heat flow in the evolution of two qutrit systems, showing that our findings are not an artifact of using two-qubit systems.
... This test can be cast as an LP, see e.g. Selby et al. (2022). The set of preparation contextual quantum correlations can be bounded by hierarchies of SDPs. ...
Preprint
Semidefinite programs are convex optimisation problems involving a linear objective function and a domain of positive semidefinite matrices. Over the last two decades, they have become an indispensable tool in quantum information science. Many otherwise intractable fundamental and applied problems can be successfully approached by means of relaxation to a semidefinite program. Here, we review such methodology in the context of quantum correlations. We discuss how the core idea of semidefinite relaxations can be adapted for a variety of research topics in quantum correlations, including nonlocality, quantum communication, quantum networks, entanglement, and quantum cryptography.
... We can theoretically and experimentally analyse generalized contextuality using two approaches, (i) via the characterization of operational-probabilistic prepareand-measure contextuality scenarios with finite operational equivalences [37,[51][52][53][54][55], and (ii) via the characterization of a prepare-and-measure fragment of general probabilistic theories (GPTs) [56][57][58][59]. In Ref. [50], Spekkens defined generalized noncontextuality as a property of ontological models [60,61] explaining an operational-probabilistic theory (OPT). ...
Preprint
Full-text available
We analyse nonclassical resources in interference phenomena using generalized noncontextuality inequalities and basis-independent coherence witnesses. We use recently proposed inequalities that witness both resources within the same framework. We also propose, in view of previous contextual advantage results, a systematic way of applying these tools to characterize advantage provided by coherence and contextuality in quantum information protocols. We instantiate this methodology for the task of quantum interrogation, famously introduced by the paradigmatic bomb-testing interferometric experiment, showing contextual quantum advantage for such a task.
Article
Operational contextuality forms a rapidly developing subfield of quantum information theory. However, the characterization of the quantum mechanical entities that fuel the phenomenon has remained unknown with many partial results existing. Here, we present a resolution to this problem by connecting operational contextuality one-to-one with the no-broadcasting theorem. The connection works both on the level of full quantum theory and subtheories thereof. We demonstrate the connection in various relevant cases, showing especially that for quantum states the possibility of demonstrating contextuality is exactly characterized by non-commutativity, and for measurements this is done by a norm-1 property closely related to repeatability. Moreover, we show how techniques from broadcasting can be used to simplify known foundational results in contextuality.
Article
When should a given operational phenomenology be deemed to admit of a classical explanation? When it can be realized in a generalized-noncontextual ontological model. The case for answering the question in this fashion has been made in many previous works and motivates research on the notion of generalized noncontextuality. Many criticisms and concerns have been raised, however, regarding the definition of this notion and of the possibility of testing it experimentally. In this work, we respond to some of the most common of these objections. One such objection is that the existence of a classical record of which laboratory procedure was actually performed in each run of an experiment implies that the operational equivalence relations that are a necessary ingredient of any proof of the failure of noncontextuality do not hold, and consequently that conclusions of nonclassicality based on these equivalences are mistaken. We explain why this concern is unfounded. Our response affords the opportunity for us to clarify certain facts about generalized noncontextuality, such as the possibility of having proofs of its failure based on a consideration of the subsystem structure of composite systems. Similarly, through our responses to each of the other objections, we elucidate some under-appreciated facts about the notion of generalized noncontextuality and experimental tests thereof.
Article
Full-text available
Quantum superposition of high-dimensional states enables both computational speed-up and security in cryptographic protocols. However, the exponential complexity of tomographic processes makes certification of these properties a challenging task. In this work, we experimentally certify coherence witnesses tailored for quantum systems of increasing dimension using pairwise overlap measurements enabled by a six-mode universal photonic processor fabricated with a femtosecond laser writing technology. In particular, we show the effectiveness of the proposed coherence and dimension witnesses for qudits of dimensions up to 5. We also demonstrate advantage in a quantum interrogation task and show it is fueled by quantum contextuality. Our experimental results testify to the efficiency of this approach for the certification of quantum properties in programmable integrated photonic platforms.
Article
Full-text available
It is a fundamental prediction of quantum theory that states of physical systems are described by complex vectors or density operators on a Hilbert space. However, many experiments admit effective descriptions in terms of other state spaces, such as classical probability distributions or quantum systems with superselection rules. Which kind of effective statistics would allow us to experimentally falsify quantum theory as a fundamental description of nature? Here, we address this question by introducing a methodological principle that generalizes Spekkens’s notion of noncontextuality: Processes that are statistically indistinguishable in an effective theory should not require explanation by multiple distinguishable processes in a more fundamental theory. We formulate this principle in terms of linear embeddings and simulations of one probabilistic theory by another, show how this concept subsumes standard notions of contextuality, and prove a multitude of fundamental results on the exact and approximate embedding of theories (in particular, into quantum theory). We prove that only Jordan-algebraic state spaces are exactly embeddable into quantum theory, and show how results on Bell inequalities can be used for the certification of nonapproximate embeddability. From this, we propose an experimental test of quantum theory by probing single physical systems without assuming access to a tomographically complete set of procedures or calibration of the devices, arguably avoiding a significant loophole of earlier approaches.
Article
Full-text available
Quantum interference phenomena are widely viewed as posing a challenge to the classical worldview. Feynman even went so far as to proclaim that they are the only mystery and the basic peculiarity of quantum mechanics. Many have also argued that basic interference phenomena force us to accept a number of radical interpretational conclusions, including: that a photon is neither a particle nor a wave but rather a Jekyll-and-Hyde sort of entity that toggles between the two possibilities, that reality is observer-dependent, and that systems either do not have properties prior to measurements or else have properties that are subject to nonlocal or backwards-in-time causal influences. In this work, we show that such conclusions are not, in fact, forced on us by basic interference phenomena. We do so by describing an alternative to quantum theory, a statistical theory of a classical discrete field (the `toy field theory') that reproduces the relevant phenomenology of quantum interference while rejecting these radical interpretational claims. It also reproduces a number of related interference experiments that are thought to support these interpretational claims, such as the Elitzur-Vaidman bomb tester, Wheeler's delayed-choice experiment, and the quantum eraser experiment. The systems in the toy field theory are field modes, each of which possesses, at all times, b o t h a particle-like property (a discrete occupation number) and a wave-like property (a discrete phase). Although these two properties are jointly possessed, the theory stipulates that they cannot be jointly k n o w n . The phenomenology that is generally cited in favour of nonlocal or backwards-in-time causal influences ends up being explained in terms of i n f e r e n c e s about distant or past systems, and all that is observer-dependent is the observer's k n o w l e d g e of reality, not reality itself.
Article
Full-text available
A central problem in the study of resource theories is to find functions that are nonincreasing under resource conversions — termed monotones — in order to quantify resourcefulness. Various constructions of monotones appear in many different concrete resource theories. How general are these constructions? What are the necessary conditions on a resource theory for a given construction to be applicable? To answer these questions, we introduce a broad scheme for constructing monotones. It involves finding an order-preserving map from the preorder of resources of interest to a distinct preorder for which nontrivial monotones are previously known or can be more easily constructed; these monotones are then pulled back through the map. In one of the two main classes we study, the preorder of resources is mapped to a preorder of sets of resources, where the order relation is set inclusion, such that monotones can be defined via maximizing or minimizing the value of a function within these sets. In the other class, the preorder of resources is mapped to a preorder of tuples of resources, and one pulls back monotones that measure the amount of distinguishability of the different elements of the tuple (hence its information content). Monotones based on contractions arise naturally in the latter class, and, more surprisingly, so do weight and robustness measures. In addition to capturing many standard monotone constructions, our scheme also suggests significant generalizations of these. In order to properly capture the breadth of applicability of our results, we present them within a novel abstract framework for resource theories in which the notion of composition is independent of the types of the resources involved (i.e., whether they are states, channels, combs, etc.).
Article
Full-text available
We consider the problem of entanglement-assisted one-shot classical communication. In the zero-error regime, entanglement can increase the one-shot zero-error capacity of a family of classical channels following the strategy of Cubitt et al., Phys. Rev. Lett. 104, 230503 (2010). This strategy uses the Kochen-Specker theorem which is applicable only to projective measurements. As such, in the regime of noisy states and/or measurements, this strategy cannot increase the capacity. To accommodate generically noisy situations, we examine the one-shot success probability of sending a fixed number of classical messages. We show that preparation contextuality powers the quantum advantage in this task, increasing the one-shot success probability beyond its classical maximum. Our treatment extends beyond Cubitt et al. and includes, for example, the experimentally implemented protocol of Prevedel et al., Phys. Rev. Lett. 106, 110505 (2011). We then show a mapping between this communication task and a corresponding nonlocal game. This mapping generalizes the connection with pseudotelepathy games previously noted in the zero-error case. Finally, after motivating a constraint we term context-independent guessing , we show that contextuality witnessed by noise-robust noncontextuality inequalities obtained in R. Kunjwal, Quantum 4, 219 (2020), is sufficient for enhancing the one-shot success probability. This provides an operational meaning to these inequalities and the associated hypergraph invariant, the weighted max-predictability, introduced in R. Kunjwal, Quantum 3, 184 (2019). Our results show that the task of entanglement-assisted one-shot classical communication provides a fertile ground to study the interplay of the Kochen-Specker theorem, Spekkens contextuality, and Bell nonlocality.
Article
Full-text available
One of the most fundamental results in quantum information theory is that no measurement can perfectly discriminate between nonorthogonal quantum states. In this work, we investigate quantum advantages for discrimination tasks over noncontextual theories by considering a maximum-confidence measurement that unifies different strategies of quantum state discrimination, including minimum-error and unambiguous discrimination. We first show that maximum-confidence discrimination, as well as unambiguous discrimination, contains contextual advantages. We then consider a semi-device-independent scenario of certifying maximum-confidence measurement. The scenario naturally contains undetected events, making it a natural setting to explore maximum-confidence measurements. We show that the certified maximum confidence in quantum theory also contains contextual advantages. Our results establish how the advantages of quantum theory over a classical model may appear in a realistic scenario of a discrimination task.
Article
Full-text available
The generalized notion of noncontextuality provides an avenue to explore the fundamental departure of quantum theory from a classical explanation. Recently, extracting different forms of quantum advantages in various information processing tasks has received an upsurge of attention. In a recent work [Schmid and Spekkens, Phys. Rev. X 8, 011015 (2018)2160-330810.1103/PhysRevX.8.011015] it has been demonstrated that minimum error discrimination of two nonorthogonal pure quantum states entails contextual advantage when the states are supplied with equal prior probabilities. We generalize their work for arbitrary prior probabilities and extend the investigation for three arbitrary mirror-symmetric states. We show that the contextual advantage can be obtained for any value of prior probability when only two quantum states are present in the task. But surprisingly, for the case of three mirror-symmetric states, the contextual advantage can be revealed only for a restrictive range of prior probabilities with which the states are supplied. Further, we extend our study to examine the contextual advantage for maximum confidence state discrimination. We demonstrate that the prior probabilities of state preparation play a similar role in exploiting the quantum advantage in maximum confidence discrimination.
Article
Full-text available
Starting from arbitrary sets of quantum states and measurements, referred to as the prepare-and-measure scenario, an operationally noncontextual ontological model of the quantum statistics associated with the prepare-and-measure scenario is constructed. The operationally noncontextual ontological model coincides with standard Spekkens noncontextual ontological models for tomographically complete scenarios, while covering the non-tomographically complete case with a new notion of a reduced space, which we motivate following the guiding principles of noncontextuality. A mathematical criterion, called unit separability , is formulated as the relevant classicality criterion – the name is inspired by the usual notion of quantum state separability. Using this criterion, we derive a new upper bound on the cardinality of the ontic space. Then, we recast the unit separability criterion as a (possibly infinite) set of linear constraints, from which we obtain two separate hierarchies of algorithmic tests to witness the non-classicality or certify the classicality of a scenario. Finally, we reformulate our results in the framework of generalized probabilistic theories and discuss the implications for simplex-embeddability in such theories.
Article
Full-text available
Recently, Schmid and Spekkens studied the quantum contextuality in terms of state discrimination. By dealing with the minimum error discrimination of two quantum states with identical prior probabilities, they reported that quantum contextual advantage exists. Meanwhile, if one notes a striking observation that the selection of prior probability can affect the quantum properties of the system, it is necessary to verify whether the quantum contextual advantage depends on the prior probabilities of the given states. In this paper, we consider the minimum error discrimination of two states with arbitrary prior probabilities, in which both states are pure or mixed. We show that the quantum contextual advantage in state discrimination may depend on the prior probabilities of the given states. In particular, even though the quantum contextual advantage always exists in the state discrimination of two nonorthogonal pure states with nonzero prior probabilities, the quantum contextual advantage depends on prior probabilities in the state discrimination of two mixed states.
Article
Full-text available
Quantum Darwinism proposes that the proliferation of redundant information plays a major role in the emergence of objectivity out of the quantum world. Is this kind of objectivity necessarily classical? We show that if one takes Spekkens’s notion of noncontextuality as the notion of classicality and the approach of Brandão, Piani, and Horodecki to quantum Darwinism, the answer to the above question is “‘yes,” if the environment encodes the proliferated information sufficiently well. Moreover, we propose a threshold on this encoding, above which one can unambiguously say that classical objectivity has emerged under quantum Darwinism.
Article
Full-text available
The predictions of quantum theory resist generalised noncontextual explanations. In addition to the foundational relevance of this fact, the particular extent to which quantum theory violates noncontex-tuality limits available quantum advantage in communication and information processing. In the first part of this work, we formally define contextuality scenarios via prepare-and-measure experiments, along with the polytope of general contextual behaviours containing the set of quantum contextual behaviours. This framework allows us to recover several properties of set of quantum behaviours in these scenarios , including contextuality scenarios and associated noncontextuality inequalities that require for their violation the individual quantum preparation and measurement procedures to be mixed states and unsharp measurements. With the framework in place, we formulate novel semidefinite programming relaxations for bounding these sets of quantum contex-tual behaviours. Most significantly, to circumvent the inadequacy of pure states and projective measurements in contextuality scenarios, we present a novel unitary operator based semidefinite relaxation technique. We demonstrate the efficacy of these relaxations by obtaining tight upper bounds on the quantum violation of several noncontextuality inequalities and identifying novel maximally contextual quantum strategies. To further illustrate the versatility of these relaxations, we demonstrate Anubhav Chaturvedi: anubhav.chaturvedi@research.iiit.ac.in monogamy of preparation contextuality in a tripartite setting, and present a secure semi-device independent quantum key distribution scheme powered by quantum advantage in parity oblivious random access codes.
Article
Full-text available
These lecture notes provide a basic introduction to the framework of generalized probabilistic theories (GPTs) and a sketch of a reconstruction of quantum theory (QT) from simple operational principles. To build some intuition for how physics could be even more general than quantum, I present two conceivable phenomena beyond QT: superstrong nonlocality and higher-order interference. Then I introduce the framework of GPTs, generalizing both quantum and classical probability theory. Finally, I summarize a reconstruction of QT from the principles of Tomographic Locality, Continuous Reversibility, and the Subspace Axiom. In particular, I show why a quantum bit is described by a Bloch ball, why it is three-dimensional, and how one obtains the complex numbers and operators of the usual representation of QT.
Article
Full-text available
Generalized contextuality refers to our inability of explaining measurement statistics using a context-independent probabilistic and ontological model. On the other hand, measurement statistics can also be modeled using the framework of general probabilistic theories (GPTs). Here, starting from a construction of GPTs based on a Gleason-type theorem, we fully characterize these structures with respect to their permission and rejection of generalized (non)contextual ontological models. It follows that in any GPT construction the three insistence of (i) the no-restriction hypothesis, (ii) the ontological noncontextuality, and (iii) multiple nonrefinable measurements for any fixed number of outcomes are incompatible. Hence, any GPT satisfying the no-restriction hypothesis is ontologically noncontextual if and only if it is simplicial. We give a detailed discussion of GPTs for which the no-restriction hypothesis is violated, and show that they can always be considered as subtheories (subGPTs) of GPTs satisfying the hypothesis. It is shown that subGPTs are ontologically noncontextual if and only if they are subtheories of simplicial GPTs of the same dimensionality. Finally, we establish as a corollary the necessary and sufficient condition for a single resourceful measurement or state to promote an ontologically noncontextual (i.e., classical) general probabilistic theory to an ontologically contextual (i.e., nonclassical) one under the no-restriction hypothesis.
Article
Full-text available
To make precise the sense in which the operational predictions of quantum theory conflict with a classical worldview, it is necessary to articulate a notion of classicality within an operational framework. A widely applicable notion of classicality of this sort is whether or not the predictions of a given operational theory can be explained by a generalized-noncontextual ontological model. We here explore what notion of classicality this implies for the generalized probabilistic theory (GPT) that arises from a given operational theory, focusing on prepare-measure scenarios. We first show that, when mapping an operational theory to a GPT by quotienting relative to operational equivalences, the constraint of explainability by a generalized-noncontextual ontological model is mapped to the constraint of explainability by an ontological model. We then show that, under the additional assumption that the ontic state space is of finite cardinality, this constraint on the GPT can be expressed as a geometric condition which we term simplex embeddability. Whereas the traditional notion of classicality for a GPT is that its state space be a simplex and its effect space be the dual of this simplex, simplex embeddability merely requires that its state space be embeddable in a simplex and its effect space in the dual of that simplex. We argue that simplex embeddability constitutes an intuitive and freestanding notion of classicality for GPTs. Our result also has applications to witnessing nonclassicality in prepare-measure experiments.
Article
Full-text available
A number of noncontextual models exist which reproduce different subsets of quantum theory and admit a no-cloning theorem. Therefore, if one chooses noncontextuality as one's notion of classicality, no-cloning cannot be regarded as a nonclassical phenomenon. In this work, however, we show that there are aspects of the phenomenology of quantum state cloning which are indeed nonclassical according to this principle. Specifically, we focus on the task of state-dependent cloning and prove that the optimal cloning fidelity predicted by quantum theory cannot be explained by any noncontextual model. We derive a noise-robust noncontextuality inequality whose violation by quantum theory not only implies a quantum advantage for the task of state-dependent cloning relative to noncontextual models, but also provides an experimental witness of noncontextuality.
Article
Full-text available
Within the framework of generalized noncontextuality, we introduce a general technique for systematically deriving noncontextuality inequalities for any experiment involving finitely many preparations and finitely many measurements, each of which has a finite number of outcomes. Given any fixed sets of operational equivalences among the preparations and among the measurements as input, the algorithm returns a set of noncontextuality inequalities whose satisfaction is necessary and sufficient for a set of operational data to admit of a noncontextual model. Additionally, we show that the space of noncontextual data tables always defines a polytope. Finally, we provide a computationally efficient means for testing whether any set of numerical data admits of a noncontextual model, with respect to any fixed operational equivalences. Together, these techniques provide complete methods for characterizing arbitrary noncontextuality scenarios, both in theory and in practice.
Article
Full-text available
Many experiments in the field of quantum foundations seek to adjudicate between quantum theory and speculative alternatives to it. To do so, one must analyse the experimental data in a manner that does not presume the correctness of the quantum formalism. The mathematical framework of generalized probabilistic theories (GPTs) provides a means of doing so. We present a scheme for determining what GPTs are consistent with a given set of experimental data. It proceeds by performing tomography on the preparations and measurements in a self-consistent manner, i.e., without presuming a prior characterization of either. We illustrate the scheme by analyzing experimental data for a large set of preparations and measurements on the polarization degree of freedom of a single photon. We find that the smallest and largest GPT state spaces consistent with our data are a pair of polytopes, each approximating the shape of the Bloch Sphere and having a volume ratio of 0.977±0.0010.977 \pm 0.001, which provides a quantitative bound on the scope for deviations from quantum theory. We also demonstrate how our scheme can be used to bound the extent to which nature might be more nonlocal than quantum theory predicts, as well as the extent to which it might be more or less contextual. Specifically, we find that the maximal violation of the CHSH inequality can be at most 1.3%±0.11.3\% \pm 0.1 greater than the quantum prediction, and the maximal violation of a particular noncontextuality inequality can not differ from the quantum prediction by more than this factor on either side.
Article
Full-text available
The Kochen-Specker theorem rules out models of quantum theory wherein sharp measurements are assigned outcomes deterministically and independently of context. This notion of noncontextuality is not applicable to experimental measurements because these are never free of noise and thus never truly sharp. For unsharp measurements, therefore, one must drop the requirement that an outcome is assigned deterministically in the model and merely require that the distribution over outcomes that is assigned in the model is context-independent. By demanding context-independence in the representation of preparations as well, one obtains a generalized principle of noncontextuality that also supports a quantum no-go theorem. Several recent works have shown how to derive inequalities on experimental data which, if violated, demonstrate the impossibility of finding a generalized-noncontextual model of this data. That is, these inequalities do not presume quantum theory and, in particular, they make sense without requiring a notion of "sharpness" of measurements in any operational theory describing the experiment. We here describe a technique for deriving such inequalities starting from arbitrary proofs of the Kochen-Specker theorem. It extends significantly previous techniques, which worked only for logical proofs (based on uncolourable orthogonality graphs), to the case of statistical proofs (where the graphs are colourable, but the colourings cannot explain the quantum statistics). The derived inequalities are robust to noise.
Article
Full-text available
We study a family of one-way communication problem based on orthogonal graphs of state independent contextuality (SIC) sets of vectors. First, we reveal that if the dimension of communicated system is bounded, sending quantum systems, which correspond to the vectors of each SIC set, is advantageous over classical communication. As the main result, we propose a general framework of oblivious communication and show that under certain oblivious condition for the same communication task, the quantum strategy corresponds to each Kochen-Specker set outperforms classical communication of arbitrary large dimensional system. The quantum protocol allows the parties to accomplish the communication task perfectly. In the case of classical communication, we present a general method to obtain the best possible strategy. The optimal classical protocols for the simplest SIC sets in dimension three and four are explicitly derived. Our results provide a fully operational significance to single system SIC and open up the possibility of quantum information processing based on that.
Article
Full-text available
Finding quantitative aspects of quantum phenomena which cannot be explained by any classical model has foundational importance for understanding the boundary between classical and quantum theory. It also has practical significance for identifying information processing tasks for which those phenomena provide a quantum advantage. Using the framework of generalized noncontextuality as our notion of classicality, we find one such nonclassical feature within the phenomenology of quantum minimum error state discrimination. Namely, we identify quantitative limits on the success probability for minimum error state discrimination in any experiment described by a noncontextual ontological model. These constraints constitute noncontextuality inequalities that are violated by quantum theory, and this violation implies a quantum advantage for state discrimination relative to noncontextual models. Furthermore, our noncontextuality inequalities are robust to noise and are operationally formulated, so that any experimental violation of the inequalities is a witness of contextuality, independently of the validity of quantum theory. Along the way, we introduce new methods for analyzing noncontextuality scenarios, and demonstrate a tight connection between our minimum error state discrimination scenario and a Bell scenario.
Article
Full-text available
When a measurement is compatible with each of two other measurements that are incompatible with one another, these define distinct contexts for the given measurement. The Kochen-Specker theorem rules out models of quantum theory that satisfy a particular assumption of context-independence: that sharp measurements are assigned outcomes both deterministically and independently of their context. This notion of noncontextuality is not suited to a direct experimental test because realistic measurements always have some degree of unsharpness due to noise. However, a generalized notion of noncontextuality has been proposed that is applicable to any experimental procedure, including unsharp measurements, but also preparations as well, and for which a quantum no-go result still holds. According to this notion, the model need only specify a probability distribution over the outcomes of a measurement in a context-independent way, rather than specifying a particular outcome. It also implies novel constraints of context-independence for the representation of preparations. In this article, we describe a general technique for translating proofs of the Kochen-Specker theorem into inequality constraints on realistic experimental statistics, the violation of which witnesses the impossibility of a noncontextual model. We focus on algebraic state-independent proofs, using the Peres-Mermin square as our illustrative example. Our technique yields the necessary and sufficient conditions for a particular set of correlations (between the preparations and the measurements) to admit a noncontextual model. The inequalities thus derived are demonstrably robust to noise. We specify how experimental data must be processed in order to achieve a test of these inequalities. We also provide a criticism of prior proposals for experimental tests of noncontextuality based on the Peres-Mermin square.
Article
Full-text available
Random access coding is an information task that has been extensively studied and found many applications in quantum information. In this scenario, Alice receives an n-bit string x, and wishes to encode x into a quantum state ρx, such that Bob, when receiving the state ρx, can choose any bit i ∈ [n] and recover the input bit xi with high probability. Here we study two variants: parity-oblivious random access codes (RACs), where we impose the cryptographic property that Bob cannot infer any information about the parity of any subset of bits of the input apart from the single bits xi; and even-parity-oblivious RACs, where Bob cannot infer any information about the parity of any even-size subset of bits of the input. In this paper, we provide the optimal bounds for parity-oblivious quantum RACs and show that they are asymptotically better than the optimal classical ones. Our results provide a large non-contextuality inequality violation and resolve the main open problem in a work of Spekkens et al (2009 Phys. Rev. Lett. 102 010401). Second, we provide the optimal bounds for even-parity-oblivious RACs by proving their equivalence to a non-local game and by providing tight bounds for the success probability of the non-local game via semidefinite programming. In the case of even-parity-oblivious RACs, the cryptographic property holds also in the device independent model. © 2016 IOP Publishing Ltd and Deutsche Physikalische Gesellschaft.
Article
Full-text available
One of the fundamental results in quantum foundations is the Kochen-Specker no-go theorem. For the quantum theory, the no-go theorem excludes the possibility of a class of hidden variable models where value attribution is context independent. Recently, the notion of contextuality has been generalized for different operational procedures and it has been shown that preparation contextuality of mixed quantum states can be a useful resource in an information-processing task called parity-oblivious multiplexing. Here, we introduce a new class of information processing tasks, namely d-level parity oblivious random access codes and obtain bounds on the success probabilities of performing such tasks in any preparation noncontextual theory. These bounds constitute noncontextuality inequalities for any value of d. For d=3, using a set of mutually asymmetric biased bases we show that the corresponding noncontextual bound is violated by quantum theory. We also show quantum violation of the inequalities for some other higher values of d. This reveals operational usefulness of preparation contextuality of higher level quantum systems.
Article
Full-text available
To make precise the sense in which nature fails to respect classical physics, one requires a formal notion of classicality. Ideally, such a notion should be defined operationally, so that it can be subject to direct experimental test, and it should be applicable in a wide variety of experimental scenarios so that it can cover the breadth of phenomena thought to defy classical understanding. Bell's notion of local causality fulfils the first criterion but not the second. The notion of noncontextuality fulfils the second criterion, but it is a long-standing question whether it can be made to fulfil the first. Previous attempts to test noncontextuality have all assumed idealizations that real experiments cannot achieve, namely noiseless measurements and exact operational equivalences. Here we show how to devise tests that are free of these idealizations. We perform a photonic implementation of one such test, ruling out noncontextual models with high confidence.
Article
Full-text available
The Kochen-Specker theorem demonstrates that it is not possible to reproduce the predictions of quantum theory in terms of a hidden variable model where the hidden variables assign a value to every projector deterministically and noncontextually. A noncontextual value-assignment to a projector is one that does not depend on which other projectors - the context - are measured together with it. Using a generalization of the notion of noncontextuality that applies to both measurements and preparations, we propose a scheme for deriving inequalities that test whether a given set of experimental statistics is consistent with a noncontextual model. Unlike previous inequalities inspired by the Kochen-Specker theorem, we do not assume that the value-assignments are deterministic and therefore in the face of a violation of our inequality, the possibility of salvaging noncontextuality by abandoning determinism is no longer an option. Our approach is operational in the sense that it does not presume quantum theory: a violation of our inequality implies the impossibility of a noncontextual model for any operational theory that can account for the experimental observations, including any successor to quantum theory.
Article
Full-text available
We present a toy theory that is based on a simple principle: the number of questions about the physical state of a system that are answered must always be equal to the number that are unanswered in a state of maximal knowledge. Many quantum phenomena are found to have analogues within this toy theory. These include the noncommutativity of measurements, interference, the multiplicity of convex decompositions of a mixed state, the impossibility of discriminating nonorthogonal states, the impossibility of a universal state inverter, the distinction between bipartite and tripartite entanglement, the monogamy of pure entanglement, no cloning, no broadcasting, remote steering, teleportation, entanglement swapping, dense coding, mutually unbiased bases, and many others. The diversity and quality of these analogies is taken as evidence for the view that quantum states are states of incomplete knowledge rather than states of reality. A consideration of the phenomena that the toy theory fails to reproduce, notably, violations of Bell inequalities and the existence of a Kochen-Specker theorem, provides clues for how to proceed with this research program.
Article
Full-text available
How would the world appear to us if its ontology was that of classical mechanics but every agent faced a restriction on how much they could come to know about the classical state? We show that in most respects, it would appear to us as quantum. The statistical theory of classical mechanics, which specifies how probability distributions over phase space evolve under Hamiltonian evolution and under measurements, is typically called Liouville mechanics, so the theory we explore here is Liouville mechanics with an epistemic restriction. The particular epistemic restriction we posit as our foundational postulate specifies two constraints. The first constraint is a classical analogue of Heisenberg's uncertainty principle -- the second-order moments of position and momentum defined by the phase-space distribution that characterizes an agent's knowledge are required to satisfy the same constraints as are satisfied by the moments of position and momentum observables for a quantum state. The second constraint is that the distribution should have maximal entropy for the given moments. Starting from this postulate, we derive the allowed preparations, measurements and transformations and demonstrate that they are isomorphic to those allowed in Gaussian quantum mechanics and generate the same experimental statistics. We argue that this reconstruction of Gaussian quantum mechanics constitutes additional evidence in favour of a research program wherein quantum states are interpreted as states of incomplete knowledge, and that the phenomena that do not arise in Gaussian quantum mechanics provide the best clues for how one might reconstruct the full quantum theory.
Article
Full-text available
We investigate general probabilistic theories in which every mixed state has a purification, unique up to reversible channels on the purifying system. We show that the purification principle is equivalent to the existence of a reversible realization of every physical process, namely that every physical process can be regarded as arising from a reversible interaction of the system with an environment, which is eventually discarded. From the purification principle we also construct an isomorphism between transformations and bipartite states that possesses all structural properties of the Choi Jamiolkowski isomorphism in quantum mechanics. Such an isomorphism allows one to prove most of the basic features of quantum mechanics, like e.g. existence of pure bipartite states giving perfect correlations in independent experiments, no information without disturbance, no joint discrimination of all pure states, no cloning, teleportation, no programming, no bit commitment, complementarity between correctable channels and deletion channels, characterization of entanglement-breaking channels as measure-and-prepare channels, and others, without resorting to the mathematical framework of Hilbert spaces.
Article
Full-text available
In a noncontextual hidden variable model of quantum theory, hidden variables determine the outcomes of every measurement in a manner that is independent of how the measurement is implemented. Using a generalization of this notion to arbitrary operational theories and to preparation procedures, we demonstrate that a particular two-party information-processing task, "parity-oblivious multiplexing," is powered by contextuality in the sense that there is a limit to how well any theory described by a noncontextual hidden variable model can perform. This bound constitutes a "noncontextuality inequality" that is violated by quantum theory. We report an experimental violation of this inequality in good agreement with the quantum predictions. The experimental results also provide the first demonstration of 2-to-1 and 3-to-1 quantum random access codes.
Article
Full-text available
Two notions of nonclassicality that have been investigated intensively are: (i) negativity, that is, the need to posit negative values when representing quantum states by quasiprobability distributions such as the Wigner representation, and (ii) contextuality, that is, the impossibility of a noncontextual hidden variable model of quantum theory. Although both of these notions were meant to characterize the conditions under which a classical explanation cannot be provided, we demonstrate that they prove inadequate to the task and we argue for a particular way of generalizing and revising them. With the refined version of each in hand, it becomes apparent that they are in fact one and the same. We also demonstrate the impossibility of noncontextuality or non-negativity in quantum theory with a novel proof that is symmetric in its treatment of measurements and preparations.
Article
Full-text available
An operational definition of contextuality is introduced which generalizes the standard notion in three ways: (1) it applies to arbitrary operational theories rather than just quantum theory, (2) it applies to arbitrary experimental procedures, rather than just sharp measurements, and (3) it applies to a broad class of ontological models of quantum theory, rather than just deterministic hidden variable models. We derive three no-go theorems for ontological models, each based on an assumption of noncontextuality for a different sort of experimental procedure; one for preparation procedures, another for unsharp measurement procedures (that is, measurement procedures associated with positive-operator valued measures), and a third for transformation procedures. All three proofs apply to two-dimensional Hilbert spaces, and are therefore stronger than traditional proofs of contextuality.
Article
Generalized contextuality is a resource for a wide range of communication and information processing protocols. However, contextuality is not possible without coherence, and so can be destroyed by dephasing noise. Here, we explore the robustness of contextuality to partially dephasing noise in a scenario related to state discrimination (for which contextuality is a resource). We find that a vanishing amount of coherence is sufficient to demonstrate the failure of noncontextuality in this scenario, and we give a proof of contextuality that is robust to arbitrary amounts of partially dephasing noise. This is in stark contrast to partially depolarizing noise, which is always sufficient to destroy contextuality.
Article
We introduce the framework of general probabilistic theories (GPTs for short). GPTs are a class of operational theories that generalize both finite-dimensional classical and quantum theory, but they also include other, more exotic theories, such as the boxworld theory containing Popescu–Rohrlich boxes. We provide in-depth explanations of the basic concepts and elements of the framework of GPTs, and we also prove several well-known results. The review is self-contained and it is meant to provide the reader with consistent introduction to GPTs. Our tools mainly include convex geometry, but we also introduce diagrammatic notation and we often express equations via diagrams.
Article
The existence of incompatible measurements is often believed to be a feature of quantum theory which signals its inconsistency with any classical worldview. To prove the failure of classicality in the sense of Kochen-Specker noncontextuality, one does indeed require sets of incompatible measurements. However, a more broadly applicable notion of classicality is the existence of a generalized-noncontextual ontological model. In particular, this notion can imply constraints on the representation of outcomes even within a single nonprojective measurement. We leverage this fact to demonstrate that measurement incompatibility is neither necessary nor sufficient for proofs of the failure of generalized noncontextuality. Furthermore, we show that every proof of the failure of generalized noncontextuality in a quantum prepare-measure scenario can be converted into a proof of the failure of generalized noncontextuality in a corresponding scenario with no incompatible measurements.
Article
The formalism of generalized probabilistic theories (GPTs) was originally developed as a way to characterize the landscape of conceivable physical theories. Thus, the GPT describing a given physical theory necessarily includes all physically possible processes. We here consider the question of how to provide a GPT-like characterization of a particular experimental setup within a given physical theory. We show that the resulting characterization is not generally a GPT in and of itself, rather, it is described by a more general mathematical object that we introduce and term an accessible GPT fragment. We then introduce an equivalence relation, termed cone equivalence, between accessible GPT fragments (and, as a special case, between standard GPTs). We give a number of examples of experimental scenarios that are best described using accessible GPT fragments, and where moreover, cone-equivalence arises naturally. We then prove that an accessible GPT fragment admits of a classical explanation if and only if every other fragment that is cone equivalent to it also admits of a classical explanation. Finally, we leverage this result to prove several fundamental results regarding the experimental requirements for witnessing the failure of generalized noncontextuality. In particular, we prove that neither incompatibility among measurements nor the assumption of freedom of choice is necessary for witnessing failures of generalized noncontextuality, and moreover, that such failures can be witnessed even when using arbitrarily inefficient detectors.
Article
We give a complete characterization of the (non)classicality of all stabilizer subtheories. First, we prove that there is a unique nonnegative and diagram-preserving quasiprobability representation of the stabilizer subtheory in all odd dimensions, namely Gross’s discrete Wigner function. This representation is equivalent to Spekkens’ epistemically restricted toy theory, which is consequently singled out as the unique noncontextual ontological model for the stabilizer subtheory. Strikingly, the principle of noncontextuality is powerful enough (at least in this setting) to single out one particular classical realist interpretation. Our result explains the practical utility of Gross’s representation by showing that (in the setting of the stabilizer subtheory) negativity in this particular representation implies generalized contextuality. Since negativity of this particular representation is a necessary resource for universal quantum computation in the state injection model, it follows that generalized contextuality is also a necessary resource for universal quantum computation in this model. In all even dimensions, we prove that there does not exist any nonnegative and diagram-preserving quasiprobability representation of the stabilizer subtheory, and, hence, that the stabilizer subtheory is contextual in all even dimensions.
Article
If one seeks to test quantum theory against many alternatives in a landscape of possible physical theories, then it is crucial to be able to analyze experimental data in a theory-agnostic way. This can be achieved using the framework of generalized probabilistic theories (GPTs). Here we implement GPT tomography on a three-level system corresponding to a single photon shared among three modes. This scheme achieves a GPT characterization of each of the preparations and measurements implemented in the experiment without requiring any prior characterization of either. Assuming that the sets of realized preparations and measurements are tomographically complete, our analysis identifies the most likely dimension of the GPT vector space describing the three-level system to be nine, in agreement with the value predicted by quantum theory. Relative to this dimension, we infer the scope of GPTs that are consistent with our experimental data by identifying polytopes that provide inner and outer bounds for the state and effect spaces of the true GPT. From these, we are able to determine quantitative bounds on possible deviations from quantum theory. In particular, we bound the degree to which the no-restriction hypothesis might be violated for our three-level system.
Article
I identify a fundamental difference between classical and quantum dynamics in the linear response regime by showing that the latter is, in general, contextual. This allows me to provide an example of a quantum engine whose favorable power output scaling unavoidably requires nonclassical effects in the form of contextuality. Furthermore, I describe contextual advantages for local metrology. Given the ubiquity of linear response theory, I anticipate that these tools will allow one to certify the nonclassicality of a wide array of quantum phenomena.
Article
Weak values are quantities accessed through quantum experiments involving weak measurements and postselection. It has been shown that “anomalous” weak values (those lying beyond the eigenvalue range of the corresponding operator) defy classical explanation in the sense of requiring contextuality [M. F. Pusey, Phys. Rev. Lett. 113, 200401 (2014)]. Here we elaborate on and extend that result in several directions. First, the original theorem requires certain perfect correlations that can never be realized in any actual experiment. Hence, we provide theorems that allow for a noise-robust experimental verification of contextuality from anomalous weak values, and compare with a recent experiment. Second, the original theorem connects the anomaly to contextuality only in the presence of a whole set of extra operational constraints. Here we clarify the debate surrounding anomalous weak values by showing that these conditions are tight: if any one of them is dropped, the anomaly can be reproduced classically. Third, whereas the original result required the real part of the weak value to be anomalous, we also give a version for any weak value with nonzero imaginary part. Finally, we show that similar results hold if the weak measurement is performed through qubit pointers, rather than the traditional continuous system. In summary, we provide inequalities for witnessing nonclassicality using experimentally realistic measurements of any anomalous weak value, and clarify what ingredients of the quantum experiment must be missing in any classical model that can reproduce the anomaly.
Article
Within the context of semiquantum nonlocal games, the trust can be removed from the measurement devices in an entanglement-detection procedure. Here we show that a similar approach can be taken to quantify the amount of entanglement. To be specific, first, we prove that in this context a single arbitrary game from a suitable subclass of semiquantum nonlocal games is necessary and sufficient for entanglement detection in the LOCC paradigm. Second, we prove that the pay-off of this game is a universal measure of entanglement which is convex and continuous. Importantly, our measure is operationally accessible in a measurement-device independent way by construction. Finally, we show that our approach can be simply extended to quantify the entanglement within any partitioning of multipartite quantum states.
Article
Contextuality is the leading notion of non-classicality for a single system. However, experimental demonstrations of contextuality have generally been limited by conceptual difficulties. Here I focus on the simplest non-trivial case, four preparations and two tomographically complete binary measurements. Exploiting a subtle connection to the CHSH scenario gives eight non-linear inequalities which are together necessary and sufficient for the experimental statistics to admit a preparation noncontextual model. No fixed "operational equivalences" are required. Violating such an inequality is therefore arguably the most direct route to an experimental refutation of noncontextuality.
Article
In many different fields of science, it is useful to characterize physical states and processes as resources. Chemistry, thermodynamics, Shannon's theory of communication channels, and the theory of quantum entanglement are prominent examples. Questions addressed by a theory of resources include: Which resources can be converted into which other ones? What is the rate at which arbitrarily many copies of one resource can be converted into arbitrarily many copies of another? Can a catalyst help in making an impossible transformation possible? How does one quantify the resource? Here, we propose a general mathematical definition of what constitutes a resource theory. We prove some general theorems about how resource theories can be constructed from theories of processes wherein there is a special class of processes that are implementable at no cost and which define the means by which the costly states and processes can be interconverted one to another. We outline how various existing resource theories fit into our framework. Our abstract characterization of resource theories is a first step in a larger project of identifying universal features and principles of resource theories. In this vein, we identify a few general results concerning resource convertibility.
Article
The average result of a weak measurement of some observable A can, under post-selection of the measured quantum system, exceed the largest eigenvalue of A. The nature of weak measurements, as well as the presence of post-selection and hence possible contribution of measurement-disturbance, has led to a long-running debate about whether or not this is surprising. Here, it is shown that such "anomalous weak values" are non-classical in a precise sense: a sufficiently weak measurement of one constitutes a proof of contextuality. This clarifies, for example, which features must be present (and in an experiment, verified) to demonstrate an effect with no satisfying classical explanation.
Article
It is well known that measurements performed on spatially separated entangled quantum systems can give rise to correlations that are nonlocal, in the sense that a Bell inequality is violated. They cannot, however, be used for superluminal signaling. It is also known that it is possible to write down sets of “superquantum” correlations that are more nonlocal than is allowed by quantum mechanics, yet are still nonsignaling. Viewed as an information-theoretic resource, superquantum correlations are very powerful at reducing the amount of communication needed for distributed computational tasks. An intriguing question is why quantum mechanics does not allow these more powerful correlations. We aim to shed light on the range of quantum possibilities by placing them within a wider context. With this in mind, we investigate the set of correlations that are constrained only by the no-signaling principle. These correlations form a polytope, which contains the quantum correlations as a (proper) subset. We determine the vertices of the no-signaling polytope in the case that two observers each choose from two possible measurements with d outcomes. We then consider how interconversions between different sorts of correlations may be achieved. Finally, we consider some multipartite examples.
Article
In this paper we give a proof of the long-standing Upper-bound Conjecture for convex polytopes, which states that, for 1 ≤ j < d < v, the maximum possible number of j-faces of a d-polytope with v vertices is achieved by a cyclic polytope C(v, d).
Article
We determine the set of the Bloch vectors for N-level systems, generalizing the familiar Bloch ball in 2-level systems. An origin of the structural difference from the Bloch ball in 2-level systems is clarified.
Article
I introduce a framework in which a variety of probabilistic theories can be defined, including classical and quantum theories, and many others. From two simple assumptions, a tensor product rule for combining separate systems can be derived. Certain features, usually thought of as specifically quantum, turn out to be generic in this framework, meaning that they are present in all except classical theories. These include the non-unique decomposition of a mixed state into pure states, a theorem involving disturbance of a system on measurement (suggesting that the possibility of secure key distribution is generic), and a no-cloning theorem. Two particular theories are then investigated in detail, for the sake of comparison with the classical and quantum cases. One of these includes states that can give rise to arbitrary non-signalling correlations, including the super-quantum correlations that have become known in the literature as Nonlocal Machines or Popescu-Rohrlich boxes. By investigating these correlations in the context of a theory with well-defined dynamics, I hope to make further progress with a question raised by Popescu and Rohrlich, which is, why does quantum theory not allow these strongly nonlocal correlations? The existence of such correlations forces much of the dynamics in this theory to be, in a certain sense, classical, with consequences for teleportation, cryptography and computation. I also investigate another theory in which all states are local. Finally, I raise the question of what further axiom(s) could be added to the framework in order uniquely to identify quantum theory, and hypothesize that quantum theory is optimal for computation.