## No full-text available

To read the full-text of this research,

you can request a copy directly from the authors.

What type of probability theory best describes the way humans make judgments under uncertainty and decisions under conflict? Although rational models of cognition have become prominent and have achieved much success, they adhere to the laws of classical probability theory despite the fact that human reasoning does not always conform to these laws. For this reason we have seen the recent emergence of models based on an alternative probabilistic framework drawn from quantum theory. These quantum models show promise in addressing cognitive phenomena that have proven recalcitrant to modeling by means of classical probability theory. This review compares and contrasts probabilistic models based on Bayesian or classical versus quantum principles, and highlights the advantages and disadvantages of each approach.
Copyright © 2015 Elsevier Ltd. All rights reserved.

To read the full-text of this research,

you can request a copy directly from the authors.

... Decomposing the disaster to different segments does not support using the Gaussian distribution for statistics and probabilities. The inability to combine different probability distributions into a joint probability distribution is known as contextuality in quantum theory (40). ...

... When an action occurs has a significant effect on the outcome, making some assumptions or questions incompatible. This is not unsurmountable as incompatible questions provide different perspectives of an event, perspectives we need to understand the world (40). ...

... We cannot process both perspectives simultaneously. We cannot decide a matter from more than one perspective -to decide from one perspective you are making your cognitive state dispersed (making indefinite) for the other (40). ...

This abstract summarizes critical insights from the manuscript, focusing on lessons learned in neonatal disaster response. The manuscript emphasizes the multifaceted challenges in such scenarios, including environmental issues, clinical care considerations, staffing concerns, communication difficulties, simulation needs, government agency collaboration, and comprehensive planning. Adopting a broader perspective and collaborating with experts from various fields to enhance neonatal disaster preparedness is highlighted. The structured process of lessons learned is emphasized, especially the analysis of observed data, the formulation of corrective actions and recommendations, and the integrating of diverse perspectives and expertise. Ultimately, the goal is to reduce potential failures and improve the survivability of neonates during disasters.

... Quantum cognitive methods integrate heuristic-based approaches [22], where a decision-maker is bounded to limited rationality, and rational approaches, where basic axioms under a probabilistic theory are defined to let the decision-maker perform inferences. The key point is that classic statistics and probability are based on Bayesian probability, which is the source of conflicts. ...

... The key point is that classic statistics and probability are based on Bayesian probability, which is the source of conflicts. Thus, in this new method, the axioms are defined under the quantum probability framework [22]. ...

... Then, the document that hits the peak of the defined metric, based on the Bell inequality applied on its state vector is considered the highly correlated document to the query in the minimum amount of window size. It is worth mentioning that emerging new concepts through combining two concepts are defined based on quantum interference and quantum entanglement in the existing literature [22], [23], [28], [29]. ...

... Generally, it posits that human cognitive information processing takes place in a quantum-theoretical way, supported by increasing literature from cognitive science and empirical psychology findings. Accordingly, it advocates the use of mathematical principles from QT and QP as a new conceptual framework and a coherent set of formal tools to help formalize and understand cognitive systems and processes and to better explain and model nonclassical, quantum-like cognitive behaviors [19,21,22]. See Section 3.1 for a more detailed description. ...

... The ANNs, to some extent, are also cognition-inspired, in the sense that they were designed to mimic human brain functioning in perception and cognition [81]. Human decision-making and cognition might arguably follow a quantum or quantum-like fashion [19,20,46]. Furthermore, QT (and QP) is more general than classical and statistical physics (and classical probability). ...

... To better describe decision vagueness, in [12], the phenomenon of borderline contradictions is explained as a manifestation of quantum interference. Additionally, QP has been used to address human cognition and decisionmaking phenomena that are considered paradoxical, generate nonreductive understandings of human conceptual processing, and provide new understandings of perception and human memory [19,20]. User study is also a common method for verifying nonclassical phenomena. ...

Quantum theory, originally proposed as a physical theory to describe the motions of microscopic particles, has been applied to various non-physics domains involving human cognition and decision-making that are inherently uncertain and exhibit certain non-classical, quantum-like characteristics. Sentiment analysis is a typical example of such domains. In the last few years, by leveraging the modeling power of quantum probability (a non-classical probability stemming from quantum mechanics methodology) and deep neural networks, a range of novel quantum-cognitively inspired models for sentiment analysis have emerged and performed well. This survey presents a timely overview of the latest developments in this fascinating cross-disciplinary area. We first provide a background of quantum probability and quantum cognition at a theoretical level, analyzing their advantages over classical theories in modeling the cognitive aspects of sentiment analysis. Then, recent quantum-cognitively inspired models are introduced and discussed in detail, focusing on how they approach the key challenges of the sentiment analysis task. Finally, we discuss the limitations of the current research and highlight future research directions.

... More recently, these models have been developed using the framework of quantum probability theory [12,13,14]. Such quantum cognitive models are fast gaining prominence since they fit human behavioral data better, and have native mechanisms to represent decision-making features like uncertainty, sequential-effects, and more [12,13,14,15,16,17,18]. Due to their computational demands, cognitive models are attracting the attention of computer systems practitioners [19], making them a natural candidate for quantum systems research. ...

... In the recent past, researchers have also begun using quantum probability theory to develop models of cognition [12,54]. These methods have been progressing rapidly [12,13,14,15,16,17,18,55,56]. Critically, uncertainty is an intrinsic property of quantum models, that does not require additional assumptions or fitting, as required by classical models, to account for the ubiquitous observation of stochasticity in mental function [12,18,54,57]. ...

Expanding the benefits of quantum computing to new domains remains a challenging task. Quantum applications are concentrated in only a few domains, and driven by these few, the quantum stack is limited in supporting the development or execution demands of new applications. In this work, we address this problem by identifying both a new application domain, and new directions to shape the quantum stack. We introduce computational cognitive models as a new class of quantum applications. Such models have been crucial in understanding and replicating human intelligence, and our work connects them with quantum computing for the first time. Next, we analyze these applications to make the case for redesigning the quantum stack for programmability and better performance. Among the research opportunities we uncover, we study two simple ideas of quantum cloud scheduling using data from gate-based and annealing-based quantum computers. On the respective systems, these ideas can enable parallel execution, and improve throughput. Our work is a contribution towards realizing versatile quantum systems that can broaden the impact of quantum computing on science and society.

... This is an argument widely developed, for example, by Gigerenzer and Murray (2015), in addition to authors of the Minskyian and resource-rational approach, as well as other less popular ones (see Lieder & Griffiths, 2020, or Millroth & Collsiöö, 2020, for further discussion). Among these alternatives to prospect theory, quantum models of information processing are the most innovative and contemporary (Bruza et al., 2015). Such models are characterized by using the mathematical language developed in quantum physics to address the theoretical issues of psychology. ...

... The transitivity assumption is fundamental for prospect theory, as it depends on the classical logic that composes traditional models of rational choice. In the mathematical theory that underlies the quantum models, the expected consequence is that people's preferences are not fixed and that they depend, for example, on the order in which certain Invitation to Mathematical Psychology events occur (Bruza et al, 2015). Thus, quantum models make extra predictions when compared to prospect theory models, depending on what limitations the research design imposes on the decision process. ...

In most areas, psychological phenomena tend to be explained only through textual constructions. Several authors, however, point to the need for theories that have a more formal nature, based on mathematical reasoning. In order to encourage broader access to its applications, we present the models and advantages of a mathematical psychology approach to the study of behavior. We review the limitations of verbal theorizing, then a common taxonomy in mathematical psychology follows, that classifies formal models as descriptive, process characterization, and explanatory. As well succeeded cases, we examine the mathematical psychology of decision making, of helping behavior, of memory, and of romantic relationships. Finally, we discuss the potential benefits and uses of this approach. Welcome to mathematical psychology.
Keywords:
mathematical psychology; formal theorizing; quantitative modeling

... Trata-se de argumento amplamente desenvolvido, por exemplo, por Gigerenzer e Murray (2015), além de autores da abordagem Minskyiana e da resourcerational, além de outros menos populares (vide Lieder & Griffiths, 2020 ou Millroth & Collsiöö, 2020 para discussões mais aprofundadas). Entre essas alternativas à teoria dos prospectos estão como mais inovadores e contemporâneos os modelos quânticos de processamento da informação (Bruza et al., 2015). Tais modelos são caracterizados por utilizar a linguagem matemática desenvolvida na física quântica para se adereçar as questões teóricas da psicologia. ...

... O pressuposto de transitividade é fundamental para a teoria dos prospectos, dado que ela depende da lógica clássica que compõe os modelos tradicionais de escolha racional. Já na teoria matemática que fundamenta os modelos quânticos, tem-se como consequência esperada que as preferências das pessoas não sejam fixas e que dependam, por exemplo, da ordem com que certos fatos acontecem (Bruza et al., 2015). Assim, os modelos quânticos fazem predições extras, quando comparados aos modelos da teoria dos prospectos, a depender de quais limitações o desenho de pesquisa impõe ao processo de decisão. ...

In most areas, psychological phenomena tend to be explained only through textual constructions. Several authors, however, point to the need for theories that have a more formal nature, based on mathematical reasoning. In order to encourage broader access to its applications, we present the models and advantages of a mathematical psychology approach to the study of behavior. We review the limitations of verbal theorizing, then a common taxonomy in mathematical psychology follows, that classifies formal models as descriptive, process characterization, and explanatory. As well succeeded cases, we examine the mathematical psychology of decision making, of helping behavior, of memory, and of romantic relationships. Finally, we discuss the potential benefits and uses of this approach. Welcome to mathematical psychology.
Keywords:
mathematical psychology; formal theorizing; quantitative modeling

... There is yet another method. To describe the decisionmaking process using quantum probabilities, quantum specialists have developed a concept called quantum cognition based on numerous cognitive processes outlined by the quantum theory of information [29]. This concept is based on modeling the human brain, languages, memories, and other factors. ...

... We use Welch's method to depict theta, alpha, beta, and gamma spectral power for the given data for each electrode. The frequency bands used are theta (4-8 Hz), alpha (8-12 Hz), beta (12)(13)(14)(15)(16)(17)(18)(19)(20)(21)(22)(23)(24)(25)(26)(27)(28)(29)(30), and gamma (30-64 Hz). Research has shown that different emotional states are associated with different patterns of brain activity in different frequency bands. ...

Recognizing emotions is crucial for the development of artificial intelligence in various fields. This study explores the application of quantum support vector machines (SVMs) on emotion recognition from electroencephalogram (EEG) signals and compares its performance to traditional SVMs. SVMs are a popular machine-learning algorithm for this task due to their ability to handle high-dimensional data and non-linear relationships between input features. This study uses a quantum SVM to generate distinct solutions based on quantum principles. We applied this method to the DEAP benchmark dataset for binary class classification and gained new insights into the quantum nature of emotions. The algorithm has trained on D-Wave quantum annealer using various samples, achieving accuracies of 65.6% and 75.0% for valence and arousal dimensions, respectively, with 22 × 40 × 32 (subjects × trials × channels) data points, demonstrating the potential of quantum machine learning for EEG-based emotion recognition. However, there are methodological challenges due to the quantum arbitrariness of current annealers and the sensitivity of quantum-based machines to initial values. To address this, we conducted multiple investigations under similar circumstances and successfully recognized emotions using our proposed method.

... In recent years, quantum probability (QP), as a mathematical framework of quantum physics that proposes two assumptions of both compatibility and incompatibility, has been adopted for describing elusive human cognitive and emotional activities, where a new research community, viz. quantum cognition, has been emerging [5]. An increasing body of theoretical and empirical evidence has shown the effectiveness and advantages of QP in modeling various AI tasks involving human cognition, e.g., semantic analysis, question answering and sentiment classification. ...

... We can notice a violation of the commutation law, i.e., M sar γ , M sen δ ̸ = 0 for all pairs, implying sentiment and sarcasm are incompatible. To further validate this observation, we introduce quantum relative entropy 5 , which is a kind of "distance" measure between quantum states, the smaller 5. D(σ||ρ) = T rσlogσ−T rσlogρ. ...

Sarcasm, sentiment, and emotion are three typical kinds of spontaneous affective responses of humans to external events and they are tightly intertwined with each other. Such events may be expressed in multiple modalities (e.g., linguistic, visual and acoustic), e.g., multi-modal conversations. Joint analysis of humans' multi-modal sarcasm, sentiment, and emotion is an important yet challenging topic, as it is a complex cognitive process involving both cross-modality interaction and cross-affection correlation. From the probability theory perspective, cross-affection correlation also means that the judgments on sarcasm, sentiment, and emotion are incompatible. However, this exposed phenomenon cannot be sufficiently modelled by classical probability theory due to its assumption of compatibility. Neither do the existing approaches take it into consideration. In view of the recent success of quantum probability (QP) in modeling human cognition, particularly contextual incompatible decision making, we take the first step towards introducing QP into joint multi-modal sarcasm, sentiment, and emotion analysis. Specifically, we propose a QUantum probabIlity driven multi-modal sarcasm, sEntiment and emoTion analysis framework, termed QUIET. Extensive experiments on two datasets and the results show that the effectiveness and advantages of QUIET in comparison with a wide range of the state-of-the-art baselines. We also show the great potential of QP in multi-affect analysis.

... See, for example, monographs[9,11,12,20,42,43,51,58], the Palgrave handbook[44], and recent reviews[57,82].2 See, for example, papers[1,2,13,16,17,19,23,24,[30][31][32][33]36,41,46,53,61,[75][76][77]94,95]. ...

The recent years are characterized by intensive applications of the methodology and mathematical apparatus of quantum theory, quantum-like modeling, in cognition, psychology, and decision making. In spite of the successful applications of this approach to a variety of psychological effects, e.g., the order, conjunction, disjunction, and response replicability effects, one may (but need not) feel dissatisfaction due to the absence of clear coupling to the neurophysiological processes in the brain. For the moment, this is just a phenomenological approach. In this paper we construct the quantum-like representation of the networks of communicating neurons. It is based not on standard quantum theory, but on generalized probability theory (GPT) with the emphasis of the operational measurement approach. We employ GPT's version which is based on ordered linear state space (instead of complex Hilbert space). A network of communicating neurons is described as a weighted ordered graph that in turn is encoded by its weight matrix. The state space of weight matrices is embedded in GPT with effect-observables and state updates within measurement instruments theory. The latter plays the crucial role. This GPT based model shows the basic quantum-like effects, as e.g. the order, non-repeatability, and disjunction effects; the latter is also known as interference of decisions. This GPT coupling also supports quantum-like modeling in medical diagnostic for neurological diseases, as depression and epilepsy. Although the paper is concentrated on cognition and neuronal networks, the formalism and methodology can be straightforwardly applied to a variety of biological and social networks.

... Since causes and consequences are not only connected but, in fact, inseparable, courses are intractable, non-computable, nondeterminate. Thus, future thoughts are not fully foreseeable because consequences give rise to new causes, not because of the complexity of cognition or randomness of its processes (Chater et al. 2006), let alone because of quantum mechanics' uncertainty principle (Bruza et al. 2015). ...

To explain why cognition evolved requires, first and foremost, an analysis of what qualifies as an explanation. In terms of physics, causes are forces and consequences are changes in states of substance. Accordingly, any sequence of events, from photon absorption to focused awareness, chemical reactions to collective behavior, or from neuronal avalanches to niche adaptation, is understood as an evolution from one state to another toward thermodynamic balance where all forces finally tally each other. From this scale-free physics perspective, energy flows through those means and mechanisms, as if naturally selecting them, that bring about balance in the least time. Then, cognitive machinery is also understood to have emerged from the universal drive toward a free energy minimum, equivalent to an entropy maximum. The least-time nature of thermodynamic processes results in the ubiquitous patterns in data, also characteristic of cognitive processes, i.e., skewed distributions that accumulate sigmoidally and, therefore, follow mostly power laws. In this vein, thermodynamics derived from the statistical physics of open systems explains how evolution led to cognition and provides insight, for instance, into cognitive ease, biases, dissonance, development, plasticity, and subjectivity.

... Such complementarities are consistent with the concept of complementarity, introduced first by Niels Bohr to characterize quantum phenomena but soon generalized to other areas, including biology, psychology, and philosophy [13,82,116]. It has been suggested that Bohr may have first encountered the notion of complementarity in the psychological writings of William James [74], and without a doubt it is reemerging in contemporary cognitive science through recent developments in quantum cognition research [10,117,144]. ...

... It is always tempting to look for new physics to explain consciousness, especially when there is evidence that human cognition displays evidence of quantum peculiarities [69,70]. But that approach is usually a mistake. ...

Without proven causal power, consciousness cannot be integrated with physics except as an epiphenomenon, hence the term ‘hard problem’. Integrated Information Theory (IIT) side-steps the issue by stating that subjective experience must be identical to informational physical structures whose cause-and-effect power is greater than the sum of their parts. But the focus on spatially oriented structures rather than events in time introduces a deep conceptual flaw throughout its entire structure, including the measure of integrated information, known as Φ (phi). However, the problem can be corrected by incorporating the temporal feature of consciousness responsible for the hard problem, which can ultimately resolve it, namely, that experiencer and experienced are not separated in time but exist simultaneously. Simultaneous causation is not possible in physics, hence the hard problem, and yet it can be proven deductively that consciousness does have causal power because of this phenomenological simultaneity. Experiencing presence makes some facts logically possible that would otherwise be illogical. Bypassing the hard problem has caused much of the criticism that IIT has attracted, but by returning to its roots in complexity theory, it can repurpose its model to measure causal connections that are temporally rather than spatially related.

... Besides, as * Corresponding author particles in quantum mechanics collapse to defined states upon observation, language users assign specific meanings to words or sentences based on contexts. Meanwhile, studies (Bruza et al., 2015; conducted within the realm of cognitive science have demonstrated that human cognition, encompassing sentiment analysis tasks in NLP, often exhibit quantum-like behaviors, which can be more precisely elucidated through quantum theory. ...

... They referred to the theorem of quantum mechanics in optical flows for initial stimulus in driver decision process to explain the motion-related perceptual phenomena while vehicles approach the incident site in adjacent lanes. Bruza et al., utilized quantum cognition [76] to replace classical cognition models for analyzing intention of human behavior of the estimated traffic participants and their interaction [77]. ...

Quantum computing, a field utilizing the principles of quantum mechanics, promises great advancements across various industries. This survey paper is focused on the burgeoning intersection of quantum computing and intelligent transportation systems, exploring its potential to transform areas such as traffic optimization, logistics, routing, and autonomous vehicles. By examining current research efforts, challenges, and future directions, this survey aims to provide a comprehensive overview of how quantum computing could affect the future of transportation.

... In the same way quantumlike logic has been used with success in the field of experimental psychology in a new approach named quantum cognition. Cognitive processes such as decision making, judgment, memory, reasoning, language or perception, not adequately described by "classical" probabilities, are modelled with mathematical quantum-like tools, in particular probability amplitudes, thus better fitting experimental data [23,24]. In quantum cognition, combining abstract concepts is associated to quantum-like phenomena such as superposition and interferences. ...

Benveniste’s experiments – known in the lay press as the “water memory” phenomenon – are generally considered to be a closed case. However, the amount of data generated by twenty years of well-conducted experiments prevents closing the file so simply. An issue, which has been little highlighted so far, merits to be emphasized. Indeed, if Benveniste failed to persuade his peers of the value of his experiments, it was mainly because of a stumbling block, namely the difficulty of convincingly proving the causal relationship between the supposed cause (“informed water”) and the experimental outcomes in different biological models. To progress in the understanding of this phenomenon, we abandon the idea of any role of water in these experiments (“water memory” and its avatars). In other words, we assume that “control” and “test” conditions to be evaluated were physically identical and differed only by their respective designations. We show in this article how simple probabilistic considerations allow to build a model that accounts for all aspects of Benveniste’s experiments. In this model based on probability amplitudes, constructive and destructive interferences emerge – or not – according to the experimental context. The recording of a statistical regularity shapes the intertwined whole constituted by the probability amplitudes of the states of the experimenter’s cognitive structures and of the experimental system. This model provides an alternative explanation to Benveniste’s experiments where water plays no role and where the place of the experimenter is central.

... In contrast, in classical probability theory, it is assumed that a system is in a defined state at any given moment [23]. ...

Inspired by the principles of quantum mechanics, we constructed a model of students’ misconceptions about heat and temperature, conceptualized as a quantum system represented by a density matrix. Within this framework, the presence or absence of misconceptions is delineated as pure states, while the probability of mixed states is also considered, providing valuable insights into students’ cognition based on the mental models they employ when holding misconceptions. Using the analysis model previously employed by Lei Bao and Edward Redish, we represented these results in a density matrix. In our research, we utilized the Zeo and Zadnik Thermal Concept Evaluation among 282 students from a private university in Northeast Mexico. Our objective was to extract information from the analysis of multiple-choice questions designed to explore preconceptions, offering valuable educational insights beyond the typical Correct–Incorrect binary analysis of classical systems. Our findings reveal a probability of 0.72 for the appearance of misconceptions, 0.28 for their absence, and 0.43 for mixed states, while no significant disparities were observed based on gender or scholarship status, a notable difference was observed among programs (p < 0.05). These results are consistent with the previous literature, confirming a prevalence of misconceptions within the student population.

... However, the geometry is projective in this case, sometimes even building on a complex manifold rather than on a Euclidean plane. In a feature review, Bruza et al. (2015) characterized such an effort as a new theoretical approach to a systematic understanding of human thinking and decision-making processes. Mathematically, they highlighted John von Neumann's projective geometric structure of vector spaces as a representation of probability rather than Andrey Kolmogorov's set theories. ...

Mathematical symbols, such as those embodying quantum concepts, are indispensable for conveying complex ideas and relationships in academic writing. However, some education researchers and students keep a distance from anything mathematical: algebraic equations, geometrical reasoning, or statistical symbols. How to lower the access threshold for this type of mathematical narrative and reveal the meanings of a range of quantum conceptions to modern educators thus becomes a real problem. Using the pendulum motion equation as a reference point, I argue in this article for the advantages of academic English or French writing genres that fuse a range of mathematical symbols of quantum concepts and conceptual change. Such writings help demonstrate how incorporating the idea of probability (a) refines the debate among conceptual, verbal, and mathematical academic writing; (b) allows new conceptions that draw on the insights from quantum cognition-supported theories; (c) helps explain students’ understanding of mathematical symbols; and (d) offers a new taxonomy for categorizing academic writings.

... Quantum theory has recently entered organization science as a new lens to explain human cognition and judgment (Bruza et al., 2015;Busemeyer & Bruza, 2012;Busemeyer et al., 2011) and to understand organizational phenomena such as paradox Li, 2021), organizational change (Lord et al., 2015), and sustainability (Dyck & Greidanus, 2017). Quantum theory fundamentally altered the classical positivism that any characterization (i.e., empirical measurement) corresponds to the preexisting physical reality of Newtonian physics, which has served as an underpinning premise across social sciences (Barad, 2007). ...

Plain English Summary
Rethinking Entrepreneurial Opportunity: A Quantum view! Discover how opportunities are both found and made. #QuantumOpportunity. Entrepreneurial opportunities have long been debated: are they discovered or created? Inspired by quantum mechanics, this article introduces a fresh perspective called the “Quantum view.” Instead of seeing opportunities as either discovered or created, the Quantum view suggests that they are both created and discovered. It urges us to rethink conventional wisdom. Instead of just “finding” or “making” opportunities, the Quantum view suggests that opportunities exist in potential states and are realized through entrepreneurial enactment. Entrepreneurs can gather information about opportunities, but they can never fully specify them because every time they interact with an opportunity, they change it. Thus, the Quantum view emphasizes that we can learn about, but never fully specify, opportunities. This helps researchers think differently about opportunities and avoid extreme positions such as that entrepreneurs create something from nothing or that some entrepreneurs can accurately predict the future.

... In the field of experimental psychology, quantum-like logic has been used with success in a new approach named quantum cognition. Cognitive processes such as decision making, judgment, memory, reasoning, language or perception, not adequately described by "classical" probabilities, are modelled with mathematical quantum-like tools, thus better fitting experimental data [23,24]. In quantum cognition, combining abstract concepts is associated to quantum-like phenomena such as superposition and interferences, as it is the case in our modelling. ...

Benveniste’s experiments – known in the lay press as the “water memory” phenomenon – are generally considered to be a closed case. However, the amount of data generated by twenty years of well-conducted experiments prevents closing the file so simply. An issue, which has been little highlighted so far, merits to be emphasized. Indeed, if Benveniste failed to persuade his peers of the value of his experiments, it was mainly because of a stumbling block, namely the difficulty of convincingly proving the causal relationship between the supposed cause (“informed water”) and the experimental outcomes in different biological models. To progress in the understanding of this phenomenon, we abandon the idea of any role of water in these experiments (“water memory” and its avatars). In other words, we assume that control and test conditions that were evaluated were all physically identical; only their respective designations (labels) differentiated them. As a consequence, labels (“controls” vs. “tests”) and the corresponding states of the biological system (no change vs. change) are independent variables. We show in this article how simple considerations based on probability theory allow to build a probability model where the order of measurements matters. This model provides an alternative explanation to Benveniste’s experiments where water plays no role and where the place of the experimenter is central.

... A recently developed theory of decision making, which formalizes the concept of uncertainty and other effects that are particularly manifest in cognitive processes, is quantum decision theory [Favre et al., 2016]. Quantum cognition refers to a probabilistic model of cognition that uses quantum formalism in order to explain some seemingly distinct and puzzling cognitive phenomena through a common set of underlying principles from quantum theory [Bruza et al., 2015]. A dynamic model which introduces quantum decision modeling in binary choices is introduced in [Dal Forno et al., 2021]. ...

We consider a discrete-time model of a population of agents participating in a minority game using a quantum cognition in an approach with binary choices. As the agents make decisions based on both their present and past states, the model is inherently two-dimensional, but can be reduced to a one-dimensional system governed by a bi-valued function. Through this reduction, we prove how the complex bifurcation structure in the model’s 2D parameter space can be explained by a few codimension-2 bifurcation points of a type not yet reported in the literature. These points act as organizing centers for period-adding structures that partially overlap, leading to bistability.

... verbal theories in psychology, and better explain and predict many phenomena puzzling to classical models, leading to highly testable models (e.g. [38][39][40]). [33] (p. ...

The argument of this article is grounded in the irreducible interference of observational instruments in our interactions with nature in quantum physics and, thus, in the constitution of quantum phenomena versus classical physics, where this interference can, in principle, be disregarded. The irreducible character of this interference was seen by N. Bohr as the principal distinction between classical and quantum physics and grounded his interpretation of quantum phenomena and quantum theory. Bohr saw complementarity as a generalization of the classical ideal of causality, which defined classical physics and relativity. While intimated by Bohr, the relationships among observational technology, complementarity, causality and the arrow of events (a new concept that replaces the arrow of time commonly used in this context) were not addressed by him either. The article introduces another new concept, that of quantum causality, as a form of probabilistic causality. The argument of the article is based on a particular interpretation of quantum phenomena and quantum theory, defined by the concept of ‘reality without realism (RWR)’. This interpretation follows Bohr's interpretation but contains certain additional features, in particular the Dirac postulate. The article also considers quantum-like (Q-L) theories (based in the mathematics of QM) from the perspective it develops.
This article is part of the theme issue ‘Thermodynamics 2.0: Bridging the natural and social sciences (Part 2)’.

... The VCFST was developed as part of a research project aimed at exploring the role of Verbovisual factors in enhancing cognitive functions. Quantum cognition, a relatively new field of study, applies quantum principles to cognitive processes and decision-making [1]. By incorporating this framework into the assessment model, the VCFST expands our understanding of cognitive functioning beyond classical models. ...

The Verbovisual Cognitive Function Screening Tool (VCFST) is a cognitive screening tool that aims to evaluate cognitive abilities by drawing upon established theories such as dual-coding theory, common coding theory, propositional theory, and ideomotor theory. However, recent developments in quantum cognitive science offer promising applications to enhance the assessment model. This article explores the utilization of quantum cognition concepts within the VCFST and discusses its implications for cognitive assessment. Quantum cognitive science is an emerging interdisciplinary field that applies principles from quantum physics to the study of human cognition. It suggests that cognitive processes may not adhere strictly to classical logic and linear thinking but can exhibit characteristics of quantum phenomena such as superposition, entanglement, and contextuality. By incorporating quantum concepts into cognitive assessment, a more nuanced understanding of complex cognitive processes can be achieved.The VCFST, with its focus on verbovisual factors and comprehensive assessment of cognitive domains, can benefit from the integration of quantum cognitive science principles. Superposition, borrowed from quantum physics, implies that individuals can simultaneously perceive and attend to multiple stimuli. By considering the superposition of verbovisual stimuli within the VCFST, a more accurate representation of cognitive abilities can be obtained, capturing the simultaneous processing of different types of information. Entanglement, another key principle of quantum cognition, can be applied to evaluate the interconnectedness of cognitive domains within the VCFST. Analogous to entangled particles, changes in one cognitive domain may influence others. By assessing the impact of changes in one domain on performance in other domains, the VCFST can provide a holistic understanding of cognitive functioning. Contextuality, a fundamental concept in quantum cognition, emphasizes the role of context in shaping cognitive processes. By incorporating contextuality into the VCFST, contextual cues can be introduced, and stimulus presentation can be varied to reflect real-world cognitive demands. This approach enables a more ecologically valid measure of cognitive abilities, considering the influence of context on cognitive processing. Embracing quantum cognitive science principles in the VCFST offers several advantages, including capturing the complexity of cognitive processes, accounting for non-linear dynamics, and exploring potential quantum-like effects within human cognition. Future research can further refine the VCFST by conducting quantitative studies to assess the efficacy of incorporating quantum concepts into cognitive assessment and developing computational models inspired by quantum cognitive science.The utilization of quantum cognitive science concepts within the VCFST shows promise for advancing cognitive assessment practices. By incorporating principles such as superposition, entanglement, and contextuality, the VCFST can provide a comprehensive and nuanced evaluation of cognitive abilities. Further research in quantum cognitive science and the refinement of assessment methodologies can contribute to the development of more advanced and accurate cognitive assessment tools, facilitating personalized diagnostic and therapeutic strategies for cognitive disorders and neurodegenerative conditions. The integration of quantum cognitive science into cognitive assessment represents an exciting frontier that can revolutionize our understanding of human cognition.

... Quantum theory is believed to be able to reveal human cognitive behavior (Busemeyer and Bruza 2012;Bruza, Wang, and Busemeyer 2015), and the mathematical principles of quantum mechanics (namely, quantum probability theory) have been extensively studied because of their superiority (Li et al. 2018;Liu, Hou, and Song 2021;Zhang et al. 2020Zhang et al. , 2022b. ...

Data imbalance, also known as the long-tail distribution of data, is an important challenge for data-driven models. In the Word Sense Disambiguation (WSD) task, the long-tail phenomenon of word sense distribution is more common, making it difficult to effectively represent and identify Long-Tail Senses (LTSs). Therefore exploring representation methods that do not rely heavily on the training sample size is an important way to combat LTSs. Considering that many new states, namely superposition states, can be constructed from several known states in quantum mechanics, superposition states provide the possibility to obtain more accurate representations from inferior representations learned from a small sample size. Inspired by quantum superposition states, a representation method in Hilbert space is proposed to reduce the dependence on large sample sizes and thus combat LTSs. We theoretically prove the correctness of the method, and verify its effectiveness under the standard WSD evaluation framework and obtain state-of-the-art performance. Furthermore, we also test on the constructed LTS and the latest cross-lingual datasets, and achieve promising results.

... Also, similar to quantum computing, spinors may store information about events and produce a memory. This may be a type of quantum cognition within molecules [15]. ...

By injecting a string of spinors within a membrane, it becomes sensitive to external magnetic fields. Without external magnetic fields, half of the spinors in this string have opposite spins with respect to the other half and become paired with them within membranes. However, any external magnetic field could have a direct effect on this system because a magnetic field could make all spinors parallel. According to the exclusion principle, parallel spinors repel each other and go away. Consequently, they force the molecular membrane to grow. By removing external fields, this molecule or membrane returns to its initial size. An injected string of spinors could be designed so that this molecule or membrane is sensitive only to some frequencies. Particularly, membranes could be designed to respond to low frequencies below 60 Hz. Even in some conditions, frequencies should be lower than 20 Hz. Higher frequencies may destroy the structure of membranes. Although, by using some more complicated mechanisms, some membranes could be designed to respond to higher frequencies. Thus, a type of intelligence could be induced into a molecule or membrane such that it becomes able to diagnose special frequencies of waves and responses. We tested the model for milk molecules like fat, vesicles, and microbial ones under a 1000x microscope and observed that it works. Thus, this technique could be used to design intelligent drug molecules. Also, this model may give good reasons for observing some signatures of water memory by using the physical properties of spinors.

... Many other models make too few mechanistic assumptions regarding noise to make any kind of formal predictions. For example, the Inductive Confirmation model (Tentori et al., 2013) do not describe how probability judgments that are not directly associated with an implicit or explicit context are produced, and the Quantum Cognition account (Bruza et al., 2015;Pothos & Busemeyer, 2022) has so far not supplied a definitive account of noise and stochasticity in human cognition and therefore the model makes no predictions concerning the mean/variance relationship. However, the Bounded Log Odds model (BLO; Zhang et al., 2020) and other associated models working on the same principle (Khaw et al., 2021) constitute an interesting exception. ...

Human probability judgments are both variable and subject to systematic biases. Most probability judgment models treat variability and bias separately: a deterministic model explains the origin of bias, to which a noise process is added to generate variability. But these accounts do not explain the characteristic inverse U-shaped signature linking mean and variance in probability judgments. By contrast, models based on sampling generate the mean and variance of judgments in a unified way: the variability in the response is an inevitable consequence of basing probability judgments on a small sample of remembered or simulated instances of events. We consider two recent sampling models, in which biases are explained either by the sample accumulation being further corrupted by retrieval noise (the Probability Theory + Noise account) or as a Bayesian adjustment to the uncertainty implicit in small samples (the Bayesian sampler). While the mean predictions of these accounts closely mimic one another, they differ regarding the predicted relationship between mean and variance. We show that these models can be distinguished by a novel linear regression method that analyses this crucial mean–variance signature. First, the efficacy of the method is established using model recovery, demonstrating that it more accurately recovers parameters than complex approaches. Second, the method is applied to the mean and variance of both existing and new probability judgment data, confirming that judgments are based on a small number of samples that are adjusted by a prior, as predicted by the Bayesian sampler.

... Perhaps more interestingly, this quality of being affected by observation is not exclusively a property of quantum systems: it is recognized and important in several fields outside physics such as human cognition, psychology and sociology, where the act of gathering information about a system clearly influences said system. In the case of human cognition, the case has actually been made for the use of a quantum-like formalism [2,3,4,5] in order to account for the effect of ordering in repeated questions. ...

We propose a novel methodological framework based on the emerging field of quantum cognition and illustrate its application to a central problem in management and strategy research: causal ambiguity. The current literature often assumes that causal ambiguity —the difficulty of managers to understand the causal link between resources and outcomes at the basis of a firm's performance—can be reduced through learning. This literature overlooks the fact that causal ambiguity reduction is impossible when causal systems exhibit so‐called complementary properties. Building upon quantum cognition —specifically the idea of complementarity as an alternative to causality—we illustrate that causal ambiguity is only a special case of ambiguity and we offer a novel methodological framework to model what we label as ‘acausal ambiguity’, which refers to the insurmountable limit of managers to achieve causal ambiguity reduction. Managers can, of course, cognize some causal links. However, this comes at the price of being agnostic about complementary ones. The implications of this novel methodological framework applied to causal ambiguity are twofold: while complementarity opacifies the attentional faculties of managers, it also accounts for the cognitive origins of novelty.

Abundant experimental evidence illustrates violations of Bayesian models across various cognitive processes. Quantum cognition capitalizes on the limitations of Bayesian models, providing a compelling alternative. We suggest that a generalized quantum approach in meta-learning is simultaneously more robust and flexible, as it retains all the advantages of the Bayesian framework while avoiding its limitations.

One of the most important challenges in decision theory has been how to reconcile the normative expectations from Bayesian theory with the apparent fallacies that are common in probabilistic reasoning. Recently, Bayesian models have been driven by the insight that apparent fallacies are due to sampling errors or biases in estimating (Bayesian) probabilities. An alternative way to explain apparent fallacies is by invoking different probability rules, specifically the probability rules from quantum theory. Arguably, quantum cognitive models offer a more unified explanation for a large body of findings, problematic from a baseline classical perspective. This work addresses two major corresponding theoretical challenges: first, a framework is needed which incorporates both Bayesian and quantum influences, recognizing the fact that there is evidence for both in human behavior. Second, there is empirical evidence which goes beyond any current Bayesian and quantum model. We develop a model for probabilistic reasoning, seamlessly integrating both Bayesian and quantum models of reasoning and augmented by a sequential sampling process, which maps subjective probabilistic estimates to observable responses. Our model, called the Quantum Sequential Sampler, is compared to the currently leading Bayesian model, the Bayesian Sampler (J. Zhu et al., 2020) using a new experiment, producing one of the largest data sets in probabilistic reasoning to this day. The Quantum Sequential Sampler embodies several new components, which we argue offer a more theoretically accurate approach to probabilistic reasoning. Moreover, our empirical tests revealed a new, surprising systematic overestimation of probabilities.

According to the principles of quantum mechanics, individuals are unable to accurately predict the precise outcome of a measurement or observation. Despite the significant impact of quantum thinking on science, there is a lack of understanding regarding the psychological consequences associated with adopting such a mindset. This research investigates how engaging in quantum thinking, which accepts the universe’s inherent complexities and uncertainties, influences one’s tolerance for ambiguity. To test our hypothesis, we conducted three complementary studies involving diverse populations (students and community adults), multiple measures of tolerance of ambiguity (self-report data and behavioral indicators), and different priming procedures (text reading and sentence scrambling tasks). Study 1 demonstrated that university students exposed to quantum thinking principles exhibited greater tolerance for ambiguity within an English as a Foreign Language (EFL) setting. Moving beyond the educational setting, Study 2 corroborated these observations by evaluating an individual’s ease with uncertainty and unpredictability across different everyday scenarios. Addressing potential self-report biases, Study 3 incorporated a behavioral measure to objectively validate the observed effect. Together, these findings suggest that the thinking mindset prevalent in physics significantly impacts individuals’ cognitive flexibility and behavior, highlighting the broad relevance of quantum thinking beyond its scientific origins.

Background
Historically, although researchers in the science of complex systems proposed the idea of the edge of chaos and/or self-organized criticality as the essential feature of complex organization, they were not able to generalize this concept. Complex organization is regarded at the edge of chaos between the order phase and the chaos phase and a very rare case. Additionally, in cellular automata, the critical property is class IV, which is also rarely found. Therefore, there can be overestimation for natural selection. More recently, developments in cognitive and brain science have led to the free energy principle based on Bayesian inference, while quantum cognition has been established to explain various cognitive phenomena. Since Bayesian inference results in the perspective of a steady state, it can be described in Boolean logic. Considering that quantum logic consists of multiple Boolean logic in terms of lattice theory, the perspective of the free energy principle is the perspective of order, and the perspective of quantum logic might be the perspective of multiple worlds, which is strongly relevant for the edge of chaos.
Problem
The next question arises whether the perspective derived from quantum logic can be generalized for the complex behavior consisting of both order and chaos, instead of the edge of chaos or self-organized criticality, to reveal the property of critical behavior such as a power-law distribution.
Solution
In this study, we define quantum logic automata, which entail quantum logic (orthomodular lattice) in terms of lattice theory and have the features of a dynamical system. Because quantum logic automata are applied to a binary sequence, one can estimate the behavior of those automata with respect to patterns and a time series. Here, we show that most of a group of quantum logic automata display class IV-like behavior, in which oscillatory traveling waves collide with each other, leading to complex behavior; moreover, a time series of binary sequences displays 1 / f noise. Therefore, one can see that quantum logic automata generalize and expand the idea of the edge of chaos.

We aim to describe the history of Quantum Brain Dynamics (QBD), the hypothesis of memory and consciousness in a brain, and argue about recent perspectives in this article. First we introduce several features in memory in a brain distinguished from computer memory. Next we introduce the holographic brain theory proposed by Pribram which describes the mechanism of memory. Furthermore we show quantum field theoretical approach of memory proposed by Umezawa et al.. We consider concrete degrees of freedom in QBD, namely water electric dipole fields and photon fields, and also discuss the feasibility of spontaneous breakdown of rotational symmetry of electric dipoles, where dipoles are aligned in the same direction, in order to describe memory. Next we introduce dissipative quantum model regarding a brain as an open system, and attempt the integration of QBD and holographic theory. Moreover we consider the consciousness emerging as Bose-Einstein condensate, and estimate its critical temperature. Furthermore we mention our opinions against criticisms with respect to decoherence phenomena. Finally we refer to perspectives on QBD towards solving hard problems in conventional neuroscience.

Media is obtaining a leading role in contemporary strategic communication, the development especially visible in the context of social-political and geopolitical conflicts. This study introduces an interdisciplinary theory of Social Laser and applies it in the analysis of social mobilization around the current militarized situation in Eastern Europe. We analyze quantitatively the content of three Swedish newspapers and qualitatively the related Facebook discussions. As demonstrated, social lasing has been employed as an instrument in information warfare. An “information tsunami” was generated by the Swedish national mass media at the outbreak of military events in March-May 2022. It shaped social dynamics on the related social media platform. The intensity of dissident actions increased when the information pressure subsided.

Quantum-like modeling is a new but well received paradigm in social science that draws from various mathematical tools used in quantum science, such as information theory. However, we argue that there are deeper meta-principles, such as Contextuality-complementarity, Uncertainty, and Non-locality, that give meaning to these models. These meta-principles are equally applicable in both the physical and cognitive domains, but with different specific measures. It is important to exercise caution and recognize that economic theory should not be equated with quantum theory per se. In this paper, we demonstrate a simple application of quantum-like modelling in financial economics, specifically in the much-debated portfolio diversification theory given radical uncertainty. Our findings suggest that quantum-like modelling can provide a useful framework for understanding complex financial systems and making informed investment decisions.

Since the late 90s a paradigm shift began in decision research that has implications for leadership research. Due to the limitations of standard decision theory (based on Kolmogorovian/Bayesian decision theory) scholars began to build a new theory based on the ontological and epistemological foundations of quantum mechanics. The last decade has witnessed a surge in quantum-like modeling in the social sciences beyond decisionmaking with notable success. Many anomalies in human behavior, viz, order effects, failure of the sure thing principle, and conjunction and disjunction effects are now more thoroughly explained through quantum modeling. The focus of this paper is, therefore, to link leadership with quantum modeling and theory. We believe a new paradigm can emerge through this wedding of ideas which would facilitate better understandings of leadership. This article introduces readers to the mathematical analytical processes that quantum research has developed that can create new insights in the social scientific study of leadership.

Computational models and metrics, including measures like surprisal and semantic relevance, have been developed to accurately predict and explain language comprehension and processing. However, their efficacy is hindered by their inadequate integration of contextual information. Drawing inspiration from the attention mechanism in transformers and human forgetting mechanism, this study introduces an attention-aware method that thoroughly incorporates contextual information, updating surprisal and semantic relevance into attention-aware metrics respectively. Furthermore, by employing the quantum superposition principle, the study proposes an enhanced approach for integrating and encoding diverse information sources based on the two attention-aware metrics. Both attention-aware and enhanced metrics demonstrate superior effectiveness in comparison to the currently available metrics, leading to improved predictions of eye-movements during naturalistic discourse reading across 13 languages. The proposed approaches are fairly capable of facilitating simulation and evaluation of existing reading models and language processing theories. The metrics computed by the proposed approaches are highly interpretable and exhibit cross-language generalizations in predicting language comprehension. The innovative computational methods proposed in this study hold the great potential to enhance our understanding of human working memory mechanisms, human reading behavior and cognitive modeling in language processing. Moreover, they have the capacity to revolutionize ongoing research in computational cognition for language processing, offering valuable insights for computational neuroscience, quantum cognition and optimizing the design of AI systems.

Quantum Natural Language Processing (QNLP) is coded in the semantic space using a combined semantic distribution classification model with tensor operations. Its theoretical results on quantum circuit mapping and quantum semantic coding of text have been tested in practice with the recent development of quantum back-end equipment. In view of the small scale of quantum natural language processing tasks, single sentence structure of quantum semantic coding, low text coverage and lack of application, this paper proposes a combination of syntactic structure to extract text and extend the sentence components of quantum semantic coding, which improves the utilization of text processing task of quantum computing resources. In view of the fact that quantum natural language processing has few cases in specific applications, this paper studies the application expansion possibility of quantum text matching and question answering applications. The development path of classical natural language processing is referred to enhance the usefulness and explore the practical ability of QNLP in the current resource-constrained conditions.

Traditional participatory approaches to public policy design and analysis use the policy cycle which include the following steps: a) problem identification, (b) agenda setting, (c) consideration of policy options, (d) decision making, (e) implementation, and (f) evaluation. This essentially linear approach even when used in a cyclical mode is only useful for deterministic problems where the context is relatively fixed and the outcome can be defined with high certainty and evidence-based approaches might be feasible. In the case of most wicked policy problems evidence-based approaches cannot work for several reasons including the factors summarized by Brian Head (2022). Wicked problems are characterized by complex interactions, gaps in reliable knowledge, and enduring differences in values, interests and perspectives. Unfortunately, 'more science' cannot resolve these conflicting views, and therefore more data cannot directly help to de-politicize the partisan divide. In this paper we use Ken Wilber's Integral Meta Theory, (shown in Figure 1 on page 12) to propose a new participatory approach to dealing with wicked policy problems as defined by Rittel and Weber (1973) and Brian Head (2022) The most common advances which seek to address wicked policy problems are based on systems approaches (bottom right quadrant BR) and especially complex adaptive systems approaches. Using participatory approaches within this box can be helpful but we believe it will be quite limited in scope. While we agree that such systems approaches are a significant advance over the policy cycle approach or even integrated policy approaches which put people, evidence and outcome at the centre of the cycle we posit that we need to go much further We propose consciousness studies, quantum like frameworks and applied spirituality as frameworks that have immense potential to contribute to contribute to inclusive holistic approaches to policy analysis and design. We argue that this underlying transdisciplinary framework can provide the basis for the range of necessary disciplines and their practitioners to work together to address wicked policy problems. Most importantly the underlying meta-theory provides the basis to go beyond multidisciplinary approaches to a true transdisciplinary participatory approach. In the paper we will elaborate on tools relevant to each quadrant that can be used in an integrative and self-consistent manner.

Over the last 10 years there has been growing interest in the application of the quantum probability rules in the modelling of many aspects of human behaviour, including decision making, similarity, and categorization. We review the quantum tools behavioural scientists are more likely to employ and consider the special challenges in corresponding modelling work.

Quantum theory, originally proposed as a physical theory to describe the motions of microscopic particles, has been applied to various non-physics domains involving human cognition and decision-making that are inherently uncertain and exhibit certain non-classical, quantum-like characteristics. Sentiment analysis is a typical example of such domains. In the last few years, by leveraging the modeling power of quantum probability (a non-classical probability stemming from quantum mechanics methodology) and deep neural networks, a range of novel quantum-cognitively inspired models for sentiment analysis have emerged and performed well. This survey presents a timely overview of the latest developments in this fascinating cross-disciplinary area. We first provide a background of quantum probability and quantum cognition at a theoretical level, analyzing their advantages over classical theories in modeling the cognitive aspects of sentiment analysis. Then, recent quantum-cognitively inspired models are introduced and discussed in detail, focusing on how they approach the key challenges of the sentiment analysis task. Finally, we discuss the limitations of the current research and highlight future research directions.

Sarcasm, sentiment, and emotion are three typical kinds of spontaneous affective responses of humans to external events and they are tightly intertwined with each other. Such events may be expressed in multiple modalities (e.g., linguistic, visual and acoustic), e.g., multi-modal conversations. Joint analysis of humans' multi-modal sarcasm, sentiment, and emotion is an important yet challenging topic, as it is a complex cognitive process involving both cross-modality interaction and cross-affection correlation. From the probability theory perspective, cross-affection correlation also means that the judgments on sarcasm, sentiment, and emotion are incompatible. However, this exposed phenomenon cannot be sufficiently modelled by classical probability theory due to its assumption of compatibility. Neither do the existing approaches take it into consideration. In view of the recent success of quantum probability (QP) in modeling human cognition, particularly contextual incompatible decision making, we take the first step towards introducing QP into joint multi-modal sarcasm, sentiment, and emotion analysis. Specifically, we propose a
QU
antum probab
I
lity driven multi-modal sarcasm, s
E
ntiment and emo
T
ion analysis framework, termed QUIET. Extensive experiments on two datasets and the results show that the effectiveness and advantages of QUIET in comparison with a wide range of the state-of-the-art baselines. We also show the great potential of QP in multi-affect analysis.

p>This paper argues that noninteger dimensionality creates a hitherto unexplored source of nonlocal noise, called ND noise or NDN, that is likely to play a role in measurements at very small distances. New analysis on scale invariant probability distributions arising from noninteger dimensionality is presented. The significance of this when considering performance and noise-correction coding in models of quantum computing is indicated. It is argued that NDN noise will lead to decoherence even without interaction with the environment and it is likely to be relevant also in models of cognition.</p

p>This paper argues that noninteger dimensionality creates a hitherto unexplored source of nonlocal noise, called ND noise or NDN, that is likely to play a role in measurements at very small distances. New analysis on scale invariant probability distributions arising from noninteger dimensionality is presented. The significance of this when considering performance and noise-correction coding in models of quantum computing is indicated. It is argued that NDN noise will lead to decoherence even without interaction with the environment and it is likely to be relevant also in models of cognition.</p

Many paradoxical findings in decision-making that have resisted explanations by standard decision theories have accumulated over the past 50 years. Recent advances based on quantum probability theory have successfully accounted for many of these puzzling findings. Critics, however, claim that quantum probability theory is less constrained than standard probability theory, and hence quantum models only fit better because they are more complex than standard decision models. In this article, for the first time, a Bayesian method was used to quantitatively compare the 2 types of decision models, which is a method that evaluates models with respect to accuracy, parsimony, and robustness. A large experiment was used to compare the best-known models of each type, matching in their numbers of parameters, but possibly differing in the complexity of their functional forms. Surprisingly, the Bayesian model comparison overwhelmingly favored the quantum model, indicating that its success is due to its robust ability to make accurate predictions rather than accidental fits afforded by increased complexity. (PsycINFO Database Record (c) 2015 APA, all rights reserved)

In this paper, we test the type indeterminacy hypothesis by analyzing an experiment that examines the stability of preferences in a Prisoner Dilemma with respect to decisions made in a context that is both payoff and informationally unrelated to that Prisoner Dilemma. More precisely we carried out an experiment in which participants were permitted to make promises to cooperate to agents they saw, followed by playing a Prisoner’s Dilemma game with another, independent agent. It was found that, after making a promise to the first agent, participants exhibited higher rates of cooperation with other agents. We show that a classical model does not account for this effect, while a type indeterminacy model which uses elements of the formalism of quantum mechanics is able to capture the observed effects reasonably well.

The conjunction fallacy refers to situations when a person judges a conjunction to be more likely than one of the individual conjuncts, which is a violation of a key property of classical probability theory. Recently, quantum probability (QP) theory has been proposed as a coherent account of these and many other findings on probability judgment "errors" that violate classical probability rules, including the conjunction fallacy. Tentori, Crupi, and Russo (2013) presented an alternative account of the conjunction fallacy based on the concept of inductive confirmation. They presented new empirical findings consistent with their account, and they also claimed that these results were inconsistent with the QP theory account. This comment proved that our QP model for the conjunction fallacy is completely consistent with the main empirical results from Tentori et al. (2013). Furthermore, we discuss experimental tests that can distinguish the 2 alternative accounts. (PsycINFO Database Record (c) 2015 APA, all rights reserved).

There are formal arguments to support the notion that CP theory provides a correct association of probabilities to uncertain events. The Dutch Book Theorem (DBT; e.g., Howson and Urbach, 1993) shows that if one assigns probabilities to events in a way inconsistent with the axioms of CP theory, then it is possible to identify a combination of stakes (money to be won or lost, depending on whether the events occur or not), which guarantees a loss (or gain, depending on the sign of the stakes). That is, according to the DBT, when failing to follow the rules of CP theory, you may be vulnerable to a sure loss (extensions to the DBT, such as the Converse DBT, have been presented too; Vineberg, 2011). Note that the DBT is based on value maximization, but it is well established that reasoners are typically e.g., risk averse (Kahneman and Tversky, 1979). Wakker (2010) showed that risk averse decision makers are subject to a Dutch Book, which provides an interesting conundrum, since expected utility theory, which allows for a risk averse utility function, is considered the rational theory of risky decision making. Nevertheless, the utility of the DBT, in relation to a theory of rationality based on CP theory, is that it provides a formal justification for why CP theory provides the normative prescription for decision making. In other words, currently, if one is interested in whether a probabilistic decision is correct or not, then one needs to explore its consistency with the prescription from CP theory.
This discussion leads to our second point: exactly what is the evidence that probabilistic inference on the basis of CP theory is as accurate as possible? An a priori argument is the DBT. The consistency in probabilistic inference, which is demonstrated with the DBT, perhaps implies accuracy as well (i.e., do CP theory probabilities match empirical data?). Is it possible to prove a version of the DBT for QP theory as well? Superficially, this may appear not to be the case. First, the axioms of CP theory (on the basis of which the DBT is proved) are very different from those of QP theory. Second, verifiably (e.g., Gilio and Over, 2012), a classical decision maker, committing the conjunction fallacy in Tversky and Kahneman's (1983) experiment, is subject to a Dutch Book, that is, it is possible to specify a combination of stakes for the various hypotheses (Linda is a bank teller; Linda is a feminist; Linda is a bank teller and a feminist), which lead to a sure loss (or gain). However, it is possible to express the requirements for the DBT in terms of the fundamental principles of QP theory. Moreover, it is certainly true that if the questions about Linda are compatible (i.e., if we assume all events can be placed within the same sample space), then a Dutch Book is possible. But, if they are incompatible this is no longer necessarily the case, because the probabilities involved are based on different conditions (orders of evaluation). With work in progress, we are formalizing the relevant intuitions, but the idea is that accepting one incompatible outcome for Linda (e.g., that she is feminist) creates a separate sample space for another (e.g., that she is a bank teller).

We analyze different aspects of our quantum modeling approach of human concepts and, more specifically, focus on the quantum effects of contextuality, interference, entanglement, and emergence, illustrating how each of them makes its appearance in specific situations of the dynamics of human concepts and their combinations. We point out the relation of our approach, which is based on an ontology of a concept as an entity in a state changing under influence of a context, with the main traditional concept theories, that is, prototype theory, exemplar theory, and theory theory. We ponder about the question why quantum theory performs so well in its modeling of human concepts, and we shed light on this question by analyzing the role of complex amplitudes, showing how they allow to describe interference in the statistics of measurement outcomes, while in the traditional theories statistics of outcomes originates in classical probability weights, without the possibility of interference. The relevance of complex numbers, the appearance of entanglement, and the role of Fock space in explaining contextual emergence, all as unique features of the quantum modeling, are explicitly revealed in this article by analyzing human concepts and their dynamics.

Quantum cognition research applies abstract, mathematical principles of quantum theory to inquiries in cognitive science. It differs fundamentally from alternative speculations about quantum brain processes. This topic presents new developments within this research program. In the introduction to this topic, we try to answer three questions: Why apply quantum concepts to human cognition? How is quantum cognitive modeling different from traditional cognitive modeling? What cognitive processes have been modeled using a quantum account? In addition, a brief introduction to quantum probability theory and a concrete example is provided to illustrate how a quantum cognitive model can be developed to explain paradoxical empirical findings in psychological literature.

The Type Indeterminacy model is a theoretical framework that uses some elements of quantum formalism to model the constructive preference perspective suggested by Kahneman and Tversky. In a dynamic decision context type indeterminacy induces a game with multiple selves associated with a state transition process. We define a Markov perfect equilibrium among the selves with individual identity (preferences) as the state variable. The approach allows to characterize generic personality types and derive some comparative static results.

Quantum probability (QP) theory can be seen as a type of vector symbolic architecture (VSA): mental states are vectors storing structured information and manipulated using algebraic operations. Furthermore, the operations needed by QP match those in other VSAs. This allows existing biologically realistic neural models to be adapted to provide a mechanistic explanation of the cognitive phenomena described in the target article by Pothos & Busemeyer (P&B).

Classical (Bayesian) probability (CP) theory has led to an influential research tradition for modeling cognitive processes. Cognitive scientists have been trained to work with CP principles for so long that it is hard even to imagine alternative ways to formalize probabilities. However, in physics, quantum probability (QP) theory has been the dominant probabilistic approach for nearly 100 years. Could QP theory provide us with any advantages in cognitive modeling as well? Note first that both CP and QP theory share the fundamental assumption that it is possible to model cognition on the basis of formal, probabilistic principles. But why consider a QP approach? The answers are that (1) there are many well-established empirical findings (e.g., from the influential Tversky, Kahneman research tradition) that are hard to reconcile with CP principles; and (2) these same findings have natural and straightforward explanations with quantum principles. In QP theory, probabilistic assessment is often strongly context- and order-dependent, individual states can be superposition states (that are impossible to associate with specific values), and composite systems can be entangled (they cannot be decomposed into their subsystems). All these characteristics appear perplexing from a classical perspective. However, our thesis is that they provide a more accurate and powerful account of certain cognitive processes. We first introduce QP theory and illustrate its application with psychological examples. We then review empirical findings that motivate the use of quantum theory in cognitive theory, but also discuss ways in which QP and CP theories converge. Finally, we consider the implications of a QP theory approach to cognition for human rationality.

From behavioral sciences to biology to quantum mechanics, one encounters situations where (i) a system outputs several random variables in response to several inputs, (ii) for each of these responses only some of the inputs may "directly" influence them, but (iii) other inputs provide a "context" for this response by influencing its probabilistic relations to other responses. These contextual influences are very different, say, in classical kinetic theory and in the entanglement paradigm of quantum mechanics, which are traditionally interpreted as representing different forms of physical determinism. One can mathematically construct systems with other types of contextuality, whether or not empirically realizable: those that form special cases of the classical type, those that fall between the classical and quantum ones, and those that violate the quantum type. We show how one can quantify and classify all logically possible contextual influences by studying various sets of probabilistic couplings, i.e., sets of joint distributions imposed on random outputs recorded at different (mutually incompatible) values of inputs.

Free-association norms indicate that words are organized into semantic/associative neighborhoods within a larger network of words and links that bind the net together. We present evidence indicating that memory for a recent word event can depend on implicitly and simultaneously activating related words in its neighborhood. Processing a word during encoding primes its network representation as a function of the density of the links in its neighborhood. Such priming increases recall and recognition and can have long-lasting effects when the word is processed in working memory. Evidence for this phenomenon is reviewed in extralist-cuing, primed free-association, intralist-cuing, and single-item recognition tasks. The findings also show that when a related word is presented in order to cue the recall of a studied word, the cue activates the target in an array of related words that distract and reduce the probability of the target's selection. The activation of the semantic network produces priming benefits during encoding, and search costs during retrieval. In extralist cuing, recall is a negative function of cue-to-distractor strength, and a positive function of neighborhood density, cue-to-target strength, and target-to-cue strength. We show how these four measures derived from the network can be combined and used to predict memory performance. These measures play different roles in different tasks, indicating that the contribution of the semantic network varies with the context provided by the task. Finally, we evaluate spreading-activation and quantum-like entanglement explanations for the priming effects produced by neighborhood density.

Perhaps the simplest and the most basic qualitative law of probability is the conjunction rule: The probability of a conjunction,
P(A&B), cannot exceed the probabilities of its constituents,
P(A) and
P(B), because the extension (or the possibility set) of the conjunction is included in the extension of its constituents. Judgments under uncertainty, however, are often mediated by intuitive heuristics that are not bound by the conjunction rule. A conjunction can be more representative that one of its constituents, and instances of a specific category can be easier to imagine or to retrieve than instances of a more inclusive category. The representativeness and availability heuristics therefore can make a conjunction appear more probable than one of its constituents. This phenomenon is demonstrated in a variety of contexts, including estimation of word frequency, personality judgment, medical prognosis, decision under risk, suspicion of criminal acts, and political forecasting. Systematic violations of the conjunction rule are observed in judgments of lay people and of experts in both between- and within-Ss comparisons. Alternative interpretations of the conjunction fallacy are discussed, and attempts to combat it are explored. (48 ref) (PsycINFO Database Record (c) 2012 APA, all rights reserved)

Presents a new theory of subjective probability according to which different descriptions of the same event can give rise to different judgments. The experimental evidence confirms the major predictions of the theory. First, judged probability increases by unpacking the focal hypothesis and decreases by unpacking the alternative hypothesis. Second, judged probabilities are complementary in the binary case and subadditive in the general case, contrary to both classical and revisionist models of belief. Third, subadditivity is more pronounced for probability judgments than for frequency judgments and is enhanced by compatible evidence. The theory provides a unified treatment of a wide range of empirical findings. It is extended to ordinal judgments and to the assessment of upper and lower probabilities. (PsycINFO Database Record (c) 2012 APA, all rights reserved)

Five alternative information processing models that relate memory for evidence to judgments based on the evidence are identified in the current social cognition literature: independent processing, availability, biased retrieval, biased encoding, and incongruity-biased encoding. A distinction between 2 types of judgment tasks, memory-based vs online, is introduced and is related to the 5 process models. In 3 experiments, using memory-based tasks where the availability model described Ss' thinking, direct correlations between memory and judgment measures were obtained. In a 4th experiment, using online tasks where any of the remaining 4 process models may apply, prediction of the memory–judgment relationship was equivocal but usually followed the independence model prediction of zero correlation. It is concluded that memory and judgment will be directly related when the judgment was based directly on the retrieval of evidence information in memory-based judgment tasks. (61 ref) (PsycINFO Database Record (c) 2012 APA, all rights reserved)

This article introduces a "pseudo classical" notion of mod-elling non-separability. This form of non-separability can be viewed as lying between separability and quantum-like non-separability. Non-separability is formalized in terms of the non-factorizabilty of the underlying joint proba-bility distribution. A decision criterium for determining the non-factorizability of the joint distribution is related to determining the rank of a matrix as well as another approach based on the chi-square-goodness-of-fit test. This pseudo-classical notion of non-separability is discussed in terms of quantum games and concept combinations in human cognition.

In this article, Savage’s theory of decision-making under uncertainty is extended from a classical environment into a non-classical
one. The Boolean lattice of events is replaced by an arbitrary ortho-complemented poset. We formulate the corresponding axioms
and provide representation theorems for qualitative measures and expected utility. Then, we discuss the issue of beliefs updating
and investigate a transition probability model. An application to a simple game context is proposed.

The concept of complementarity, originally defined for non-commuting
observables of quantum systems with states of non-vanishing dispersion, is
extended to classical dynamical systems with a partitioned phase space.
Interpreting partitions in terms of ensembles of epistemic states (symbols)
with corresponding classical observables, it is shown that such observables are
complementary to each other with respect to particular partitions unless those
partitions are generating. This explains why symbolic descriptions based on an
\emph{ad hoc} partition of an underlying phase space description should
generally be expected to be incompatible. Related approaches with different
background and different objectives are discussed.

People can often outperform statistical methods and machine learning algorithms in situations that involve making inferences about the relationship between causes and effects. While people are remarkably good at causal reasoning in many situations, there are several instances where they deviate from expected responses. This paper examines three situations where judgments related to causal inference problems produce unexpected results and describes a quantum inference model based on the axiomatic principles of quantum probability theory that can explain these effects. Two of the three phenomena arise from the comparison of predictive judgments (i.e., the conditional probability of an effect given a cause) with diagnostic judgments (i.e., the conditional probability of a cause given an effect). The third phenomenon is a new finding examining order effects in predictive causal judgments. The quantum inference model uses the notion of incompatibility among different causes to account for all three phenomena. Psychologically, the model assumes that individuals adopt different points of view when thinking about different causes. The model provides good fits to the data and offers a coherent account for all three causal reasoning effects thus proving to be a viable new candidate for modeling human judgment.

Quantum entanglement relies on the fact that pure quantum states are
dispersive and often inseparable. Since pure classical states are
dispersion-free they are always separable and cannot be entangled. However,
entanglement is possible for epistemic, dispersive classical states. We show
how such epistemic entanglement arises for epistemic states of classical
dynamical systems based on phase space partitions that are not generating. We
compute epistemically entangled states for two coupled harmonic oscillators.

The broader scope of our investigations is the search for the way in which concepts and their combinations carry and influence meaning and what this implies for human thought. More specifically, we examine the use of the mathematical formalism of quantum mechanics as a modeling instrument and propose a general mathematical modeling scheme for the combinations of concepts. We point out that quantum mechanical principles, such as superposition and interference, are at the origin of specific effects in cognition related to concept combinations, such as the guppy effect and the overextension and underextension of membership weights of items. We work out a concrete quantum mechanical model for a large set of experimental data of membership weights with overextension and underextension of items with respect to the conjunction and disjunction of pairs of concepts, and show that no classical model is possible for these data. We put forward an explanation by linking the presence of quantum aspects that model concept combinations to the basic process of concept formation. We investigate the implications of our quantum modeling scheme for the structure of human thought, and show the presence of a two-layer structure consisting of a classical logical layer and a quantum conceptual layer. We consider connections between our findings and phenomena such as the disjunction effect and the conjunction fallacy in decision theory, violations of the sure thing principle, and the Allais and Elsberg paradoxes in economics.

A quantum dynamic model of decision-making is presented, and it is compared with a previously established Markov model. Both the quantum and the Markov models are formulated as random walk decision processes, but the probabilistic principles differ between the two approaches. Quantum dynamics describe the evolution of complex valued probability amplitudes over time, whereas Markov models describe the evolution of real valued probabilities over time. Quantum dynamics generate interference effects, which are not possible with Markov models. An interference effect occurs when the probability of the union of two possible paths is smaller than each individual path alone. The choice probabilities and distribution of choice response time for the quantum model are derived, and the predictions are contrasted with the Markov model.

Motivated by several classic decision-theoretic paradoxes, and by analogies with the paradoxes which in physics motivated the development of quantum mechanics, we introduce a projective generalization of expected utility along the lines of the quantum-mechanical generalization of probability theory. The resulting decision theory accommodates the dominant paradoxes, while retaining significant simplicity and tractability. In particular, every finite game within this larger class of preferences still has an equilibrium.

Much of our understanding of human thinking is based on probabilistic models. This innovative book by Jerome R. Busemeyer and Peter D. Bruza argues that, actually, the underlying mathematical structures from quantum theory provide a much better account of human thinking than traditional models. They introduce the foundations for modelling probabilistic-dynamic systems using two aspects of quantum theory. The first, 'contextuality', is a way to understand interference effects found with inferences and decisions under conditions of uncertainty. The second, 'quantum entanglement', allows cognitive phenomena to be modeled in non-reductionist ways. Employing these principles drawn from quantum theory allows us to view human cognition and decision in a totally new light. Introducing the basic principles in an easy-to-follow way, this book does not assume a physics background or a quantum brain and comes complete with a tutorial and fully worked-out applications in important areas of cognition and decision.

Examines the psychological processes involved in answering different types of survey questions. The book proposes a theory about how respondents answer questions in surveys, reviews the relevant psychological and survey literatures, and traces out the implications of the theories and findings for survey practice. Individual chapters cover the comprehension of questions, recall of autobiographical memories, event dating, questions about behavioral frequency, retrieval and judgment for attitude questions, the translation of judgments into responses, special processes relevant to the questions about sensitive topics, and models of data collection. The text is intended for: (1) social psychologists, political scientists, and others who study public opinion or who use data from public opinion surveys; (2) cognitive psychologists and other researchers who are interested in everyday memory and judgment processes; and (3) survey researchers, methodologists, and statisticians who are involved in designing and carrying out surveys. (PsycINFO Database Record (c) 2012 APA, all rights reserved)

Significance
In recent years, quantum probability theory has been used to explain a range of seemingly irrational human decision-making behaviors. The quantum models generally outperform traditional models in fitting human data, but both modeling approaches require optimizing parameter values. However, quantum theory makes a universal, nonparametric prediction for differing outcomes when two successive questions (e.g., attitude judgments) are asked in different orders. Quite remarkably, this prediction was strongly upheld in 70 national surveys carried out over the last decade (and in two laboratory experiments) and is not one derivable by any known cognitive constraints. The findings lend strong support to the idea that human decision making may be based on quantum probability.

We use the system of p-adic numbers for the description of information processes. Basic objects of our models are so-called transformers of information, basic processes are information processes and statistics are information statistics (thus we present a model of information reality). The classical and quantum mechanical formalisms on information p-adic spaces are developed. It seems that classical and quantum mechanical models on p-adic information spaces can be applied for the investigation of flows of information in cognitive and social systems, since a p-adic metric gives a quite natural description of the ability to form associations.

Decisions can sometimes have a constructive role, so that the act of, for example, choosing one option over another creates a preference for that option (e.g., Ariely & Norton, 2008; Payne, Bettman, & Johnson, 1993; Sharot, Velasquez, & Dolan, 2010; Sherman, 1980). In this work we explore the constructive role of just articulating an impression, for a presented visual stimulus, as opposed to making a choice (specifically, the judgments we employ are affective evaluations). Using quantum probability theory, we outline a cognitive model formalizing such a constructive process. We predict a simple interaction, in relation to how a second image is evaluated, following the presentation of a first image, depending on whether there is a rating for the first image or not. The interaction predicted by the quantum model was confirmed across three experiments and a variety of control manipulations. The advantages of using quantum probability theory to model the present results, compared with existing models of sequence order effects in judgment (e.g., Hogarth & Einhorn, 1992) or other theories of constructive processes when a choice is made (e.g., Festinger, 1957; Sharot et al., 2010) are discussed.

A key central tenet of decision theory is that decomposing an uncertain event into sub-events should not change the overall probability assigned to that uncertain event. As we show, both quantum physics and behavioral decision theory appear to systematically violate this principle in very similar ways. These results suggest that the structuring phase of decision analysis-which specifies how various events are decomposed-helps shape the subjective probabilities which will ultimately be assigned to those events.

Quantum-like structure is present practically everywhere. Quantum-like (QL) models, i.e. models based on the mathematical formalism of quantum mechanics and their generalizations can be successfully applied to cognitive science, psychology, genetics, economics, finances, and game theory.This book is not about quantum mechanics as a physical theory. The short review of quantum postulates is therefore mainly of historical value: quantum mechanics is just the first example of the successful application of non-Kolmogorov probabilities, the first step towards a contextual probabilistic description of natural, biological, psychological, social, economical or financial phenomena. A general contextual probabilistic model (V model) is presented. It can be used for describing probabilities in both quantum and classical (statistical) mechanics as well as in the above mentioned phenomena. This model can be represented in a quantum-like way, namely, in complex and more general Hilbert spaces. In this way quantum probability is totally demystified: Born's representation of quantum probabilities by complex probability amplitudes, wave functions, is simply a special representation of this type. © Springer-Verlag Berlin Heidelberg 2010. All rights are reserved.

In this paper we develop a general quantum-like model of decision making. Here updating of probability is based on linear algebra, the von Neumann–Lüders projection postulate, Born’s rule, and the quantum representation of the state space of a composite system by the tensor product. This quantum-like model generalizes the classical Bayesian inference in a natural way. In our approach the latter appears as a special case corresponding to the absence of relative phases in the mental state. By taking into account a possibility of the existence of correlations which are encoded in relative phases we developed a more general scheme of decision making. We discuss natural situations inducing deviations from the classical Bayesian scheme in the process of decision making by cognitive systems: in situations that can be characterized as objective and subjective mental uncertainties. Further, we discuss the problem of base rate fallacy. In our formalism, these “irrational” (non-Bayesian) inferences are represented by quantum-like bias operations acting on the mental state.

Question order effects are commonly observed in self-report measures of judgment and attitude. This article develops a quantum question order model (the QQ model) to account for four types of question order effects observed in literature. First, the postulates of the QQ model are presented. Second, an a priori, parameter-free, and precise prediction, called the QQ equality, is derived from these mathematical principles, and six empirical data sets are used to test the prediction. Third, a new index is derived from the model to measure similarity between questions. Fourth, we show that in contrast to the QQ model, Bayesian and Markov models do not generally satisfy the QQ equality and thus cannot account for the reported empirical data that support this equality. Finally, we describe the conditions under which order effects are predicted to occur, and we review a broader range of findings that are encompassed by these very same quantum principles. We conclude that quantum probability theory, initially invented to explain order effects on measurements in physics, appears to be a powerful natural explanation for order effects of self-report measures in social and behavioral sciences, too.

Memory exhibits episodic superposition, an analog of the quantum superposition of physical states: Before a cue for a presented or unpresented item is administered on a memory test, the item has the simultaneous potential to occupy all members of a mutually exclusive set of episodic states, though it occupies only one of those states after the cue is administered. This phenomenon can be modeled with a nonadditive probability model called overdistribution (OD), which implements fuzzy-trace theory's distinction between verbatim and gist representations. We show that it can also be modeled based on quantum probability theory. A quantum episodic memory (QEM) model is developed, which is derived from quantum probability theory but also implements the process conceptions of global matching memory models. OD and QEM have different strengths, and the current challenge is to identify contrasting empirical predictions that can be used to pit them against each other.

Potential features of quantum computation could explain enigmatic aspects of consciousness. The Penrose—Hameroff model (orchestrate objective reduction: ‘Orch OR’) suggests that quantum superposition and a form of quantum computation occur in microtubules—cylindrica protein lattices of the cell cytoskeleton within the brain's neurons. Microtubules couple to and regulate neural–level synapti functions, and they may be ideal quantum computers because of dynamical lattice structure, quantum–level subunit states an intermittent isolation from environmental interactions. In addition to its biological setting, the Orch OR proposal differ in an essential way from technologically envisioned quantum computers in which collapse, or reduction to classical outpu states, is caused by environmental decoherence (hence introducing randomness). In the Orch OR proposal, reduction of microtubul quantum superposition to classical output states occurs by an objective factor: Roger Penrose's quantum gravity threshol stemming from instability in Planck–scale separations (superpositions) in spacetime geometry. Output states following Penrose' objective reduction are neither totally deterministic nor random, but influenced by a non–computable factor ingrained in fundamenta spacetime. Taking a modern pan–psychist view in which protoconscious experience and Platonic values are embedded in Planck–scal spin networks, the Orch OR model portrays consciousness as brain activities linked to fundamental ripples in spacetime geometry.

The aim of this paper is simple. I want to state as clearly as possible, without a long discursion into technical questions, what I consider to be the single most powerful argument for use of a nonclassical logic in quantum mechanics. There is a very large mathematical and philosophical literature on the logic of quantum mechanics, but almost without exception, this literature provides a very poor intuitive justification for considering a nonclassical logic in the first place. A classical example in the mathematical literature is the famous article by Birkhoff and von Neumann (1936). Although Birkhoff and von Neumann pursue in depth development of properties of lattices and projective geometries that are relevant to the logic of quantum mechanics, they devote less than a third of a page (p. 831) to the physical reasons for considering such lattices. Moreover, the few lines they do devote are far from clear. The philosophical literature is just as bad on this point. One of the better known philosophical discussions on these matters is that found in the last chapter of Reichenbach’s book (1944) on the foundations of quantum mechanics. Reichenbach offers a three-valued truth-functional logic which seems to have little relevance to quantum-mechanical statements of either a theoretical or experimental nature.

The "Orch OR" theory suggests that quantum computations in brain neuronal dendritic-somatic microtubules regulate axonal firings to control conscious behavior. Within microtubule subunit proteins, collective dipoles in arrays of contiguous amino acid electron clouds enable "quantum channels" suitable for topological dipole "qubits" able to physically represent cognitive values, for example, those portrayed by Pothos & Busemeyer (P&B) as projections in abstract Hilbert space.

Much literature attests to the existence of order effects in the updating of beliefs. However, under what conditions do primacy, recency, or no order effects occur? This paper presents a theory of belief updating that explicitly accounts for order-effect phenomena as arising from the interaction of information-processing strategies and task characteristics. Key task variables identified are complexity of the stimuli, length of the series of evidence items, and response mode (Step-by-Step or End-of-Sequence). A general anchoring-and-adjustment model of belief updating is proposed. This has two forms depending on whether information is processed in a Step-by-Step or End-of-Sequence manner. In addition, the model specifies that evidence can be encoded in two ways, either as a deviation relative to the size of the preceding anchor or as positive or negative vis-à-vis the hypothesis under consideration. Whereas the former (labeled estimation mode) results in data consistent with averaging models of judgment, the latter (labeled evaluation mode) implies adding models. Conditions are specified under which (a) evidence is encoded in estimation or evaluation modes and (b) use is made of the Step-by-Step or End-of-Sequence processing strategies. The theory is shown both to account for much existing data and to make novel predictions for combinations of task characteristics where current data are sparse. Some of these predictions are examined and validated in a series of five experiments. Finally, both the theory and the experimental results are discussed with respect to the structure of models of updating processes, limitations and extensions of the present work, and the importance of developing a procedural theory of judgment.

a b s t r a c t In this paper we discuss the use of quantum mechanics to model psychological experiments, starting by sharply contrasting the need of these models to use quantum mechanical nonlocality instead of contextuality. We argue that contextuality, in the form of quantum interference, is the only relevant quantum feature used. Nonlocality does not play a role in those models. Since contextuality is also present in classical models, we propose that classical systems be used to reproduce the quantum models used. We also discuss how classical interference in the brain may lead to contextual processes, and what neural mechanisms may account for it.

In mathematical modeling of cognition, it is important to have well-justified criteria for choosing among differing explanations
(i.e., models) of observed data. This paper introduces a Bayesian model selection approach that formalizes Occam’s razor,
choosing the simplest model that describes the data well. The choice of a model is carried out by taking into account not
only the traditional model selection criteria (i.e., a model’s fit to the data and the number of parameters) but also the
extension of the parameter space, and, most importantly, the functional form of the model (i.e., the way in which the parameters
are combined in the model’s equation). An advantage of the approach is that it can be applied to the comparison of non-nested
models as well as nested ones. Application examples are presented and implications of the results for evaluating models of
cognition are discussed.

Neural activity patterns related to behavior occur at many scales in time and space from the atomic and molecular to the whole brain. Here we explore the feasibility of interpreting neurophysiological data in the context of many-body physics by using tools that physicists have devised to analyze comparable hierarchies in other fields of science. We focus on a mesoscopic level that offers a multi-step pathway between the microscopic functions of neurons and the macroscopic functions of brain systems revealed by hemodynamic imaging. We use electroencephalographic (EEG) records collected from high-density electrode arrays fixed on the epidural surfaces of primary sensory and limbic areas in rabbits and cats trained to discriminate conditioned stimuli (CS) in the various modalities. High temporal resolution of EEG signals with the Hilbert transform gives evidence for diverse intermittent spatial patterns of amplitude (AM) and phase modulations (PM) of carrier waves that repeatedly re-synchronize in the beta and gamma ranges at near zero time lags over long distances. The dominant mechanism for neural interactions by axodendritic synaptic transmission should impose distance-dependent delays on the EEG oscillations owing to finite propagation velocities. It does not. EEGs instead show evidence for anomalous dispersion: the existence in neural populations of a low velocity range of information and energy transfers, and a high velocity range of the spread of phase transitions. This distinction labels the phenomenon but does not explain it. In this report we explore the analysis of these phenomena using concepts of energy dissipation, the maintenance by cortex of multiple ground states corresponding to AM patterns, and the exclusive selection by spontaneous breakdown of symmetry (SBS) of single states in sequences.

There are at least two general theories for building probabilistic–dynamical systems: one is Markov theory and another is quantum theory. These two mathematical frameworks share many fundamental ideas, but they also differ in some key properties. On the one hand, Markov theory obeys the law of total probability, but quantum theory does not; on the other hand, quantum theory obeys the doubly stochastic law, but Markov theory does not. Therefore, the decision about whether to use a Markov or a quantum system depends on which of these laws are empirically obeyed in an application. This article derives two general methods for testing these theories that are parameter free, and presents a new experimental test. The article concludes with a review of experimental findings from cognitive psychology that evaluate these two properties.

Sequential measurements of non-commuting observables produce order effects
that are well-known in quantum physics. But their conceptual basis, a
significant measurement interaction, is relevant for far more general
situations. We argue that non-commutativity is ubiquitous in psychology where
almost every interaction with a mental system changes that system in an
uncontrollable fashion. Psychological order effects for sequential measurements
are therefore to be expected as a rule. In this paper we focus on the
theoretical basis of such effects. We classify several families of order
effects theoretically, relate them to psychological observations, and predict
effects yet to be discovered empirically. We assess the complexity, related to
the predictive power, of particular (Hilbert space) models of order effects and
discuss possible limitations of such models.

This talk proceeds from the premise that IR should engage in a more substantial dialogue with cognitive science. After all, how users decide relevance, or how they chose terms to modify a query are processes rooted in human cognition. Recently, there has been a growing literature applying quantum theory (QT) to model cognitive phenomena ranging from human memory to decision making. Two aspects will be highlighted. The first will show how concept combinations can be modelled in a way analogous to quantum entangled twin-state photons. Details will be presented of cognitive experiments to test for the presence of “entanglement” in cognition via an analysis of bi-ambiguous concept combinations. The second aspect of the talk will show how quantum inference effects currently being used to fit models of human decision making may be applied to model interference between different dimensions of relevance.
The underlying theme behind this talk is QT can potentially provide the theoretical basis of new genre of information processing models more aligned with human cognition.

One of the aspects of quantum theory which has attracted the most general attention, is the novelty of the logical notions which it presupposes. It asserts that even a complete mathematical description of a physical system S does not in general enable one to predict with certainty the result of an experiment on S, and that in particular one can never predict with certainty both the position and the momentum of S, (Heisenberg’s Uncertainty Principle). It further asserts that most pairs of observations are incompatible, and cannot be made on S, simultaneously (Principle of Non-commutativity of Observations).