Bob Coecke and Robert W Spekkens, Picturing classical and quantum Bayesian inference, Arxiv.Org, quant-ph (2011) 651-696.
We introduce a graphical framework for Bayesian inference that is sufficiently general to accommodate not just the standard case but also recent proposals for a theory of quantum Bayesian inference wherein one considers mixed quantum states rather than probability distributions as representative of degrees of belief. The diagrammatic framework is stated in the graphical language of symmetric monoidal categories and of compact structures and Frobenius structures therein, in which Bayesian inversion boils down to transposition with respect to an appropriate compact structure. In the case of quantum-like calculi, the latter will be non-commutative. We identify a graphical property that characterizes classical Bayesian inference. The abstract classical Bayesian graphical calculi also allow to model relations among classical entropies, and reason about these. We generalize conditional independence to this very general setting.
(web, pdf)
Issei Sato et al., Quantum Annealing for Variational Bayes Inference, Arxiv.Org, 2014 1408.2037v1, cs.LG.
This paper presents studies on a deterministic annealing algorithm based on quantum annealing for variational Bayes (QAVB) inference, which can be seen as an extension of the simulated annealing for variational Bayes (SAVB) inference. QAVB is as easy as SAVB to implement. Experiments revealed QAVB finds a better local optimum than SAVB in terms of the variational free energy in latent Dirichlet allocation (LDA).
(web, pdf)
James M Yearsley et al., When are representations of causal events quantum versus classical?, , 2016 pp. 1-6.
Throughout our lives, we are faced with a variety of causal reasoning problems. Arguably, the most successful models of causal reasoning, Causal Graphical Models (CGMs), perform well in some situations, but there is considerable variation in how well they are able to account for data, both across scenar- ios and between individuals. We propose a model of causal reasoning based on quantum probability (QP) theory that ac- counts for behavior in situations where CGMs fail. Whether QP or classical models are appropriate depends on the repre- sentation of events constructed by the reasoner. We describe an experiment that suggests the representation of events can change with experience to become more classical, and that the representation constructed can vary between individuals, in a way that correlates with a simple measure of cognitive ability, The Cognitive Reflection Task.
(pdf)
Harald Atmanspacher and Edward Nelson, Quantum Interaction, , 8369 (2014) 1-358.
At the time of writing this preface, it is now more than six years ago that the first Quantum Interaction conference took place at Stanford University. It is probably no exaggeration to claim that the group of people who originally set up this conference were barely thinking of the eventuality that this conference would develop into a series. But it did and we are proud to announce that this volume published by Springer in the LNCS series, gathers the output of accepted papers at the seventh Quantum Interaction conference which took place at the University of Leicester.
The themes of the Quantum Interaction conference series continue to revolve around four main subject pillars: (i) information processing/retrieval/semantic repre- sentation and logic; (ii) cognition and decision making; (iii) finance/economics and social structures and (iv) biological systems.
The accepted and refereed papers published in this volume are ordered according to the above four themes.
The outcome of the refereeing process this year allocated nearly 60 % of all accepted papers to oral presentations. The remaining accepted papers were put in a poster session which continued throughout the full duration of the conference.
As with other Quantum Interaction conferences, this year we were again very fortunate to be able to count on the contributions of outstanding keynote speakers: Professor Nelson (Department of Mathematics - Princeton University); Professor Abramsky (Department of Computer Science - University of Oxford) and Professor Hiley (Theoretical Physics Research Unit - Birkbeck – University of London).
It is impossible to realize an event like this without the input of many people. Surely, the speakers and keynote addresses are essential. We thank all the speakers for their contributions. The input of both the Steering Committee and the Programme Committee, whose members were so active in refereeing the papers (on time!!) was essential. We also want to thank in particular Lisa Brandt for support covering vir- tually all the administrative details involved in the organization of this conference. We also want to thank Cheryl Hurkett for her infinite patience in setting up the website and adapting it to our ad-hoc wishes. The Centre for Interdisciplinary Science and the School of Management and the University of Leicester conference services need also to be thanked for their precious support.
(web, pdf)
Robert R Tucci, Quantum Bayesian Nets, Arxiv.Org, quant-ph (1997) 295-337.
We begin with a review of a well known class of networks, Classical Bayesian (CB) nets (also called causal probabilistic nets by some). Given a situation which includes randomness, CB nets are used to calculate the probabilities of various hypotheses about the situation, conditioned on the available evidence. We introduce a new class of networks, which we call Quantum Bayesian (QB) nets, that generalize CB nets to the quantum mechanical regime. We explain how to use QB nets to calculate quantum mechanical conditional probabilities (in case of either sharp or fuzzy observations), and discuss the connection of QB nets to Feynman Path integrals. We give examples of QB nets that involve a single spin-half particle passing through a configuration of two or three Stern-Gerlach magnets. For the examples given, we present the numerical values of various conditional probabilities, as calculated by a general computer program especially written for this purpose.
Published in: Int.J.Mod.Phys. B9 (1995) 295-337
(web, pdf)
Giuseppe Vitiello, The Brain and its Mindful Double, Journal Of Consciousness Studies, 2018 pp. 1-26.
In the past decade Walter Freeman has contributed to the development of the dissipative quantum model of the brain and its testing against laboratory observations. In this paper the model is briefly reviewed with particular reference to the brain–mind relation and its quantum gauge field structure which determines the macro- scopic functional behaviour of the brain. Memory appears to be memory of meanings constructed by learning which results from intentional actions. The consciousness act finds its realization in the unavoidable adjustments (dialogues) in the brain/environment (brain/ Double) relation, out of which the aesthetic experience is generated when the harmonious to-be-in-the-world is realized. Criticality, fractal self-similarity, chaoticity are manifestations of the coherent gauge field dynamics characterized by the free energy minimization condition.
(pdf)
Diederik Aerts, Quantum Structure in Cognition, Arxiv Preprint, 2009 pp. 1-58.
The broader scope of our investigations is the search for the way in which concepts and their combinations carry and influence meaning and what this implies for human thought. More specifically, we examine the use of the mathematical formalism of quantum mechanics as a modeling instrument and propose a general mathematical modeling scheme for the combinations of concepts. We point out that quantum mechanical principles, such as superposition and interference, are at the origin of specific effects in cognition related to concept combinations, such as the guppy effect and the overextension and underextension of membership weights of items. We work out a concrete quantum mechanical model for a large set of experimental data of membership weights with overextension and underextension of items with respect to the conjunction and disjunction of pairs of concepts, and show that no classical model is possible for these data. We put forward an explanation by linking the presence of quantum aspects that model concept combinations to the basic process of concept formation. We investigate the implications of our quantum modeling scheme for the structure of human thought, and show the presence of a two-layer structure consisting of a classical logical layer and a quantum conceptual layer. We consider connections between our findings and phenomena such as the disjunction effect and the conjunction fallacy in decision theory, violations of the sure thing principle, and the Allais and Elsberg paradoxes in economics.
(pdf)
Diederik Aerts, Interpreting Quantum Particles as Conceptual Entities, Arxiv.Org, quant-ph (2010) 2950-2970.
We elaborate an interpretation of quantum physics founded on the hypothesis that quantum particles are conceptual entities playing the role of communication vehicles between material entities composed of ordinary matter which function as memory structures for these quantum particles. We show in which way this new interpretation gives rise to a natural explanation for the quantum effects of interference and entanglement by analyzing how interference and entanglement emerge for the case of human concepts. We put forward a scheme to derive a metric based on similarity as a predecessor for the structure of 'space, time, momentum, energy' and 'quantum particles interacting with ordinary matter' underlying standard quantum physics, within the new interpretation, and making use of aspects of traditional quantum axiomatics. More specifically, we analyze how the effect of non-locality arises as a consequence of the confrontation of such an emerging metric type of structure and the remaining presence of the basic conceptual structure on the fundamental level, with the potential of being revealed in specific situations.
Published in: International Journal of Theoretical Physics, 49, pp. 2950-2970, (2010)
(web, pdf)
Jennifer S Trueblood and Percy K Mistry, Quantum Models of Human Causal Reasoning, , 2015 pp. 1-27.
Throughout our lives, we are constantly faced with a variety of causal reasoning problems. A challenge for cognitive modelers is developing a comprehensive framework for modeling causal reasoning across different types of tasks and levels of causal complexity. Causal graphical models (CGMs), based on Bayes’ calculus, have perhaps been the most successful at explaining and predicting judgments of causal attribution. However, some recent empirical studies have reported violations of the predictions of these models, such as the local Markov condition. In this chapter, we suggest an alternative approach to modeling human causal reasoning using quantum Bayes nets. We show that our approach can account for a variety of behavioral phenomena including order effects, violations of the local Markov condition, anti-discounting behavior, and reciprocity.
(pdf)
Jerome R Busemeyer and Jennifer S Trueblood, Theoretical and empirical reasons for considering the application of quantum probability theory to human cognition, , 2011 pp. 1-11.
This paper examines six theoretical reasons for considering the applica- tion of quantum theory to human cognition. It also presents six empirical examples that - while puzzling from a classic probability framework - are coherently explained by a quantum approach.
(pdf)
James T Townsend, General recognition theory and methodology for dimensional independence on simple cognitive manifolds, , 2007 pp. 1-43.
(pdf)
Hamid R Eghbalnia and Amir Assadi, Quantum Neurocomputation and Signal Processing, , 2000 pp. 1-11.
In this paper we consider a Quantum computational algorithm that can be used to determine (probabilistically) how close a given signal is to one of a set of previously observed signal stored in the state of a quantum neurocomputional machine. The realization of a new quantum algorithm for factorization of integers by Shor and its implication to cryptography has created a rapidly growing field of investigation. Although no physical realization of quantum computer is available, a number of software systems simulating a quantum computation process exist. In light of the rapidly increasing power of desktop computers and their ability to carry out these simulations, it is worthwhile to investigate possible advantages as well as realizations of quantum algorithms in signal processing applications. The algorithm presented in this paper offers a glimpse of the potentials of this approach. Neural Networks (NN) provide a natural paradigm for parallel and distributed processing of a wide class of signals. Neural Networks within the context of classical computation have been used for approximation and classification tasks with some success. In this paper we propose a model for Quantum Neurocomputation (QN) and explore some of its properties and potential applications to signal processing in an information-theoretic context. A Quantum Computer can evolve a coherent superposition of many possible input states, to an output state through a series of unitary transformations that simultaneously affect each element of the superposition. This construction generates a massively parallel data processing system existing within a single piece of hardware. Our model of QN consists of a set of Quantum Neurons and Quantum interconnections. Quantum neurons represent a normalized element of the n-dimensional Hilbert space - a state of a finite dimensional quantum mechanical system. Quantum connections provide a realization of probability distribution over the set of state that combined with the Quantum Neurons provide a density matrix representation of the system. A second layer with a similar architecture interrogates the system through a series of random state descriptions to obtain an average state description. We discuss the application of this paradigm to the quantum analog of independent states using the quantum version of the Kullback-Leibler distance.
(pdf)
Irina Basieva et al., Quantum probability updating from zero priors (by-passing Cromwell’s rule), Journal Of Mathematical Psychology, 2016 pp. 1-12.
Cromwell’s rule (also known as the zero priors paradox) refers to the constraint of classical probability theory that if one assigns a prior probability of 0 or 1 to a hypothesis, then the posterior has to be 0 or 1 as well (this is a straightforward implication of how Bayes’ rule works). Relatedly, hypotheses with a very low prior cannot be updated to have a very high posterior without a tremendous amount of new evidence to support them (or to make other possibilities highly improbable). Cromwell’s rule appears at odds with our intuition of how humans update probabilities. In this work, we report two simple decision making experiments, which seem to be inconsistent with Cromwell’s rule. Quantum probability theory, the rules for how to assign probabilities from the mathematical formalism of quantum mechanics, provides an alternative framework for probabilistic inference. An advantage of quantum probability theory is that it is not subject to Cromwell’s rule and it can accommodate changes from zero or very small priors to significant posteriors. We outline a model of decision making, based on quantum theory, which can accommodate the changes from priors to posteriors, observed in our experiments
(web, pdf)
Michel Bitbol, The Quantum Structure of Knowledge, Axiomathes, 21 (2010) 357-371.
This paper analyzes how conflicts of perspective are resolved in the field of the human sciences. Examples of such conflicts are the duality between the actor and spectator standpoints, or the duality of participancy between a form of social life and a socio-anthropological study of it. This type of duality look irreducible, because the conflicting positions express incompatible interests. Yet, the claim of ‘‘incommensurability’’ is excessive. There exists a level of mental activity at which dialogue and resolution are possible. Reaching this level only implies that one comes back to a state of undetermination between situations and interests whose best model is a superposition of states in generalized quantum theory. Some applications of this strategy of going back below the point of ‘‘state reduction’’, from the psychology of perception to the history of civilization, are presented.
(web, pdf)
David Ellerman, Probability Theory with Superposition Events: A Classical Generalization in the Direction of Quantum Mechanics, Arxiv.Org, 2020 2006.09918v1, quant-ph.
In finite probability theory, events are subsets of the outcome set. Subsets can be represented by 1-dimensional column vectors. By extending the representation of events to two dimensional matrices, we can introduce "superposition events." Probabilities are introduced for classical events, superposition events, and their mixtures by using density matrices. Then probabilities for experiments or `measurements' of all these events can be determined in a manner exactly like in quantum mechanics (QM) using density matrices. Moreover the transformation of the density matrices induced by the experiments or `measurements' is the Luders mixture operation as in QM. And finally by moving the machinery into the n-dimensional vector space over Z_2, different basis sets become different outcome sets. That `non-commutative' extension of finite probability theory yields the pedagogical model of quantum mechanics over Z_2 that can model many characteristic non-classical results of QM.
(web, pdf)
Thomas Filk and Hartmann Römer, Generalized Quantum Theory: Overview and Latest Developments, Axiomathes, 21 (2010) 211-220.
The main formal structures of generalized quantum theory are sum- marized. Recent progress has sharpened some of the concepts, in particular the notion of an observable, the action of an observable on states (putting more emphasis on the role of proposition observables), and the concept of generalized entanglement. Furthermore, the active role of the observer in the structure of observables and the partitioning of systems is emphasized.
(web, pdf)
Thomas Filk, Non-Classical Correlations in Bistable Perception?, Axiomathes, 21 (2010) 221-232.
A violation of Bell’s inequalities is generally considered to be the Holy Grail of experimental proof that a specific natural phenomenon cannot be explained in a classical framework and is based on a non-boolean structure of predications. Generalized quantum theory allows for such non-boolean predications. We formulate temporal Bell’s inequalities for cognitive two-state systems and indicate how these inequalities can be tested. This will introduce the notion of temporally non-local measurements. The Necker-Zeno model for bistable perception predicts a violation of these temporal Bell’s inequalities.
(web, pdf)
Dieter Gernert, Distance and Similarity Measures in Generalised Quantum Theory, Axiomathes, 21 (2010) 303-313.
A summary of recent experimental results shows that entanglement can be generated more easily than before, and that there are improved chances for its persistence. An eminent finding of Generalised Quantum Theory is the insight that the notion of entanglement can be extended, such that, e.g., psychological or psychophysical problem areas can be included, too. First, a general condition for entanglement to occur is given by the term ‘common prearranged context’. A formalised treatment requires a quantitative definition of the similarity or dis- similarity between two complex structures which takes their internal structures into account. After some specific remarks on distance, metrics, and semi-metrics in mathematics, a procedure is described for setting up a similarity function with the required properties. This procedure is in analogy with the two-step character of measurement and with the well-known properties of perspective notions. A general methodology can be derived for handling perspective notions. Finally, these con- cepts supply heuristic clues towards a formalised treatment of the notions of ‘meaning’ and ‘interpretation’.
(web, pdf)
Emmanuel Haven and Andrei Khrennikov, Statistical and subjective interpretations of probability in quantum-like models of cognition and decision making, Journal Of Mathematical Psychology, 2016 pp. 1-11.
The paper starts with an introduction to the basic mathematical model of classical probability (CP), i.e. the Kolmogorov (1933) measure-theoretic model. Its two basic interpretations are discussed: statistical and subjective. We then present the probabilistic structure of quantum mechanics (QM) and discuss the problem of interpretation of a quantum state and the corresponding probability given by Born’s rule. Applications of quantum probability (QP) to modeling of cognition and decision making (DM) suffer from the same interpretational problems as QM. Here the situation is even more complicated than in physics. We analyze advantages and disadvantages of the use of subjective and statistical interpretations of QP. The subjective approach to QP was formalized in the framework of Quantum Bayesianism (QBism) as the result of efforts from C. Fuchs and his collaborators. The statistical approach to QP was presented in a variety of interpretations of QM, both in nonrealistic interpretations, e.g., the Copenhagen interpretation (with the latest version due to A. Plotnitsky), and in realistic interpretations (e.g., the recent Växjö interpretation). At present, we cannot make a definite choice in favor of any of the interpretations. Thus, quantum-like DM confronts the same interpretational problem as quantum physics does.
(web, pdf)
Andrei Khrennikov et al., Quantum Models for Psychological Measurements: An Unsolved Problem, Plos One, 9 (2014) e110909-8.
There has been a strong recent interest in applying quantum theory (QT) outside physics, including in cognitive science. We analyze the applicability of QT to two basic properties in opinion polling. The first property (response replicability) is that, for a large class of questions, a response to a given question is expected to be repeated if the question is posed again, irrespective of whether another question is asked and answered in between. The second property (question order effect) is that the response probabilities frequently depend on the order in which the questions are asked. Whenever these two properties occur together, it poses a problem for QT. The conventional QT with Hermitian operators can handle response replicability, but only in the way incompatible with the question order effect. In the generalization of QT known as theory of positive-operator-valued measures (POVMs), in order to account for response replicability, the POVMs involved must be conventional operators. Although these problems are not unique to QT and also challenge conventional cognitive theories, they stand out as important unresolved problems for the application of QT to cognition. Either some new principles are needed to determine the bounds of applicability of QT to cognition, or quantum formalisms more general than POVMs are needed.
(web, pdf)
Andrei Khrennikov et al., Quantum probability in decision making from quantum information representation of neuronal states, Scientific Reports, 2018 pp. 1-8.
The recent wave of interest to modeling the process of decision making with the aid of the quantum formalism gives rise to the following question: ‘How can neurons generate quantum-like statistical data?’ (There is a plenty of such data in cognitive psychology and social science). Our model is based on quantum-like representation of uncertainty in generation of action potentials. This uncertainty is a consequence of complexity of electrochemical processes in the brain; in particular, uncertainty of triggering an action potential by the membrane potential. Quantum information state spaces can be considered as extensions of classical information spaces corresponding to neural codes; e.g., 0/1, quiescent/firing neural code. The key point is that processing of information by the brain involves superpositions of such states. Another key point is that a neuronal group performing some psychological function F is an open quantum system. It interacts with the surrounding electrochemical environment. The process of decision making is described as decoherence in the basis of eigenstates of F. A decision state is a steady state. This is a linear representation of complex nonlinear dynamics of electrochemical states. Linearity guarantees exponentially fast convergence to the decision state.
(web, pdf)
Anja Matschuck, Non-Local Correlations in Therapeutic Settings? A Qualitative Study on the Basis of Weak Quantum Theory and the Model of Pragmatic Information, Axiomathes, 21 (2011) 249-261.
Weak Quantum Theory (WQT) and the Model of Pragmatic Information (MPI) are two psychophysical concepts developed on the basis of quantum physics. The present study contributes to their empirical examination. The issue of the study is whether WQT and MPI can not only explain ‘psi’-phenomena theoretically but also prove to be consistent with the empirical phenomenology of extrasensory perception (ESP). From the main statements of both models, 33 deductions for psychic readings are derived. Psychic readings are defined as settings, in which psychics support or counsel clients by using information not mediated through the five senses. A qualitative approach is chosen to explore how the psychics experience extrasensory perceptions. Eight psychics are interviewed with a half-structured method. The reports are examined regarding deductive and inductive aspects, using a multi-level structured content analysis. The vast majority of deductions is clearly confirmed by the reports. Even though the study has to be seen as an explorative attempt with many aspects to be specified, WQT and MPI prove to be coherent and helpful concepts to explain ESP in psychic readings.
(web, pdf)
José Raúl Naranjo, Bridging the Gap: Does Closure to Efficient Causation Entail Quantum-Like Attributes?, Axiomathes, 21 (2011) 315-330.
This paper explores the similarities between the conceptual structure of quantum theory and relational biology as developed within the Rashevsky-Rosen- Louie school of theoretical biology. With this aim, generalized quantum theory and the abstract formalism of (M,R)-systems are briefly presented. In particular, the notion of organizational invariance and relational identity are formalized mathe- matically and a particular example is given. Several quantum-like attributes of Rosen’s complex systems such as complementarity and nonseparability are dis- cussed. Taken together, this work emphasizes the possible role of self-referentiality and impredicativity in quantum theory.
(web, pdf)
Stefano Siccardi, Spontaneous Anomalystic Phenomena, Pragmatic Information and Formal Representations of Uncertainty, Axiomathes, 21 (2010) 287-301.
I discuss the application of the Model of Pragmatic Information to the study of spontaneous anomalystic mental phenomena like telepathy, precognition, etc. In these phenomena the most important effects are related to anomalous information gain by the subjects. I consider the basic ideas of the Model, as they have been applied to experimental anomalystic phenomena and to spontaneous phenomena that have strong physical effects, like poltergeist cases, highlighting analogies and differences. Moreover, I point out that in such cases we cannot assign a probability of being accepted to every proposition, and so we cannot use standard formulas for pragmatic information and other relevant measures. To overcome the problem, I propose that qualitative possibility theory could be used to describe the situation. In such theory, the confidence in a proposition is expressed using a scale. Basic concepts like epistemic states, belief revision, information gain, pragmatic information etc. are discussed in this frame. Finally an application to some specific cases is sketched.
(web, pdf)
Jerome R Busemeyer and Jennifer S Trueblood, Comparison of Quantum and Bayesian Inference Models, , 2009 pp. 1-15.
The mathematical principles of quantum theory provide a general foundation for assigning probabilities to events. This paper examines the application of these principles to the probabilistic inference problem in which hypotheses are evaluated on the basis of a sequence of evidence (observations). The probabilistic inference problem is usually addressed using Bayesian updating rules. Here we derive a quantum inference rule and compare it to the Bayesian rule. The primary difference between these two inference principles arises when evidence is provided by incompatible measures. Incompatibility refers to the case where one measure interferes or disturbs another measure, and so the order of measurement affects the probability of the observations. It is argued that incompatibility often occurs when evidence is obtained from human judgments.
(pdf)
Johann Summhammer, Quantum Cooperation, Axiomathes, 21 (2010) 347-356.
Abstract In a theoretical simulation the cooperation of two insects is investigated who share a large number of maximally entangled EPR-pairs to correlate their probabilistic actions. Specifically, two distant butterflies must find each other. Each butterfly moves in a chaotic form of short flights, guided only by the weak scent emanating from the other butterfly. The flight directions result from classical random choices. Each such decision of an individual is followed by a read-out of an internal quantum measurement on a spin, the result of which decides whether the individual shall do a short flight or wait. These assumptions reflect the scarce environmental information and the small brains’ limited computational capacity. The quantum model is contrasted to two other cases: In the classical case the coherence between the spin pairs gets lost and the two butterflies act independently. In the super classical case the two butterflies read off their decisions of whether to fly or to wait from the same internal list so that they always take the same decision as if they were super correlated. The numerical simulation reveals that the quantum entangled butterflies find each other with a much shorter total flight path than in both classical models.
(web, pdf)
Harald Walach and Nikolaus von Stillfried, Generalised Quantum Theory—Basic Idea and General Intuition: A Background Story and Overview, Axiomathes, 21 (2011) 185-209.
Science is always presupposing some basic concepts that are held to be useful. These absolute presuppositions (Collingwood) are rarely debated and form the framework for what has been termed ‘‘paradigm’’ by Kuhn. Our currently accepted scientific model is predicated on a set of presuppositions that have diffi- culty accommodating holistic structures and relationships and are not geared towards incorporating non-local correlations. Since the theoretical models we hold also determine what we perceive and take as scientifically viable, it is important to look for an alternative model that can deal with holistic relationships. One approach is to generalise algebraic quantum theory, which is an inherently holistic frame- work, into a generic model. Relaxing some restrictions and definitions from quantum theory proper yields an axiomatic framework that can be applied to any type of system. Most importantly, it keeps the core of the quantum theoretical formalism. It is capable of handling complementary observables, i.e. descriptors which are non-commuting, incompatible and yet collectively required to fully describe certain situations. It also predicts a generalised form of non-local corre- lations that in quantum theory are known as entanglement. This generalised version is not quantum entanglement but an analogue form of holistic, non-local connect- edness of elements within systems, predicted to occur whenever elements within systems are described by observables which are complementary to the description of the whole system. While a considerable body of circumstantial evidence supports the plausibility of the model, we are not yet in a position to use it for clear cut predictions that could be experimentally falsified. The series of papers offered in this special issue are the beginning of what we hope will become a rich scientific debate.
(web, pdf)
James M Yearsley and Emmanuel M Pothos, Challenging the classical notion of time in cognition: a quantum perspective, Proceedings Of The Royal Society B: Biological Sciences, 281 (2014) 20133056-8.
All mental representations change with time. A baseline intuition is that mental representations have specific values at different time points, which may be more or less accessible, depending on noise, forgetting processes, etc. We present a radical alternative, motivated by recent research using the mathematics from quantum theory for cognitive modelling. Such cogni- tive models raise the possibility that certain possibilities or events may be incompatible, so that perfect knowledge of one necessitates uncertainty for the others. In the context of time-dependence, in physics, this issue is explored with the so-called temporal Bell (TB) or Leggett–Garg inequalities. We consider in detail the theoretical and empirical challenges involved in exploring the TB inequalities in the context of cognitive systems. One inter- esting conclusion is that we believe the study of the TB inequalities to be empirically more constrained in psychology than in physics. Specifically, we show how the TB inequalities, as applied to cognitive systems, can be derived from two simple assumptions: cognitive realism and cognitive com- pleteness. We discuss possible implications of putative violations of the TB inequalities for cognitive models and our understanding of time in cognition in general. Overall, this paper provides a surprising, novel direction in relation to how time should be conceptualized in cognition.
(web, pdf)
Jerome R Busemeyer, Introduction to Quantum Probability for Social and Behavioral Scientists, , 2008 pp. 1-35.
There are two related purposes of this chapter. One is to generate interest in a new and fascinating approach to understanding behavioral measures based on quantum probability principles. The second is to introduce and provide a tutorial of the basic ideas in a manner that is interesting and easy for social and behavioral scientists to understand.
(pdf)