Introduction
The mathematics of our current understanding of probability was formalized in the 18th century, with the emergence of actuarial insurance, risk management, and shifting economic and colonial power.1 As part of this development, Thomas Bayes (1702-1761) aimed to address the problems of induction formulated by Hume and other Enlightenment thinkers, and fashioned a proposal for determining the conditional probability of a hypothesis, allowing for correction or refinement given new evidence. The power of Bayes theorem was that it allowed one to infer the probability of causes given events, by looking for patterns in the conditional context. Bayes theorem was pivotal for solving “the inverse problem” and calculating the probability of (prior) hypotheses given (posterior) evidence. This kind of abductive inverse reasoning shifted attention to a ‘rational’ agent’s credence or degree of belief in the occurrence of a hypothetical ‘chance event’, while modeling quasi-causal relationships under uncertain conditions.2 A full-fledged subjective Bayesian probability emerged in the 1930s, with the work of Frank Ramsey and Bruno de Finetti, and f ull blown Bayesian neural networks have more recently evolved, which aim to manage the overfitting of neural network models, and better capture and communicate the uncertainty of parameter weights. In our current computational cultures, Bayesian statistics offers the perfect framework for continually updating prediction models, where real-time data feeds-forward into networked digital capitalism, and the emphasis is on hypothesis generation. As Joque suggests, “in a world of cheap computation and massive amounts of digital data, we can continually update our beliefs about the world as new data arrives”.3
This kind of plausible reasoning has now found its place in a new logic of algorithmic decision making where “the chain of contingencies becomes the driving force for decision-making actions”.4 Machine learning algorithms, including generative models, stumble over vast undulating error landscapes, constantly revising the linked measures and values for each node in the network, minimizing error through gradient descent. This paper asks: how might classical probability calculations about dependent variables – which are the engine of these models – be themselves problematic? Neural network algorithms iterate at inhuman speeds and produce what appears to be a kind of opaque judgment engendered by the allegedly wise algorithm. The widespread belief in the oracle-like capacity of algorithmic judgment is in part due to its capacity to simulate what historian of science Lorraine Daston calls a “thick” rule.5 Thick rules are responsive to particular cases, generalizing from particular to particular; in contrast, thin rules are “unencumbered by examples and exceptions” and are pared-down and brittle. In the case of current AI models, the opaque thickness is linked to how the error function is constantly broken and rewritten through its own instrumental structure, resulting in a lack of transparency and explainability. The ‘rules’ of deep learning algorithms run on constant updating, modulating the weights of parameters with each iteration, and interpolating their way incrementally from particular to particular. These are data-driven algorithms, compressing the noise and the aberration of particulars to achieve a general rule. This is exactly how rules can marshal power through the compression of such contingency. This is also how automated reason and machinic discernment extend social forms of control, by performing the role of the allegedly wise human adjudicating from case to case (like the Benedictine abbot that Daston uses to exemplify a thick rule(r)).
If the “Bayesian revolution” – as Joque names it – seems all too perfectly allied to information capitalism through its methods of reverse inference and incremental modulation of revised beliefs , what alternative kind of probabilistic reasoning might be sufficiently radical to trouble its dominance? No mathematical model of contingency will ever fully grasp the profound queerness of the chaos-monde, but some will do better at gathering diverse entangled tendencies. The aim of this paper is to expose some of the limitations of classical probability models which are integral to artificial neural network algorithms. This paper problematizes the epistemological principles that subtend our current reliance on a particular mathematics of plausible belief, and seeks alternative formulations of dependency and belief. I humbly venture into this exploit, in search of different onto-epistemic formalizations of contingency. I argue that dependency and doubt are mischaracterized in conventional formalizations of conditional probability and Bayesian updating. Contingency is simply not adequately captured by the kind of “procedural randomness” we see in current neural network algorithms.6 I offer this as a minor gesture in the midst of massive technocratic forces, as an attempt to pry open the ontological and epistemological cracks or “holes” in Bayesian approaches.7
One alternative approach is quantum probability where conditional probabilities are differently conceived, and probability is differently formalized.8. Quantum probability (QP) was formulated by John Von Neumann in the 1930s as a way of describing the nonclassical behavior of sub-atomic particles in quantum mechanics. There are significant technical and metaphysical differences between classical and quantum probability that might help us rethink the nature of chance and onto-epistemic dependency. My aim here is to show how alternative quantum-like mathematical formalizations of reasoning with uncertainty have come to surface recently in the social sciences . Quantum probability envisions a multi-dimensional state of potentiality where incompatible quantum futures are in superposition, problematizing the Bayesian reliance on calculating joint probability distributions.9
My interest is in the emerging field of quantum social science which interrogates tacit commitments to 19th century logics, ontologies, and socio-political frameworks, while exploring alternative formalizations of social-material processes, including mathematical formalizations of contingency .10 This significant shift towards quantum thinking in the social sciences draws on a century of practical methods for studying the nature of entanglement, locality, co/relationality, de/coherence, and scale, in natural phenomena .11 Applications across the social sciences, including the fields of international relations, philosophy of mind, economics, psychology, and sociology can be found at the quantum social science bootcamp (https://u.osu.edu/quantumbootcamp/) and project Q (https://projectqsydney.com/). These new research programs aim to address the persistent failure of the social sciences to adequately make sense of the complex entanglement of various couplets, such as mind/matter, subject/object, and individual/group, and have come to embrace new kinds of socio-ecological collectivism .12
Increasingly, researchers in cognitive science are exploring quantum cognition to account for the preponderance of human judgments that do not follow classical probability, including the sure thing axiom of decision making,13 and order and conjunctive axioms, all of which are explained below .14 Quantum probability spreads doubt across a system differently, and captures the onto-epistemic nature of entanglement, de/coherence, and superposition. I tread cautiously here, drawing insight from this research , yet wary of the psychological warrants of cognitive science . My own philosophical framing of this kind of research, drawing on work with Nathalie Sinclair, aligns more closely with that of Karen Barad’s quantum thought , where “agential realism” emphasizes the materiality of contingency, rather than the subjective nature of uncertainty.15 For new materialism, indeterminacy is not reducible to epistemic uncertainty. The quantum paradigm demands a new materialist mathematics of chance and an image of reasoning that reckons with the precarity of onto-epistemic entanglement, addressing the material force of ‘objective’ chance and not simply the subjective modulation of doubt.
Classical probability
In 1933 Andrey Kolmogorov (1903-1987) offered an axiomatization of classical probability.16 Here I discuss the three main tenets of that systematization: (1) probability is a non-negative real number, (2) the probability of the entire sample space is 1, and (3) the probability of the union of a countably infinite set of mutually exclusive events is equivalent to the sum of the probabilities of each of the events. The first axiom simply limits the values of the function to non-negative real numbers, and leaves as meaningless the idea of a negative probability or a complex-valued probability. The second axiom points out that a probability distribution measures exhaustively across a set of possible outcomes, and normalizes that total value, and hence the total probability is 1. The third axiom pertains to the ways in which probability assumes individuated and independent events, with many important consequences, such as the commutativity of probability judgments. This commutativity is crucial for classical probability, as are rules for calculating conditional probabilities like that of Bayes , to help formalize and manage dependency rather than independency .
The rules of classical probability may serve speculative finance and machine learning algorithms fairly well, but they are woefully inadequate at describing human reasoning about uncertain matters. In the 1980s, various experiments showed how human judgment is at odds with the principles of classical probability, and that human reasoning rests on other kinds of inference.17 In particular, reasoning probabilistically about dependent and conditional events does not typically align with the Kolmogorov axioms of probability. Cognitive science research reveals how order effects, conjunction bias, and sure-thing inclinations overrule the classical rules for plausible reasoning in situated decision making.18 Alternative models of reasoning such as query theory, fuzzy trace theory and integration theory are all attempts to alter the rules of classical probability (CP) in order to better describe the way that people reason about contingency.19 Whilst CP sees only a fallacy when humans deviate from the rules of probabilistic reasoning, these other approaches aim to comprehend a different metaphysical and mathematical entanglement of belief and materiality .
Alexander Wendt highlights three aspects of decision-making that arise from the work of Kahneman and colleagues in psychology, to show how people reason about conditional probabilities.20 From the perspective of classical probability, these habits are all considered examples of fallacious reasoning ,21 but I want to emphasize here that in each case there is also an ontological insight buried in the apparent contradiction . It is this ontological insight that betrays the particular metaphysics of chance packaged in our conventions about probability. The first issue involves the “order effect” in which the value or weight that people ascribe to two events depends on the order in which the two events are presented to them, contradicting the commutativity of probability (i.e. P(A?B)≠P(B?A)). In other words, the probability that Joan is an arsonist (A) and a banker (B) is theoretically equivalent to the probability that Joan is a banker (B) and an arsonist (A); however, people regularly evaluate these conjunction probabilities differently, because order matters to their judgement. Rather than dismiss this as simply subjectivist, and explain it in terms of prior knowledge impacting later judgements, it seems worth considering how event conjunctions are inherently spatio-temporal and entail more-than-human complex relational dependencies. Moreover, the observer is part of the event rather than detached or outside of it. The order error underscores how classical probability does not adequately capture the onto-epistemic impact of measurement or observation, despite being framed in terms of subjectivism. Non-commutative reasoning is often at work because people sense some intangible causality at work in the passing of time. It seems that people reason about events and mutually reciprocal processes differently because they are more conscious of being implicated in the state of affairs. If entanglement is such that extracting A from B ontologically is impossible, it may be that the commutative law does not adequately capture plausible reasoning.
The second aspect of decision making is the “conjunction error” in which people think a given event A is less likely than the combination of two events A and B. We tend to assume that the likelihood of Lucy being a feminist and a cyclist (A?B) is higher than her being a feminist (A). We wrongly correlate feminism with cycling, which would be flawed probabilistic reasoning in most contexts. But it’s important to see how this fallacy is also motivated, not entirely wrongly, by how multiplicity of character is conceived as an intensive rather than merely extensive quality. The distinction between intensive and extensive has a long history in Western philosophy, and is developed in Deleuze, as he explores the power of thought in a monist metaphysics. The term extensive pertains to extended substance while the term intensive refers to a more elastic thinking substance. Extended substance can be cut up and separated, while intensive substance is simply folded into a more complex multiplicity. In our case, with Lucy and her distinct qualities, multiple affiliations are folded together, and intersectionality is not just a list of distinct attributes held together like an unordered set. The intensive dependency between attributes may involve undulating folds or multiplicative factors that aren’t well captured in linear or logarithmic growth. Indeed, intensive continuous processes of becoming are forever out of reach of incremental methods.22. The conjunction error indicates that dependency relations between multiple affiliations cannot be captured by classical set theoretic assumptions – they cannot be treated as isolated and overlapping sets in a Venn diagram or node-edge graph.
Finally, the third aspect of decision making is called the “disjunction error” and concerns the way people reason about the consistency of belief in hypothetical contexts. This is known as the “sure thing” principle, described by Leonard Savage in 1954 in these terms : If under state of the world X, people prefer action A over action B and in state of the world ~X people prefer action A over B, then if the state of the world is unknown, a person should always prefer action A over B.23 According to classical probability, the union of X and ~X is equivalent to the total universe of possible worlds and it should be a ‘sure thing’ to persist in preferring action A over B. However, persistent evidence suggests that humans do not follow the principle in practice. This error suggests that belief and doubt are more elastic dispositions, squeezing their way out of the polarized binary of past convictions. Essentially, it suggests that most people do not ascribe to the totalizing gesture whereby X?~X?U and, equivalently, do not follow the law of semantic bivalence when dealing with an imagined future action. This contradicts the basic axioms of classical probability in various ways, including the assumption that the probability of the entire sample space is 1. It seems that people are inclined to allow for a third space in future conditions and thereby create an opening where their preference or belief might alter. People seem to hold out for some other possible world not yet accommodated by the two claims, as though the force of the unknown was stronger than their preference or belief in the two given known situations.
Quantum Probability
Quantum probability recasts the onto-epistemic environment in terms of probabilistic outcomes that are literally mixed together in varying intensity.24 This approach emphasizes the non-classical nature of reasoning, where the virtual space of potentiality is characterized in terms of superposition, a holistic entanglement of “what if not?” futures. Classical probability characterizes this hypothetical future as a set of choices separated-out into paths, each with their own probabilistic weight, and then aggregated, like a bag of color marbles.25 The quantum superposition state, on the other hand, is not an average state obtained from an aggregate of possibilities, but an intensive mixing of futures with non-futures, honoring the spread of indeterminacy across the potential and the actual. Moreover, the non-commutative nature of quantum measurement means that disambiguating a superposition state requires new methods, rather than an act of revelation of previously determined prior states, thereby contesting the all too neat separation of priors and posteriors as found in classical Bayesian methods .26 The quantum world exhibits complex modes of relational dependency that are woefully characterized by conventional conditional probability formulations.27
I am interested in how this approach offers both a mathematics and a metaphysics, linking measures of subjective uncertainty to the existence of objective indeterminacy in ways that open up new onto-epistemologies, better attending to the situatedness of reasoning, and the limits of probabilistic intelligibility. Knowledge states in QP involve a superpositioning of different possible outcomes (hypotheses, decisions, actions) because such states are in fact dynamic mixtures of potentiality, which are not consistent with any one action or decision in the conventional sense .28 While classical probability represents knowledge as a discrete set of propositions correlated to a discrete set of outcomes, quantum probability can capture the ambivalence inherent to knowledge, pointing to the underlying indeterminacy of phenomena, and tapping into the metaphysics of quantum entanglement. On this account, quantum superposition is not merely Bayesian uncertainty, because entanglement points to a very different kind of dependency relation. The quantum ontology shifts the focus from updating subjective belief, as is the case for Bayesian epistemology, to a more complex mixture of subjective and objective factors.
In classical probability we can always form a meaningful conjunction of two propositions A and B; if we are able to determine the truth value of A and B independently, then we are able to determine the truth value of their conjunction. In quantum probability, however, there can be cases where the truth of such a conjunction is fundamentally indeterminate. Quantum metaphysics entails weird dependency relations which conjoin what is conventionally incompatible (a cat that is both dead and alive). Our reasoning about complex events often involves incompatible insights that cannot be cobbled together to create a well-formed probability claim. In such cases, it may be that knowing the truth value of A implies not knowing the truth value of B. We might characterize this logical relation as A?(B??B). Another way of presenting this is the statement: “If A is true at time t1, then B is neither true nor false at time t1”. This logical formulation captures the temporality of any knowledge claim, and the ways in which decision making is never a simple choice between separate options. And it brings home the ways in which ‘unknowns’ are profoundly resistant to capture.
In classical probability, separability and incompatibility are defined in exclusive terms; two events are incompatible if they contradict each other (A and ~A are incompatible). The quantum concept of incompatibility is defined somewhat differently.29 Two events (A and B) are incompatible if there is a conditional entanglement between them that makes it impossible to pose well-formed questions about their conjunction at that time. The expression “well formed” simply means answerable. One of the major potential contributions of QP to social theory pertains to this revised concept of incompatibility. In quantum-like modeling of human judgment, there can be aspects about an event that are perceived as incompatible, that is, we pose a question about different variables for which knowledge is not coherently integrated. Incompatibles can be contemplated in a quantum state, before decoherence occurs and separates out the actions. For quantum cognitive scientists, who are attempting to formalize these complex relations, d etermining what kinds of questions are incompatible is extremely tricky, and is a crucial lynchpin in any theory of quantum mind. This concept of incompatibility is thus fundamental in any cultural applications of quantum-like models, and disagreements about what constitutes incompatible parameters will lead to very different models.30 But this issue of incompatibles is crucial for showing how quantum probability troubles common Bayesian techniques that routinely assume a joint distribution of any two parameters and routinely treat the act of measurement as a direct and unproblematic application of classical rules . In the quantum context, we can’t assume this: “Quantum superposition is not the same thing as probabilistic averaging over uncertainty.”31
Granted, the vector formalization of quantum states using the Bloch sphere, where incompatibles are conjoined, is still a simplification; the quantum Hilbert space freezes the event in order to grapple with its possible outcomes, and in so doing this approach still fails to really capture the dynamism of the event.32 However, and importantly, CP must freeze the event and treat outcomes as mutually distinct and meaningful within a single flat outcome space. All outcomes are compared within a space of complete knowledge in which being situated in one belief does not curtail one’s ability to determine the truth of another belief. Such an approach does not do justice to uncertainty. Correlational inference is what rules the Bayesian landscape, so that any incompatibles are flattened into a landscape of forced compatibility. See Guido Bacciagaluppi’s 2016 chapter on these issues, for an excellent introduction to the formalities of quantum probability.33
Doubt and paranoia
Louise Amoore states that machine learning algorithms operate in terms of a “teeming plenitude of doubtfulness” because “doubt becomes transformed into a malleable arrangement of weighted probabilities” and is ultimately condensed into a decision.34 She is critical of claims to algorithmic certainty that mask the uncertainty at the heart of probabilistic reasoning, and she demands a post-Cartesian and post-human revisioning of doubt, which distributes doubt across an indeterminate and uncertain ecology. Achille Mbembe also paints a disturbing picture of our current algorithmic capture of doubt, a time that is “far from being one of reason”. Alongside the growing reliance on neural network algorithms and their alliance with capital, he sees a renewed mytho-religious fervour and a simultaneous flourishing of conspiracy theory. The new “digital divine” is seen as “vital, visceral, and energetic” aligned with new forms of populism and faith.35 This delirium simultaneously results in large-scale refusals of any kind of healthy doubt, demanding instead a non-verifiable faith in automated intelligent systems. We see a growing desire for “mysteries and the return of a spirit of crusade … a time of paranoid dispositions …”.36 What emerges, he claims, is the security state which is premised on the insecurity of its virtual hordes, utterly isolated and yet digitally connected, like adjacent foam bubbles, proximal, linked, and yet alone. This state of affairs rests on and furthers the control society, achieved through software systems operating at the granular scale of infra-individual affective collectivity. Viral disinformation speeds across the social links of online platforms, broadcast without apparent limit, where belief accrues into the dogma of anointed opinion.
I wonder if these insights into the lack of healthy doubt and its link to our socio-technical conditions is also a reflection of the very probability models built into our current algorithms. Perhaps classical probability models reflect a kind of paranoid disposition, presuming pre-given masked a priori conditions yet to be unveiled, while quantum probability describes a multiple world where cognition is more holistically bound up with entangled localities. Perhaps neural network models built on CP actually rob us of all healthy doubt because they never actually encounter an incompatible. Doubt has no real metaphysical bite in this space – instead, epistemic wandering in the hypothesis space lends itself to a flattened combinatoric subjectivity.37 In quantum social science, disambiguating a superposition state is an inventive process rather than an act of revelation of previously hidden states, and doing so may reveal incompatibles which would require a new kind of risky diplomacy to navigate. The virtual space of superposition does not consist of a set of choices separated-out (and then mixed up like a bag of colour marbles), nor is it the same thing as the set of all possible outcomes (each with their own probabilistic weight), but is rather a space of holistic entanglement across radical difference. Quantum social science is still debating just how this differently distributed doubt might better characterize the credibility of any given truth claim. How might doubt be understood in the midst of entanglement? Can doubt be rescued as a positive mode of engaged reasoning, after being so utterly fragmented by neural network algorithms? Or is pure skepticism and nihilism the only response?38 What kind of mathematics of chance might better reckon with the precarity of entanglement and the fluidity of contingency? Is the deep learning capture of chance our best and worst effort, and likely to be our last? Or might Quantum Social Science offer new and alternative ways of exploring the situated and vertiginous nature of onto-epistemic doubt?
Quantum concerns
Recent attempts to develop “quantum cognition machine learning” underscore the continued investment in this alternative approach to reasoning about uncertainty.39 The arguments in favor of quantum ML concern the usual issues of efficiency and accuracy, but also the general aim of simulating human-like perception judgments.
In the fields of decision-making and QCT [quantum cognition theory], superposition enables a model to consider multiple states at once, similar to how human cognition holds several options in mind before making a decision … Just as human subconscious processing helps weigh options and select the best course of action, hybrid neural networks would cross-reference different models to reduce overconfidence and incorrect AI predictions.40
These developments, along with the continued investment in quantum computing and quantum encryption, suggest that the future of computing may accord with new quantum images of reasoning.41 Ever since Peter Shor showed in 1994 that quantum computing could in principle break standard encryption algorithms, research into the field is increasingly funded at high levels by both governments and corporations.42 The field, however, remains highly theoretical, in part because the resources needed to sustain quantum de/coherence in a ‘quiet’ environment are exorbitant. Nonetheless, there is growing interest in quantum paradigms and the ways in which they might disrupt computational habits and standards of algorithmic intelligibility. Might this alternative way of calculating contingency open up better ways of understanding doubt and dependency? By the 1980s, quantum science was increasingly framed in terms of information theory, adopting algorithmic methods from computer science.43 A seminal paper by Stephen Wiesner in the 1970s introduced the idea of storing and transmitting two messages by encoding them as “conjugate observables” so that either, but not both, could be observed and decoded. Quantum cryptography has developed since then, introducing new methods into cyber security, where quantum nonlocality and Bell’s inequalities are used to encode secure key distribution. This is computing that plugs into the non-deterministic modes of quantum behavior and allows us to shift away from classical probability into studies of entanglement . In quantum states, there is no definite mutually exclusive condition to be revealed. Given current limits around computability, there is hope that quantum cryptography can encrypt data for longer periods than classical cryptography, offering possibly up to 100 years of protection.
Insofar as this serves the security state, we are hardly better off. In other words, the same logic of hiding secret keys from digital eavesdroppers fuels the research into quantum cryptography, where the compounded nature of uncertainty makes not-knowing a powerful tool to better protect data. Advocates claim that hackers cannot clone or copy quantum information, which means that “qubits” are not knowable in the classical sense. Quantum algorithms entail an encryption opportunity, where degrees of certainty are balanced with an absolute uncertainty (a kind of perfect unknowability). The strategy aligns with principles of information theory and mobilizes the vector space methods of formalized quantum probability, allowing for a kind of teleporting of information . In practice, quantum computing is currently unsustainable at any significant scale, because the tiniest amount of heat or material disturbance can destroy entanglement at the scale of qubits. Additional patches are also needed to address environmental noise effects, and moderate error, but nonetheless there is huge speculative investment in this direction. If cryptography is the one area where quantum computing is being explored with some success, this reminds us that the security industry dominates computer science research, and the guarded exchange of property (intellectual and capital) continues to shape the future of communication technologies.
Another concern is that quantum-like probability models aim to explain non-classical quasi-causal subconscious inference, and thus might seem to affirm an inherent computational propensity of mindedness. To the extent that quantum decision theory affirms a kind of infra-probability, operating at scales beneath human discernment, we are led to wonder whether quantum probability might be used to track the ebb and flow of intensive pre conscious inclinations . We risk falling into a crude materialism regarding biophysical causes if we slip all too easily into a conviction that quantum computational configurations are realist expressions of matter. Posthuman ecological perspectives modeled on the premise of ontological indeterminacy, as we find in Quantum Social Science, will have to avoid the reductive naturalization of quantum probability, as though it were ‘one’ with a fundamental physical indeterminacy of nature. According to Parisi , this danger comes from failing to separate “the continuity of biophysical complexity from the discrete character of computational abstraction,”44 whether it be classical or quantum. Ironically, the continuous mathematics of gradient descent contrasts with the mathematics of quantum mechanics. But all of these mathematical formalizations are challenged by the discrete/continuous aporia.45 Calculus is only a rough-hewn and humble incremental tool for studying infinitary variance and the continuous intensive interval . And the very notion of ‘continuity’ has itself been reformulated according to the quantum paradigm shift, so that new ways of “cutting together/apart” capture a new mix of discrete and continuous.46 In order to avoid the traps of an automated reason ruled by viral affect and totalitarian motives, or the crude materialism of an infra-quantum probabilism, we are going to need more critical insight into the dis/advantages of a new quantum social science paradigm .
References
Aaronson, Scott. ‘ What makes quantum computing so hard to explain?’ Accessed on July 29, 2021. https://www.quantamagazine.org/why-is-quantum-computing-so-hard-to-explain-20210608/
Aerts, Diederik , Broekaert, Jan , Gabora, Liane and Sandro Sozzo. ‘Quantum structure and human thought’. Behavioral and Brain Sciences, 36 (2013): 274-276.
Amoore, Louise . Cloud ethics: Algorithms and the attributes of ourselves and others. Duke University Press, 2020.
Atmanspacher, Harald. ‘At home in the quantum world’. Behavioral and Brain Sciences, 36 (2013): 276-277.
Bacciagaluppi, Guido. (2016). ‘Quantum probability’. In Alan Hajek and Christian Hitchcock (Eds.), The Oxford Handbook of Probability and Philosophy. Oxford University Press, 2016: 545-572.
Baragello, Fabio. Quantum concepts in the social, ecological, and biological sciences. Cambridge University Press, 2022.
Barad, Karen. Meeting the universe halfway: Quantum Physics and the Entanglement of Matter and Meaning. Duke University Press, 2007.
Behme, Christina. ‘ Uncertainty about the value of quantum probability for cognitive modeling’. Behavioral and Brain Sciences, 36 (2013): 279-280.
Bitbol, Michel. ‘Quantum machines as an ecological theory of being in a (natural and human) world’. Paper presented at the Quantum social science boot camp, July 12th, 2021. https://u.osu.edu/quantumbootcamp/speakers/
Busemeyer, Jerome R. & Bruza, Peter D. Quantum models of cognition and decision. Cambridge University Press. 2015.
Busemeyer, Jerome R., Pothos, Emmanuel M., Franco, Ricardo and Jennifer S. Trueblood. ‘ A quantum theoretical explanation for probability judgment errors’. Psychological Review 118 , no 2 (2011): 193–218.
Calzati, Stefano and Derrick de Kerckhove. Quantum ecology: Why and how new information technologies will reshape societies. The MIT Press, 2024
Cosmides, Leda and John Tooby . ‘ Are humans good intuitive statisticians after all? Rethinking some conclusions from the literature on judgment under uncertainty’. Cognition, 58 (1996): 1-73.
Da Silva, Denise Ferreira. ‘Difference without separability’. Accessed on Dec 10, 2024, at https://issuu.com/amilcarpacker/docs/denise_ferreira_da_silva. 2017.
Daston, Lorraine. Classical probability in the Enlightenment. Princeton University Press, 1988.
Daston, Lorraine. Rules: A short history of what we live by. Princeton University Press, 2022.
de Castro, Alexandre . ‘ On the quantum principles of cognitive learning’. Behavioral and Brain Sciences, 36, (2013): 281-282.
de Freitas, Elizabeth. ‘The mathematical continuum: A haunting problematic’. The mathematical enthusiast, 15, no 1, (2018). Available at https://scholarworks.umt.edu/tme/vol15/iss1/9
de Freitas, Elizabeth. ‘Deleuze’s Hume: Machine speculation and the infinitude of useless hypotheses’. Paper presented at the Deleuze & Guattari Studies Conference, July 5-7, 2021. Video available at https://www.accelevents.com/e/dgsconference
de Freitas, Elizabeth . ‘Mathematics in the middle: The relationship between measurement and metamorphic matter’. Matter: Journal of new materialist research. ( 2021): Available at: https://revistes.ub.edu/index.php/matter/article/view/35888
de Freitas, Elizabeth. ‘The new empiricism of the fractal fold: Rethinking monadology in digital times’. Cultural Studies? Critical Methodologies 2017, Vol. 16(2) 224–234. 2017
de Freitas, Elizabeth and Nathalie Sinclair. ‘ The quantum mind: Alternative ways of reasoning with uncertainty’. The Canadian Journal of Mathematics, Science and Technology, 18, (2018): 271-283.
Deleuze, Gilles. Empiricism and subjectivity: An essay on Hume’s theory of human nature. (Trans. C.V. Boundas). Columbia University Press, 1981.
Deleuze, Gilles. Difference and repetition (Trans. Paul Patton). Columbia University Press, 1994.
Fischbein, Efraim and Ditza Schwartz . ‘ The evolution with age of probabilistic, intuitively based misconceptions’. Journal of Research in Mathematics, 28, no 1 (1997): 96-105.
Franco, Riccardo. ‘ The conjunction fallacy and interference effects’. Journal of Mathematical Psychology, 53, (2009): 415–22.
Galloway, Alexander. The Uncomputable: Play and politics in the long digital age. Verso Books, 2021.
Gelman, Andrew and Yuling Yao . ‘ Holes in Bayesian Statistics’ . Journal of Physics G: Nuclear and Particle Physics, (2020). arXiv:2002.06467.
Ghose, Shohini. ‘ Beyond the binary: Building a quantum future’. Morals and Machines, vol 1. , (2021): https://www.nomos-elibrary.de/10.5771/2747-5174-2021-1-44.pdf
Glissant, Édouard. Poetics of relation. (Trans. Betsy Wing). University of Michigan Press, 1997.
Grace, Randolf C. and Simon Kemp . ‘Quantum probability and comparative cognition’. Behavioral and Brain Sciences, 36, (2013): 287.
Hacking, Ian. The Emergence of Probability. Cambridge University Press, 1975.
Hampton, James A. ‘ Quantum probability and conceptual combination in conjunctions’. Behavioral and Brain Sciences, 36, (2013): 290-291.
Haven, Emmanuel and Andrei Khrennikov . Quantum Social Science. Cambridge University Press, 2013.
Hayles, Katherines. Unthought: The power of the cognitive nonconscious. University of Chicago Press, 2017.
Joque, Justin. Revolutionary mathematics: Artificial intelligence, statistics, and the logic of capitalism. Verso, 2022.
Kahneman, Daniel , Slovic, Paul and Amos Tversky . Judgment under uncertainty: Heuristics and biases. Cambridge University Press, 1982.
Keeling, Kara. Queer times, Black futures. Duke University Press, 2019.
Khrennikov, Andrei. ‘ Quantum-like modeling: Cognition, decision making, and rationality’. Mind & Society, 19, (2020) :307–310.
Maksimovik, Milan & Maksimovik, Ivan S. ‘Quantum cognitive neural networks: Assessing confidence and uncertainty with human decision-making simulations’. Big Data and Cognitive Computing, 9(1), 12. DOI:10.3390/bdcc9010012. 2024
Mbembe, Achille. Necropolitics. Duke University Press, 2019.
Musaelian, Kharen; Abanov, Alexander; Berger, Jeffrey; Candelori, Luca; Kirakosyan, Vahagn; Samson, Ryan; Smith, James; Villani, Dario. Quantum cognition machine learning: AI needs quantum. https://indico.qtml2024.org/event/1/contributions/106/attachments/107/111/QCML.pdf
Parisi, Luciana. ‘ Computational logic and ecological rationality’. In E. Hörl and James Burton (Eds.) General ecology: The new ecological paradigm. Bloomsbury, (2017): 75-100.
Pothos, Emannuel and Jerome R. Busemeyer . ‘Can quantum probability provide a new direction for cognitive modeling?’. Behavioral and Brain Sciences, 36, (2013): 255-327.
Pourciau, Sarah. ‘ On the digital ocean’. Critical Inquiry, 48, no 2 , (2022): 233-261.
Savage, Leonard. The foundations of statistics. Dover Publications, 1954.
Tenton, Katya and Vincenzo Crupi . ‘ Why quantum probability does not explain the conjunction fallacy’. Behavioral and Brain Sciences, 36, (2013): 308-310.
Völker, Susanne. ‘ Cutting together/apart: Impulses from Karen Barad’s feminist materialism for a relational sociology’. In U. Tikvah Kissmann & Joost Van Loon (Eds.), Discussing New Materialism: Methodological implications for the study of materialities. Springer Publishing, 2019: 87-106.
Von Baeyer, Hans Christian. Qbism: The future of quantum physics. Harvard University Press. 2016.
Wang, Zheng, Busemeyer, Jerome R., Atmanspacher, Harald and Emannuel M. Pothos . ‘The potential for using quantum theory to build models of cognition’. Topics in Cognitive Science, 5, no 4, (2013): 672-688.
Warren, Calvin. ‘ The catastrophe: Black feminist poethics, anti(form), and mathematical nihilism’. qui parle, Vol. 28, No. 2, (2019): 353-372.
Wendt, Alexander. Quantum mind: Unifying physical and social ontology. Cambridge University Press, 2015.
- Daston, Lorraine, Classical probability in the Enlightenment (Princeton University Press, 1998). ↩
- Hacking, Ian The Emergence of Probability (Cambridge University Press, 1975). ↩
- Joque, Justin, Revolutionary mathematics: Artificial intelligence, statistics, and the logic of capitalism. New York: Verso, p. 152. ↩
- Parisi, Luciana. (2017). ‘Computational logic and ecological rationality’. In Erich Hörl and James Burton (Ed.) General ecology: The new ecological paradigm. London: Bloomsbury, p.77. ↩
- Daston, Lorraine. Rules: A short history of what we live by. Princeton University Press, 2022. ↩
- Galloway, Alexander. The Uncomputable: Play and politics in the long digital age. New York: Verso Books, 2021. ↩
- Gelman, A. & Yao, Y. ‘ Holes in Bayesian Statistics’, 2020. ↩
- See Busemeyer and Bruza (2015) for an overview. QP can be generalized to include CP as a case. There is a surfeit of divergent interpretations of quantum probability, and a complex mesh of arguments for why one interpretation is better than another. I have tried to avoid wading into these debates (which are well beyond my expertise), and here I merge a few divergent perspectives to elaborate QP as an onto-epistemic alternative to CP, following some of my previous work. See Bacciagaluppi (2016) for introduction to QP. ↩
- Khrennikov, A. Quantum-like modeling: Cognition, decision making, and rationality. Mind & Society (2020) 19:307–310. ↩
- Haven & Khrennikov, 2013; Wendt, 2015, 2021. ↩
- Aerts et al., 2013; de Castro, 2013. ↩
- Bagarello, 2019; Pothos & Busemeyer 2013; Wang et al, 2013. ↩
- Pothos & Busemeyer, 2009. ↩
- Franco, 2009; Busemeyer et al, 2011. ↩
- de Freitas, 2018. ↩
- I follow the tradition of using the term “classical probability” to refer to the rules of probability derived in the 18th and 19th century. ↩
- Kahneman et al, 1982 ↩
- Wendt, 2015. ↩
- Cosmides & Tooby (1996) show that rewording probability queries according to frequentist perspectives, rather than subjective Bayesian frames, can resolve some of these issues. ↩
- Wendt, 2015. ↩
- Fischbein & Schwartz, 1997. ↩
- de Freitas, 2018; Deleuze, 1994. ↩
- Savage, 1954. ↩
- Atmanspacher, 2011; de Freitas & Sinclair, 2018; Grace & Kemp, 2013. ↩
- Bitbol, 2021. ↩
- Gelman & Yao, 2020. ↩
- See Hans Christian Von Baeyer (2016) for attempts to develop a quantum Bayesianism. ↩
- Pothos & Busemeyer, 2013, p. 256. ↩
- Notably, Niels Bohr borrowed the notion of incompatibility from William James (Pothos & Busemeyer, 2013). ↩
- Hampton, 2013. ↩
- Gelman & Yao, 2020, p.7. ↩
- Behme, 2013. ↩
- Bacciagaluppi, 2016. ↩
- Louise Amoore, 2019, p.2-3. ↩
- Mbembe, 2019, p. 51. ↩
- Mbembe, 2019, p. 41. ↩
- de Freitas, 2018. ↩
- Warren. ↩
- Musaelian et al, 2024. ↩
- Maksimovic & Maksimovic, 2024, p.15. ↩
- Stefano Calzati & Derrick de Kerckhove. 2024. ↩
- Aaronson, 2021. ↩
- Ghose, 2021. ↩
- Parisi, 2017, p.81. ↩
- de Freitas, 2015. ↩
- Völker, 2019. ↩