Poetics of Machinic Opacity: Glissant and Bayes

Article Information

  • Author(s): Conrad Moriarty-Cole
  • Affiliation(s): Bath Spa University
  • Publication Date: July 2025
  • Issue: 10
  • Citation: Conrad Moriarty-Cole. “Poetics of Machinic Opacity: Glissant and Bayes.” Computational Culture 10 (July 2025). http://computationalculture.net/poetics-of-machinic-opacity/.


Abstract

This article explores the concept of opacity, examining its multifaceted roles in conditioning knowledge and cultural existence. It begins by addressing technical opacity in machine learning, and the Bayesian techniques used to quantify uncertainty. While these methods can reduce opacity technically, they fail to address the broader, structural “machinic opacity” that emerges from the interaction of computational reason with social processes at scale. This machinic opacity represents a phenomenological dimension beyond human cognitive and experiential grasp, compounded by the complexity of large-scale machine ecologies. The article then transitions to a cultural critique, drawing on Édouard Glissant’s pluralist concept of opacity and his poetics of Relation. Glissant's framework offers an affirmative response to the alienating potential of machinic opacity, advocating for the “right to opacity” as a means to preserve ontological diversity and foster cultural creation. By juxtaposing Glissant’s ideas with Bayesian epistemology, the article argues for a nonreductive approach to understanding machinic opacity, emphasising the generative role of nonknowledge in cultural processes. Ultimately, the article proposes that embracing machinic opacity through Glissant’s poetics of Relation opens space for a cultural response to the challenges posed by computational technologies and machinic opacity. This approach not only counters the colonial logic of transparency but also highlights the creative potential inherent in the interplay between human and machinic processes despite their mutual opacity. The article concludes by advocating for a cultural poetics that recognises and utilises the productive tensions between technical and cultural opacities, fostering a deeper engagement with the complexities of contemporary computational culture.


The following is an exploration of the concept of opacity. While it operates in various modes with significant differences, they each share the commonality that, in each instance opacity is a form of nonknowledge conditioning knowledge, and consequently functions as an active component in ways of being and knowing. Tracing this will entail a consideration of the technical concept of machine learning as a “black box” and how the quantification of uncertainty in Bayesian techniques in machine learning can reduce opacity. Conceived in this manner, opacity is of a technically limited scope, i.e. opacity that is theoretically resolvable by technical means.
It will be argued, however, that there is a broader, structural “machinic opacity” occluding the creative processes of social activity that generate the social-historical field. This machinic opacity emerges from the structural dynamics of computational reason interacting in the world (society) at scale, which is significantly compounded by the aforementioned technical opacities. The creative capacity that has been built into machines that learn is a metabolization of culture that produces a machinic dimension of the social imaginary. This machinic imaginary exceeds the human cognitive and phenomenological capacities to intuit and understand the full complexity of machinic processes. The term machinic opacity designates a phenomenological dimension of social existence that is beyond the horizon of human capacity to grasp through experience. Even with mediation there remains an excess, nonknowledge contained in the process social-historical becoming inflected and infected by computational reason. This epistemological hard-boundary of machinic opacity differentiates it from the merely technical problem of “black-boxed” models and technical solutions that fall within the category of XAI (Explainable AI, which aims to build models that are easier to understand how their output is reached, and analyse the automated decision-making process, thus providing greater transparency and accountability). The nonknowledge of machinic opacity is the excess that remains even after the use of technical interpretation models, decomposition techniques to break down the elements of a given model, or techniques designed to translate social complexity into human-readable form. Likewise, while Bayesian approaches applied within machine learning can reduce opacity, they do so in a technical sense only and therefore are inadequate in responding to the wider social-existential problem of machinic opacity.
This article explores the challenge machinic opacity poses to culture, considering how this challenge might be embraced and overcome. Searching for an adequate response to the problem of machinic opacity, the second part of this article will consider Édouard Glissant’s pluralist concept of opacity and the broader relational ontology it supports. Through Glissant, this third operative mode of opacity is introduced as an affirmation of ontological diversity within “Totality”.1 contains a framework for a way to conceptualise machinic opacity as condition of possibility for cultural creation. In each of the three conceptualisations of opacity considered in this article, the role of uncertainty is an unavoidable component of knowledge (of a technical problem; of the world; of difference). Uncertainty is nonknowledge: the condition of possibility of knowledge. This fundamental role of uncertainty is a core insight of Bayesian epistemology. However, rendering nonknowledge as a technical problem, the Bayesian quantification of uncertainty replicates the core problem of the imposition of a transparency, which Glissant addresses by demanding the “right to opacity” against colonial epistemology that would reduce all difference. As an alternative onto-epistemological model to Bayes, the conceptualisation of opacity in Glissant provides both a way to theorise machinic opacity in nonreductive cultural terms, and an affirmative response to the otherwise alienating potential of machinic opacity. In turning to Glissant we can radicalise the core insight of Bayesian epistemology’s embrace of uncertainty while avoiding the colonial logic of seeking to render the world transparent.

Machinic opacity

The history of the development of the capacity for machines to learn is a history of the artificial creation of a novel mode of existence; the human creation of the experience of an ‘other’, which in turn rebounds and changes the modes of existence of humans who participate in societies reordered by machines. With machine learning, humans have inscribed into the inert matter of silicon circuitry aspects of their psycho-biological capacity to invent and self-invent. Yet endowing machines with this creative capacity for invention does not mean that those machines replicate a human form of creativity (although it may have derivative cultural characteristics of those who programmed it). The creative affordances of computers emerge through not only a transference but a transformation of the human capacity for creative expression in the social activity that produces the world around us (i.e. culture or the social imaginary). By inscribing this capacity into inert matter, the capacity for creation has been reconfigured in a new form: computational reason. When a learning machine maps a problem space it distinguishes a pattern in the data. Let us call this pattern recognition a process of ‘creative patterning’: the patterns generated in the activity of learning machines serve as the material by which the world comes to be articulated by those non-human, machinic processes in a non-trivial sense. The generative and interactive capacities of contemporary computational systems are a creative articulation of a world within and alongside the human worlds of computational culture.2
Yet such computational processes are well documented as being “black-boxes”,3 and the need for XAI is a major topic of research in the field.4 Let us call this “technical opacity”, as far as the opacity is a technical problem that can be resolved through technical means. However, despite efforts at XAI, there remains an opacity of these computational processes that has not been resolved, and there is a strong case to make that this problem will never be resolved entirely, because even if individual models are made fully explainable (itself a dubious claim), at the macro-social level of large-scale machinic ecologies the complexity of interactions and externalities is highly unpredictable.5 Let us call this kernel of persistent unexplainability “machinic opacity” where it becomes a systemic effect imbued in computational infrastructure that condition and increasingly replace the social activity of humans.
This machinic opacity operates differently at scale. A phenomenological and cognitive opacity operates at the macro-level of large-scale machine ecologies in which emergent dynamics can be highly unpredictable. The layering of algorithms-upon-algorithms, systems-upon-systems, leads to complex feedback loops that are extremely difficult, if not impossible to understand. Where computational automation replaces human decision-making and other core social activities—especially where this involves machines with generative capabilities—there is a reconfiguration of the conditions for the social-historical creation of society. This is the machinic dimension of the social imaginary (culture). This is the field of activity of the machinic opacity in its fullest self-expression, such that it produces an intra-cultural opacity: an unintelligible dimension of the pathos and background structure of social life. Void of strategies to actively engage with this new social-historical condition, as far as cultural life becomes computational (a common feature of any local culture as far as it submits itself to the exigencies of computational regimes) this machinic creativity has the capacity to produce an existential crisis in relation to history as a product of human agency.6
While different in kind, the opaque macro-scale machinic processes described above are emergent from operations and forces at play at a micro-scale, and thus require further examination. Significantly, driving the latter are processes of machinic signification that are obscure and uninterpretable by humans,7 even those with the requisite technical knowledge and skills.8 As Jenna Burrell succinctly phrases it, this form of opacity “stems from the mismatch between mathematical optimization in high-dimensionality characteristic of machine learning and the demands of human-scale reasoning and styles of semantic interpretation.”9 An example of this can be seen in an experiment by Szegedy et al. in which they found that the properties an image classifier learnt were uninterpretable and counter-intuitive.10 By making imperceptible non-random perturbations to an image, the researchers caused the convolutional neural network (CNN) to arbitrarily change its prediction, i.e. it incorrectly classified an image that appeared to the human eye to be identical to an image it had previously classified correctly. This could have been an extreme case of overfitting if it were not for the fact that the model was able to generalise well. Instead, what these experimental findings imply is that certain highly abstracted micro-features, captured in the semantic properties of individual units within the network, can be undiscernible to humans yet important enough to change a classification. How this potentially plays out at scale is a core property of machinic opacity.
Responding to this special issue’s theme on Bayesian knowledge, an evaluation of the effectiveness of Bayesian techniques designed to reduce opacity in machine learning will highlight the difference between opacity in a technical sense as found in computer science literature and other socio-cultural forms of opacity that become operative in spite of, or even because of attempts to reduce technical opacity. Firstly, in machine learning there is the problem of uncertainty quantification,11 which comes in two forms: aleatory (noise in data, introduced in experiment repetition, etc) and epistemic (model, prior belief informing model choice, weight parameters, etc) uncertainty. Bayesian methods, discussed here, provide a framework for explicitly modelling uncertainty. In contrast to frequentist or classical machine learning approaches, which treat parameters as deterministic quantities, Bayesian Neural Networks (BNNs) treat parameters as random variables,12 and use probability distributions over model weights rather than a single point estimate for each weight. This means that each weight has many possible values, with some being more likely than others.13 In other words, instead of a single prediction as output, BNNs provide a range of possible outcomes with associated probabilities. This allows for a more accurate understanding of the uncertainty associated with model prediction, making it easier to assess the reliability of the model’s output.14 By accounting for model uncertainty, Bayesian neural networks (BNNs) can technically enable more efficient decision-making by reducing the risk of erroneous predictions.15 This provides more transparency regarding the model because it allows the assessment of the model’s confidence in its predictions. A high-confidence prediction is more likely to be accurate, while a low-confidence prediction may indicate uncertainty or the need for further investigation, thus it enables an informed decision about the trustworthiness of the model. This is beneficial in scenarios where model predictions need to be trusted, such as in healthcare for clinical decision support.16 Note, however, that this is a technical point, it does not account for broader political dimensions of the decision that predicate the training and deployment of a given model. For instance, in safety-critical applications where an incorrect decision can lead to severe consequences, such as autonomous weapons: BNNs can provide uncertainty estimates that prevent the system from making high-risk decisions in ambiguous situations in which the target may be a civilian, yet the taking of life is not questioned.17
Beyond purely technical issues of interpretability, epistemological and political problems can still be generated by model reduction techniques (such as approximation) that seek to reduce complexity for computational efficiency. Other factors, such as choice of training data and hyperparametric decisions may also have negative consequences by encoding systemic biases. How these BNNs operate within a wider socio-cultural field is deserving of critical analysis to understand their epistemological and political stakes.
By attending to technical opacity regarding model confidence, Bayesian techniques introduce a different kind of opacity, an opacity of difference, as Ramon Amaro argues in The Black Technical Object through a discussion of computer vision models.18 Models of this type often use Bayesian techniques to reduce the complexity of the visual scene in order for it to be legible within the specific formalism of the computational reason of the model.19 These reductions are enacted learned posterior distributions approximated from samples of real scenarios (training data).20 This is particularly problematic in cases where such models are used for the perception of human facial features because, as Amaro highlights, it reduces the figure of the human face to the data pool of training examples, which are often overrepresented by white faces. Thus non-white, and particularly Black faces as Amaro notes, are excluded from the space of perception: the execution of these computational techniques entails “reducing the operation of individuation and the differences amongst the living to no more than an assemblage of contradictions that are negated and subsumed into a higher, more homogenous unity of existence.”21 The reduction of opacity in a technical sense, with a view to making the world legible to a particular subject position at the intersection of white supremacy and computational reason, simultaneously occludes certain experiences; that of Black life, in the case Amaro describes.
Despite the claims more transparent model confidence mentioned above, systemic biases in training data are not fully resolved when using BNNs. If training data used for facial recognition tasks over-represents white faces the resulting BNN might exhibit reduced accuracy or higher uncertainty when encountering non-white faces. While this is because of the biased data used to inform its probabilistic reasoning, the architecture of BNNs has an effect. In a BNN, due to the computational cost of calculating the posterior distribution in high dimensions, full inference is infeasible. This requires the use of approximations, but as mentioned above, this can embed biases from the training data. The implications of approximations can be seen in the example of Zafar et al.’s paper, in which that apply Bayesian deep convolutional neural networks (B-DCNNs) to face recognition for surveillance. In this approach, the model’s capacity to learn a distribution over its weights via variational inference approximated by dropout plays a critical role in embedding bias.22 This process is fundamentally shaped by the training data, in this case, datasets (AT&T Face Database (formerly called ORL) and EURECOM Kinect Face Database (EKFD)). These datasets were chosen for their inclusion of variations in pose, illumination, and accessories, there is no discussion of the demographic composition of the dataset. An analysis of these datasets shows that white faces are overrepresented, while Black faces are underrepresented.23
By using these datasets, the learned weight distribution is less optimally calibrated for characteristics prevalent in the underrepresented populations. Consequently, the model’s performance may degrade for these groups. More specifically, the B-DCNN’s mechanism for quantifying uncertainty—derived from the variance of samples from this learned distribution—may end up reflecting the bias from the dataset. Faces from underrepresented groups, being less ‘familiar’ to the biased model, could be assigned higher uncertainty scores, potentially resulting in their disproportionate classification into a “doubt or rejection class”24 based on the heuristic function, thereby embedding and amplifying existing socio-political biases within the model’s decision-making process.
Another example is found in Matin and Valdenegro-Toro’s proposed approach using BNNs for facial emotion detection.25 Biases present in the training data they use, the FER+ dataset (of which 70.5% are white faces)26 are reproduced in the learned approximate posterior distribution over the model’s parameters, which BNN approximations aim to infer from the training data (full inference being intractable).27 During training, if the data is biased, the model will have less empirical evidence to precisely determine the optimal values for the parameters relevant to those specific characteristics. This results in a less certain or wider approximate posterior distribution in those parameter regions, effectively embedding the data bias within the model’s internal probabilistic representation of its weights. Consequently, when the model makes predictions or estimates uncertainty for inputs from groups under-represented in the training data, the bias is reflected in the model’s output: performance may be worse, and uncertainty estimates, such as the entropy of predictive probabilities, are likely to be higher. To their credit, this is a concern the authors explicitly acknowledge and link to research highlighting demographic disparities in facial analysis.28
Thus, while uncertainty estimates can be useful for identifying potential biases and areas where the model may be less reliable, they can also mask the underlying social dynamics conditioning such disparities by encoding them in other ways, such as the use approximations in the above examples. In facial recognition, a BNN like the B-DCNN described above, expressing higher uncertainty when encountering non-white faces does not reveal the social-historical forces conditioning the under-representation in training data but rather reproduces it at another layer. Thus, technical opacity is not eradicated, it is instead shifted to a new location carried by machinic significations (for example, probability distributions), while also contributing to an opacity of difference in the misrecognition of non-white faces of which Amaro speaks.
However, rather than insist on reprogramming the machinic eye so that Black life becomes legible and folded into the epistemic regime of computer vision, Amaro seizes on this moment of misrecognition as an opening for an affirmation of Black existence and psychic generation:

The resultant incompatibility could then be seen as an act that, while bringing forth pre-existing substances of racialisation, can make use of this duress to catalyse future affirmative iterations of the self. Ultimately, what is prioritised…is a compassion for the self as already coherent at the encounter of artificial misrecognition—a self that is continually taking shape, as Blackness has always done, in its exploration of infinite halls of possibility.29

Amaro’s argument is to embrace this erasure as a form of counter-opacity as an affirmative moment of creation. Moving to the transsubjective level of social-historical, we can find a similar move of affirming the creative generativity of opacity in the work of Martinican philosopher and poet Édouard Glissant.
The following is an attempt to extract a lesson from Glissant, who was writing in response to a different (but not untethered) social-historical condition, colonialism and post-coloniality in the Antilles. In doing so we will operationalise Glissant’s concept of opacity against the idea that the machinic opacity embedded into structural processes of the social-historical is an intractable problem, while maintaining a recognition that it is a genuine opacity that persists.
As noted, Glissant’s demand for the right to opacity aligns with the argument made by Amaro. However, Glissant’s related ideas around Antillean cultural poetics (creolization) driven by a response to a historical loss—that of the abyss of the transatlantic crossing—also presents another dimension of opacity as it operates culturally. Crucially, both centre nonknowledge in the process of cultural creation but from different directions. In each conceptual register—opacity and creolization—nonknowledge plays a central role.30 Building on Amaro’s argument, a discussion of Glissant’s writings on opacity and ontology can provide an avenue to thinking about the dynamics of computational culture more broadly. Alongside the affirmation of Black existence, the ontological opacity of the social-historical world as it is articulated in multifaceted and kaleidoscopic perspectives—amongst different human experiences, as well as the opaque machinic articulation of the world—is the generative force that produces the regeneration and reinvention of the world we call social-historical change. An affirmative engagement with the opacity of the social imaginary from itself is therefore to be embraced over a desire for full transparency and understanding. Glissant arrives at this theory of cultural poetics through sustained examination of the precedent he finds in Antillean and Black diasporic cultures in the Americas that have invented themselves in relation to historical losses and uprootedness.
Can this kind of cultural poetics serve as framework for a similar expressive cultural invention in response to the intra-cultural opacity of the social imaginary produced by computational technologies? An exposition of Glissant’s ideas with a focus on the ontological stakes of his notion of opacity will demonstrate how it functions as alternative figuration of opacity to the technical opacity previously discussed in relation to machine learning and XAI. This alternative configuration coincides with the social-historical concept of machinic opacity, both functioning at the level of culture. Operationalising these different notions of opacity will then be possible as a means to answer the above question regarding an affirmative poetics of existence.
To briefly recapitulate the argument, machinic opacity describes the idea of an intra-cultural phenomena whereby there is a significant enough portion of social activity directed and implemented by autonomous computational processes to inform cultural (re)production. Machinic opacity is generated by the technical opacity of machine learning and pattern recognition and the emergent dynamics of a myriad of computational systems interacting at scale.[ 32. There is an ongoing debate about whether machine learning should be differentiated from statical modelling, that the large data sets used in machine learning cannot be considered strong priors as is the case in statistical modelling. Nevertheless, the argument in this article takes the position, made by a range of authors such Christopher Bishop, that Bayes theorem plays a central role in machine learning and pattern recognition today. The simplification of this position, if the reader will allow this, is that the use of probability to represent uncertainty by “quantifying our expression of uncertainty and make precise revisions of uncertainty in the light of new evidence, as well as subsequently to be able to take optimal actions or decisions as a consequence.” 31 This is a description of Bayesian probability, but also functions as a general description of machine learning. While there are Bayesian approaches to machine learning, we can also model learning with Bayesian inference, whereby the inducive bias of a model in machine learning can be understood in Bayesian terms as the prior belief. The unfolding paradigm shift towards the dominance of probabilistic methods in computing why Justin Joque argues that “Bayes…is the statistical theory of the information age” 32 The extent to which the ‘patterning’ of machine learning is an abstraction that is illegible to human cognitive and phenomenological capacities without mediation and transformation—here called ‘machinic signification’. While we can translate and infer interpretations of the patterns generated in the optimisation search of a learning machine, or the high-dimensional integration of probabilities in Bayesian approaches, these machinic patterns are distributed, sub-symbolic representations of the mapping of data. As such, they are a form of abstraction that simply does not make sense to human understanding, even with approaches aimed at increasing interpretability, like those Bayesian techniques discussed previously. The implication of this is an increasing opacity built into computational culture due to the ever-increasing role of machine learning in social activity (this being a materialist definition of culture expressed in social activity of all forms—actions and abstractions). As discussed above, techniques for the reduction of the technical opacity of machine learning is nevertheless insufficient and always incomplete, Bayesian approaches can reduce the technical opacity in some respects, but some degree of technical opacity remains, and perhaps more importantly, a politically and epistemically structural “machinic opacity” is introduced once they begin operating in the world.
The local machinic opacity of models therefore has a generalised effect of social-historical opacity considering its widespread and increasingly central role in social reproduction. This applies to many dynamics within culture: culture becomes opaque to itself as far as there are other modes of articulating the world within culture. The machinic articulation of computational culture is one such example (culture is always plural to a degree and contains a multiplicity of opacities, this argument could be made with a range of examples), yet it is a historical dynamic with a particularly disintegrative effect due to intensity and speed of emergence of a radically alien computational reason.
There is a tension here regarding the political dimension of culture. If the infrastructural conditions of social-historical creation produce an opacity of a radically machinic character, the latter is a potentially alienating force as far as the possibility of participating in the self-creation of society becomes a generalised feature. Not only a solipsistic alienation of the individual that can somehow be overcome through intersubjectivity, but a deeper structural alienation from culture itself. In the case of the machinic imaginary this structural alienation derives from an infrastructural opacity of the computational systems of everyday life. The creative and generative role of machine learning in partially articulating the social imaginary and generating an internal opacity to culture, changes the condition of being in the world for social individuals: it is an occultation of the collective articulation of the world (culture) in a non-trivial sense (i.e. as far as learning machines no longer function as media but rather automate social activity, such as machine-to-machine interactions in the finance industry, they have a creative role in social-historical becoming: they participate in the social imaginary). This a machinic reading of Glissant’s argument that: “Thinking thought usually amounts to withdrawing into a dimensionless place in which the idea of thought alone persists. But thought in reality spaces itself out into the world. It informs the imaginary of peoples, their varied poetics, which it then transforms, meaning, in them its risk becomes realised.”33 Computational reason is not located in a “dimensionless place”, rather, it “spaces itself out into the world”—computation is the practical realisation of theoretical reason in machines.34 As far as computational reason is thought in the world, it is of the category Glissant is discussing in this quote, and also “informs the imaginary of peoples…which it then transforms”.35 The replacement of human activity through automation not only reduces opportunities to act (for example, for workers to strike), but due to the opacity of computational abstractions, the compound effect of uncountable instances of abstraction and machine-directed activity deny the possibility of interpretation of social-historical becoming. This is an estrangement from the transsubjective conditions of the production of the social world, it is a form of hermeneutic alienation, in that it prohibits interpretation of the world (by an individual or collective). As far as even a naïve hermeneutics is a necessary precondition of politics, this appears to present an impasse. This is particularly problematic regarding the degree to which this infrastructural opacity is an extension of pre-existing forms of alienation—capitalist, colonial, patriarchal. And it is an extension, in the very real sense of a direct continuation of the logics and practices of domination, exploitation, and extractivism. As has been demonstrated extensively, colonial logics carry through to the present moment in the structuration of societies both in post-colonial nations and post-imperial metropoles. The logics and structures of thought that drove and justified colonial practices continue to structure our thinking today,36 and are built into the computational technologies, just as the practices of domination and exploitation in the colonies were subsequently brought home and applied to populations of the imperial centre,37 a more abstract but no less insidious effect plays out in the abstractions by which computational culture is ordered.38
There is more than one possible interpretative response to the emergence of machinic opacity as a structural condition. One response is to accept the occultation of social-historical creation by machinic opacity as alienating and insurmountable (that is, without severely limiting or destroying of the machinic infrastructure that generate it), thus leading to despair and disempowerment. Another option is to respond to machinic opacity as a creative provocation to cultural reinvention driven by that very condition of nonknowledge. As signposted above, a framework already exists for such a response in the work of Édouard Glissant, specifically his poetics of Relation and the related concept of creolization. His ideas were developed in response to particular post-colonial context, yet, as he himself has suggested, this poetics of Relation, and the process of creolization that it produces under the right conditions (an equality of value between elements), is generalisable to culture beyond the initial particularities of the context from which he developed his ideas (Creole languages, and creolization of culture in the post-colonial Americas).39 Glissant’s expansion of the poetics of creolization to global cultural dynamics (with all their local variations), demonstrates that it can serve as a cultural model in general. The extent to which the ingression of computational reason into the development of culture and society might lead to a computational creolization of culture is the degree to which humans and machines can relate to one another as equally opaque modes of being in the world—rather than through a relation of domination by machines over the human (automation of the logics of Imperialism and Capital), or even domination of the human over machines (the latter mere executions of human will). This equality in Relation, I argue—with the aid of a Glissantian framework—is made affirmative by sustaining or accepting machinic opacity as an invitation to extend a cultural poetics in the face of uncertainty generated by infrastructural conditions, as long as it is coupled with a programme to uncover and assert the opacity of the human from the machinic gaze (to declare the right to opacity as Glissant does, or to sustain the incomputability of Black life as Amaro does). I make this argument in line with Glissant’s own argument regarding the positive function of opacity. However, it is also important to stress that if opacity is not mutual, and transparency is unidirectional, opacity only serves forces of domination rather than provide an opening to a poetics of Relation. Hence the right to opacity is a key element to be pursued politically, while exploring the more aesthetic, cultural responses to machinic opacity.
The capture and use of computational infrastructures of daily life by forces of domination, exploitation, and extractivism developed to serve imperialism and capitalism emphasises why a post-colonial thinker like Glissant is an appropriate source to think with about computational culture. There is direct line between the social-historical conditions to which Glissant was responding, and computational culture. Moreover, the whole globe is impacted by computational technologies. While computational culture may be particularly concentrated in its development within (post-)imperial metropoles, the notion of computational culture(s) obviously also includes its expression within post-colonial contexts such as Glissant’s Martinque in the 21st century. Thus, the following is an attempt to extract a lesson from Glissant, that is, to pursue an epistemological project predicated on nonknowledge and uncertainty (an alternative nonreductive configuration of the role of uncertainty to that of Bayesian epistemology). This is Glissant’s epistemological gesture, as Joanna Ruth Evans eloquently describes:
In arising after the end, beyond the abyss, and thus beyond the horizon … this epistemology does not require or produce a border against which to expand. … ‘knowledge of Relation of the Whole’…absorbs borders into its structural totality, via a productive acceptance of abyss, unknowability, and oblivion as the conditions of possibility of its existence.40
The machinic opacity operating within the self-creation of the social-historical world marks a genuine loss, it is genuinely a new form of alienation from the forces of historical creation. Glissant is a valuable thinker for such a context because he presents a way to respond to alienation and loss of historical agency. Glissant is an optimistic and deeply positive thinker, but he is not naïvely so, rather his affirmation of life is a defiant embrace of uncertainty. Nonknowledge and the abyssal are not negated, they persist and feed a generative cultural poetics of Relation. Likewise, to apply a Glissantian lens to the problem of a machinic occlusion of the (infra)structural conditions of social-historical change is not to negate the alienation this causes but to rather to work through that alienation. Just as Amaro embraces the invisibility of Blackness as a means to affirm Black life, a Glissantian poetics of Relation suggests a way to respond to the loss of agency with a counter-poetics vigorous enough to generate a historical force that can contend with the alienating power of computational reason.

Glissantian Opacity and Creolization as a Theory of Culture

While Édouard Glissant’s philosophy primarily reflects upon Caribbean experience, with a clear postcolonial politics at its heart, he builds out from this context to a more general theory of culture (albeit requiring re-specification in each instance of application) and a critique of globalisation.41 Within his writing we can locate a more pluralist ontology that, I want to suggest here, can be applied to a critique of computational culture, to supplement similar applications of Glissant to a critique of technology, such as those found in Zach Blas and Nelly Y. Pinkrah.42 Glissant’s body of work is large and multifaceted, and cannot be done justice to in full here. Instead, unfolding his concept of opacity can provide an insight into the pluralist ontological commitments that emerge from his thought.
Opacity for Glissant is a cultural concept. While there is an individual dimension to opacity, within the broader context of Glissant’s work opacity serves to describe the irreducibility of culture and culturally specific subjectivity that each culture produces. A key theme in Glissant’s writing is a critical reflection on globalisation and the shared transcultural world, beyond which he argues for a more radical ontology of Totality. Opacity functions as a political concept43 in this regard because it is the basis upon which, Glissant argues, engagement in genuinely equitable transcultural poetics of Relation (creolization) is possible.44 Opacity is a defence against reduction by a dominating European colonial epistemological regime that demands the renunciation of a genuine alterity that cannot be subsumed within its own logic.45
These ideas, which are most prominent in Poetics of Relation, but appear earlier in Caribbean Discourse and echo throughout his work, are very much an extension of his theory of creolization, speaking to the important role the idea of the nation (the cultural nation) and Caribbean identity play in his politics.46 It is through and by opacity that a nation, or culture, exists, and through opacity that a culture can engage in Relation on an equal footing with other cultures that is essential for a process of positive creolization to take place. Without the right to opacity, there is imbalance in the valuation of one side over the other (one side is deemed totally visible and flat, naked in front of the dominating gaze that remains shrouded in complexity and depth).47 As far as creolization operates as a general theory of cultural relation, there must be opacity for there to be equality and freedom. The opposite of opacity is the transparency of the universalist colonial project and its accompanying image of Reason, which flattens all difference through a reference to the worldview of colonial Europe. In imperial domination the colony is understood in the language of the metropole, including its epistemological framework (including race science). However, through creolization, the language of the metropole—French in the case of Glissant’s Martinique—individuates itself in a process of becoming opaque to the colonial French.48 The cultural, linguistic, and aesthetic process of creolization is opacity as resistance to subjugation, a resistance through a poetics of Relation that reinvents itself. There is another opacity at play here, however, in the Caribbean and Black Americas this creolization is as a way of interpreting an historical opacity of the past, the historical root that is lost in the abyssal chasm that was the transatlantic crossing and colonial practices of cultural suppression and erasure is turn upon itself as a generative force of cultural invention.49
Creolization, as it is first and foremost articulated as a concept by Glissant, takes place in this specific context of the Caribbean where the indigenous population had been completely wiped out and “new people” have been forcibly transported to populate “Plantation America”. Thus, the emergence of creole culture, creole epistemology, is different from other colonial contexts like, for instance the rest of the Americas, where indigenous populations and decedents of European settlers both have a continuous lineage of relation to a cultural past that persists in the present.50 The Antillean context is what Glissant calls a nomadic errantry, defined by the traumatic separation from an irrecuperable past. Creolization bootstraps itself through a poetics reorienting the lost and unknowable of its past towards a future orientation. As Glissant writes:
Because the collective memory was too often wiped out, the Caribbean writer must ‘dig deep’ into this memory, following the latent signs that he has picked up in the everyday world. Because the Caribbean consciousness was broken up by sterile barriers, the writer must be able to give expression to all those occasions when these barriers were partially broken. Because the Caribbean notion of time was fixed in the void of an imposed nonhistory, the writer must contribute to re-constituting its tormented chronology: that is, to reveal the creative energy of a dialectic reestablished between nature and culture in the Caribbean.51
Although creolization was developed in relation to language, in becoming a broader cultural concept in Glissant, we can also speak of “epistemological creolization”: firstly, because of the epistemological function of language in constructing models and describing the world, but also in terms of the cultural interpretation and reorganisation of western rationality in new creolised contexts. Just as creolization of French took place through the encounter between colonial plantation owners and enslaved Africans deprived of their connection to their cultural heritage, creolization more broadly takes place within the descriptive and analytical modes by which knowledge is constructed and the world is understood in the encounter with uncertainty and nonknowledge. This brings us back to opacity (nonknowledge): as far as it serves a function within creolization to generate equality and unpredictability,52 it places nonknowledge and uncertainty into the heart of cultural becoming (poetics). This is most starkly and violently apparent in the case of the Caribbean and the rest of the African Americas, hence its origination in the Caribbean thought. However, Glissant generalises creolization to all languages and cultures:
…all languages were originally Creoles. It’s just that the speakers of those languages, as soon as they were aware of it, wanted their language to no longer be a Creole but to be specific. The dream of every human community is that its language was given to them by a god, in other words that its language is the language of an exclusive identity. … In other words the phenomenon I am describing is not in any way local: it is a far more general issue. And if I choose the term creolization, I am not referring to my home town or the Antilles or the Caribbean. It is because nothing else gives a better idea of what is happening in the world than this unforeseeable coming together of heterogeneous elements … it is the situation of the world. When I say ‘creolization’, I am not referring at all to the Creole language, I am referring to the phenomenon that has structured the Creole languages, which is not the same thing.53

The “Right to Opacity” as Ontological, Epistemological, and Political Demand

Returning to opacity, what does it mean to argue that opacity is nonknowledge? Glissant develops the concept of opacity as a corrective to theories of difference. Glissant recognises that the theory of difference is invaluable in the fight against reductive thought, such as that of racial hierarchies and notions of superiority. The theory of difference has also been a powerful legal and ethical framework for the politics of recognition. However, difference, he writes, can itself “contrive to reduce things to the Transparent”.54 This is deeply problematic for Glissant, because transparency is not a positive. Rather transparency is the aim and tool of colonial epistemology that seeks to “grasp” people and ideas so that they can be understood. Placing the other within an “ideal scale” of its own contrivance (such as racial hierarchy) allows euro-centric colonial identity to compare, judge, and ultimately dominate. Transparency requires a reduction to the same, understanding alterity and existence by reducing it to a point on a scale—as with the earlier example of using probability distributions to reduce the complexity of the visual fields to be computationally legible.55
Instead, Glissant argues, we need to discard scales and to end reduction, we need to: “Agree not merely to the right to difference but, carrying this further, agree also to the right to opacity.”56 This opacity is the guarantee of a genuine pluralism, and highlights the radical difference of what Glissant calls “Le Divers” (the diverse). Glissant’s concept of opacity is notable in that it can be read as defining an onto-epistemological boundary that cannot be crossed. That is to say, this concept need not merely be read as an epistemological divide between world views, but more radically as describing the ontologically distinct cultural mode of becoming, which is produced through creolization (as all culture is) that preserves opacities by transforming their unknowability into novel cultural expression. Cultural opacity flows from the singularity of a perspective that articulates a world-for-itself: opacity generates, and is generated by, a cultural poetics, with its own logic and its own aesthetic (conditioning the ongoing possibility of further creolization).
However, opacity does not mean total disengagement, or necessitate a lack of visibility. To quote Glissant again, at length:

The opaque is not the obscure, though it is possible for it to be so and be accepted as such. (Rather,) it is that which cannot be reduced, which is the most perennial guarantee of participation and confluence. We are far from the opacities of Myth or Tragedy, whose obscurity was accompanied by exclusion and whose transparency aimed at ‘grasping’. In this version of understanding the verb to grasp contains the movement of hands that grab their surroundings and bring them back to themselves. A gesture of enclosure if not appropriation.57

The origination of this concept in a critique of colonialism is clear here. Glissant is responding to the real violence of European colonialism grasping people, resources, and land, which was supported and made possible by an epistemic violence of transparency and reduction. Reduction meant the real reduction of human beings, ways of life, of the diverse poetics of existence to accord with a particular euro-centric poetics of existence: the logic of Western rationality and Christian religious aesthetic that articulated a specific image of the world.
Glissant describes the reduction and transparency of Western rationality as a “grasping”, playing on the word comprehension in the original French, from the root comprendre (which connotes both understanding, and also integration). In opposition to grasping, Glissant suggests we: “Let our understanding prefer the gesture of giving-on-and-with that opens finally on totality.”58 In the original French, giving-on-and-with is the neologism donner-avec, the donner having the double meaning of giving in the sense of generosity, but also to “look out toward”.59 Thus, through the gesture of giving-on-and-with we open with a generosity—a giving as a gift of ourselves to totality—as well as position ourselves towards it as onlookers, to see without attempting to grasp and understand as demanded by transparency. This gesture is therefore a non-reductive acknowledgement of opacity in the other, and the foundation of participating in totality.
Glissant’s use of totality might seem jarring here, considering the emphasis on the diverse and the opaque, how can there be a totality? His use of this term is an intentional co-opting and reconceptualising of a core concept in Western thought, which he argues is insufficient. Instead, he proposes a notion of totality that he calls “Relation”. Relation is an open totality evolving upon itself—it is a self-differentiating process that remains radically open. Relation is a non-totalising totality: a paradoxical totality that is never totalising because it is always moving away from itself, becoming more than itself. It is a totality from which we “subtract” the “principle of unity”: “In Relation the whole is not the finality of its parts: for multiplicity in totality is totally (sic) diversity.”60 Totality, for Glissant is total diversity (here the influence of Deleuze and Guattari on Glissant comes through particularly clearly in an only slightly obscured reference to their formulation “pluralism = monism”).61 Totality in its imperialist formulation, as simply totalising universalism, prevents us from reaching the radical pluralism of totality: a transsubjective process of social-historical becoming eternally self-differentiating.
But what has happened to opacity here? How can the disjunctive properties of opacity allow for an opening onto totality? For Glissant, “The right to opacity would not establish an autism; it would be the real foundation of Relation, in freedoms.”62 What does he mean here? Glissant’s insistence that the right to opacity does not undermine Relation at the transsubjective level of the social-historical, and in fact is vital to the development of genuine Relation within Totality. However, opacity does create disjunction and fragmentation due to the radical nonknowledge of any given subjective position, and necessarily so: without disjunction and fragmentation, future processes of creolization, that is, cultural emergence at its most creative, cannot exist. The poetics of Relation of which Glissant speaks is not a static matrix but an ongoing process of differentiation and convergence in infinite combinations within Totality.

Cultural Opacity Revisited

Glissant is often invoked in the media theoretical literature as a counter to technologies of transparency.63 For example, Zach Blas engages with Glissant’s notion of opacity as a response to what he describes as the transparency produced by the proliferation of information technologies of surveillance.64 At the same time, however, as we have just seen, there is another form of machinic opacity created by these same technologies that mirrors the affirmative cultural opacity found in Glissant’s work (and which, in a different way, Blas opposes to informatic transparency).
The question then becomes whether these different notions of opacity are related? Do they contradict one another? In the case of the opacity of machine learning there is a unidirectional opacity that functions as a form of ‘power over’ by rendering transparent its objects of analysis while remaining itself opaque in a technical and political-economic sense. This unidirectionality is central to its political dimension, and precisely the transparency of western rationality that Glissant is critiquing. The demand for opacity is therefore a political demand for a counter-opacity to technologies of transparency, as per Blas’ formulation. However, as far as the technical opacity participates in the emergence of a structural opacity, here being called machinic opacity, it is an onto-phenomenological analogue of the Glissantian opacity. What is obscured by machinic opacity is an image of the world generated by machines participating in social activity: a “machinic imaginary”.65 Even in the process of making transparent, there always remains an opacity due to the “material-semiotic” of that which attempts to make the world transparent, that is to articulate the world through its own opaque “material-semiotic”.66 This phenomenological and epistemological point speaks to an ontological point that opacity also operates at an intra-cultural level, as well as inter-culturally. Glissant’s demand for the right to opacity is a demand to recognise the full reality of the other’s existence as fundamentally unknowable. This applies equally, however, to the opacity of one’s own culture. The point being that Glissant’s pluralism is generalisable to culture itself. We might radicalise this claim to say that any given culture is already an expression of the open Totality in its internal relation to the diverse.

Ethicopolitical Value of Opacity: Alternatives to Bayes

Thus, what must be learnt from Glissant is that the challenge of intra-cultural opacity presented by machinic opacity does not need to be overcome. Instead, it is through and with this opacity that an ethicopolitics can be constructed. Louise Amoore makes a similar argument in her book “Cloud Ethics”. Amoore cites Donna Haraway’s notion of “staying with the trouble” as a way of following threads in the dark, and Judith Butler’s argument that one can only ever give a partial account of oneself because such any account must include the conditions of emergence of oneself, a practical, if not theoretical, impossibility.67 Amoore, Haraway and Butler highlight that “the opaque and unknowable nature of making all kinds of acting subjects is the condition of possibility of having an ethicopolitical life.”68
Add to this Glissant’s insistence of opacity as a political imperative. Opacity is, in his account also, the condition of possibility of freedom. Freedom from oppression by the colonial desire for transparency, a desire that persists in the epistemological and material legacies of colonialism in machine learning and its violent applications. Opacity is always present, even in those instances of machine learning applications that strive to enforce a transparency, and act upon said contrived transparency, opacity remains stubbornly present even as the opaque subject is destroyed. Take, for example, the machine learning system designed by a defence contractor (interviewed by Amoore) with the intention of discerning what is a “legitimate” target. Amoore explains and problematises the proposed capacity to render the drone’s target transparent, that the definition of an enemy target as opposed to a school bus is deeply flawed because it is predicated on prior knowledge of school buses drawn from US-centric training data.69 Here the knowledge and actions of the system follow from the inclusion of priors based on data from a local US context, that render a supposed transparency of the subject of violence for the contemporary imperialist war machine. Whereas the brute fact of the violence is that the neo-colonial subject cannot be rendered fully transparent, thus resulting in “accidental” drone strikes on civilians. In this case the opacity is passive, but Glissant holds that the activation of that opacity is where political agency is located.
Thus, while Bayesian inference is understood positively in computer science literature as a technical solution as far as it is a quantification of uncertainty, in the context of an ontological opacity insisted upon by Glissant, the quantification of uncertainty is problematic. The rendering of that which is opaque into a quantity entails a reduction of the difference that produces that opacity, and is an example of the colonial process of rendering everything visible and transparent, which Glissant is arguing against. Moreover, entailed in this reduction are the historical and political biases that are imported as priors into this process of quantifying uncertainty.
A pessimistic reading of this situation might be that, because of the function of that opacity has in supporting a new form of the colonial drive for transparency and domination, machinic opacity is fundamentally alienating force. Yet however alienating the existence of an intra-cultural machinic opacity might be, Glissant has shown that hope is found in the creative capacity of people to respond to alienating conditions. Hence the argument this article has put forward: that we can find a framework for overcoming the social-historical alienation produced by machinic opacity in Glissant’s notion of creolization and the counter-poetics of existence. In Sun of Consciousness (Soleil de la Conscience)70 and Caribbean Discourse (Discours Antillais),71 Glissant uses the term poetics or imaginary to describe the cultural construction by which a people become conscious of themselves as a people of a nation / culture. In the case of Antilles, it is a counter-poetics responding to the alienation created by people forced into slavery and carried across the Atlantic, removing them from history. This poetics turns alienation into a moment of creation (Fanon describes this as the creation of a new nation and new culture through post-colonial struggle).72 While it is through this creative activity an opacity is generated, the poetics of Relation is a relation to the transversal opacity of Totality, a continuous attempt to articulate the opaque and unknowable, which itself generates further difference, and therefore opacity. Poetics undoes the relation in forging it, it generates opacity through relation and vice-versa.
While analogous with Bayesian epistemology as far as turning uncertainty upon itself, such a poetics, or imaginary, is of a radically different kind to Bayesian epistemology: uncertainty and the unknown is not quantified and reduced for instrumental purposes but qualified (as nonknowledge) by being given significance within a new cultural imaginary. For example, the historical opacity produced by colonial violence becomes the seed of cultural creation in Caribbean poetics as Glissant argues in his philosophical writing and demonstrates in his fiction and poetry.73 However, Bayesian techniques are reductive as far as they fold uncertainty into an instrumental regime of calculation that closes the aperture of being to capture a “truth” statement at any given moment to be acted upon pragmatically, whereas a poetics of existence is an opening of the aperture of being because it does not make truth claims but rather participates in an ongoing aesthetic, cultural process. In the case of Bayesian machine learning, domain experts have relevant information that can guide the model’s learning process. In the case of a cultural poetics of existence, the domain expert is the poet, the artist, the activist, the community who transform the experience of existence into aesthetic process to guide the “learning” process of cultural invention.
One way that this may take place is outlined by Ramon Amaro in The Black Technical Object, arguing that the moment of misrecognition can become an affirmative process of individuation of Black life, which aligns with the Glissantian argument for a right to opacity. As Amaro writes, learnt probabilistic distributions maintain a field of opacity beyond which “differences of living” are imperceptible to the machinic imaginary. In this regard, while on the one hand it is a negation of existence, this leaves open a space for reassertion of subjectivity. As Amaro proposes, building on Fanon’s concept of “auto-actualisation”, embracing the lack of representation as an act of refusal to be represented is a “reclamation of individual freedom” which is found by “prioritising the future”,74 as opposed to past data prioritised by machine learning (as in the earlier example of approximations drawn from training data prioritising white faces). Amaro is describing a poetics of existence that begins from the assertion of the right to opacity as Glissant states it, and which Fanon also insists upon. This is a poetics of existence that turns machine learning’s negation of Black existence back upon itself as a process of travelling through “the dynamic creation of subjectivity to (re)imagine new representations and ways of living”.75 With Glissant this poetics of existence takes place in the process of cultural creation that he calls creolization, thus offering a framework to understand how machinic opacity might operate at a cultural level to generate aesthetic, cultural invention.

Notes

  1. Édouard Glissant Poetics of Relation (Poétique de la Relation) (Trans. Betsy Wing). University of Michigan Press, 1997 (1990). Glissant’s “poetics of Relation” [2. Ibid.
  2. Conrad Moriarty-Cole. The Machinic Imaginary: A Post-Phenomenological Exploration of Computational Society (PhD Thesis). Goldsmiths Research Online, 2023, https://research.gold.ac.uk/id/eprint/34323/1/MCCS_thesis_MoriartyColeC_2023.pdf
  3. Frank Pasquale, The Black Box Society: The Secret Algorithms that Control Money and Information. Harvard University Press, 2015.
  4. Saranya A. and Subhashini R., ‘A Systematic Review of Explainable Artificial Intelligence Models and Applications: Recent Developments and Future Trends’, Decision Analytics Journal 7 (1 June 2023): 100230, https://doi.org/10.1016/j.dajour.2023.100230.
  5. For example: algorithmic trading; smart cities; or the presently unforeseeable effects of LLMs being used at scale, especially if integrated into automated processes. In a middle layer we might also include large organisations/corporations into the equation, which are arguably a form of “collective artificially intelligence” (see Kuipers, 2012; Davies, 2024) and are increasing organised using computational automation. Kuipers, Benjamin. ‘An Existing, Ecologically-Successful Genus Of Collectively Intelligent Artificial Creatures’, Proceedings of Collective Intelligence, ArXiv, 2012, arXiv.1204.4116
  6. Qua notions of freedom, resistance, political imaginary, etc.
  7. This notion of the monolithic human opposed to the machine is of course fraught, especially in the context of an argument engaging with post-colonial literature, and the racist construction of the category of human as exclusionary of non-whiteness and particularly its anti-Blackness (Wynter, 2003). Nevertheless, for practical reasons a term is needed to delineate what is novel about the historical emergence of computational reason. This distinction is a necessary first step in this argument to assert the potential generativity a mutual opacity. Wynter, S. “Unsettling the Coloniality of Being/Power/Truth/Freedom: Towards the Human, After Man, Its Overrepresentation—An Argument.” The New Centennial Review, 3(3), 2003: 257–337.
  8. Epstein, Ziv, Blackeley H. Payne, Judy H. Shen, Abhimanyu Dubey, Bjarke Felbo, Matthew Groh, Nick Obradovich, Manuel Cebrian, and Iyad Rahwan. ‘Closing the AI Knowledge Gap.’ ArXiv, 2018. arXiv:1803.07233.
  9. Burrell, J. 2016. ‘How the machine ‘thinks’: Understanding opacity in machine learning algorithms’. Big Data and Society, 3(1), 2016: 2.
  10. Szegedy, C., Zaremba, W., Sutskever, I., Bruna, J., Erhan, D., Goodfellow, I., Fergus, R. 2014. ‘Intriguing properties of neural networks’. ArXiv. arXiv:1312.6199
  11. Gelman, A., Carlin, J. B., Stern, H. S., Dunson, D. B., Vehtari, A., & Rubin, D. B. Bayesian Data Analysis. CRC Press, 2013.; Ghahramani, Z. Probabilistic machine learning and artificial intelligence. Nature, 521(7553) 2015: 452–459.
  12. Arbel, J., Pitas, K., Vladimirova, M., & Fortuin, V. A Primer on Bayesian Neural Networks: Review and Debates. ArXiv, 2023, 2309.16314 (arxiv.org)
  13. MacKay, D J C. A practical Bayesian framework for backpropagation networks. Neural computation, 4(3), 1992: 448–472. For a visual representation of the difference see Figures 1 and 5 in Arbel et al., “A Primer on Bayesian Neural Networks”.
  14. Gal, Yarin, & Ghahramani, Z. Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning. In International Conference on Machine Learning (ICML). Proceedings of The 33rd International Conference on Machine Learning, PMLR 48: 1050-1059, 2016.
  15. Kendall, Alex, and Yarin Gal. ‘What Uncertainties Do We Need in Bayesian Deep Learning for Computer Vision?’ In Advances in Neural Information Processing Systems, Vol. 30. Curran Associates, Inc, 2017.
  16. Ngartera, L., Issaka, M.A., Nadarajah, S. ‘Application of Bayesian Neural Networks in Healthcare: Three Case Studies’. Machine Learning & Knowledge Extraction, 6(4), 2024: 2639–2658.
  17. Amoore, L. Cloud Ethics: Algorithms and the Attributes of Ourselves and Others. Duke University Press, 2020: 19.
  18. Amaro, R. The Black Technical Object: On Machine Learning and the Aspiration of Black Being. Sternberg Press, 2023.
  19. For example, Amaro cites Potez and Lee, 2011’s review of 3D surface perception.
  20. Amaro, The Black Technical Object: 42.
  21. Ibid, 44.
  22. Umara Zafar et al., ‘Face Recognition with Bayesian Convolutional Networks for Robust Surveillance Systems’, EURASIP Journal on Image and Video Processing, 1 (10) (11 January, 2019): 4. https://doi.org/10.1186/s13640-019-0406-y
  23. AT&T’s dataset contains forty faces (ten images per face) with only one Black face (2.5% of the database), all others being white. A demographic analysis by (Min, Kose, and Dugelay, 2014, 3–4) of EKFD found that only 6% of the faces are African American, 6% Hispanic, 8% Indian compared to 40% Caucasian, gendered bias is also 73% male. Rui Min, Neslihan Kose, and Jean-Luc Dugelay, ‘KinectFaceDB: A Kinect Database for Face Recognition’, IEEE Transactions on Systems, Man, and Cybernetics: Systems, 44 (11) (November 2014): 1534–48, https://doi.org/10.1109/TSMC.2014.2331215.
  24. Zafar et al, ‘Face Recognition’, 4.
  25. Maryam Matin and Matias Valdenegro-Toro, ‘Hey Human, If Your Facial Emotions Are Uncertain, You Should Use Bayesian Neural Networks!’, arXiv, (17 August 2020), https://doi.org/10.48550/arXiv.2008.07426.
  26. Iris Dominguez-Catena, Daniel Paternain, Mikel Galar, 2022. Assessing Demographic Bias Transfer from Dataset to Model: A Case Study in Facial Expression Recognition, arXiv, arXiv:2205.10049v1.
  27. Matin et al., “Hey Human”, 3.
  28. Buolamwini, Joy., Gebru, T., ‘Gender shades: Intersectional accuracy disparities in commercial gender classification’. Conference On Fairness, Accountability and Transparency, 2018: 77–91.
  29. Amaro, The Black Technical Object, 62.
  30. On nonknowledge see Bataille, Inner Experience and The Unfinished System of Nonknowledge. The use of it here signals the affirmative character of Glissant’s thought. Nonknowledge (opacity) is generative, conditioning the creation of knowledge, while always exceeding knowledge. Bataillie, George. Inner Experience (trans. L. A. Boldt). State University of New York Press, 1988.; Bataille, George. The Unfinished System of Nonknowledge (trans. S. Kendall), University of Minnesota Press, 2001.
  31. Bishop, Pattern Recognition and Machine Learning, 21
  32. Joque, J. Revolutionary Mathematics: Artificial Intelligence, Statistics and the Logic of Capitalism. Verso, 2022. p. 151
  33. Glissant, Poetics of Relation, 3.
  34. Ibid.
  35. Ibid; On computational reason as the synthesis of practical and theoretical reason see Parisi, L. 2019. ‘Media Ontology and Transcendental Instrumentality’. Theory, Culture and Society, 36 (6), 2019: 95–124.
  36. Quijano, A. Coloniality of Power, Eurocentrism, and Latin America’. Nepantla: Views from South, 1 (3), 2000: 533-580.; Wynter, Unsettling the Coloniality of Being/Power/Truth/Freedom.
  37. Trafford, J. The Empire at Home: Internal Colonies and the End of Britain. Pluto Press, 2020.
  38. Noble, Safiya Umoja. Algorithms of Oppression. NYU Press, 2018. Browne, Simone. Dark Matters. On the Surveillance of Blackness. Duke University Press, 2015. Moriarty-Cole, C, Phillips, J. The Artificial Earth: A Conceptual Morphology. In Incomputable Earth: Technology and the Anthropocene Hypothesis. Bloomsbury, 2025.
  39. Édouard Glissant, Introduction to a Poetics of Diversity (trans. C. Britton). Liverpool University Press, 2020: 14.
  40. Evans, Joanna Ruth. “As the world trembles: borders and borderlessness in the thought of Édouard Glissant” Women & Performance: A Journal of Feminist Theory, 32 (1), 2022: 31–52.
  41. For an extended discussion of Glissant’s critique of globalisation, see Coombes, S. Édouard Glissant: Poetics of Resistance. Bloomsbury, 2018.
  42. Blas, Z. Informatic Opacity. Posthuman Glossary (eds. R. Braidotti and M. Hlavajova, eds.). Bloomsbury Academic, 2018.; Pinkrah, Nelly Y. ‘After Opacity: A Turn Towards Language’. The AI Anarchies Book (eds. C. Herrmann, M. I. Ganesh, E. M. Hunchuck). Akademie der Künste, 2024.
  43. Some commentators, most notably Hallward, have argued otherwise, that there is a depoliticization in Glissant’s later, post-1980s work. Hallward, Peter. Édouard Glissant Between the Singular and the Specific. The Yale Journal of Criticism, 11(2), 1998: 441–464. I disagree for the reasons outlined here.
  44. We should note here that Glissant argues that “The notion of transculture is not adequate. Basically, the term creolization covers this notion of transculture. But the notion of transculture suggests that one could calculate and predict the results of a transculturalization; whereas creolization in my view is unpredictable.” Introduction to a Poetics of Diversity, 86.
  45. While it is beyond the scope of this article, considering the context of a discussion on computation it is worth noting that the extent to which number/quantification/datafication are inherently a process of reduction is a question of the degree to which these processes of abstraction developed historically in the service of colonial practices. To jettison all quantification because of a particular use case (however widespread and near-totalising that may have been) would be reductive.
  46. I will herein use the term ‘culture’ to align with the terminology Glissant uses in his later writings.
  47. “For creolization presupposes that the cultural elements brought together must necessarily be ‘of equivalent value’ for this creolization to be truly realized. That is to say that if some of the cultural elements brought together are seen as inferior to others, creolization does not really happen. It happens, but in a bastardized and unjust fashion.” Glissant, Introduction to a Poetics of Diversity, 8. Mutual opacity creates the conditions of equality needed for creolization, which is generative of culture.
  48. While Glissant mostly speaks of language, using literature as his key frame of reference, he is describing a cultural process more generally. As Jean-Marie Grassin writes, citing Robert Chaudenson’s Des Îles, des hommes, des langues. Essai sur la créolisation linguistique et culturelle: “Linguistic creolization cannot be separated from its cultural aspects…the creole linguistic process can be extrapolated to the general dynamics of cultural creole systems including music, cooking, popular medicine, religion, magic, oraliture (oral literature).” Grassin, J-M.. ‘Toward a global theory of creolization as an emergent process by opposition to multiculturalism as a configuration of identities’. In David Gallagher (ed.), Creoles, Diasporas and Cosmopolitanisms: The Creolization of Nations, Cultural Migrations, Global Languages and Literatures, 2012: 97–112.
  49. “…the deported African has not had the opportunity of preserving … specific inheritances. But he has made something new on the basis of the only memories, that is to say the only trace thoughts, that he had left: he has created on the one hand the Creole languages and, on the other, art forms that are valid for everyone, such as the music of jazz, which has been reconstituted with the help of newly adopted instruments but on the basis of fundamental African rhythms. Although this neo-American does not sing African songs from two or three centuries ago, he re-establishes in the Caribbean, Brazil and North America, through ‘trace thought’, art forms that he offers as valid for all peoples. Trace thought seems to me to be a new dimension that in the current state of the world we must set in opposition to what I call ‘systematic thought’ or systems of thought. Systematic thought and systems of thought were prodigiously fruitful and prodigiously dominant and prodigiously deadly. Trace thought is that which today most validly opposes the false universality of systematic thought.” Glissant, Introduction to a Poetics of Diversity, 11.
  50. Glissant, E. The Caribbean Discourse (Discours Antillais) (Trans. J. Michael Dash). The University Press of Virginia, 1989 (1981): 116–117; Glissant, Introduction to a Poetics of Diversity, 4.
  51. Glissant, The Caribbean Discourse, 64–65.
  52. “…creolization is hybridity with an added value, namely unforeseeability. Thus it was completely unpredictable that ‘trace thoughts’ would lead the populations of the Americas towards the creation of such new languages or art forms.” Glissant, Introduction to a Poetics of Diversity, 8.
  53. Ibid, 15.
  54. Glissant, Poetics of Relation, 189.
  55. “Knowledge is in no way distinct from me: I am it, it is the existence which I am. But this existence is not reducible to it; this, reduction would require that the known be the aim of existence and not existence the aim of the known.” Bataille, 1988, pp. 110ff. Read through Glissant, Bataille’s definition of nonknowledge as a critique of Hegel takes on an anti-colonial theme: “The completion of the circle was for Hegel the completion of man. Completed man was for him necessarily “work”. For knowledge “works”, which does neither poetry, laughter, nor ecstasy. But poetry, laughter, ecstasy are not completed man—do not provide any “satisfaction”. Short of dying of them, one leaves them like a thief…dazed, thrown back stupidly into the absence of death: into distinct consciousness, activity, work.” p. 111. Poetry, laughter, ecstasy, all those excesses of lived-life that make a person and a culture escape knowledge, they are opacities. The Hegelian completed man is impossible because “man”, as a construct of Hegel’s European “system”, cannot grasp the opacities of which Glissant speaks “Trace thought (generative of creolization) is that which today most validly opposes the false universality of systematic thought.” (Glissant, 7)
  56. Glissant, Poetics of Relation, 190.
  57. Ibid, 191
  58. Ibid, 191.
  59. See Betsy Wing’s ‘Translators Introduction’ in ibid, xiv.
  60. Ibid, p. 192.
  61. Deleuze, Gilles and Guattari, Félix. A Thousand Plateaus: Capitalism and Schizophrenia II, (trans. Brian Massumi). University of Minnesota Press, 1987.
  62. Glissant, Poetics of Relation, 190.
  63. See Pinkrah, “After Opacity”, for a further discussion of the media theoretical engagement with opacity and transparency.
  64. Blas, “Informatic Opacity”.
  65. Moriarty-Cole, The Machinic Imaginary.
  66. Haraway, Donna. When Species Meet. University of Minnesota Press, 2008: 4. Material-semiotic is here used to encompass the range of material and semiotic dimensions of a mode of being that articulates a world for itself. In the case of computation, the material-semiotic includes everything from the architecture of a neural net, to the mathematics of a model, the statistical techniques deployed, the pre-processing of data, and the electrical engineering of the hardware.
  67. Cited in Amoore, Cloud Ethics, 19.
  68. Ibid, 8.
  69. Ibid, 17.
  70. Glissant, Édouard. Sun of Conscience (Soleil de la Conscience) (trans. Nathanaël). Nightboat Books, 2020 (1956).
  71. Glissant, Caribbean Discourse.
  72. Fanon , Franz. The Wretched of the Earth (trans. R. Philcox ). Grove Press, 2004.
  73. “The Genesis of the Creole societies of the Americas is founded in a different obscurity, that of the belly of the slave ship. This is what I call a digenesis.” Glissant, Édouard. Treatise on the Whole World (trans. C. Britton). Liverpool University Press, 2020: 21.
  74. Amaro, The Black Technical Object, 33.
  75. Ibid.