The public access to the web is twenty years old. Through it, digital society has developed throughout the entire world. But has this society become mündig, that is, mature, in the sense that Immanuel Kant used this term to define the age of Enlightenment as an exit from minority, from Unmündigkeit? Certainly not: contemporary society seems on the contrary to have become profoundly regressive. Mental disorders as well as environmental, economic, political and military problems constantly proliferate and increase, while the spread of traceability seems to be used primarily to increase the heteronomy of individuals through behaviour-profiling rahter than their autonomy.
If digitalisation clearly holds promise in many ways, and if (and I am convinced of this) socialising digitalisation in a reasoned and resolute way is the first condition for the world to escape from the impasse in which the obsolete consumerist industrial model finds itself, then, this socialisation requires the creation and negotiation of a new legal framework which itself presupposes the formation of new ‘Enlightenments’.
I am thus delighted that Neelie Kroes has called for a new Enlightenment philosophy for the digital age, just as Tim Berners-Lee and Harry Halpin have argued, in dialogue with the position of Vint Cerf, who developed the TCP-IP protocol, that internet access must become a universal right. But here, what exactly does access mean? Or again: what type of access should we claim will bring light or enlightenment, rather than darkness or shadow? Under what conditions will such access be beneficial for individuals and the societies in which they live?
Responding to these questions, and in the first place posing them correctly : such is the challenge we face today. And to try to take the measure of these questions, it is necessary to see how, for example, in his book The Shallows: What the Internet is Doing to Our Brains, Nicholas Carr outlines the context that constitutes the stakes of this global summit on the web that is meeting here this year.
The spread of digitalisation since 1992 has resulted in a true chain reaction that has transformed social life at its most public level as well as psychical life at its most intimate level. The Shallows exposes the immense distress that has accompanied this meteoric rise – which is seen more and more as a tsunami, greatly disrupting the mental capacities of Nicholas Carr himself, according to his own account, and which may well sweep away all the inherited structures of civilisation on every continent, which may in turn produce immense disillusionment, and even a terrible disaffection.
Faced with the fact that the negativity of this new situation is continuing to take hold, we must affirm the necessity and the positivity of a new state of law. To do so, we must turn to the question of the relationship between technology and law. What we refer to as the law is in fact grounded in writing. Now, digital technology constitutes the last stage of writing. And just as writing was in the age of Socrates, so too today the digital is for us a pharmakon: it can lead either to the destruction of the mind, or to its rebirth.
I would like here to develop the following points :
- the constitution of a new state of law, a new rule of law, founded on digital writing, in fact presupposes a new age of Enlightenment(s);
- these new Enlightenments must, however, critique the limits of the philosophy of the Aufklärung itself, notably in relation to the question raised by the ‘philosophical engineering’ developed at W3C at the instigation of Tim Berners-Lee;
- the new philosophy which must arise from the worldwide experience of the web, and more generally of the digital, across all cultures, an experience that is in this sense universal – this new philosophy, these new Enlightenments, cannot merely be that of digital lights: it must be a philosophy of shadows (and maybe of shallows, to speak with Nicholas Carr), of the shadows which inevitably accompany all light.
We can no longer ignore the irreducibly ‘pharmacological’ character of writing– that is, its ambivalent character –, whether writing be alphabetical or digital, etched in stone or inscribed on paper, or in silicon, or on screens of digital light. Writing, and more precisely printed writing, is the condition of ‘enlightenment’, and it is for this reason that Kant says that it is addressed to the ‘public who reads’. But there is never light without shadow. And it is for this reason that, in 1944, Theodor Adorno and Max Horkheimer were able to perceive, in the rationalisation of the world, the opposite of reason and of the Aufklärung.
This irreducible ambivalence, which applies to technology in general, and which the twenty-first century, like Nicholas Carr, discovers almost every day through a thousand experiences of the limits and ambiguities of technological progress, this irreducible ambivalence is what neither modern philosophy, nor ancient philosophy, were able to think. And this is what, faced with the new challenges of the digital world, it is today imperative for us to understand: it is with this that we must learn to think and to live differently.
*
Unlike Tim Berners-Lee and Harry Halpin, Vint Cerf argues that internet access cannot be subjected to law because digital technology is an artefact which can change – and which never stops changing. But was not writing, which in Greece was at the origin of law, as well as of geometrical thinking, itself equally artefactual? It seems that writing has become stabilised and universalised as alphabetical writing, and it seems as though, as such, through this writing and through its apparent stability, it is the universal structures of language and, beyond language, the universal structures of thought, which were discovered. This is how it may seem – but things are perhaps not quite so clear.
And here lies the true stakes of a debate with that philosophy which we have inherited in the epoch of generalised digitalisation: what has been the role of writing, and, beyond that, of technics, in the constitution of thought, and especially of that thought which was universalised through the Enlightenment and through its emancipatory discourse? With the development of the web such a debate has become unavoidable – and it is a debate that I argue must be held in the framework of digital studies, and within that framework this is the fundamental question.
If Tim Berners-Lee and Harry Halpin can raise the question of a universal right to internet access, a question that a philosophy underlying the conception of the web must incarnate, and of which W3C would be the bearer, this is precisely because the web is a function of a digital technical system which could be otherwise, and which could even disappear, and if they argue that this is a right, it is because this philosophy and this stability must support the need to ensure not only a certain conception of the internet, of its functions and its goals, but the sense of mental, intellecutual, spiritual social and let’s say noetic progress (noetic in the sense of Aristotle) that digitalisation in general must constitute.
In order to explore these formidable questions, we must take the measure of the following two points:
- first, the digital technical system constitutes a global and contributory publication and editorialisation system that radically transforms the ‘public thing’, given that the res publica, the republic, presupposes a form of publicness, of ‘publicity’ – what the Aufklärung called an Öffentlichkeit – sustained by processes of publication;
- second, this publication system is inscribed in the history of a process of grammatisation, which conditions all systems of publication. The concept of grammatisation, as forged by Sylvain Auroux (who was the first director of the École normale supérieure de Lyon), provides important elements for the discussion inaugurated by Tim Berners-Lee around what he referred to as philosophical engineering.
With the concept of grammatisation, Auroux was able to think the technical conditions of the appearance of grammata, of letters of the alphabet, and of their effects on the understanding and practice of language – and to do so from their pre-alphabetic conditions (ideograms and so on) and right up until the linguistic technologies that Auroux called ‘language industries’, and passing by way of the printing press.
I have myself extended this concept by arguing that grammatisation more generally describes all technical processes that enable behavioural fluxes or flows to be made discrete (in the mathematical sense) and to be reproduced, those behavioural flows through which are expressed or imprinted the experiences of human beings (speaking, working, perceiving, interacting and so on). If grammatisation is understood in this way, then the digital is the most recent stage of grammatisation, a stage in which all behavioural models can now be grammatised and integrated through a planetary-wide industry of the production, collection, exploitation and distribution of digital traces.
*
The grammatisation of a type of behaviour consists in a spatialisation of time, given that behaviour is above all a form of time (a meaningful sequence of words, an operational sequence of gestures, a perceptual flow of sensations, and so on). Spatialising time means, for example, transforming the temporal flow of a speech such as the one I am delivering to you here and now into a textual space, a de-temporalised form of this speech: it is thus to make a spatial object. And this is what is going on from alphabetic writing to digital technology, as shown by Walter Ong :
Writing […] initiated what print and computers only continue, the reduction of dynamic sound to quiescent space. 1
This spatial object can be engraved on a wall, inscribed on a clay tablet, written or printed on paper, metastabilised on a silicon memory chip, and so on – and these various supports make possible operations that are specific to each form of support, that is, proper to each stage of grammatisation.
Spoken language is what Edmund Husserl called a temporal object, that is, an audible object which only appears in disappearing. But when speech is written down, becoming through this grammatisation a spatial object, that is, a synoptically visible object, then synopsis makes possible an understanding that is both analytic (discretised) and synthetic (unified). This spatialisation is a materialisation. This does not mean that there was something ‘immaterial’ that subsequently became material: nothing is immaterial. For example, my speaking is material: it is produced by vocal organs which produce sound waves, which are themselves supported by molecules, composed of atoms, molecules that begin to vibrate in the surrounding air, and so on.
One can speak of a visibly spatialising materialisation to the extent that there is a passage from an invisible, and as such in-discernable and unthinkable material state, to another state, a state that can be analysed, critiqued and manipulated – in both senses that can be given to this verb, that is :
- on which analytical operations can be performed, and intelligibility can be produced; and
- with which one can manipulate minds – for which Socrates reproached the sophists in the case of writing, writing being the spatialisation of time of what he called ‘living speech’.
If grammatisation is thus a process of materialisation, then hominization is itself, moreover, and in a very general way, a process of materialisation : man is the living being who fabricates tools, and in so doing he transforms the world in never ceasing to materialise anticipations – what Husserl called protentions, and I will explain later why which I must express this through the vocabulary of the founder of phenomenology.
Grammatisation is a very specific type of materialisation within a much larger process of materialisation of all kinds that Georges Canguilhem called ‘technical life’ – which distinguishes us from other living things.
Grammatisation begins during the Upper Paleolithic age, some two million years after technical life began. It enabled mental and behavioural flows to be made discrete, and thus enabled new mental and behavioural models to be created. In the course of materialisation and the spatialisation in which it consists, the constitutive elements of grammatised mental and behavioural flows are made discrete, and temporal mental realities, which have become identifiable through finite lists of analysable and calculable elements, are modified in return.
The visible and tangible emerging from this spatialisation constitutes an object that belongs to the class of what I call tertiary retention. I borrow the term ‘retention’ from Husserl. Retention refers to what is retained, through a mnesic function itself constitute of a consciousness, that is, of a psychical apparatus. Within this psychical retention, Husserl distinguishes two types of retention, one he refers to as primary and the other secondary.
Secondary retention, which is the constitutive element of a mental state which is always based on memory, was originally a primary retention : primary means retained in the course of a perception, and through the process of this perception, but in the present, which means that primary retention is not yet a memory, even if it is already a retention. To perceive a phenomenon is to retain and unify in the course of the perception of the phenomenon everything that appears as the identical ‘content’ of the perception (of the perceived phenomenon) but that each time presents a different aspect (an Abschatung).
A primary retention is what, constituting the course of a present experience, is destined to become a secondary retention of somebody who has lived this experience that has become past – secondary because, no longer being perceived, it is imprinted in the memory of the one who had the experience, and from which it may be reactivated.
But a retention, as the result of a flux and emerging from the temporal course of experience, can also become tertiary through the spatialisation in which consists the grammatisation (and more generally, in which consists any technical materialisation process) of the flow of retentions. This mental reality can thus be projected onto a support that is neither cerebral nor psychical but rather technical. The web grants access to such a space, through which shared, digital tertiary retentions are projected and introjected, constituting as such a new public, global and contributory space, functioning at the speed of light. What light and what shadow, what Enlightenment and what Darkness, can and must this bring us?
*
Michel Foucault spoke about the materialisation of knowledge in The Archaeology of Knowledge – but without placing it in the context of the grammatisation process, nor by understanding it in relation to primary and secondary retention – when he was interested in the archives that make possible all knowledge. Knowledge is above all a collection of archived traces, that is, ordered and modelled traces, thereby constituting an order – and submitted to this order and to this model which orders these traces. Knowledge, modelled in this way, thus conserves the trace of the old from which it comes, and of which it is the rebirth and the trans-formation through a process that Plato described as an anamnesis.
The conservation of traces of the old is what enables the constitution of circuits of collective individuation across time and in the framework of a discipline which governs the relations between minds, which individuate themselves in concert and in the course of intergenerational transmission – through which a transindividuation process is concretised, producing what Gilbert Simondon called the transindividual, forming significations. Now, the conditions of this process are over-determined by the characteristics of grammatisation, that is, by the characteristics of the archive supports that are tertiary retentions of different epochs : ideograms, manuscript texts, prints, records, databases, metadata, and so on.
The archive is material, according to Foucault, and knowledge is essentially archived, which means that its materiality is not something which occurs after the fact, for recording something which would have occurred before its materialisation : this is the very production of knowledge. This materialisation doesn’t come after the form that it conserves, and it must be thought beyond the opposition of matter and form : it constitutes a hypermatter.
The hypermateriality of knowledge is that which, in the epoch of the web and of what it produces as new processes of transindividuation, must be studied as the condition of construction of rational forms of knowledge and of knowledge in general. We must situate the study of the hypermateriality of knowledge within the framework of a general organology which studies the supports and instruments of every form of knowledge. And in the contemporary context, this study of hypermateriality must be placed at the heart of digital studies, which must itself become the new unifying and transdisciplinary model of every form of academic knowledge.
General organology studies the relation between three types of organs characteristic of technical life : physiological organs, technical organs social organisations. Grammatisation began thirty thousand years ago, inaugurating a specific stage of the process of the co-evolution of these three organological spheres, which are inseparable from one another – as shown in an extremely clear way by the neurophysiology of reading where, as Maryanne Wolf puts it, the brain is literally written by the socio-technical organs, and where our own brain, which she calls the ‘reading brain’, was once written by alphabetical writing, but now is written by digital writing :
We were never born to read. Human beings invented reading only a few thousand years ago. And with this invention, we rearranged the very organization of our brain, which in turn expanded the way we were able to think, which altered the intellectual evolution of our species. 2
Now, with the web, we are living through a passage from the reading brain to the digital brain, and this raises a thousand questions of rights and duties, in particular in regard to the younger generations:
[W]e make the transition from a reading brain to an increasingly digital one. […] Reading evolved historically […] and […] restructured its biological underpinnings in the brain [of what must be thought of] as a literate species. 3
And the question, during this transition, is to know:
what [it] is important to preserve. 4
It is a question of knowing what must be preserved, within the digital brain, of that which characterised the reading brain, given that writing new circuits in the brain can erase or make illegible the old circuits.
The writing of the psycho-physiological organs through the socio-technical organs constitutes the reality of the history of thought, that is, of what Hegel called and described as the phenomenology of Geist – except that within the phenomenology of tertiary retention that I am talking about here, technics is the main dynamic factor, and does so precisely insofar as it constitutes a system of tertiary retention, this dynamic being ignored by Hegel.
The emergence of digital technologies, of the internet and the web, which is also the age of industrial tertiary retention, is obviously the new page (a hypertextual and hypermaterial page) on which is inscribed and read (in HTML 5) the history of thought – through what must be understood as a new system of publication constituting a new public thing, a new res publica.
The web constitutes an apparatus of reading and writing founded on automata that enable the production of metadata on the basis of digital metalanguages which change what Michel Foucault called the processes of enunciation and discursive formation. All this can only be thought on the condition of studying in detail the neurophysiological, technological and socio-political conditions of the materialisation of the time of thinking (and not only of thinking, but also of life and of the unthought of what one calls noetic beings, which is also, undoubtedly, of their unconscious, in the Freudian sense).
It is for this reason that we must develop a general organology capable of specifying the characteristics, the course and the stakes of a process which began in the Upper Paleolithic as the materialisation of the flux of consciousness projecting new kinds of mental representations forged through this very projection – and which we shall now see is also an intro-jection.
*
From out of this rupestral projection, which is also the birth of art, the exteriorisation of the mental contents of the mind begins. After the Neolithic age, specific retentional forms appear, which make possible the control of mental contents, as the first forms of calculation, then as the step-by-step recording of geometric reasoning, and such that mind, self-controlling and self-critiquing, constitutes the origin of logos – as identified by Husserl in 1936 when he saw that the origin of geometry was founded on literal (that is, lettered) tertiary retention.
From the origin of philosophy and up until our own time, this process has been concealed: its study was made impossible by the metaphysics of fundamental ontology – conceived as an ontology of pure thought, that is, prior to the impurity of its exteriorisation, a thought that Kant would call ‘a priori’, and that this metaphysics would take to be the only true knowledge, where being is presumed to precede becoming, and to thus make itself knowable.
Today, grammatisation spreads and accelerates, and trans-forms all forms of knowledge, at a time when we are also learning from neurophysiology that cerebral plasticity and the transformation of what Maryanne Wolf calls ‘mental circuitry’ through the introduction of tertiary retentions (literal tertiary retention, for example) is thinking – thinking consists in the production of new circuits, through the materialisation process that comes to modify existing circuits, and sometimes to destroy them, the question being to know what we must ‘preserve’.
This constitution of the mind through the introjection of tertiary retentions has today become visible because it can be experimentally studied by neurophysiologists equipped with apparatus through which they can observe the life of the mind, that is, of movements occurring within the cerebral apparatus, like introjection, but also and above between this apparatus and the apparatus of tertiary retention emerging from grammatisation, that is, from the projection of the mind outside itself.
The fact that the exteriorisation of the mind is the condition of its constitution means that the mind cannot be a pure substance that, by exteriorising itself, alienates itself through this exteriorisation. The constitution of the mind through its exteriorisation is its expression as a result of a prior impression. The projection of the mind outside itself constitutes the mind through its materialisation and spatialisation as a movement : the mind is as such mobility, motility and emotion (this is how we should interpret the theses of Antonio Damasio).
This projection that is constitutive of the mind, however, can also discharge it : it makes possible what Socrates described as a short-circuit of the life of the mind through an exteriorisation without return – that is, without re-interiorisation. Projection can in fact only constitute a mind insofar as it re-temporalises that which, spatialising itself, must be individuated and ‘interiorised’ in order to come to life. Tertiary retention is dead, and it remains so if it does not trans-form, through a reverse effect, the secondary retentions of the psychical individual affected by this tertiary retention.
This transformation of the individual is possible because the latter has, for example, ‘literalised’ his own brain, which has thus become a ‘reading brain’, and is therefore now a weave of literalised secondary retentions, that is, textualised secondary retentions, and becomes as such the object of constant self-interpretations. It is for this reason that Joseph Epstein can write that:
we are what we read. 5
And this is what Walter Ong made comprehensible when he wrote that literate human being are:
beings whose thought processes do not grow out of simply natural powers but out of these powers as structured, directly or indirectly, by the technology of writing. Without writing, the literate mind would not and could not think as it does, not only when engaged in writing, but normally even when it is composing its thoughts in oral form. 6
In other words, even when he speaks, and expresses himself orally, the literate human being is reading himself and interpreting himself to the letter – that is, he is ‘literally’ in the course of writing himself, given that everything he reads inscribes itself in his brain, and given that everything he reads reactivates and interprets the previously and textually written circuits of his secondary retentions : the literate human being speaks like the book that he is and that he reads.
What Maryanne Wolf adds, however, and through referring to Stanislas Dehaene, is that the acquisition of new retentional competences through the interiorisation of tertiary retentions can also replace ‘existing circuits’, that is, destroy them, and that this is why it is a question of knowing what must be ‘preserved’. Moreover, Socrates argued that exteriorisation cannot occur without re-interiorisation, that is, without individuation, for producing real thoughts, and that it is therefore necessary, as a law and a duty, to struggle against the sophistic drift.
What consequences can we draw from these considerations within the framework of our encounter here, and from the perspective of a reactivation of the Enlightenment project in the age of the web? I will conclude by attempting to give a broad outline of an answer to this question.
*
The writing of the brain is the writing of capacities for cooperating between brains – notably as constitutions of communities of reading (that is lettered or literate) brains, or digital brains. Socrates, however, argued that by enabling souls (and their brains) to be short-circuited or bypassed, the writing of the brain can also destroy both noetic and social capacities, and result in structural incapacitation, that is, lead to an inability to think for oneself.
This means that the interiorisation of technical organs by the cerebral organ, which is thereby reorganised, only constitutes a new stage of thinking on the condition that social organisations exist to ensure this interiorisation – such as, for example, the paideia practiced at Plato’s academy. The question of what must be preserved is thus a matter of social circuits, and not only cerebral circuits.
Providing better access to the internet therefore requires thoroughly rethinking the formation and transmission of knowledge with a view to ensuring an historical understanding of the role of tertiary retention in the constitution as well as the destruction of knowledge, and with a view to deriving on this basis a practical and theoretical understanding of the digital tertiary retentions that transform cerebral and social organisations. Without such a politics, the inevitable destiny of the digital brain is to find itself slowly but surely short-circuited by automata, and thus find itself incapable of constituting a new form of society with other digital brains.
Automation makes digitalisation possible, but if it immeasurably increases the power of the mind (as rationalisation), it can also destroy the mind’s knowledge (as rationality). A ‘pharmacological’ thinking of the digital must study the contradictory dimensions of automation in order to counteract its destructive effects on knowledge. The point is not merely to ensure there is a right to access the internet, but of having a right and a duty to know (through education) that there are invisible automatisms which may escape digital brains – and which may manipulate them without teaching them how to handle them.
This question arises in a context in which neuromarketing is today in a position to directly solicit the automatisms of the lower layers of the cerebral organs by short-circuiting or bypassing the networks inscribed through education in the neo-cortex. That the automatisms of the nervous system are in this way combining with technological automatisms is the threat (that is a shadow) against which the new enlightenment must struggle.
Thinking is above all the history of grammatisation, the history of the relations of projections and introjections between the cerebral apparatus and tertiary retentions. It is for this reason that the question of philosophical engineering arises, as posed by Tim Berners-Lee. Philosophical engineering must lead to a close articulation between psychosomatic organs, technological organs and social organisations, while ensuring that the technological layer does not short-circuit the psychosomatic and social layers.
It is a question of creating an intelligent articulation between the social web and the semantic web, and where these two must not be opposed, but rather composed – through social and educational organisations completely rethought from this perspective – what at the Institut de recherche et d’innovation we call technologies of transindividuation, through which must be constituted the organs of contributive society.
Translated by Daniel Ross
Daniel Ross is co-director of the film The Ister and translator of four books by Bernard Stiegler: Acting Out (2009), For a New Critique of Political Economy (2010), The Decadence of Industrial Economies (2011), and Uncontrollable Societies of Disaffected Individuals (2012, forthcoming).
Notes
- Walter Ong, Orality and Literacy: The Technologizing of the Word (New York: Routledge, 2002) p.81 ↩
- Maryanne Wolf, Proust and the Squid: The Story and Science of the Reading Brain (New York: Harper, 2007), p. 3. ↩
- Ibid., p. 4. ↩
- Ibid. ↩
- Joseph Epstein, cited in ibid., p. 5. ↩
- Ong, Orality and Literacy, p. 77. ↩