Antonio Penedo Picos
Nowadays, it is barely worth mentioning that digital language is causing an irrevocable shift towards a new society. However, assumptions that this change will consist of simply substituting oral/written communication for audio/visual are nothing more than a false, yet to be established dichotomy. If anything will herald the dawn of a new digital age, it will be the definitive, ingrained preeminence of linguistics as an essential skill in managing the technologised subjects of the already-present future in which we find ourselves.
Let us consider, for a moment, how we can numerically quantify our evolution as a species: millions of years spent in a hominid state; mere hundreds of thousands to become homo sapiens, and finally the culmination of as little as tens of thousands to reach our phylogenetic zenith as homo sapiens sapiens, speeding through the revolutionary palaeolithic and neolithic eras. What caused such a shift from slow, gradual change to these subsequent leaps and bounds? We are faced with a case of what Stephen Jay Gould would call “punctuated equilibrium”, since between the very long Phase A (millions) and the disproportionately shorter C, D, E and F (each faster than the last), there is little or no sign of continuity. The first phase deals with the transition, very gradually over millions and millions of years, from amphibians or reptiles to mammals, and then, as if in a flash (in evolutionary time) we became human beings. What could account for this? We outgrew our audiovisual abilities and developed new, verbal ones, along oral and proto-orthographic lines. It is the emergence of language, the creation of morphology and syntax and the ability to assemble these pieces into lengthy paragraphs, which heralded and enabled the birth of human society. It was, in a nutshell, the brain’s capacity for symbolic thought which supercharged our physical forms and drove us to appropriate the natural world for our own ends. What made us human was not the ability to see or hear our environment, but rather to articulate it in words. It was this transition, from an audiovisual protoculture to an oral/written one, with the icons and symbols this makes possible, that brought us to where we are today. There is nothing more nonsensical than to say that “a picture is worth a thousand words”. If such a thought exists, it is only because a thousand words have been used to explain the picture’s proper significance, and because, in any case, our evolution has always been guided by feedback loops linking images with symbols and vice versa.
tDigital language can only serve an emancipatory function in the modern world if it is used by minds whose synapses have been trained in decoding signs and symbols, minds which can reprocess each intellectual and perceptual action through a hermeneutic process. Steven Rose reminds us that our neurons work not digitally but analogically, and that humans understand things primarily through associated meaning, as opposed to direct contact with the phenomena we face. The global cyberspace which technology opens up does not function as optimally as it does because digital technology itself is fast, but rather because we make fast decisions when using it: we click through web pages at dizzying speeds because our neurotransmitters know how to interpret symbolic language, language which cannot exist without the support of a system of symbolic signifiers. As a consequence, the seemingly boundless communicative possibilities offered by new technology are not the result of being freed from the “burden” of traditional language, but rather they are enabled by this language itself, which has facilitated a new evolutionary leap and created space for the (re)activation of long-lost audiovisual perception as a direct result of the oral/written abilities developed by humans over recent millennia. If we can match each other’s audiovisual capabilities on a global level, it is only because of the thousands, if not millions, of words that will allow us to do so.
Let us consider then, that the last unit of measurement is not “tens of thousands of years”, but “decades” (and not many of them) in which the subjects of the new digital era are being decided. We find ourselves, no doubt, facing a new phase of human evolution whose possibilities we can barely guess at. However, the subjects of this era do not yet exist; all us are still semio-hybrid beings. New technology, and the new norms of communication it entails, are meaningless if detached from what came before them, and with regard to this point it is worth dismantling (or cancelling) the false, two-dimensional debate between classical and audiovisual modes. In order for one of these to advance, it is because it draws on the other, and because the person who seeks to advance it has been educated in previously codified norms: there is no opposition or substitution, only transitive cross-pollination and integration. School and university models, and their associated social and group dynamics, continue to be governed according to non-digital parameters, the persistence of which is paving the way for the digitalisation of society. The future, as designed by certain people, is nothing without the presence of everything they (falsely) believe must be abandoned. The more steps made toward the digital epiphany, the more the (not yet) past will seep into online spaces.
Things will become even more complicated when we reach the next evolutionary stage made possible by new technology. This will be when the prominence of the entity known as “paragraph” will decide who is and is not tuned into the new reality. This will be when verbal modes will be unravelled into the transfuture (perhaps better described as the omni-present). In this time we will finally come to understand that without the ability to interpret phenomena through symbols we cannot and will not have community. In short, digital community cannot exist without grammar. We have evolved as humans because our brains computed the world via symbols, and as we enhanced this ability by learning to fire on all synapses, we accelerated our own genetic progress through space and time. Without language we are not human, and there will be no transhuman world that is not based on signs and symbols. The acceleration triggered by technology will not sweep grammar aside, it will radicalise it. A digital subject cannot exist without grammar.
2. Analogue prevalence in two historical parentheses
Decades, as we have seen, constitute the final unit of the evolutionary process. Even until the Second World War there were substantial pockets of illiteracy scattered throughout Europe, and many sections of society still only had access to an audiovisual understanding of the world. The post-war era was put to an end by mass education, the creation of the welfare state and the democratisation of symbols. These conditions were the analogue precursors to our digital era. It bears repeating: the great biological leaps made by the human species with respect to its previous hominid phases were made possible by the incorporation of language into our minds and bodies; the last great digital leap, which we have barely even begun, is the result of the universal democratisation of this process and its consolidation into graphemes. Without command of grammar there can be no audiovisual progress. New technological demands are the product of new demographics, and can be parenthetically categorised into two eras, with an incision cut between them.
The first period in question begins in 1945 and ends in 1989, and can be summed up as follows: the experimentation and expiration of contemporary capitalist parliamentary representation. During these decades, people were testing out the principles of collective access to power, control of this power via elections, and secular faith in the communion between representatives and the represented. With the fall of the Berlin Wall came the demise of this model, but at the same time the wall’s demolition marked the end of any possibility that the opposing bloc’s ideology could ever present a threat to political or economic praxis in the West. So, what then happened was that, when one of the two elements of the era-defining opposition (capitalism vs communism) collapsed, the other ceased to be a partial option and was rebranded under a new name: common sense. From 1989 the criteria for (and control of) governability in Europe, along with the associated socioeconomic norms and directives, came to be perceived less as cultural constructions, and more as reality itself. This defined the 1990s, whose rallying cries of market “deregulation” and “self-management” were championed by the business and banking industries, and were encouraged (even publicly cheered on) by governments.
It was not only the 90s-era market deregulation and self-management that caused such explosive, imbalanced global growth and introduced poverty to swathes of the planet that claimed to have it under control, but also that it has led to a reality which is, for all of us, irreconcilable with conceptions of what is human and what is real.
The second period begins in May 1968 and ends in September 2001, and can be defined in even simpler terms than the last: the limits of postmodernity. After the system was rebooted in the early 1950s and put to the test in the 1960s, it became apparent—before the decade was even over—that new generations did not see themselves represented in this model and wanted little to do with it. This is the epitome of postmodernity: reflection on a phenomenon and radical suspicion of any trust placed in, or mediated by, culture. This overarching meta-principle ended abruptly with the symbolic (and physical) wound left by the 9/11 attacks, when globalisation blew up in our faces and forced us to reckon with the fact that the ethnocentrism revealed by postmodernity no longer governed the fate of the world. A new chapter was opened which is still yet to be named.
Then there is the incision between the two parenthetical eras (capitalism and postmodernity) marked by a third, much more recent date: 1995. This year was when the internet reached its democratic potential, and we will use this date to (re)set the tone of the previously mentioned periods via the following hypothesis: 1989 and 2001 were the end of the line for their respective models of growth and development; they brought us to where we are now, but are no longer relevant. The disrepute into which our parliamentary systems have fallen and the growing disconnect between citizen and politician is plain to see, and it has also become clear that we cannot continue to doubt our suspicions of what may or may not be real. The ending of these two parenthetical eras marks the double culmination of two phases, a culmination which may have led to the demise of Western nations, had they not been the birthplace and epicentre of the Internet, and therefore of the expansion of digital society and culture. Electronic means were what saved (if we wish to think in such terms) us from potential societal collapse following the end of these twin projects after the Second World War. Were it not for the birth of web pages, blogs, social media, chats, emails, and were it not for the mass communication facilitated via digital channels, it is difficult to see how we could have moved past the fatigue, lack of dialogue and distrust in traditional media to circulate ideas and actions. What we seek to make clear is that cyberspace has taken on the role of regulating, managing and intervening in socio-civic interaction, roles previously played by assorted parliamentary, municipal, or governmental actors. What we also seek to make clear is that the characteristically postmodern suspicion of reality, never intended to drive us towards nihilism, has been energised by digital language and has led suspicion to become a form of subscription. Mistrust and wariness have become the poster children for the de-hierarchisation of digital libertarianism.
1995 is the year that reversed the declining trend of the parenthetical periods which marked the end of mid to late twentieth century experimentalism. It is just as well that the virtual world came to our rescue to make reality more navigable. So, what we have been trying to show from the beginning is that, if a lucky escape from both periods was possible, it was only because the analogue realities forged in them made their own transmutation possible when the time was right. It is here that we must once again consider the false, flawed dichotomy of a digital cultural model which believes it can do away with an analogue cultural model. If generations and patterns are able to be replaced, it is because there was first a satisfactory resolution and assimilation of what came before. If, in spite of all the objections we may raise to the respective processes set in motion in 1945 and 1968, we fail to recognise what we got out of these periods, then we fail to properly assess our own evolution. What took place in the second half of the last century was nothing more and nothing less than the (relatively) equal distribution of linguistic signs and signifiers among all strata of society. The second half of the twentieth century is the consolidation of what Juri Lotman would come to call the semiosphere, i.e. the insertion of the human being into a sphere of cultural signs and codes which safeguard it and increase its chances of survival. We can thus return to the initial thesis: digital culture and language will not do away with analogue culture and verbal language, but rather they—having borne us out of humanity’s initial audiovisual state—will now, once again, guide us towards the new, post-2001 millennial horizon.
3. Human evolution and automated kindness
The second half of the twentieth century saw the historically unprecedented expansion of “livability” in terms of existence which, just when it was beginning to decline (its self-interest having destabilised the “economy-first” approach, while the economic regulation feeding frenzy left it above any criticism), came to be rescued by a new subject which emerged from within it.
Anthropological sequences can be divided up as follows: the signo-biological thread, the technological thread and the cyber-cultural thread, which make up an informational society (to place Manuel Castells’ analytical stamp on it), played out in an new-old environment which is nothing other than the brain itself. To all the dates mentioned thus far we can add the final (and most dramatic) one, which confirms our thesis. The date is October 2008, when a stock market crash not seen since 1929 made something transcendentally clear: the new digital era cannot work under the socioeconomic parameters that created it. There is no longer the mere ethical consideration of how wealth ought to be distributed, but also the fact that we have created a system in which no one is truly safe, not even those at the very top of the pile who benefit the most from it. It must be emphasised that it was not only the 90s-era market deregulation and self-management that caused such explosive, imbalanced global growth and introduced poverty to swathes of the planet that claimed to have it under control, but also that it has led to a reality which is, for all of us, irreconcilable with conceptions of what is human and what is real.
As paradoxical as it may seem, we can argue that the growth and development model of the 1990s paved the way for a new kind of experience, that of “automated kindness”, which cannot be sustained by the means which created it. By looking once more at our evolutionary development, we see that what defined humans was the ability to make tools, and this ability was accelerated by the creation of that most instrumental of tools: the verbal sign. Humanity’s goal from then onwards (for the next thirty to forty thousand years) would be to overcome the harsh savagery of the natural world. Natural selection took care of the most elemental aspects of bodily subsistence: surviving harsh environments, outlasting plagues and epidemics and devouring without being devoured. We know that Charles Darwin based his theory of the evolution of species on these premises, and it is understandable that he focussed his attention on somatic subsistence and its changing forms; cerebral capacity had to be used in the service of physical survival and so the symbolic had to be used in the service of the somatic. However, in the digital age this line of reasoning must be turned on its head. We have reached a point of such comfort, of such “automated kindness” and physical wellbeing that it is now the body which enables (or rather demands) a new mindset which was hitherto inconceivable. It may appear (by now to the point of certainty) that the growth of digital language has taken place in order to facilitate commercial transactions, governmental legislation, and, in general, the ordering and maintenance of our social fabric. It is precisely because we have reached such a point that digital language is what will take us beyond our current state. The domestication of our environments makes clear the precarity of our most important environment, that which rules our every moment, but which has yet to be carefully considered: I am referring to the cerebral environment. Evolution accelerated because when humans truly started to develop, they began to understand how to kick-start the hundreds of billions of synapses to which they have access. It was not arms and legs which carried us so far, but rather the thinking of what we could or could not do with our arms and legs. Now, there is little progress left to be made with our bodies because the entire planet has been placed in service to them. It is the mind which will implode in the digital age, and it will be able to survive doing so because it knows how to process analogically. It has not yet dawned on us that we will not know what to do with the millions of people whose life expectancy could reach as much as 100 years. We are faced with a physical longevity which has no precedent, no instruction manual. One could consider the possibility of a 40/60 cycle, whereby at the age of forty a person is not considered to be past any sort of peak, but that this age marks the beginning of a new sixty year macrocycle. The fullness of life is not reached until the age of fifty, and “adult-lescence” lasts until the end of one’s thirties. There is no established roadmap for such unprecedented biological circumstances which can help us to make sense of our lives at this point. As Hans Blumenberg said, people want to live longer, but they do not think about why. Automated kindness will not dictate what gives meaning to our lives, but perhaps by making the most of automated kindness our lives will find their own meaning.
New technology and the hyper-domesticated environment in which millions of human beings live will allow them to indulge in their thoughts and feelings like never before. The question is, will they know how? When they resort to digital language, will they know what they want to say and how to encode and decode it? This is the key to the new poetics of the digital age: if life expectancy is going to reach 100 years; if parliamentary models are failing to properly manage our societies; if hitherto unquestionable ideas are being questioned; if, in spite of everything, comfort has taken precedence; if, on top of everything else, the ability to access and be shaped by information is what will define our freedom, are we analogically prepared for the digital world? Are we verbally equipped for digital citizenry? The distance we must travel, from what we are currently capable of to what this task requires will not be measured, as many believe, from the end of the journey, but from the beginning. It is so clear that the transformation of reality will be produced by any and all means that it apparently has not dawned on us that such a task will require a subject attuned to such methods. The path being cleared by digital technology will not be paved by the correct use of different formats, interfaces or channels, but by the ability to recodify ourselves via the messages we send through them. It will be a semiotic issue, not a computer issue. We are experiencing a genetic mutation in our brains, because they carry the responsibility of having enough neural pathways prepared to accommodate any new information.
We already know that it is debatable whether evolution is headed for better or worse states, but what should be clear is that it will bring about complex, novel developments. So, it would appear that those subjects who have the purchasing power to acquire whatever technology they want and are equipped to use it will be the new vanguard, the harbingers of change. However, this is not necessarily true, as the birth of new forms of wealth will engender new forms of poverty. The poverty which is now emerging is what we could call “semiotic poverty”, which can be understood as being suffered by those who are incapable of understanding the events occurring in their own bodily environments. Such an inability will not necessarily depend on income levels, bank balances or asset portfolios, but on the ability to distinguish which messages need to be processed or not, how to fit oneself into higher levels of comprehension and, most importantly, what proactive actions must be undertaken in order to achieve this.
4. A more than digital divide
The digital era will require, as never before, articulate, coherent and assembled linguistic subjects structured in ordered, rather than simply juxtaposed, units which are capable of oral and written comprehension and the ability to circulate them online. New digital languages will further differentiate those who only have a basic level of sign-based subsistence from those capable of revelling in sign-based creativity. The closing of the aforementioned eras (the ending of both an economic and political model and a sociocultural value system) was galvanised by the advent of the Internet in 1995 and the globalisation of the semiosphere that it entailed. Digital language will be (and in fact already is) the stuff from which interconnected societal paradigms more resistant to hierarchical control will be built. However, in order for such a scenario to endure, we need highly literate, actively grammatical subjects, with the ability to comprehend and synthesise information, as well as transmit and generate alternatives, and education will play a significant part in this. It is right here that we are presented with the true divide between the worlds of today and tomorrow: there is so much attention being paid to breaching the digital divide that the analogue rift caused by the online world is going completely unnoticed. What we argue is that primary and secondary education, as well as higher education, are basing their programmes on media resources and lining up knowledge with the job market, when they should in fact be prioritising the relationship between teaching patterns and civic participation. Currently, their top priority is the training and specialisation of individuals to meet the labour needs of a society focused on research and development, but they fail to realise that, while workers can be specialised, the citizens of the future will be anything but sectoral.
The divide which is becoming entrenched centres on separating the working subject from the real human like never before. The former is asked (though really they are forced) to attain a deep knowledge of only that subject matter which is deemed to be productive, while the latter seems to be asked for (or advised of, as it were) nothing. We regularly hear the World Health Organisation’s statistics on the rise of mental illness, not only in the Western world but also across the globe. Can we believe that economic thresholds are the sole deciding factor in these percentages? What if we started to consider that the opposite were true? It is at this juncture that digital language and nascent cyberculture once more take centre stage. What will take place in the coming decades is that such an expansion of messages and forums will submit our limbic systems to such overload that only those who are capable of signic self-regulation will be able to flourish in the new world. Human evolution accelerated due to our mental development, and human evolution may now enter into recession due to a lack of mental ability. In this new incarnation of Darwinism, survival will not depend on muscular and skeletal prowess, but rather on the ability to tightly control cerebral processes at all times. This will not be an environment of predatory animals or famine and disease, but of a dazzling array of media vying for our attention from the moment we wake up to the moment we go to bed. Analogue training and discipline is the only solution to this so-called digital divide.
This is the key to the new poetics of the digital age: if life expectancy is going to reach 100 years; if parliamentary models are failing to properly manage our societies; if hitherto unquestionable ideas are being questioned; if, in spite of everything, comfort has taken precedence; if, on top of everything else, the ability to access and be shaped by information is what will define our freedom, are we analogically prepared for the digital world?
All of the above is applicable to inhabitants of high and upper-middle income countries, and lower-middle and low income nations will need to adjust their educational programmes according to the means of production and the information available to them. However, we also know that lower income economies lag further behind than ever, not only because they cannot “catch up” with other societies, but also because they are being effectively cast aside by, or trampled under, the sheer momentum of what we think of as progress. Here, we can add the missing, crucial piece to the puzzle of any reflection on digital language: the new bioethics of the third millennium. Without discourse and ensuing specific action which includes and embeds digital subjects in top-level, global organisation from a position of participative solidarity, no kind of future can be envisioned, not in the immediate, let alone the near future. We can thank the year 1995 once more for unlocking what could not be imagined in the periods of 1945-1989 or 1968-2001, when it became apparent that group or parliamentary forums as they were understood at the time did not possess the will, ability or knowledge to account for or challenge global inequalities. If one thing defined the final decades of the twentieth century it is the groundswell of associative and alternative grassroots organising never before seen in the history of humanity, and it goes without saying that, were it not for the existence of digital technology and digital language, these movements could never have been coordinated with such speed and efficiency. Who would have thought that the cyberworld would be what ended up moulding our physical world? We therefore now find ourselves faced with a novel alliance between technology and ethics which, up to now (for example during the two industrial revolutions of the nineteenth century), has hinged on principles of domination and capital, as opposed to those of solidarity and equitable distribution, and so it bears repeating: if the new world of technology can function as it does, it is only because the swift transmission of signs and ideas which has been trained by centuries of subjects versed in the language of signs can now, at last, be disseminated online ad infinitum.
5. New poetics for the digital era in times of crisis
With Voltaire’s licence, though at the risk of being naïve, we can take a page from Leibniz and try to imagine the best possible outcome for the world: millions upon millions of humans whose life expectancy is growing from one generation to the next, and a warming climate creating deserts across a planet which has been plundered of its natural resources. We will have to contend with ever-present tensions between the need for self-regeneration and the demand for expansion, between the comforts of automated domestic life and the bewildering question of the meaning of life itself. Perhaps, when all is said and done, we have not yet committed the sin of naïvely dodging the imminent apocalypse. What we have here is the final key element, which must be assembled in the new digital semiosphere through digital methods: participation via information. Living as we do in a world of endless options, we cannot forget that one option that is not clickable, and which the system does not account for: “not doing”, “not knowing”. What we can no longer expect from parliamentary systems, and what we can no longer ascribe to in religion will come to be replaced by computer systems optimised for biophilia and cooperation. The new poetics of the digital age will transform the innocuous virtual realm into the transcendental reality. This will be the definitive step from a solid to a liquid society, rather than the current resolution as defined by Zygmunt Bauman. This is not to discredit his ideas, but assessing the partiality with which the lone individual weaves his ideas moves us no closer to providing imaginative solutions for the new communities of the digital era. Characterising liquid modernity as fickle and insubstantial in contrast to the stoic dependability of the solid is to forget that the latter can also be immovable and authoritarian, while the former can be malleable, adaptive, liberating, changeable, flexible, and empathetic. Bauman would be right, if dismantling societal welfare were only to end in dismembered relativism with no criteria or reason. The lucidity with which he analyses many of the ills of our present is not matched by an ability to envision any future, nor by an even minimal analytical attempt to foresee future kindness after systemic collapse. We therefore find ourselves presented with a catch-22: we need to both reconcile climate change and rebuild our economic system. This is neither good nor bad news—there are no angels and devils whispering in our ears—as the result depends firmly on what kind of backbone we will create for the skeletal structure of the new homo sapiens sapiens digitalis: the imminence of environmental change dovetails with the imminence of semiospheric change. Both will have to be synchronised and this heroic deed, if we want to tell this story in epic language and emotions, can only be carried out via techno-codified means.
We are living through a crisis, undoubtedly the greatest experienced by humans in recent centuries, and also at the dawn of the greatest revolution ever seen since prehistoric times. However, we cannot forget that crisis is a noun derived from the Greek verb “krínein”, which in fact means to judge, evaluate and decide. What decisions must be taken, and what part does digital language have to play? We have to decide how we will move from a solid, dogmatic society to a liquid society rooted in contracts, from a coercion-based society to an agreement-based society. How can such a feat be accomplished? Through an analogue formula, universally distributed via digital terminals.
New technology can play as important a role as it wants in the coordination of future secular societies. The spirituality of each individual (if it exists) will have to be made compatible with the founding of a non-theological community whose citizens, as a whole, will never be subjugated to any creed. It remains to be seen how we will move from old anthropomorphic theogonies to the new cosmogonies of modern science. What we previously called pneumus or spiritus will be reborn as bytes upon bytes stored in our devices. The eternally veiled face of the divine will be hooked up to our interfaces and screens, and its Holy Name will grace the very code of our software.
Taking 2001 to be the endpoint of postmodernity does not mean it has been vanquished. Quite the opposite. In the coming years, the period leading up to 2001 will come to be reappraised as a time of much needed rebuttal of the old and unfettered dissemination of anthropological experimentalism. However, we are no longer in the postmodern period. What is now left to figure out, following the interregnum period from 2001 to 2009, is what a society will look like in which the market, despite its best efforts, cannot and does not dictate our existential consciousness. The crisis at the end of 2009 brought two things to light: deregulation does not lead to infinite wealth, and the markets are incapable of self-management. We therefore find ourselves in a new cycle which has caught many but not all of us off guard. We have known for a long time where wealth will be found in the third millennium: in the possession and exchange of information, in the attainment and assimilation of signs and symbols. We have also realised (thanks especially to computers, communicative media, and new technology) what truly remains to be developed: the brain and how it can work in service of its innate abilities. The new frontier will be psychic. It dwells within us already and will be what catalyses and provokes tangible, physical changes across our society and our planet.