“We may no longer be able to trust technology. A computer program could, without warning, become an uncontrollable force, triggered by a date, an event or a timer.”
(Clough and Mungo 223)
In 1991 the Information Security Handbook noted how “society is becoming increasingly dependent on the accurate and timely distribution of information” (Shain 4). This dependence, however, exposed the society to new kinds of dangers, accidents that have to do with information disorders – viruses, worms, bugs, malicious hackers etc.
In this essay, I focus on digital viruses as disorderly elements within the digital culture. It is due to certain key principles in computing and computer security that viruses and worms have acquired their contemporary status as malicious software, that is, malware – elements of chaos, accident and disorder. According to my claim, the fear of viruses does not stem just from the contemporary culture of digital technology. It is part of a longer genealogy of modern computing, which has emphasized issues of control, reliability and order. Viruses and worms threaten the conceptual ontology of digital culture in a similar fashion as epidemic diseases have been figures for social disorder throughout Western history. Unlike AIDS or other deadly biological viruses, computer viruses have not been known to cause casualties to humans, yet they have been treated the last years as the “killer viruses of digital culture”, connotating the seriousness of the threat. A “viral perspective” to digital culture reveals how underlying articulations of order are used to construct an all-too-harmonious picture of computers in modern society.
Anti-virus manuals, guidebooks and other such publications have especially contributed to our understanding of viruses as threats to the orderly digital society. The editorial of the first issue of Virus Bulletin (July 1989) sees viruses as a cunning form of vandalism and an indication of ”sabotage mentality”. Viruses destroy information and produce uncontrollability:
Rather like Hitler’s V1 ‘flying bomb’, no-one knows when or where a computer virus will strike. They attack indiscriminately. Virus writers, whether or not they have targeted specific companies or individuals, must know that their programs, once unleashed, soon become uncontrollable. (Virus Bulletin 2)
Computer viruses mean unreliable and unexpected danger: they are a chaotic element within a system based on security and order. According to a widely embraced view computer security means 1) confidentiality (the privacy of sensitive information), 2) integrity (authorized information and program exchange) and 3) availability (systems work promptly and allow access to authorized users). (E.g. Shain 5).
Viruses and other forms of malicious code are, consequently, a direct threat to these values, part of the modern episteme in general. This is what I will here define as “a computational way of thinking”. The concept refers not only to the epistemological and ontological presuppositions in actual computer science discussions, but also to the larger cultural historical contexts surrounding the design, implementation and use of computers. Of course, one has to note that computers have never been those reliable and rational dream-machines they have been taken to be, as they are exposed to various potentials for breaking down of which viruses and worms form only a minor part. Yet, interestingly, reading professional and popular depictions of digital viruses reveals that these sources do consider computers as otherwise integrated, coherent and pristine machines of rationality, which are only temporarily disturbed by the evil occurrences of external malicious software.
The virus researcher Vesselin Bontchev acknowledges how issues of trust and control are at the heart of computing and the virus threat: “a computer virus steals the control of the computer from the user. The virus activity ruins the trust that the user has in his/her machine, because it causes the user to lose his or her belief that she or he can control this machine” (31).
This definition resonates with broader cultural trends of modernization. Zygmunt Bauman has expressed the essence of modern science as an ”ambition to conquer Nature and subordinate it to human needs” (39). Bauman understands this as ”control management”: the moulding of things to suit human needs. The essence of modern technology proceeds along the same lines, defined through values of progress, controllability, subordination of chaos and reification of the world. From the 19th-century on, technology became closely associated with advances in science. The values of order and control were embedded in the machines and technological systems, and with time, these values became the characteristics of modern technological culture.
In this vein modernity can be defined as a new attitude towards controlling information. Capitalism and digital culture as historical phenomena share the valuation of abstraction, standardization and mechanization, which were already part of the technological culture of the 19th-century. Similarly, Turing’s universal machine was above all a machine of ordering and translation, with which heterogenous phenomena could be equated. This idea, concretised in typewriters, conveyer belts, assembly lines, calculators and computers served the basis for both digital machines and capitalism. The concrete connection was the capitalist need to control the increasingly complex amount of production, circulation and signs. Rationalism – as exemplified in Babbage’s differential calculators, Taylor’s ideas of work-management and cybernetics – was the image of thought incorporated in these machines (Gere 19–40).
In general, first order cybernetics fulfilled the project of modern abstract rationalism. In other words, notions of control and order play a significant role in the archaeology of information technological security, and these themes are especially visible in the thinking of Norbert Wiener, the pioneer of cybernetics.
Wiener’s cybernetics touches, most of all, upon the question of understanding the world as communication circuits and controlling them via successful feedback loops that maintain the stasis of a system. This theory relates closely the problem of entropy, a classical notion in statistical mechanics from the 19th-century: “Just as the amount of information in a system is a measure of its degree of organization, so the entropy of a system is a measure of its degree of disorganization; and the one is simply the negative of the other” (Wiener 11).
Wiener and the modern era share a respect for control and security. As products of modernity, cybernetics, systems theory and information theory are all in a way theories of order and cleanliness. This is the main theme of Stephen Pfohl’s essay “The Cybernetic Delirium of Norbert Wiener”, in which he describes the cultural historical background of modern cybernetic culture. To Pfohl, cybernetics does not mean a purely academic discipline but “a term connoting the most far-reaching of ultramodern forms of social control.” Pfohl delineates the genealogy of cybernetics from the early projects on anti-aircraft artillery to the functioning of the contemporary capitalist media culture. For Pfohl, Wiener’s theories connect directly to the power structures of modern society, sacrificing other ways of being, restricting other possible worlds from emerging. Paraphrasing Pfohl, cybernetics regulates and modifies the dynamic flows of the world into fixed, stabilized and controlled boundaries.
The engineering problem of logical calculation and communication of signals without noise expands towards the more general cultural fields of power and articulation. I would especially like to pick up the notion of noise, which, as understood by Bauman, means undefinability, incoherence, incongruity, incompatibility, illogicality, irrationality, ambiguity, confusion, undecidability, ambivalence, all tropes of “the other of order” (7). For cybernetics and early computer pioneers, noise meant a managing problem, objects in the way of transmitting signals. Noise as the most important problem for the rise of modern discourse networks was not solved once and for all in any historical phase, but remained part of the communication acts ever since, and the only resolution to the problem of non-communication was to incorporate it within the system (Kittler 242).
Computer viruses can be understood as contemporary instances of this notion of noise. They are software that short-circuit the “normal” operations of a computer and connect themselves to the basic functioning of the machine. Viruses mean short-term wiring of noise to the components of a computer. By definition, viruses have been conceived as a threat to any computer system for a) virus activity is always uncontrollable, because the actions of the virus program are autonomous and b) viruses behave indeterminately and unpredictably (Lamacka 195).
In a much more positive vein, this coupling of computing order and viral disorder has been noted by recent net art projects. According to the net artist Jaromil
the digital domain produces a form of chaos – which is inconvenient because it is unusual and fertile – on which people can surf. In that chaos, viruses are spontaneous compositions which are like lyrical poems in causing imperfections in machines ”made to work” and in representing the rebellion of our digital serfs.
Jaromil takes noise as the starting point and articulates how viruses function also as forms of resistance to the contemporary informational capitalist ideology of the digital. Charlie Gere’s analysis of the connections between modern technology and capitalism is apt in this regard as well: the abstract, standardizing and mechanizing machines of modernization serve the basis for both the cult of the digital and contemporary capitalism in a way that makes these two almost siblings. Thus, also accidents of this techno-capitalist culture are not solely technical, but social in that they are articulated on a plane of society and cultural interaction. Viruses can thus be understood as those “unwanted bads” that are a by-product of post-industrial culture of production of goods (Van Loon), as well as they can be viewed alongside other mass mediated apocalyptic monsters threatening the order of contemporary Western culture, as Luca Lampo from the net art group _[epidemiC]_ suggests:
We feel that “The Virus” is the “stranger”, the “other”, in our machine, a sort of digital sans papier—uncontrollable diversity. Once Hollywood, like Empire, finished killing “Indians” and the “Soviet Russians”, the Hollywood propaganda machine had to build other anti-Empire monsters to keep alive the social imaginary of 2001: aliens, meteors, epidemic… so many monsters.
In this light, while being technical bits of code that from time to time cause trouble for users, viruses act also as social signs which can be activated in various contexts. For representatives of the official computer culture viruses and worms are signs of disorder, chaos and crime that undermine the presumed reliability of digital culture, which would otherwise function “normally.” Yet, according to some commentators, viral disorder should not mean solely anarchy but a space for variation and experimentation that resist the one-way ideology of computer rationalism. (See Sampson; Cohen; Deleuze.) For some, that ideology has been crystallized in the figure of Microsoft, a popular target for virus attacks.
This view accentuates that the genealogy of computers and rationalism analysed above is but one potential history. There is always the possibility to write the counter-memory of the disorderly, accidental, probabilistic and contingent nature of technological culture. Hence, viruses might prove out to be also intellectual tools, with which to create new concepts and viewpoints to digital culture and the cultural history of computing and technology in general. Already Martin Heidegger (§ 16) proposed that modern technology reveals itself at the moment of its breaking.
In this sense, viruses reveal the functioning of a certain ideological or micro-political constitution of digital order. The challenge is not to take any notion of a “healthy” cultural network without disturbances as the starting point, but to see elements of break-up as part and parcel of those systems. Even if we are used to thinking of systems as orderly and harmonious, “[i]n the beginning there was noise”, as Serres (13) notes. This emphasizes the conceptual space we should give to the parasites who reveal the networks of power that otherwise are left unnoticed.