INTERFACE MYTHOLOGIES
- XANADU UNRAVELED
Christian Ulrik Andersen & Søren Pold
“We are seduced by the interface into neglecting the work behind it, and the operationalization and instrumentalization of dreams that takes place. The interface appears mythical, absolute and frozen.”
TECHNO MYTHS
Data mining, machine learning and other disciplines involved in finding patterns of data promise a future with new insights that will enable a new mode of intelligence. However, as with much other technological marketing, this is also a myth. In our interface criticism, we propose to engage with ubiquity, openness, participation and other aspects of this intelligence as mythological constructions which are presented to us via interfaces.
Following on from Roland Barthes‘ seminal studies of visual culture, where he discusses everything from striptease to washing powder, we intend to engage with the illusions of technologies. In many ways it is, for instance, an illusion to believe that a computer system can really forecast everything. As with weather forecasts, predictions of traffic, browsing, and other behaviours are faulty. Machine learning works by approximation and by generating generalized functions of behaviour, which are only generalizations after all; and similarly, the data we produce is captured by technologies that constantly have to deal with the noise of many simultaneous and ambiguous actions. However, from the perspective of a mythology, the important aspect is not whether the generated algorithms work or not, but how they become part of our reality. For instance, they function as speech acts that create correlations between ‘data analytics’ and ‘intelligence’, and this performative act may have a real impact when we rely on this alleged intelligence – when we market products, control traffic, fight terrorism or predict climate changes.
The mythologization of technology that takes place in the speech acts does not imply that how the technology ‘really works’ is hidden, but merely the ability to automatically associate certain images with certain signification in an absolute manner. To follow on from Roland Barthes, the mythologization of our smart technologies removes the history of intelligent systems, smartness, ubiquitousness, openness, and so forth, from the linguistic act. Just as we do not question that Einstein’s famous equation, and equations more generally, are keys to knowledge – as Barthes describes – intelligent systems for smart cities, state security, logistics, and so on suddenly appear absolute.1 Along with openness, participation and other techno myths, ‘smartness’ appears as an algorithmic reality we cannot question.
However, all techno myths should be seen as expressions of how we want the world to be, rather than what it really is. In order to perform an interface criticism, we do not need to discuss if the technologies are true or false – for the smart techniques of data mining, machine learning, and so forth, obviously work – but we need to realize that their myths are also part of our reality. As Philip Agre has noted, we subject our actions to the system that needs to capture them as data; and this deeply affects the way we produce, socialize, participate, engage, and so on.2 The monitoring of academic production and the capture of citations is, for instance, used to create indexes which indicate impact. Ideally, this can affect the efficiency of academia and be a relevant parameter for funding opportunities, careers, and the like. Even though this efficiency may be absent, the data capture still has an effect on the perception and performance of academic work; it is constitutive of our habitat and subtly affects our habits.
In many ways, the technological myths always feel real, and are dominant actors that affect a range of areas – from the perception of the weather, to our cities, and our cultural production and consumption. We have every reason to question not only if the technology works, but also the implications of its myths. It is often when we realize the pointlessness of our actions (that texts can be quoted for their mistakes, rather than their insights; or their summaries of knowledge rather than their epochal value) that we structurally begin to question the absolute assertions about the world embedded in the myth, and also to envision alternatives.
In this article, we do not want to dismiss intelligent, open, participatory or other technologies, but to discuss how technologies participate in the construction of myths. To us, this criticism fundamentally involves a mythology – a critical perspective on the interface that explores how the interface performs as a form of algorithmic writing technology that supposedly transcends signs, culture and ideology. To focus on the interface as a a language diverts attention away from technology’s immediate assertions about reality – the technical fix – and highlights the materiality of their staging. The aim will be to discuss how technologies perform as dreams of emancipatory or other post-semiotic idealized futures, and argue for the need for an interface mythology that critically addresses the technologies as myths; and unravels them as value systems and tools for writing – of both future functionalities and future cultures.
There is a general tendency to develop technology in the light of cultural utopias. The development of hypertext is a very good example of this. With the emergence of hypertext in the sixties (and later the WWW, weblogs, social media, and much more), the development of various forms of textual networks has been intrinsically linked to strong visions of new ways of producing, experiencing and sharing text. One of the strongest proponents of such visions has been Theodor H. (Ted) Nelson. Nelson’s Xanadu is a lifelong project, and it has been the outset for numerous reflections on the development of hypertext. Perhaps the most well-known of these texts is Computer Lib/Dream Machines from 1974, a self-published book featuring illustrations, cartoons and essays on various topics, all aiming in different ways to explore alternative ways of thinking related to computers.
Furthermore, the book can be read from both ends. The one end offers a technical explanation for common people of how computers work; as Nelson writes: “Any nitwit can understand computers, and many do. Unfortunately, due to ridiculous historical circumstances, computers have been a mystery to most of the world.”3 The other end is meant to make the reader see the development of the computer as a “choice of dreams.”4 According to Nelson, what prevents us from dreaming is the developer’s incomprehensible language (or, as he labels it, “cybercrud”), which in his view is just an excuse to make people do things in a particular way; that is, to let the technocratic visions of culture stand unchallenged.
Already in 1965 Nelson invented the term hypertext for a new kind of file structure for cultural and personal use:
"The kinds of file structures required if we are to sue the computer for personal files and as an adjunct to creativity are wholly different in character from those customary in business and scientific data processing. They need to provide the capacity for intricate and idiosyncratic arrangements, total modifiability, undecided alternatives, and thorough internal documentation." (...) "My intent was not merely to computerize these tasks but to think out (and eventually program) the dream file: the file system that would have every feature a novelist or absentminded professor could want..."5
In this way, Nelson was already in 1965 aware that developing alternative uses of the computer was closely linked to developing alternative versions of the technical structure and even the file system. He continued – and still continues – to develop his idea of hypertext, of which he premiered the first publicly accessible version at the Software exhibition of technological and conceptual art in New York in 1970. Visions and dreams appear in a recognition that the power of computation – or of computer liberation – is linked to visions of a new medium; that the inner signals of cathode ray tubes are related to signs and signification, and therefore to cultural visions. In other words, they are linked to the hypothesis that the computer interface, at all levels, and not just the graphical user interface, is an interface between the technical and the cultural. When text, for instance, is treated by protocols there is a double effect, where not only the cultural form of the text changes (e.g. from book to hypertext), but also the technology itself appears as a deposition of cultural values. This is why the discussion of the future of text and images, on the web and in e-books, also appears as a discussion of text protocols and formats.
Many writers and theorists have adopted Nelson’s visions of alternatives, and of new modes of producing, reading and sharing text. For example, in his book Writing Space, Jay Bolter explored what writing was before and potentially could be with hypertext.6 Bolter’s main hypothesis was that print text no longer would decide the presentation and organisation of text, and that it no longer would decide the production of knowledge. Readers would become writers, and this would undermine the authority of print text; writing would become liquid, and we would experience a space of creative and collective freedom. However, as we have experienced on today’s Internet, not everything seems as rosy. There are plenty of reasons to look more critically at Facebook, Twitter, Wikis and other services.
Nelson’s Xanadu system had already included an advanced management instrument, the so-called ‘silver stands’: stations where users can open accounts, dial up and access the information of the system, process publications and handle micro payments. Nelson himself compares this to a McDonald’s franchise and the Silver Stands somehow resemble the Internet Cafés of the late 90s and early 2000s or the commercial, centralized platforms of Web 2.0. Furthermore, copying content in the Xanadu system is restricted to dynamic “transclusions” that include the current version of the original text and assure a small royalty when accessed, a so-called “transcopyright”.
When looking at the services of Facebook, Google, Amazon, Apple, and so on today, it is similarly obvious that the common production modes characteristic of a free writing space are accompanied by strict control mechanisms. There are, for instance, strict protocols for the sharing, searching, writing and reading of text, and these protocols often ensure an accumulation of capital and compromise the anonymity and freedom of the participant. In other words, the instrumentalization of the dream includes everything else but the dream. The envisioned shared, distributed, free and anonymous writing space is in fact a capitalised and monitored client-server relation.
This critique of contemporary interface culture is perhaps not news, but what we want to stress here is the effect of the instrumentalization of dreams and visions. What this indicates is that down the ‘reactionary path’ (that is, the path of instrumentalization), our dreams turn into myths. However, the ethos of the dreams remains, and become automatically associated with the technical systems.
The dream of a shared writing space, a Xanadu, that overcomes the problems of representation facing linear text forms, as well as the hypertext system’s instrumentalization of this dream, the mythological status of such systems, and the adherent critique of them, all fit into a three-phase model of media presented by the German media theorist Harmut Winkler.
From a linguistic perspective all new media are, in the first phase, considered post-symbolic, concrete and iconic communication systems that present a solution to the problem of representation, or the arbitrariness of the sign. Winkler even sees the development of media as “deeply rooted in a repulsion against arbitrariness”, and a “long line of attempts to find a technical solution to the arbitrariness” dating back to the visual technical media of the 19th century.7 In addition, hypertext was perceived as establishing a more true relation between form and content, because of its more intuitive, democratic, and less hierarchical, nonlinear structure. It will often be the investment in the dreams that pays for their technical implementation: You not only buy new functionality, you buy a new way of living, working, thinking and dreaming. In this way, the development of hypertext, the WWW, social media – and also computer games and virtual reality, and their alleged liberation of the user – is driven by an urge to fulfil a dream, a vision of a new future.
In the second phase, the utopias become natural, stable and hegemonic. Through subsumption by market forces they become commodified, and sold as myths of being part of a media revolution. However, the subscription to this reality also contains an explicit lack of visions of alternative futures, and is therefore also without the critical, activist and heroic dimensions of the first phase.
It is, however, also a phase where people begin to study the media and learn how to read and write with them. In other words, the new media begins to enter a phase where you see it as a language, and hence where the arbitrariness of the sign is reinstalled. In the third phase, this arbitrariness has turned into disillusion over the media’s lack of abilities; which, however, also constitutes the ground for new visions, new media technologies, new interfaces, and new media revolutions.
The question is how far are we, today, from Ted Nelson’s critique of centralised data processing and IBM-like visions of efficiency and intelligence? In several ways, it seems as if we are in a phase where we might soon begin to regard big data, smart systems, social intelligence, and so forth, as a language; where we begin to see through the technological systems’ mythological statuses, or at least their dark sides in the form of control and surveillance. This is by no means an easy phase. As Ted Nelson also noted, “Most people don’t dream of what’s going to hit the fan. And computer and electronics people are like generals preparing for the last war.”8 The developers of technology and their supporters will often insist that their system is the future, and that the users’ actions need to follow the system’s intrinsic logic.
From a design perspective, the assumption will typically be that the clearer the representation of the computer signal-processes appears (or the mapping of mental and symbolic labour – the formalization of labour to computer language performed by the programmer), the more user-friendly and understandable the user interface appears. To computer semiotics, the aim was ultimately to create better interface design. However, in relation to an interface criticism, it is noteworthy how computer semiotics also explains how a design process in itself contributes to the mythological status of the interface – its absolute assertions about the world.9 In other words, the myths of interfaces are not only established through how they are represented elsewhere (how they are talked about, written about, advertised, etc.), but also through the interfaces themselves, and how they are designed. It is in its design as a medium, and in its claims of an iconic status as a communication system, that we find the interface’s operationalized mythology. And, in a general perspective, this is not unlike how media such as photography, film, the panorama, and so on, according to Harmut Winkler, have tried to operate in earlier times.
To read this myth demands that one begins to read the media – or, in our case, the interface. It is a tool for reading and writing, and not an absolute representation of the world. We must, therefore, begin to pay attention to the establishment of sign-signal relations that take place in the interface design, as a particular production mode, a particular kind of labour; a production of signs that at once reflects cultural and historical processes, and leaves an imprint on the world and how we organise and deal with it.
For instance, the software of the print industry, as Nelson also demonstrates, both reflects the historical and cultural origins of print and negotiates the reality of text, as searchable, sequential, iterative, sortable, and so forth. Our file formats and standards for storing and showing data also reflect such processes. Jonathan Sterne, for instance, has recently analysed how the diameter of the Compact Disc directly reflects relations to the cassette tape, and how the mp3 format also holds an audio culture of listening that is embedded in the sound compression, and how this directly challenges the conception of technological progress as equal to increased high fidelity.10 Even the electrical circuits and the signal processes deep inside the computer can be viewed as the result of language acts, as Wendy Chun has pointed out.11
Computer software and its formats and platforms promise us dreams of the future, of technological progression, better opportunities to make our music portable and shareable, better ways of organising our work, and so forth. It is often these dreams that carry the technological development. However, the dreams have a tendency to freeze, and gain an air of absoluteness, and of hegemony. This happens through their commodification and appropriation to a reality of power and control. Technology is marketed as a utopia of being in the midst of a media revolution. But in this phase the cultural and historical residues are hidden. We are seduced by the interface into neglecting the work behind it, and the operationalization and instrumentalization of dreams that takes place. The interface appears mythical, absolute and frozen. We do not see the mp3 format’s compression of sound as a result of an audio culture, but as the only possible scenario, a technological fact; and we do not see the IT systems of workers as the result of a negotiation of labour processes, and we do not see the operational system’s metaphorization of actions as other than a result of natural selection in the evolution of technologies. To get out of the deception of the technological facts we need interface mythologies – critical readings of the interface myths.
Suggested citation: Andersen, Christian Ulrik and Pold, Søren (2018). “Interface Mythologies – Xanadu Unraveled.” In Interface Critique Journal Vol.1. Eds. Florian Hadler, Alice Soiné, Daniel Irrgang. DOI: 10.11588/ic.2018.0.44738
This article is released under a Creative Commons license (CC BY 4.0).
Christian Ulrik Andersen PhD, Associate Professor at Aarhus University, Dept. of Digital Design and Information Studies. Inspired by network and software culture his research addresses the intersection between software and cultural performativity. In particular, he addresses the notion of “interface criticism” as performed in a variety of design and arts practices.
Søren Bro Pold (Aarhus University) has published on digital and media aesthetics. His main research field is interface criticism which discusses the role and the development of the interface for art, literature, aesthetics, culture and IT.
Selected Publications:
Andersen, Christian Ulrik, and Søren Pold, eds. Interface Criticism: Aesthetics Beyond Buttons. Århus: Århus Universitetsforlag, 2011.
Andersen, Christian Ulrik, and Søren Pold. The Metainterface - The Art of Platforms, Cities and Clouds. Cambridge, MA & London, England: MIT Press, 2018.
References
Agre, Philip E. “Surveillance and Capture: Two Models of Privacy.” In The New Media Reader, edited by Noah Wardrip-Fruin & Nick Montfort, 737-60. Cambridge: MIT Press, 2003.
Barthes, Roland. Mythologies. Transl. Annette Lavers. New York: Hill and Wang, a division of Farrar, Straus & Giroux, 1972.
Bolter, J. David. Writing Space the Computer, Hypertext, and the History of Writing. Hillsdale, N.J: L. Erlbaum Associates, 1991. [Find the second edition here.]
Chun, Wendy Hui Kyong. Programmed Visions: Software and Memory. Software Studies. Cambridge, MA: MIT Press, 2011.
Nelson, Theodor H. “Computer Lib / Dream Machines.” In The New Media Reader, edited by Nick Montfort, and Noah Wardrip-Fruin, 301-38. Cambridge, MA: MIT Press, 2003 (1974/1987).
Nelson, Theodor H. “A File Structure for the Complex, the Changing, and the Indeterminate.” In The New Media Reader, edited by Nick Montfort and Noah Wardrip-Fruin, 133-45. Cambridge: MIT Press, 2003 (1965).
Pold, Søren, and Christian Ulrik Andersen. The Metainterface: The Art of Platforms, Cities and Clouds. Cambridge, MA: MIT Press, 2018.
Sterne, Jonathan. Mp3: The Meaning of a Format. (Sign, Storage, Transmission). Durham: Duke University Press, 2012.
Winkler, Hartmut. Docuverse. Ratisbon: Boer, 1997.
Footnotes
1 Roland Barthes and Annette Lavers, Mythologies: Selected and Transl. From the French by Annette Lavers (New York: Hill and Wang, a division of Farrar, Straus & Giroux, 1972).
2 Philip E. Agre, “Surveillance and Capture: Two Models of Privacy,” in The New Media Reader, ed. Noah Wardrip-Fruin and Nick Montfort (Cambridge, Massachusetts and London, England: MIT Press, 2003). According to Agre there are two dominant notions of surveillance. Surveillance is often perceived in visual metaphors (i.e., ‘Big Brother is watching’); however, computer science mostly builds on a tradition of capturing data in real time, and is often perceived in linguistic metaphors (‘association’, ‘correlation’, etc.). Hence these metaphors are also better suited to describe the kinds of surveillance taking place when data capture permeates social life, friendship, creative production, logistics, and other areas of life.
3 Theodor H. Nelson, “Computer Lib / Dream Machines,” in The New Media Reader, ed. Nick Montfort and Noah Wardrip-Fruin (Cambridge, MA: MIT Press, 2003 (1974/1987)), 302.
4 Ibid. 305.
5 “A File Structure for the Complex, the Changing, and the Indeterminate,” in The New Media Reader, ed. Nick Montfort and Noah Wardrip-Fruin (Cambridge, MA: MIT Press, 2003 (1965)), 134.
6 J. David Bolter, Writing Space the Computer, Hypertext, and the History of Writing (Hillsdale, N.J: L. Erlbaum Associates, 1991).
7 Hartmut Winkler, Docuverse (Ratisbon: Boer, 1997), 214.
8 Nelson, “Computer Lib / Dream Machines,” 305.
9 On computer semiotics and the work of Frieder Nake and Peter Bøgh Andersen, see Søren Pold and Christian Ulrik Andersen, The Metainterface: The Art of Platforms, Cities and Clouds (Cambridge, MA and London, England: MIT Press, 2018).
10 Jonathan Sterne, Mp3: The Meaning of a Format, Sign, Storage, Transmission (Durham: Duke University Press, 2012).
11 Wendy Hui Kyong Chun, Programmed Visions: Software and Memory (Cambridge, MA and London, England: MIT Press, 2011).
INTERFACE MYTHOLOGIES
- XANADU UNRAVELED
Christian Ulrik Andersen & Søren Pold
Suggested citation: Andersen, Christian Ulrik and Pold, Søren (2018). “Interface Mythologies – Xanadu Unraveled.” In Interface Critique Journal Vol.1. Eds. Florian Hadler, Alice Soiné, Daniel Irrgang. DOI: 10.11588/ic.2018.0.44738
This article is released under a Creative Commons license (CC BY 4.0).
Christian Ulrik Andersen PhD, Associate Professor at Aarhus University, Dept. of Digital Design and Information Studies. Inspired by network and software culture his research addresses the intersection between software and cultural performativity. In particular, he addresses the notion of “interface criticism” as performed in a variety of design and arts practices.
Søren Bro Pold (Aarhus University) has published on digital and media aesthetics. His main research field is interface criticism which discusses the role and the development of the interface for art, literature, aesthetics, culture and IT.
Selected Publications:
Andersen, Christian Ulrik, and Søren Pold, eds. Interface Criticism: Aesthetics Beyond Buttons. Århus: Århus Universitetsforlag, 2011.
Andersen, Christian Ulrik, and Søren Pold. The Metainterface - The Art of Platforms, Cities and Clouds. Cambridge, MA & London, England: MIT Press, 2018.
“We are seduced by the interface into neglecting the work behind it, and the operationalization and instrumentalization of dreams that takes place. The interface appears mythical, absolute and frozen.”
TECHNO MYTHS
Data mining, machine learning and other disciplines involved in finding patterns of data promise a future with new insights that will enable a new mode of intelligence. However, as with much other technological marketing, this is also a myth. In our interface criticism, we propose to engage with ubiquity, openness, participation and other aspects of this intelligence as mythological constructions which are presented to us via interfaces.
Following on from Roland Barthes‘ seminal studies of visual culture, where he discusses everything from striptease to washing powder, we intend to engage with the illusions of technologies. In many ways it is, for instance, an illusion to believe that a computer system can really forecast everything. As with weather forecasts, predictions of traffic, browsing, and other behaviours are faulty. Machine learning works by approximation and by generating generalized functions of behaviour, which are only generalizations after all; and similarly, the data we produce is captured by technologies that constantly have to deal with the noise of many simultaneous and ambiguous actions. However, from the perspective of a mythology, the important aspect is not whether the generated algorithms work or not, but how they become part of our reality. For instance, they function as speech acts that create correlations between ‘data analytics’ and ‘intelligence’, and this performative act may have a real impact when we rely on this alleged intelligence – when we market products, control traffic, fight terrorism or predict climate changes.
The mythologization of technology that takes place in the speech acts does not imply that how the technology ‘really works’ is hidden, but merely the ability to automatically associate certain images with certain signification in an absolute manner. To follow on from Roland Barthes, the mythologization of our smart technologies removes the history of intelligent systems, smartness, ubiquitousness, openness, and so forth, from the linguistic act. Just as we do not question that Einstein’s famous equation, and equations more generally, are keys to knowledge – as Barthes describes – intelligent systems for smart cities, state security, logistics, and so on suddenly appear absolute.1 Along with openness, participation and other techno myths, ‘smartness’ appears as an algorithmic reality we cannot question.
However, all techno myths should be seen as expressions of how we want the world to be, rather than what it really is. In order to perform an interface criticism, we do not need to discuss if the technologies are true or false – for the smart techniques of data mining, machine learning, and so forth, obviously work – but we need to realize that their myths are also part of our reality. As Philip Agre has noted, we subject our actions to the system that needs to capture them as data; and this deeply affects the way we produce, socialize, participate, engage, and so on.2 The monitoring of academic production and the capture of citations is, for instance, used to create indexes which indicate impact. Ideally, this can affect the efficiency of academia and be a relevant parameter for funding opportunities, careers, and the like. Even though this efficiency may be absent, the data capture still has an effect on the perception and performance of academic work; it is constitutive of our habitat and subtly affects our habits.
In many ways, the technological myths always feel real, and are dominant actors that affect a range of areas – from the perception of the weather, to our cities, and our cultural production and consumption. We have every reason to question not only if the technology works, but also the implications of its myths. It is often when we realize the pointlessness of our actions (that texts can be quoted for their mistakes, rather than their insights; or their summaries of knowledge rather than their epochal value) that we structurally begin to question the absolute assertions about the world embedded in the myth, and also to envision alternatives.
In this article, we do not want to dismiss intelligent, open, participatory or other technologies, but to discuss how technologies participate in the construction of myths. To us, this criticism fundamentally involves a mythology – a critical perspective on the interface that explores how the interface performs as a form of algorithmic writing technology that supposedly transcends signs, culture and ideology. To focus on the interface as a a language diverts attention away from technology’s immediate assertions about reality – the technical fix – and highlights the materiality of their staging. The aim will be to discuss how technologies perform as dreams of emancipatory or other post-semiotic idealized futures, and argue for the need for an interface mythology that critically addresses the technologies as myths; and unravels them as value systems and tools for writing – of both future functionalities and future cultures.
There is a general tendency to develop technology in the light of cultural utopias. The development of hypertext is a very good example of this. With the emergence of hypertext in the sixties (and later the WWW, weblogs, social media, and much more), the development of various forms of textual networks has been intrinsically linked to strong visions of new ways of producing, experiencing and sharing text. One of the strongest proponents of such visions has been Theodor H. (Ted) Nelson. Nelson’s Xanadu is a lifelong project, and it has been the outset for numerous reflections on the development of hypertext. Perhaps the most well-known of these texts is Computer Lib/Dream Machines from 1974, a self-published book featuring illustrations, cartoons and essays on various topics, all aiming in different ways to explore alternative ways of thinking related to computers.
Furthermore, the book can be read from both ends. The one end offers a technical explanation for common people of how computers work; as Nelson writes: “Any nitwit can understand computers, and many do. Unfortunately, due to ridiculous historical circumstances, computers have been a mystery to most of the world.”3 The other end is meant to make the reader see the development of the computer as a “choice of dreams.”4 According to Nelson, what prevents us from dreaming is the developer’s incomprehensible language (or, as he labels it, “cybercrud”), which in his view is just an excuse to make people do things in a particular way; that is, to let the technocratic visions of culture stand unchallenged.
Already in 1965 Nelson invented the term hypertext for a new kind of file structure for cultural and personal use:
"The kinds of file structures required if we are to sue the computer for personal files and as an adjunct to creativity are wholly different in character from those customary in business and scientific data processing. They need to provide the capacity for intricate and idiosyncratic arrangements, total modifiability, undecided alternatives, and thorough internal documentation." (...) "My intent was not merely to computerize these tasks but to think out (and eventually program) the dream file: the file system that would have every feature a novelist or absentminded professor could want..."5
In this way, Nelson was already in 1965 aware that developing alternative uses of the computer was closely linked to developing alternative versions of the technical structure and even the file system. He continued – and still continues – to develop his idea of hypertext, of which he premiered the first publicly accessible version at the Software exhibition of technological and conceptual art in New York in 1970. Visions and dreams appear in a recognition that the power of computation – or of computer liberation – is linked to visions of a new medium; that the inner signals of cathode ray tubes are related to signs and signification, and therefore to cultural visions. In other words, they are linked to the hypothesis that the computer interface, at all levels, and not just the graphical user interface, is an interface between the technical and the cultural. When text, for instance, is treated by protocols there is a double effect, where not only the cultural form of the text changes (e.g. from book to hypertext), but also the technology itself appears as a deposition of cultural values. This is why the discussion of the future of text and images, on the web and in e-books, also appears as a discussion of text protocols and formats.
Many writers and theorists have adopted Nelson’s visions of alternatives, and of new modes of producing, reading and sharing text. For example, in his book Writing Space, Jay Bolter explored what writing was before and potentially could be with hypertext.6 Bolter’s main hypothesis was that print text no longer would decide the presentation and organisation of text, and that it no longer would decide the production of knowledge. Readers would become writers, and this would undermine the authority of print text; writing would become liquid, and we would experience a space of creative and collective freedom. However, as we have experienced on today’s Internet, not everything seems as rosy. There are plenty of reasons to look more critically at Facebook, Twitter, Wikis and other services.
Nelson’s Xanadu system had already included an advanced management instrument, the so-called ‘silver stands’: stations where users can open accounts, dial up and access the information of the system, process publications and handle micro payments. Nelson himself compares this to a McDonald’s franchise and the Silver Stands somehow resemble the Internet Cafés of the late 90s and early 2000s or the commercial, centralized platforms of Web 2.0. Furthermore, copying content in the Xanadu system is restricted to dynamic “transclusions” that include the current version of the original text and assure a small royalty when accessed, a so-called “transcopyright”.
When looking at the services of Facebook, Google, Amazon, Apple, and so on today, it is similarly obvious that the common production modes characteristic of a free writing space are accompanied by strict control mechanisms. There are, for instance, strict protocols for the sharing, searching, writing and reading of text, and these protocols often ensure an accumulation of capital and compromise the anonymity and freedom of the participant. In other words, the instrumentalization of the dream includes everything else but the dream. The envisioned shared, distributed, free and anonymous writing space is in fact a capitalised and monitored client-server relation.
This critique of contemporary interface culture is perhaps not news, but what we want to stress here is the effect of the instrumentalization of dreams and visions. What this indicates is that down the ‘reactionary path’ (that is, the path of instrumentalization), our dreams turn into myths. However, the ethos of the dreams remains, and become automatically associated with the technical systems.
The dream of a shared writing space, a Xanadu, that overcomes the problems of representation facing linear text forms, as well as the hypertext system’s instrumentalization of this dream, the mythological status of such systems, and the adherent critique of them, all fit into a three-phase model of media presented by the German media theorist Harmut Winkler.
From a linguistic perspective all new media are, in the first phase, considered post-symbolic, concrete and iconic communication systems that present a solution to the problem of representation, or the arbitrariness of the sign. Winkler even sees the development of media as “deeply rooted in a repulsion against arbitrariness”, and a “long line of attempts to find a technical solution to the arbitrariness” dating back to the visual technical media of the 19th century.7 In addition, hypertext was perceived as establishing a more true relation between form and content, because of its more intuitive, democratic, and less hierarchical, nonlinear structure. It will often be the investment in the dreams that pays for their technical implementation: You not only buy new functionality, you buy a new way of living, working, thinking and dreaming. In this way, the development of hypertext, the WWW, social media – and also computer games and virtual reality, and their alleged liberation of the user – is driven by an urge to fulfil a dream, a vision of a new future.
In the second phase, the utopias become natural, stable and hegemonic. Through subsumption by market forces they become commodified, and sold as myths of being part of a media revolution. However, the subscription to this reality also contains an explicit lack of visions of alternative futures, and is therefore also without the critical, activist and heroic dimensions of the first phase.
It is, however, also a phase where people begin to study the media and learn how to read and write with them. In other words, the new media begins to enter a phase where you see it as a language, and hence where the arbitrariness of the sign is reinstalled. In the third phase, this arbitrariness has turned into disillusion over the media’s lack of abilities; which, however, also constitutes the ground for new visions, new media technologies, new interfaces, and new media revolutions.
The question is how far are we, today, from Ted Nelson’s critique of centralised data processing and IBM-like visions of efficiency and intelligence? In several ways, it seems as if we are in a phase where we might soon begin to regard big data, smart systems, social intelligence, and so forth, as a language; where we begin to see through the technological systems’ mythological statuses, or at least their dark sides in the form of control and surveillance. This is by no means an easy phase. As Ted Nelson also noted, “Most people don’t dream of what’s going to hit the fan. And computer and electronics people are like generals preparing for the last war.”8 The developers of technology and their supporters will often insist that their system is the future, and that the users’ actions need to follow the system’s intrinsic logic.
From a design perspective, the assumption will typically be that the clearer the representation of the computer signal-processes appears (or the mapping of mental and symbolic labour – the formalization of labour to computer language performed by the programmer), the more user-friendly and understandable the user interface appears. To computer semiotics, the aim was ultimately to create better interface design. However, in relation to an interface criticism, it is noteworthy how computer semiotics also explains how a design process in itself contributes to the mythological status of the interface – its absolute assertions about the world.9 In other words, the myths of interfaces are not only established through how they are represented elsewhere (how they are talked about, written about, advertised, etc.), but also through the interfaces themselves, and how they are designed. It is in its design as a medium, and in its claims of an iconic status as a communication system, that we find the interface’s operationalized mythology. And, in a general perspective, this is not unlike how media such as photography, film, the panorama, and so on, according to Harmut Winkler, have tried to operate in earlier times.
To read this myth demands that one begins to read the media – or, in our case, the interface. It is a tool for reading and writing, and not an absolute representation of the world. We must, therefore, begin to pay attention to the establishment of sign-signal relations that take place in the interface design, as a particular production mode, a particular kind of labour; a production of signs that at once reflects cultural and historical processes, and leaves an imprint on the world and how we organise and deal with it.
For instance, the software of the print industry, as Nelson also demonstrates, both reflects the historical and cultural origins of print and negotiates the reality of text, as searchable, sequential, iterative, sortable, and so forth. Our file formats and standards for storing and showing data also reflect such processes. Jonathan Sterne, for instance, has recently analysed how the diameter of the Compact Disc directly reflects relations to the cassette tape, and how the mp3 format also holds an audio culture of listening that is embedded in the sound compression, and how this directly challenges the conception of technological progress as equal to increased high fidelity.10 Even the electrical circuits and the signal processes deep inside the computer can be viewed as the result of language acts, as Wendy Chun has pointed out.11
Computer software and its formats and platforms promise us dreams of the future, of technological progression, better opportunities to make our music portable and shareable, better ways of organising our work, and so forth. It is often these dreams that carry the technological development. However, the dreams have a tendency to freeze, and gain an air of absoluteness, and of hegemony. This happens through their commodification and appropriation to a reality of power and control. Technology is marketed as a utopia of being in the midst of a media revolution. But in this phase the cultural and historical residues are hidden. We are seduced by the interface into neglecting the work behind it, and the operationalization and instrumentalization of dreams that takes place. The interface appears mythical, absolute and frozen. We do not see the mp3 format’s compression of sound as a result of an audio culture, but as the only possible scenario, a technological fact; and we do not see the IT systems of workers as the result of a negotiation of labour processes, and we do not see the operational system’s metaphorization of actions as other than a result of natural selection in the evolution of technologies. To get out of the deception of the technological facts we need interface mythologies – critical readings of the interface myths.
References
Agre, Philip E. “Surveillance and Capture: Two Models of Privacy.” In The New Media Reader, edited by Noah Wardrip-Fruin & Nick Montfort, 737-60. Cambridge: MIT Press, 2003.
Barthes, Roland. Mythologies. Transl. Annette Lavers. New York: Hill and Wang, a division of Farrar, Straus & Giroux, 1972.
Bolter, J. David. Writing Space the Computer, Hypertext, and the History of Writing. Hillsdale, N.J: L. Erlbaum Associates, 1991. [Find the second edition here.]
Chun, Wendy Hui Kyong. Programmed Visions: Software and Memory. Software Studies. Cambridge, MA: MIT Press, 2011.
Nelson, Theodor H. “Computer Lib / Dream Machines.” In The New Media Reader, edited by Nick Montfort, and Noah Wardrip-Fruin, 301-38. Cambridge, MA: MIT Press, 2003 (1974/1987).
Nelson, Theodor H. “A File Structure for the Complex, the Changing, and the Indeterminate.” In The New Media Reader, edited by Nick Montfort and Noah Wardrip-Fruin, 133-45. Cambridge: MIT Press, 2003 (1965).
Pold, Søren, and Christian Ulrik Andersen. The Metainterface: The Art of Platforms, Cities and Clouds. Cambridge, MA: MIT Press, 2018.
Sterne, Jonathan. Mp3: The Meaning of a Format. (Sign, Storage, Transmission). Durham: Duke University Press, 2012.
Winkler, Hartmut. Docuverse. Ratisbon: Boer, 1997.
Footnotes
1 Roland Barthes and Annette Lavers, Mythologies: Selected and Transl. From the French by Annette Lavers (New York: Hill and Wang, a division of Farrar, Straus & Giroux, 1972).
2 Philip E. Agre, “Surveillance and Capture: Two Models of Privacy,” in The New Media Reader, ed. Noah Wardrip-Fruin and Nick Montfort (Cambridge, Massachusetts and London, England: MIT Press, 2003). According to Agre there are two dominant notions of surveillance. Surveillance is often perceived in visual metaphors (i.e., ‘Big Brother is watching’); however, computer science mostly builds on a tradition of capturing data in real time, and is often perceived in linguistic metaphors (‘association’, ‘correlation’, etc.). Hence these metaphors are also better suited to describe the kinds of surveillance taking place when data capture permeates social life, friendship, creative production, logistics, and other areas of life.
3 Theodor H. Nelson, “Computer Lib / Dream Machines,” in The New Media Reader, ed. Nick Montfort and Noah Wardrip-Fruin (Cambridge, MA: MIT Press, 2003 (1974/1987)), 302.
4 Ibid. 305.
5 “A File Structure for the Complex, the Changing, and the Indeterminate,” in The New Media Reader, ed. Nick Montfort and Noah Wardrip-Fruin (Cambridge, MA: MIT Press, 2003 (1965)), 134.
6 J. David Bolter, Writing Space the Computer, Hypertext, and the History of Writing (Hillsdale, N.J: L. Erlbaum Associates, 1991).
7 Hartmut Winkler, Docuverse (Ratisbon: Boer, 1997), 214.
8 Nelson, “Computer Lib / Dream Machines,” 305.
9 On computer semiotics and the work of Frieder Nake and Peter Bøgh Andersen, see Søren Pold and Christian Ulrik Andersen, The Metainterface: The Art of Platforms, Cities and Clouds (Cambridge, MA and London, England: MIT Press, 2018).
10 Jonathan Sterne, Mp3: The Meaning of a Format, Sign, Storage, Transmission (Durham: Duke University Press, 2012).
11 Wendy Hui Kyong Chun, Programmed Visions: Software and Memory (Cambridge, MA and London, England: MIT Press, 2011).