FROM MY TO ME
Me is cheap, Me is easy to control, Me is easy to channel, Me is slave of its own reflection, Me is a slave of the platforms that make the reflection glossy. Me is data. Me is data closest to metadata. This makes Me just perfect to satisfy advertisers and to sate neural networks.
This article is an elaboration on the statements about the WWW, web design and personal websites I made in my recent talks1 and articles, as well as those included in the volume. As the editor (and probably the readers as well) noticed, as soon as I look for counter examples to new media products made following the cruel and hypocritical UX paradigm, I come up with a website – or more precisely, with a website of a particular genre – “the 90s GeoCities”.[2]
This selectivity has reasons and is intentional. As a keeper and researcher of the One Terabyte of Kylobyte Age[3] archive, I am surrounded by GeoCities sites built and abandoned by amateur webmasters between 1995 and 2009. Amateur websites are central to my argument because they are the corpus of the archive and my research on web history. This focus is not accidental, though – it was developed from the thesis that personal web pages are the conceptual and structural core of the WWW.
Their emergence was accidental, their time was short, their value and influence were downplayed, they were erased or hidden. And since this arrogance of the IT industry and Human Computer Interaction (HCI) circles was also not accidental, but followed the call of the “invisible computer”, the core instrument of which is alienating the users from their medium, I chose to argue for the opposite and to illustrate the argument with artefacts that highlight moments in the history of the medium when its users were in power.
The choice of the word “moments” and the use of the past tense is also intentional and deserves comment. The fact that the time of personal pages is over is self-evident. What is obfuscated by today’s early web nostalgia (netstalgia) trend, though, is the fact that there was never a time for them.
Just as there was no Web 1.0 period by itself. First of all, the term is retrospective. And second: the Web 2.0 marketing claim made by the Silicon Valley of 2004[4] regarding the Web’s future should not be allowed to define 10 years of web history prior to it as being neither homogeneous nor the opposite. There was no 2.0 cut into the history of the Web that left certain content and forms – namely personal websites – behind.
Nor was there some sort of evolution or natural development that would make people stop building their personal websites. Professionalisation or faster Internet, which you could hear as reasons for amateur pages dying out, could have become the reasons for the opposite, for a brighter, rich and long tradition of people building their cyberhomes themselves.
There was no time in the history of the Web when building your home was celebrated and acknowledged by opinion leaders. The idea that you should invest time in building your corners of cyberspace was mercilessly suppressed by hosting service providers and “fathers” of the Internet. The sarcastic “They may call it a home page, but it's more like the gnome in somebody's front yard”[5] was stated not by some social networking prophet, not by, metaphorically speaking, Mark Zuckerberg or Jack Dorsey, but by Tim Berners-Lee himself, and it happened as early as 1996, the year we usually see as a golden age of amateur pages.
I have several suggestions for those who decide to make their home page in the third decade of the twenty-first century. Most of them will appear at the end, but there is one I’d like to make right away:
Don’t see making your own web page as a nostalgia, don’t participate in creating the netstalgia trend. What you make is a statement, an act of emancipation. You make it to continue a 25-year-old tradition of liberation.
This text is part of the forthcoming book "Turing Complete User. Resisting Alienation in Human-Computer-Interaction" by Olia Lialina, that will be published in march 2021 through Heidelberg University Publishing. It contains collected essays and some other gimmicks and will be available as print and open access, here on this website.
All the figures are from the archive of geocities, that is located at the tumblr one terabyte of kilobyte age.
Olia Lialina has published another essay on Interface Critique, called Rich User Experience, UX and the Desktopization of War.
You can read more from her on her website contemporary home computing.
To understand the history of the Web and the role of its users, it is important to acknowledge that people who built their homes, houses, cottages, places, realms, crypts, lairs, worlds, dimensions [Fig.1–13] were challenging the architecture and the protocols, protocols in a figurative not technical meaning. Users hijacked the first home page of the browser and developed this concept in another direction.[6] A user building, moving in, taking control over a territory was never a plan. It was a subversive practice, even in 1995.
“Q: The idea of the ‘home page’ evolved in a different direction.
A: Yes. With all respect, the personal home page is not a private expression; it's a public billboard that people work on to say what they're interested in. That's not as interesting to me as people using it in their private lives. It's exhibitionism, if you like. Or self-expression. It's openness, and it's great in a way, it's people letting the community into their homes. But it's not really their home. They may call it a home page, but it's more like the gnome in somebody's front yard than the home itself.”[7]
Tim Berners didn’t intend to be sarcastic. It would be fair to quote the rest of the answer to see that what he called for was giving web users better, faster and more seamless[8] ways to connect.
“People don’t have the tools for using the Web for their homes, or for organizing their private lives; they don't really put their scrapbooks on the Web. They don’t have family Webs. There are many distributed families nowadays, especially in the high-tech fields, so it would be quite reasonable to do that, yet I don’t know of any.”[9]
Such “webs” started to arrive some years later in the form of LiveJournal, Friendster, Facebook and other platforms that clearly showed web users that their part was to be connected and deliver content, not to build anything.
I don’t think that in 1996 anybody was really hurt or stopped making web pages because of the remark the father of the Web made. People building what was “not really their home” were reading other texts at that time: HTML manuals, web graphics tips and tricks, and source codes of each other’s websites. They would rather buy HTML for Dummies or Home Sweet Home Page and the Kitchen Sink than the WWW Consortium corporate journal.
Mentioning web design manuals is not a side remark here, but a suggestion to pay closer attention to the books that explained the World Wide Web to newcomers and taught them to make web pages as documents[10] of the epoch: books such as Teach Yourself Web Publishing with HTML 3.2 in 14 Days; Building Your Own Website; Jazz Up Your Web Site in a Weekend; Frontpage Web Publishing & Design for Dummies; Publish it on the Web! – and other titles that shout: the Web is the future, the future belongs to you, learn HTML and embrace the future! The older the manual, the younger the medium, the more powerful and diversified is the role of the manual’s reader, the Web user. But in the context of this article I send you there not to look for the “good old days”. The manuals are also evidence of the personal web pages and their authors being ridiculed by experts: on the very same pages that motivated a newcomer you can often read “amateur” as a negative adjective.
“This page shouts ‘Amateur’"[11]
“There's nothing that says, ‘I'm an amateur Web designer and I don't know what I'm doing’ like 3-D logos”[12]
“Visit an amateur home page and see how excessive scrolling drags its nails across the blackboard of the user's experience”[13]
Already as early as in 1996, personal home pages as a genre and early web makers (as a group) were made fun of and blamed for all the ugly stuff. It is the year when David Siegel publishes Creating Killer Web Sites. Describing the history of the WWW till that moment, he announces the third generation of web design to come to replace the second-generation site, which for him was the world of amateur web and which he described as “At worst, noisy backgrounds and interminable waits for sound files make these sites unbearable. At best, they are nice white sites with color-coordinated icons”.[14]
“The audience for personal pages is basically only one person -- the creator of the site.”[15]
“It's perfectly OK for you to be as wild and crazy as you want because the only people who will probably visit your site are friends and family – and they are well aware of your lack of aesthetic taste.” [16]
“ […] they cram every page with embedded MIDI (music) files, pointlessly scrolling JavaScript messages, huge full-color photographs, animated GIFs (flames and dripping blood are especially popular), and blinking and moving text [...] That is bad design, and (we think) bad markup, even if it validates – which is pretty unlikely because folks attracted to dripping blood animations tend not to spend much time learning about web standards.”[17]
The last quote is from Taking Your Talent to the Web, the book with the most beautiful title ever given to a manual. That’s why I borrowed it to be our library’s pseudonym.[18] I also find this book very wise in many aspects: first and foremost for Zeldman’s conviction regarding the medium specificity of web practice and his attempt to divorce it from graphic design in this and other texts. Also, the work that he and his colleagues do at A Book Apart, a publishing house that makes manuals for contemporary web designers, cannot be underestimated. But I also think that it was a big mistake to neglect amateurs’ contributions to the development of the Web’s language.
In my opinion, people struggling to position a dripping blood animation in between two skulls and under <marquee>ENTER IF YOU DARE</marquee>, and pick up an appropriate MIDI tune to sync with the blood drip, made an important contribution to showing the beauty and limitation of web browsers and HTML code.
Making fun and blaming amateurs is only half of the problem. More damaging for the history of the Web was the ignoring of personal home pages and their authors in “how-to” books.
Neither the usability (Jakob Nielsen) nor the creativity (Jeffrey Zeldman, David Siegel) camps and their followers spared a page to analyse the home pages of amateurs, sorting things exclusively between themselves. From time to time they (as in Nielsen, Zeldman, Flanders) mentioned artists and web artists as exceptions to the rules they established, but not web vernacular. Even after designers of “photoshop” sites and dot.com unviable hybrids discredited the profession, experts suggested looking for new ideas among... professionals.
Veen: “I find inspiration in noncommercial Web creations”[19] claims Veen and gives examples of designer portfolios.
“In order to move beyond a conservative, copycat style, you must look beyond the inbred corporate web to the personal sites of today's leading web designers”[20] echoes Cloninger.
Danish researcher Ida Engholm in her 2002 paper “Digital style history: the development of graphic design on the Internet” wrote, “Web design has become an aesthetic phenomenon in its own right and with its own means of expression.”[21]
She continues: “Until now few attempts have been made from the perspective of aesthetic theory to develop reflective approaches to web design.” Ida Engholm was too cautious and modest with this remark. To my knowledge she was the first to attempt such an approach in the international academic press. And one can see that she was strongly informed (or misinformed) by the “how-to books” of the above-mentioned Siegel, Cloninger, Zeldman.
She writes: “[…] web design didn’t develop in a vacuum but shares features with development trends in 20th century design and art and with traditional design areas such as industrial design and graphic communication.” Following Cloninger she looks for web design roots in Swiss Style and Grunge, and discusses Kilobyte Minimalism, Hello Kitty and other popular online, but still graphic, design styles.
Indeed, web design didn’t develop in a vacuum, it grew out of vernacular web, it grew in opposition to vernacular expression. But there was obviously an information and education vacuum created around it by authors of design manuals and other experts and evangelists.
Only in 2008, in Fresher Styles of Web Design, Cloninger, following Cory Archangel’s Dirtstyle,[22] introduced “1996 Dirt style”, which he attributed to Myspace, blingee.com and other sites and communities “greatly influenced by hobbyist created personal home pages circa 1996”[23] without giving a single example of any website from that era.
No wonder that young web designers think that responsive web design was invented this century, although Ethan Marcotte never hid the fact that he only coined the term,[24] brought back and popularised the principle of liquid layouts, which was very popular among personal home page makers of the mid 90s; and why Aaron Walter, the author of Designing for Emotion[25] – a web design manual that explains step-by-step how to create a service in a way that its users think that there is a real person behind it – dares to deliver his point without once mentioning a personal home page.
Webmasters and their production were an easy target. Professional designers, evangelists – they all took their chance/really took the opportunity: ridiculing, discrediting, alienating, exposing clean styles and templates, usurping the right to make design decisions.
And they succeeded, they protected the Internet from “wrong” colour combinations…, annoying background sound, from marquees and blinking, but in the long term it was the beginning of the end of web design itself. The rhetoric of alienation that design experts practised in 1996 was picked up by IT giants a decade later.
To quote Vincent Flanders’ (the extensively quoted above Flanders, who, book by book, article by article, humiliated websites that were too bright, too loud, too confusing) tweet from 4 years ago: “in 2016 web design is what Google wants it to be”.[26] Even more true in 2020.
There is no web design and web designers any more, there are graphic designers and developers again, front-end and back-end developers this time. For me as a net artist and new media design educator, this splitting of web designer into graphic designer and front-end developer is bitter, because it is the death of a very meaningful profession.
“Web publishing is one of the few fields left where the generalist is valuable. To make a great site you need to know a little bit about writing, photography, publishing, UNIX system administration, relational database management systems, user interface design, and computer programming,”[27] writes Philip Greenspun in Philip’s and Alex’s Guide to Web Publishing in 1999. It would be naive to think that it would work the same way two decades later, taking into account the complexity of modern online products. But still the web designer is a generalist in a leading position. But knowing a bit of everything is not the most important part of the profession. The generalist as web designer is a person who sees the medium designed and shows it to the users, a person who is growing up together with the medium (and never gets old because the medium is forever new) and who has the potential to reshape it, because intelligence is still the ends.
“Web designers are still there though, I think. Just maybe more and more are actually growing into Frontend developers or turning to something more specific like becoming UI/UX designers, or Product designers. It's less browser focused maybe, less ‘web’? Even though most of these still technically rely on web protocols and technologies”[28] – net artist at night and “full-stack developer with more experiences as a front-end developer” –, Émilie Gervais sees it more optimistically in our email correspondence but still confirms the shift: the Web is not a medium but underlying technology.
Underlying and invisible. Most of the digital products and interfaces we use today are in fact browsers opened in kiosk mode. The majority of mobile apps, digital signages, installations, and other big and small “experiences” are constructed with HTML, CSS and JavaScript. Front-end developers who can talk with screens and layouts in these languages are demanded, celebrated, well paid … but harmless; they master technologies without ambitions to master the medium.
Without web designers, the Web is left to front-end developers who implement Material Design guidelines (“what Google wants it to be”), graphic designers mix-n-matching “illstrations for every occasion”[29] – and for the rest of us there is Artificial Design Intelligence (ADI).[30]
“There is no room for ornament on the web. People want to look at Instagram […] because their brain already understands how Instagram is laid out. In my opinion the goal of an artist vs a UX/UI/product designer is totally opposite. To combat templatization and minimalism I try to exaggerate designs with ephemeral styles and effects,”[31] -- says Steph Davidson. She is web art director at Bloomberg, a publishing house that actually makes an effort[32] to revive the genre – with a website that is different. Bloomberg designers are not the only ones. There are exceptions and we identify them as such. For example, every work of German web design duo Christoph Knoth and Konrad Renner makes people say “wow, the Web (design) is alive”. They confirm that “there is a small movement that is fusing web design back together with new tools. We design and develop frontends and backends and it feels like a perfect habitat for our work. We are the living proof”.[33]
“Small movement” is very important for rescuing the profession and the idea that one – be it a publishing house, a festival, a journalist investigation, a person – needs a website.
“[…] the idea of a site and its relationship to our online identity has far more depth of possibility than ever before, which makes me think the concept of having one’s own site online might never be more relevant given how ‘homeless’ our digital presence is online currently,”[34] writes co-founder of Reclaim Hosting initiative, Jim Groom.
The homeless status is a reality for individuals, who never know when Facebook will implode together with their images and interactions, and for institutions begging Google and Wikipedia to edit their “knowledge panels”. Experts and celebrities are not better settled than naive users of Instagram.
Nothing is more eloquent than popular tech journalist Katie Notopoulos tweeting: “I had an idea for a blog, but realized that there's nowhere to like, make a new blog (rip tumblr), so I think the best blogging platform now is.... a really long caption on an Instagram?”[35] or aforementioned web design guru David Siegel, whose web home today is a link list on Medium.[36] Many links to his own text about the future of the Web once published on dsigel.com point to the Wayback machine.
The father of hypertext gave up updating hyperland.com and directed it to his YouTube channel.[37] The mother of Post-Internet made a spectacular home page[38] for marisaolson.com – the rest of her portfolio is outsourced to blocks and channels on arena.com. Among the ruins of online portfolios rises the home page of artist Petra Cortright,[39] who links everything she’s done in between 2012 and 2019 to “petra cortright 2019 2018 2017 2016 2015 2014 2013 2012” on llmgtfy.com – a very contemporary gesture, which could be interpreted as both despair and arrogance.
In this situation I think a new role and an understanding of web designers could be rebuilding homes; showing gnomes the way out of corporations’ front yard, if I may steal Ted Nelson’s metaphor.
These are not “ornaments” per se, Davidson mentions, and not the awesome audio visual effects Knoth and Renner provide to their clients; it is the notion of having an appearance – that they bring back by exaggerating it – and subsequently a place of your own outside of standard interfaces and grids of algorithmic timelines.
*
To turn designers and users away from technology and back to the medium one should try to adjust the optics and see the people who made the Web, to write the history of the users (not corporations that released these or those products, or updates) and frame it in a continuum of their actions, views, self-identification. Not an easy task because on the Web we are always confronted with revolutions, with histories of big men and binary time[40] and space: before/after, web 1.0/web 2.0, desktop/mobile, flat/material.
My slow climb from 1995 to 2004 in the 1TB archive, my personal journey online that started in 1994 and is still not over, as well as two decades of teaching new media designers to see and show the environment they work with, we should recognise several trajectories we (web users) took since 1993.[41]
From web designer to front-end developer could be one of these trajectories. This is partially introduced on the previous pages. To make it complete I’d first of all have to place it in a more complex, forking path, starting from webmaster (not web designer), following the genesis and metamorphosis of that profession (passion) through time and niches of the Web.
Figure 1
Figure 2
Figure 3
Figure 4
Figure 5
Figure 6
Figure 7
Figure 8
Figure 9
Figure 10
Figure 11
Figure 12
Figure 13
Another trajectory, which would demand a longer text, is Under Construction → Update → Upload. The history of the Web distinguishing three generations – three “Us”. Where Under construction stands for building the Web; Update for having difficult relations with the Web, not having time for the Web, it’s complicated, “get a real life”, and more [Fig 14–16]; and Upload – users’ involvement reduced to feeding the forms with photos, texts, or other types of “generated content”.
Figure 20
Let’s have a closer look at “topgun’s Home Page” [Fig. 20], made and last updated in September 1995. A significant one for the archive: first of all because it is the oldest; second, it is one whose author I could trace, which is rarely the case; and third, because the creator, the person behind Bruce who is testing how to make a web page is none other than Ganesh Kumar Bangah, a big name in South-East Asian IT world: it was he who bought Friendster in 2009.[42]
In 1995 he was 16 years old and made his first home page by modifying a sample page made by David Bohnett, himself the founder of GeoCities, who was 40 at the time, but had maybe only some months more experience with the Web than Ganesh Kumar Bangah. David Bohnett’s first page was not saved, but in an interview he recalled that it was visually identical to Ganesh’s one (it was anonymous and placed into Hollywood Neighbourhood). This sample suggested two major ideas to the users signing in to his platform: they should or could be “under construction” and contain “links to other sites of the net” [Fig. 21–23]. A must that people took seriously, replacing Bohnett links with their own. Making links being the node was the duty,[43] the reason or an excuse to be online. You are maybe not an expert in anything, you are not a fan of anybody, but you can provide links to others and that’s a noble role. These links could be to search engines [Fig. 18,24] and this didn’t look like a paradox.
Figure 14
Figure 15
Figure 16
“Links are the spice that makes the Web so interesting. Links perform the magic [...]”44
“There are no rules about which documents can point where – a link can point to anything that the creator finds interesting”[45]
“If you are building a site for people in growing roses, don't stop with just pictures of your roses; include the list of rose resource links”[46]
“Good home pages provide useful resources and links to other Web documents. Web is a project in community authorship”[47]
“There are sites that help you find people, sites that help you find jobs, sites that help you find other web sites,”[48] The authors of the design manual Home Sweet Home stated in 1997, and they didn’t mean Google or search engines, they meant that it is a valid reason to create a website.
“There are plenty of sites around the World Wide Web that exist only to provide a Web ‘mouse potato’ with huge lists of links to pages that are informative, entertaining, or “cool”[49]
“Traditional home pages easily degenerate into an endless vertical list of links.”[50] David Siegler’s remark sounds like a prophecy, knowing what happened to his own web presence. Indeed, webmasters were aware and often made an effort to transform the list into something rather intriguing, imagining and structuring them as a lava lamp [Fig. 25] or Christmas tree [Fig. 26]
Figure 18
Figure 21
Figure 22
Figure 23
Figure 24
The latter, “Links are us”, deserves special attention. It provided 100 links to what were, in 1999, important sources. Netscape is still there, Google is already there. Hans Hollenstein links to whitehouse.gov as well as “~” folders on .edu servers. But what does he put on the top of his Tree Full of Links? What’s the shiny Christmas star? Is it Microsoft? Apple? Yahoo? No, it is the author’s own complete solution to Rubic’s Cube[51] as a Java applet… His invention, his pride and his right to make the link to it more prominent than links to the giants.
Back to our times. In the winter in early 2020, I taught a project “go as deep or stay as shallow”, which is a quote from Joshua Quittner’s Way New Journalism manifesto,[52] an optimistic text published on Hotwired in 1995, where Quittner called to the journalist of 25 years ago not to be afraid of making links to immerse themselves in the world of hypertext and hyper images OUT there, outside of your text or publishing platform. The group I was teaching was very young. I knew I would be the first to tell them about the difference between the Internet and the WWW, the history of hypertext and hyperlink, the values of EtoE and the treasures of p2p, and of the urgency of breaking out of walled gardens, the importance of not obeying the one link Instagram allows you. I was prepared to start from the basics. What I was not prepared for was that students would ask me what I mean by the only one link that Instagram allows its users? Where is it?
They didn’t know about the link, they didn’t see it, and were not missing anything. I was trying to fire up a resistance against the cruel policy of Instagram, but achieved the opposite. It made Instagram even more generous in their eyes.
Then I told this to an older student of mine. By “older” in this case, I mean she had already had a conversation with me about blue underlined words some semesters ago and had produced several great projects. She said that, with all due respect to all the links I made, Instagram’s policy of not allowing links is great, it helps her to stay concentrated and to see only what she wants to see.
This is not a story about stupid young people,[53] it is the destiny of computer users of all generations. Adapting, forgetting, delegating.
So often we hear and say that things change very fast. I don’t know what is fast or what is slow, but what is clear to me is that the adaption of computer users’ mindsets keeps up with this pace. First you stop making linx, then you stop following ones made by others, then you ask, “what’s a link?” Like a girl in the Apple commercial asks “What’s a computer?”[54], a question that is supposed to portray the ultimate quality (transparency as invisibility) of a consumer electronic product.
Computer users accepted that making links is not their business. Instagram’s one and only link in bio is not a question of the amount of links but the fact that the decision to make hypertext is not a prerogative of the users.
“Free speech in hypertext implies the ‘right to link’, which is the very basic building unit for the whole Web”[55] writes Tim Berners-Lee in 2000. He adds, “if the general right to link is not upheld for any reason, then fundamental principles of free speech are at stake, and something had better be changed.”[56]
Links were indeed perceived so “basic” and “fundamental” that no contributor to user rights platform thought about adding a demand to link in 2013 or later. I noticed this while finishing this text and tried to improve the situation by placing my demand.[57] But one thing that has long existed is the unwillingness of corporations to make external links and the rise of walled gardens, where hypertext is only inside,[58] and links are made between documents not servers. And another is service providers taking away the technical possibility of turning text into hypertext, media into hypermedia, even inside one platform.
The <a href> tag is the most essential tag of HTML. A is for “anchor”, HREF is for hypertext REFerence – <A HREF> is to tie, to weave, to knit. One would think it is the essence and the core, but we see more and more signs that in a year or two it will be “deprecated”, browsers will just ignore it as some sort of <blink> or <marquee>, as if it is something decorative, but unnecessary, just a feature, that can be removed.
Content management systems and WYSIWYG web publishing (among other solutions that would make publishing instant) made a very attractive offer to their users: authors don’t use tags to make links, just type “https://” and the platform will recognise it and automatically turn the address into the link. But a decade later they started to change their mind and URLs stayed inactive, appearing more as noise than information. Since 2016, Instagram users have wondered how to make links, how to go around “non-clickable URLs”,[59] as hyperlinks are now called – an absurd collocation for an environment based on hyperlinks. “For the Web, the external link is what would allow it to actually become ‘worldwide’”,[60] to quote its inventor once again.
There are more sad neologisms around, for example the “Clickable Links”[61] extension I installed to make URLs “work” in Chrome, or “Linkificator”, it’s analogue for Firefox. Not to mention PANs like linktr.ee and il.ink, apps that you have to install to move round the only link Instagram allows. The mere existence of the apps shouts about the absurdity of today's web, the hypocrisy of social networks and the misery of their users. “The only link you’ll ever need” is linktr.ee’s slogan, with which I marked the current moment in the trajectory.
“... hyperlinks aren’t just the skeleton of the web: They are its eyes, a path to its soul.” Iranian blogger Hossein Derakhshan wonderfully said in his 2015 post on Medium, the title of which was “The web we have to save”.[62]
Derakhshan spend 6 years in prison for his posts online. He was released, went back on the Internet and viewed it as terrible that Facebook would not let him link properly and control the presentation of his texts. He was absolutely right in his critique.
At the same time I remember being puzzled when reading this text 5 years ago, because I realised that in his memories WordPress was paradise for links and the golden age for hypertext and the Web we have to save. How could this be? In my chronology, WordPress was the platform that started to take away users’ control over the links; it is precisely WordPress that should be blamed for disrespecting hypertext, as it filled the Web with zombie links.
The question is rhetorical. We know the answer: we (users of free publishing tools) forget or adapt or accept very quickly.
Much like the false memories about WordPress is the current Myspace nostalgia,[63] namely the part where people recall their time on this platform as a time when they were coders. US scholar Kate M. Miltner presented her research “MySpace Had Us All Coding”: A Nostalgic (Re)imagining of ‘Web 2.0’”[64] about it at last year’s conference “The Web That Was”. Again, I had the impression that she was talking about another Internet or Myspace, because I remember the opposite, and in 2007 I wrote about Myspace as a platform that took HTML as a source code away from people.[65]
But true, when you compare the Myspace of that time with any service of today or even the Myspace of today, you feel like you were a coder if not a programmer. You could copy and paste glittering text code, decide whether sparkles are purple or pink.
I asked the audience whether, in a few years’ time, teenagers who are now on Instagram will recall 2019 as a paradise, as a free wild web, a place when they were coders? Can it be that people who are on Instagram now will be nostalgic about the freedoms they had?
“Of course! Thank you, Instagram – we were allowed to upload!” Alex Gekker of Amsterdam University shouted from his seat.
Indeed, happy times when you could decide yourself to post a pic and not your phone doing it for you automatically. We will be recalling the 2010s as a time when we could post images ourselves.
Good old times... Remember Instagram where you could post an image?
Remember Google that allowed you to type your search request? We had Twitter! You could unfollow people! Yes! Yes, in 2020 there were browsers that had a location bar and you could type in an address of a site!!
What? Address bar? Website? You could type? Was there a sort of typewriter?
Delegating, adapting, forgetting.
Another timeline that vividly exposes this path would be from making a website for your own dog to reposting someone’s cat. There are transitions in between these extremes: making a website for a cat, or posting your own cat. It is a trajectory to follow, to investigate. Again it is not binary, not just a dog’s web for Web 1.0 vs a cat’s web for Web 2.0. Though my research shows that cats, which later became a front-running symbol[66] of being online, played a small role in early web culture, and had another function.[67]
Figure 25
Figure 26
The page [Fig.27] is one of 848 pages tagged as “dog” in the One Terabyte of Kylobyte Age archive (as of june 7, 2020). The most spectacular ones have become part of an ongoing online exhibition.[68] Many of these pages are made in memory of a dog, many to celebrate a new puppy, some are personal, others belong to breeders. There are monumental and very simple ones, and some that I found are especially stunning. I tag them as “dog” and “webmasters” and there are 99 of them at the moment. Almost 100 of 848 dogs claimed they made their webpage themselves.[Fig.28–30]. We (people who are a bit older than these pages) know that it is not true. But for how long will this knowledge be there?
Chances are that the number of people who have ever heard about web pages made by people themselves is getting smaller every month. At the same time, the chances that your dog, cat or hamster doesn’t need you to share its pictures and sounds online are getting higher every day. I’m sure that if you return to this exhibition in ten years from now and see the screenshots, you won’t be surprised by dogs showing off their pages or posts. Theoretically, some sort of Alexa could probably already do it today, automatically photographing your pet, streaming it live, translating its barking into words and whatever.
And that’s why I invite you to go into these pages in more depth: not to forget that these dogs were not dogs but people who spent a few weekends learning how to make a web page, and it was so exciting and so much fun that they also made them for their dogs. People, not dogs, not AI, not UX were making decisions about URLs, links, navigation, layouts, colour palettes and content.
Webmasters of the 1990s built homes, worlds and universes. But also, outside of intergalactic ambitions, they strongly pushed the concept of something being mine. The first-person possessive determiner “my” took on a very strong meaning – “my” because I build it, I control this presentation; my interests, my competences, my obsessions: in the trajectory from my to me, I suggest following its decline.
Figure 27
Figure 28
Figure 29
Figure 30
[Fig.31–45] My Steven King, my Korn, my page for Sandra Bullock, my Eminem, somebody else’s Eminem, my t.A.T.u., My Orlando Blum, your Orlando Bloom, Martin’s Mylène Farmer, Julia’s John Malkovich, Jacob’s pictures of Pamela Anderson. They are Jacob’s because he scanned them and put them online. My space for Leo, and my territory without him. Patricia’s Xena, but not only because she is her fan; it is a page about Patricia’s dreamworld where she is Zena. And this is a very important dimension of My. An alternative my-self. Alternative space where one can be someone else, someone that they want to be. Emphasis on MY! [Fig.46]
The growing idea that things can belong to the person who wrote an html code, or scanned pictures, or collected something was unprofitable and dangerous. Today, users put a gate or a door on their page [Fig.47,48]. And what tomorrow? Will they start to think that the files behind them belong to them? And the day after tomorrow, will they come round to thinking hat their data should not be exposed or sold?
Today they change the colour of the scroll bar [Fig. 49,50] adapting it to the theme of their imaginary world, so what’s next? Will they come around to the idea of installing a browser extension, or write one!
Dangerous!
Figure 51
Through the second part of the 90s, service providers took many actions to reduce and restrict: rewriting Terms of Service (ToS) and taking away frameworks,[69] not developing tools that would make it easy to update and communicate – editors, guestbooks, or web rings; and developing tools and services that would (theoretically) require the least effort, simultaneously promoting the idea of IRL, of some real life[70] that you were allegedly missing when making your web page.
But the smartest and most effective move the industry made (the aforementioned measures wouldn’t work without it) was to push people from My to Me. To introduce forms that would motivate people to see themselves as the main – and then the only – content of what they do online. I’d like to stress that although early web pages (or home pages) are remembered as personal, the person who made it was not the initial content; that turn took place later.[71]
Figure 31
Figure 32
Figure 33
Figure 34
Figure 35
Figure 36
Figure 37
Figure 38
Figure 39
Figure 40
Figure 41
Figure 42
Figure 43
Figure 44
Figure 45
Figure 46
Figure 47
Figure 48
Figure 49
Figure 50
Just ME! Me! I’m me and there is no one else like me in the rest of the entire world. All about me and more. John, Kevin, Becky, Jake, Jason, Steve. [Fig.52–61]
Alongside the motivation to promote your ME that came from manuals and articles, there were some smaller, almost technical steps made by providers.
For example, as soon as Yahoo bought GeoCities they replaced the sample pages discussed above with templates. Personal Page Blue, introduced in summer 1999, is maybe the best known.[72] What you see in Fig.62 is not only the original design, but also the original text, that in humorous form invited you not to be shy and to talk about yourself:
“Hi. Let me tell you about myself. Il [sic] like to eat. Sometimes I drink. Often I even sleep! And then in between all of those, I'll need to go to visit the bathroom. Most of the time I do all of those, practically every day! And sometimes I do things with other people. I used to go to school, but now I work. My favorite color is blue.”
[Fig. 63,64] Many registered their profiles but didn’t bother to change the text or never got to that point [Fig.65–67]. Text removed, picture exchanged, text exchanged, but not the picture. All possible combinations and variations, which never led to a page that would grow or be updated.[73]
Another frequently picked and abandoned “About Me“ template was techie2; it was reminiscent of the Matrix fonts and colour combinations [Fig. 68–70].
I want to believe that Fig.71 is Mark Zuckerberg trying out GeoCities by moving in the Wall Street neighbourhood 3 months before Facebook got operational. But I know there are good arguments to prove me wrong.
The screenshot in Fig.72 is almost identical, but pay attention to the address line.
It is not in the neighbourhood,[74] but is a vanity profile – also a change introduced by Yahoo in 1999, another measure to make people think in terms of “me” not “my” categories.
Recently, at a One Terabyte Age workshop, a participant asked if it would make sense to visualise this rise of Me by arranging the pages according to the position of the About Me button in the navigation menu and see how it developed over time. I thought this would be rather a simplification and would object to the algorithmic approach, anyway, but what I saw with my own eyes would confirm that the About Me button indeed made itself a career and moved from the bottom to the top [Fig. 73–77].
In later history (Facebook), we would be able to remember the switch to the timeline, which was a push in the direction of telling the story of your life,[75] to immerse in the history of your “me.”
I think it is also possible to distinguish the pinnacle of the transition from My to Me. It was very well highlighted (or even pushed) by the Person of the Year 2006 cover of Time magazine.[76] You (me) were praised and celebrated and left in front of the mirror, to make selfies and post them on channels that would go bankrupt if their users didn’t produce – and produce for free.
Where My was dangerous, Me was perfect. Me is cheap, Me is easy to control, Me is easy to channel, Me is slave of its own reflection, Me is a slave of the platforms that make the reflection glossy. Me is data. Me is data closest to metadata. This makes Me just perfect to satisfy advertisers and to sate neural networks.
What can be done? How to reclaim My?
Don’t collaborate! Don’t post your texts where you are not allowed to turn it into hypertext.
Don’t post your pictures where you can’t link them to whatever you like. Don’t use content management systems that turn your GIFs into JPEGs. Don’t use hashtags, don’t accept algorithmic timelines. In short, make a web page and link to others who still have one.
Leaving monopolists and/or using alternatives is easy to suggest. And many of us made the first step – for example, created a page on neocities.org or on tilde.club, or even bought a superglue.it kit and hosted their home page at their actual home, supporting the Reclaim hosting initiative.
In December 2019, I asked the founders of the aforementioned projects whether they thought all these 5-year-old initiatives were still active. They were not optimistic about winning the competition with the giants (Dan Phiffer[77] rightfully pointed me to the fact that I asked him about the Tilde Club not on Tilde Club but on Twitter). At the same time, Vasiliev’s,[78] Drake’s[79] and Ford’s[80] answers – as well as Jim Groves’ aforementioned thoughts on “homeless” – suggested that in 2020 there would be more reasons to emancipate than in 2013, or better to say those reasons are stronger in 2020 than in 2013, and that may be the time and the motivation to leave.
But to quote developer and passionate “tilderer” Jon Bell: “How can we make something like this last longer than a sunrise?”
I think that leaving the platforms and meeting somewhere else is not enough, or not even the biggest deal. The challenge is to get away from Me, from the idea that you are the centre of your online presence. Don’t take this imposed, artificial role into the new environments. It will poison and corrupt the best of initiatives.
References
[2] In my recent article GeoCities’ afterlife and web history, I write about the shortcomings and hazards of this term https://blog.geocities.institute/archives/6418; access: October 29, 2020.
[3] The archive keeps 381,934 GeoCities pages rescued by Archive Team in 2009, and restored by Dragan Espenschied in 2011. Materials and outcomes of the research are published on https://blog.geocities.institute/. 72 screenshots a day have been (and continue to be) posted in chronological order on https://oneterabyteofkilobyteage.tumblr.com/ since February 7, 2013.
[4] “Web 2.0 is the business revolution in the computer industry caused by the move to the internet as platform, and an attempt to understand the rules for success on that new platform.” Tim O’Reilly, Web 2.0 Compact definition: Trying again, O’Reilly Radar, December 10, 2006, http://radar.oreilly.com/2006/12/web-20-compact-definition-tryi.html. An early less polemic definition of the term was given by O’Reilly a year before in What is Web 2.0, https://www.oreilly.com/pub/a/web2/archive/what-is-web-20.html, access: June 7, 2020.
[5] Tim Berners-Lee, On simplicity, standards, and “intercreativity”, in: The Web After Five Years: World Wide Web Journal: 1 (3) (Inc. O’Reilly Media), p.8. Also online https://www.w3.org/People/Berners-Lee/1997/w3j-3-iview.html; access: April 4, 2019.
[6] Originally the home page was defined as “the hypertext document you see when you first enter the web”, Ed Krol, The Whole Internet User’s Guide & Catalog, Revised Edition (Sebastopol, CA1992), p. 229. But the concept was evolving and splitting: “The welcome page for a server is often now called a ‘home’ page because it is a good choice for a client to use as a home (default) page. The term ‘home’ page means the default place to start your browser. Don’t be confused by this, though. There are two separate concepts.” https://web.archive.org/web/19970605145352/http://www.w3.org:80/pub/WWW/Provider/Style/Etiquette.html; access: October 29, 2020. In addition, a “home” page is a hypertext document, which is a starting point for a user. Users with hypertext editors make their own home pages. Other users use home pages provided for anyone in their organisation. An example is the CERN home page, https://www.w3.org/Talks/Tour/FindingOurWay.html; access: October 29, 2020. In My corner of the internet, 2014, https://blog.geocities.institute/archives/5118, I provide an overview of different meanings and the history of the term.
[7] The Web After Five Years, p. 8.
[8] Some paragraphs later in the interview, Tim Berners-Lee emphasises again that connected computers should be tools, services, portals that are given to people: “When you turn on your computer what you should see is information, what you should deal with is information. You should be able to create it, to absorb it; you should be able to exchange it freely in the informational space. The computer should just be your portal into the space, in my view.” The Web After Five Years, p. 8.
[9] The Web After Five Years, p. 8.
[10] At this moment the GRI library has 83 items published between 1993 and 2002. Today they are my source of information, being very often the only reference to the websites that ceased to exist before they were preserved in any other way. A screenshot from a web design manual is nothing close to an archived website, it is also much less than an interview with a designer or developer of it. Screenshots are not sufficient, you can’t call them “good enough”, rather “at least something”, but as these web projects were too complex for web archives and too embarrassing for their creators to keep and recall – there is “at least something” you can reference, analyse or attempt to reconstruct.
[11] Vincent Flanders and Michael Willis, Web Pages That Suck: Learn Good Design by Looking at Bad Design (San Francisco, CA 1998), p. 111.
[12] Vincent Flanders and Dean Peters, Son of Web Pages That Suck: Learn Good Design by Looking at Bad Design (San Francisco, CA 2002), p. 204.
[13] Jeffrey Zeldman, Taking Your Talent to the Web: A Guide for the Transitioning Designer (Indianapolis, IN 2001), p. 95.
[14] David Siegel, Creating Killer Web Sites: The Art of Third-Generation Site Design (Indianapolis, IN 1996), p. 29.
[15] Flanders and Willis, Web Pages That Suck, p. 13.
[16] Flanders and Peters, Son of Web Pages That Suck, p. 2.
[17] Zeldman, Taking Your Talent to the Web, p. 189.
[18] The library is not only research material for my GRI colleagues and students, but also an object, an interactive sculpture called “Taking Your Talent to the Web”. It goes to events and exhibitions and, according to the space, takes a form of a book piles or bookshelves. Visitors are welcome to flip through, take photos of the front and back covers, or just stare (and take selfies) at the particoloured row of book spines and read the titles that say a lot about the epoch of the early Web: The Web Design WOW! Book; Cyberspace for Beginners; Graphical Treasures on the Internet; Finding Images Online; Home Sweet Home Page and the Kitchen Sink.
[19] Jeffrey Veen, Art and Science of Web Design (Minneapolis, MN 2000), p. 71.
[20] Curt Cloninger, Fresh Styles for Web Designers: Eye Candy from the Underground (Indianapolis, IN 2001), p. 8.
[21] Ida Engholm, Digital style history: The development of graphic design on the Internet. Digital Creativity 13 (December 1, 2002), pp. 193–211, https://doi.org/10.1076/digc.13.4.193.8672
[22] http://www.coryarcangel.com/things-i-made/2002-009-dirtstyle-design, https://web.archive.org/web/20021208124943/http://www.dirtstyledesign.com/; access: October 29, 2020.
[23] Curt Cloninger, Fresher Styles for Web Designers: More Eye Candy from the Underground (Berkeley, CA 2008).
[24] Ethan Marcotte, Responsive web design, A List Apart (blog), May 25, 2010, https://alistapart.com/article/responsive-web-design/; access: October 29, 2020.
[25] Aarron Walter, Designing for Emotion (New York 2012).
[26] “People keep asking me, ‘What’s web design in 2016?’ Simple. It’s whatever Google wants it to be.” February 2, 2016, https://twitter.com/vincentflanders/status/694362260060389376; access: October 29, 2020.
[27] Philip Greenspun, Philip and Alex’s Guide to Web Publishing (San Francisco, CA 1999), p. xxi.
[28] Émilie Gervais in personal email on February 20, 2020.
[29] www.humaaans.com is an illustration library, which became “an overwhelming trend in editorial and web illustration over the past few years, with particular prevalence currently in the realm of tech.[...] adopting a visual language that signals positivity, and connectedness is a tool to paper over the social and political harm and divisiveness their products create – and illustration has increasingly become a centerpiece of the strategy” as Rachel Hawley describes it in Don’t worry, these gangly-armed cartoons are here to protect you from big tech, Eye on Design (blog), August 21, 2019, https://eyeondesign.aiga.org/dont-worry-these-gangley-armed-cartoons-are-here-to-protect-you-from-big-tech/; access: October 29, 2020.
[30] World, meet the first ever AI (artificial intelligence) solution for website design and creation: Wix ADI.
Wix ADI: Design AI that will change website creation, June 8, 2016. https://www.wix.com/blog/2016/06/wix-artificial-design-intelligence/; access: October 29, 2020.
[31] Steph Davidson in personal email on February 26, 2020.
[32] https://pad.profolia.org/bloomberg – a list of Bloomberg’s special issues compiled by Steph Davidson on July 17, 2017; access: October 29, 2020.
[33] Christoph Knoth in personal email on March 3, 2020.
[34] Jim Groom, “Lifebits, the Next Corner of Cyberspace?” Bavatuesdays (blog), January 13, 2020, https://bavatuesdays.com/lifebits-the-next-corner-of-cyberspace/; access: October 29, 2020.
Katie Notopoulos on Twitter on December 14, 2018, https://twitter.com/katienotopoulos/status/1073392120847851520
[36] “I built my first home page in early 1994. This is my new home online. It contains links to everything I have ever written, created, or been part of.” David Siegel on January 10, 2016, https://medium.com/@pullnews/david-siegel-jack-of-none-998a70be0e57; access: October 29, 2020.
[37] https://www.youtube.com/user/TheTedNelson; access: October 29, 2020.
“Home page” in the second meaning of the word, i.e. the first page of the site.
[39] https://www.petracortright.com/hello.html; access: October 29, 2020.
[40] The pace known as Internet time (or Netscape Time) is not only about velocity, but the dramatism of change that could happen in a very short time. When lecturing about WWW history, I emphasize it by adding to the common saying: “there may be 7 calendar years in one Internet year, but there are 100 years in between 1996 and 1997”, referring to the overnight sinking of connection prices, the Wired cover that announced the death of the web pages, the release of Netscape communicator, which suggested thinking about the Web as an application, not sites. The same observation can be applied to the events of 1995, when Netscape browser was released to “kill” its predecessor Mosaic. And even more to 1993 when Mosaic appeared as the first alternative to WWW. In Architects of the Web, Robert H. Reid marks everything that was on the Internet before Mosaic as B.C. where C is “commercialization”.
[41] Release of Mosaic browser. “NCSA’s Mosaic™ wasn’t the first Web browser. But it was the first to make a major splash. In November 1993, Mosaic v. 1.0 broke away from the small pack of existing browsers by including features – like icons, bookmarks, a more attractive interface, and pictures – that made the software easy to use and appealing to ‘non-geeks’.” http://www.ncsa.illinois.edu/enabling/mosaic; access: October 29, 2020.
42 Friendster CEO: I made you, Zuckerberg, Observer (blog), July 1, 2011, https://observer.com/2011/07/friendster-ceo-i-made-you-zuckerberg/; access: October 29, 2020.
[43] Olia Lialina, Vernacular Web in Digital Folklore Reader (Lialina, Espenschied, 2009), p. 27. Also online, http://art.teleportacia.org/observation/vernacular/links.html; access: October 29, 2020.
[45] Krol, The Whole Internet User’s Guide & Catalog, p. 231.
[46] Paul E. Robichaux, Jazz Up Your Web Site: In a Weekend (Rocklin, CA 1997), p. 16.
[47] Bryan Pfaffenberger, Publish It on the Web!: Windows Version (Boston 1995), p. 61.
[48] Robin Williams and Dave Mark, Home Sweet Home Page and the Kitchen Sink, Pap/Cdr edition (Berkeley, CA 1997), p. 45.
[49] Paul McFedries, The Complete Idiot’s Guide to Creating an HTML Web Page, 2nd book & CD edition (Indianapolis, IN 1997), p. 15.
[50] Siegel, Creating Killer Web Sites, p. 33.
[51] http://www.geocities.com/Eureka/7185/cb2.htm
[52] Quittner on Way New Journalism http://archive.gyford.com/1995/11/13/HotWiredDemo/i-agent/index.htm; access: October 29, 2020.
[53] Let me also mention that the students’ project came up with great work, including Lyricslinks by Tim Jack Schmit, which is a music video you have to compile yourself by following the links through different platforms. https://pad.profolia.org/tj_lyricslinksthat; access: October 29, 2020.
[54] Dennis Green, Apple is running an ad where a kid asks, “What’s a computer?” – and people find it infuriating. Business Insider on January 24, 2018, https://www.businessinsider.com/apple-whats-a-computer-ad-sparks-anger-2018-1. Video available on YouTube https://youtu.be/pI-iJcC9JUc; access: October 29, 2020.
[55] Tim Berners-Lee, Weaving the Web: The Original Design and Ultimate Destiny of the World Wide Web, 1st edition (San Francisco 2000), p. 139.
[56] Ibid., p. 141.
[57] The right to link, https://userrights.contemporary-home-computing.org/iqoxfwg/; access: October 29, 2020.
[58] The trailblazers web-surfing event and competition was conceptualised in 2010 by my project group at Merz Akademie, as a competition where participants can exercise (or show off) their skills to go around through the walls of walled gardens. Announcements and documentation of the events available at http://nm.merz-akademie.de/trailblazers/; access: October 29, 2020.
[59] https://www.quora.com/How-do-I-add-clickable-links-in-Instagram-comments; access: October 29, 2020.
[60] Berners-Lee, Weaving the Web, p. 33.
[61] https://www.laurentvw.com/project/clickable-links, Laurent Van Winckel, 2012.
[62] Hossein Derakhshan, The Web we have to save, Medium, September 12, 2019, https://medium.com/matter/the-web-we-have-to-save-2eb1fe15a426; access: October 29, 2020.
63 MySpace and the coding legacy it left behind, Codecademy News, February 14, 2020, https://news.codecademy.com/myspace-and-the-coding-legacy/
[64] https://easychair.org/smart-program/RESAW19/; access: October 29, 2020.
[65] O. Lialina, A vernacular Web 2, 2007 http://contemporary-home-computing.org/vernacular-web-2/; access: October 29, 2020.
[66] “There is a giant gulf between doing something and doing nothing. And someone who makes a lolcat and uploads it – even if only to crack their friends up – has already crossed that chasm to doing something.” Clay Shirky (in an interview with NPR), What happens when people migrate to the Internet?, https://www.npr.org/story/127760715, June 1, 2010; access: October 29, 2020. As a side note, I’d like to mention that 10 years ago I saw it as praising web vernacular, an invitation to the masses to go online, to be a part of online culture, and didn’t see that, in fact, statements like this were also ignoring the abyss of the amateur Web, the equating of personal homes with doing nothing.
[67] More on the topic can be found at https://blog.geocities.institute/archives/tag/cat and text Rascal, a Labrador, Mochi, a pug, and other webmasters at https://blog.geocities.institute/archives/655; access: October 29, 2020.
[68] On the Internet, Everybody knows you had a dog https://dogs.geocities.institute/, regularly updated since August, 2015, 264 screenshots at the time of writing this text. The title is an allusion to Peter Steiner’s famous cartoon published in The New Yorker on July 5, 1993, captioned “On the Internet, nobody knows you’re a dog”.
[69] “In protest of Section 8”, https://blog.geocities.institute/archives/5049,“is death for WEB sites as us” https://blog.geocities.institute/archives/6144, access january 21, 2021.
[70] “I assume you have a life away from your computer screen”, McFedries, The Complete Idiot’s Guide to Creating an HTML Web Page, p. xvi.
[71] David Bohnett in an unpublished interview on January 26, 2019: “[…] it [GeoCities] was also intended to be thematic and subject matter based. It never even occurred to me that people would like to talk about themselves and talk about where they eat and where they want to make it. So, it was all about themes and that’s why you have a neighborhood of themes.”
[72] There are 2124 specimens in the GeoCities Archive.
[73] More about GeoCities users trying to cope with the template in my post PersonalPageBlue https://blog.geocities.institute/archives/2736 (2011); access: October 29, 2020.
[74] A comprehensive list of all GeoCities Homestead Neighbourhoods and Suburbs by Blade https://www.bladesplace.id.au/geocities-neighborhoods-suburbs.html; access: October 29, 2020.
76 At the end of 2011, Facebook introduced a new layout and structure for their users’ profiles – Timeline – described by the company itself in the Help section as: “your collection of the photos, stories, and experiences that tell your story.” Anticipating its success, Wired described it as even more monumental: "Timeline is potentially an omnivorous collector of personal data that you can format to tell your story." Steven Levy, With Timeline, Facebook bids to reinvent the social biography. Wired, November 11, 2011, https://www.wired.com/2011/11/timeline-facebook/; access: October 29, 2020.
[76] https://en.wikipedia.org/wiki/You_(Time_Person_of_the_Year); access: October 29, 2020.
[77] “This is how I started using the Internet in 1998, when I started college. This is the future I would like to displace the corporate social media dystopia. But here we are on twitter dot com still, so it’s all a work in progress. Tilde itself is just one attempt of many to create an alternative. See also: http://mltshp.com, http://are.na, Mastodon, Secure Scuttlebutt, etc.” Dan Pfeifer, DM Twitter, January 16, 2020.
[78] “While the reasons for one wanting to have their own ‘corner of cyberspace’ are now much more defined, today's average users are looking for hand-holding help for actually doing it. Contemporary users are not like the bravehearts of the 90s and it feels to me that making self-hosted platforms less nerdy and more ‘popular’ (without making it dumb, of course) would help the people in rebuilding the Web we lost.” Danja Vasiliev in personal email on January 13, 2020.
[79] “I do think that one of the trends we will see over the next few years is a general exodus from social networks into saner alternatives – and I do believe one of those main alternatives will be creating personal websites again, where you have complete control over how you present yourself and what content you want to provide, and in what arrangement.” Kyle Drake in personal email January 6, 2020.
[80] “I am certain we need more spaces like this, places where you can experiment and be both dumb and kind in equal measure and people either leave you to it, or help you along.” Paul Ford on September 17, 2020 on http://tilde.club/~ford/; access: October 29, 2020.
Figure 52
Figure 53
Figure 54
Figure 55
Figure 56
Figure 57
Figure 58
Figure 59
Figure 60
Figure 61
Figure 62
Figure 63
Figure 64
Figure 65
Figure 66
Figure 67
Figure 68
Figure 69
Figure 70
Figure 71
Figure 72
Figure 73
Figure 74
Figure 75
Figure 76
Figure 77
To understand the history of the Web and the role of its users, it is important to acknowledge that people who built their homes, houses, cottages, places, realms, crypts, lairs, worlds, dimensions [Fig.1–13] were challenging the architecture and the protocols, protocols in a figurative not technical meaning. Users hijacked the first home page of the browser and developed this concept in another direction.[6] A user building, moving in, taking control over a territory was never a plan. It was a subversive practice, even in 1995.
“Q: The idea of the ‘home page’ evolved in a different direction.
A: Yes. With all respect, the personal home page is not a private expression; it's a public billboard that people work on to say what they're interested in. That's not as interesting to me as people using it in their private lives. It's exhibitionism, if you like. Or self-expression. It's openness, and it's great in a way, it's people letting the community into their homes. But it's not really their home. They may call it a home page, but it's more like the gnome in somebody's front yard than the home itself.”[7]
Tim Berners didn’t intend to be sarcastic. It would be fair to quote the rest of the answer to see that what he called for was giving web users better, faster and more seamless[8] ways to connect.
“People don’t have the tools for using the Web for their homes, or for organizing their private lives; they don't really put their scrapbooks on the Web. They don’t have family Webs. There are many distributed families nowadays, especially in the high-tech fields, so it would be quite reasonable to do that, yet I don’t know of any.”[9]
Such “webs” started to arrive some years later in the form of LiveJournal, Friendster, Facebook and other platforms that clearly showed web users that their part was to be connected and deliver content, not to build anything.
I don’t think that in 1996 anybody was really hurt or stopped making web pages because of the remark the father of the Web made. People building what was “not really their home” were reading other texts at that time: HTML manuals, web graphics tips and tricks, and source codes of each other’s websites. They would rather buy HTML for Dummies or Home Sweet Home Page and the Kitchen Sink than the WWW Consortium corporate journal.
Mentioning web design manuals is not a side remark here, but a suggestion to pay closer attention to the books that explained the World Wide Web to newcomers and taught them to make web pages as documents[10] of the epoch: books such as Teach Yourself Web Publishing with HTML 3.2 in 14 Days; Building Your Own Website; Jazz Up Your Web Site in a Weekend; Frontpage Web Publishing & Design for Dummies; Publish it on the Web! – and other titles that shout: the Web is the future, the future belongs to you, learn HTML and embrace the future! The older the manual, the younger the medium, the more powerful and diversified is the role of the manual’s reader, the Web user. But in the context of this article I send you there not to look for the “good old days”. The manuals are also evidence of the personal web pages and their authors being ridiculed by experts: on the very same pages that motivated a newcomer you can often read “amateur” as a negative adjective.
“This page shouts ‘Amateur’"[11]
“There's nothing that says, ‘I'm an amateur Web designer and I don't know what I'm doing’ like 3-D logos”[12]
“Visit an amateur home page and see how excessive scrolling drags its nails across the blackboard of the user's experience”[13]
Already as early as in 1996, personal home pages as a genre and early web makers (as a group) were made fun of and blamed for all the ugly stuff. It is the year when David Siegel publishes Creating Killer Web Sites. Describing the history of the WWW till that moment, he announces the third generation of web design to come to replace the second-generation site, which for him was the world of amateur web and which he described as “At worst, noisy backgrounds and interminable waits for sound files make these sites unbearable. At best, they are nice white sites with color-coordinated icons”.[14]
“The audience for personal pages is basically only one person -- the creator of the site.”[15]
“It's perfectly OK for you to be as wild and crazy as you want because the only people who will probably visit your site are friends and family – and they are well aware of your lack of aesthetic taste.” [16]
“ […] they cram every page with embedded MIDI (music) files, pointlessly scrolling JavaScript messages, huge full-color photographs, animated GIFs (flames and dripping blood are especially popular), and blinking and moving text [...] That is bad design, and (we think) bad markup, even if it validates – which is pretty unlikely because folks attracted to dripping blood animations tend not to spend much time learning about web standards.”[17]
The last quote is from Taking Your Talent to the Web, the book with the most beautiful title ever given to a manual. That’s why I borrowed it to be our library’s pseudonym.[18] I also find this book very wise in many aspects: first and foremost for Zeldman’s conviction regarding the medium specificity of web practice and his attempt to divorce it from graphic design in this and other texts. Also, the work that he and his colleagues do at A Book Apart, a publishing house that makes manuals for contemporary web designers, cannot be underestimated. But I also think that it was a big mistake to neglect amateurs’ contributions to the development of the Web’s language.
In my opinion, people struggling to position a dripping blood animation in between two skulls and under <marquee>ENTER IF YOU DARE</marquee>, and pick up an appropriate MIDI tune to sync with the blood drip, made an important contribution to showing the beauty and limitation of web browsers and HTML code.
Making fun and blaming amateurs is only half of the problem. More damaging for the history of the Web was the ignoring of personal home pages and their authors in “how-to” books.
Neither the usability (Jakob Nielsen) nor the creativity (Jeffrey Zeldman, David Siegel) camps and their followers spared a page to analyse the home pages of amateurs, sorting things exclusively between themselves. From time to time they (as in Nielsen, Zeldman, Flanders) mentioned artists and web artists as exceptions to the rules they established, but not web vernacular. Even after designers of “photoshop” sites and dot.com unviable hybrids discredited the profession, experts suggested looking for new ideas among... professionals.
Veen: “I find inspiration in noncommercial Web creations”[19] claims Veen and gives examples of designer portfolios.
“In order to move beyond a conservative, copycat style, you must look beyond the inbred corporate web to the personal sites of today's leading web designers”[20] echoes Cloninger.
Danish researcher Ida Engholm in her 2002 paper “Digital style history: the development of graphic design on the Internet” wrote, “Web design has become an aesthetic phenomenon in its own right and with its own means of expression.”[21]
She continues: “Until now few attempts have been made from the perspective of aesthetic theory to develop reflective approaches to web design.” Ida Engholm was too cautious and modest with this remark. To my knowledge she was the first to attempt such an approach in the international academic press. And one can see that she was strongly informed (or misinformed) by the “how-to books” of the above-mentioned Siegel, Cloninger, Zeldman.
She writes: “[…] web design didn’t develop in a vacuum but shares features with development trends in 20th century design and art and with traditional design areas such as industrial design and graphic communication.” Following Cloninger she looks for web design roots in Swiss Style and Grunge, and discusses Kilobyte Minimalism, Hello Kitty and other popular online, but still graphic, design styles.
Indeed, web design didn’t develop in a vacuum, it grew out of vernacular web, it grew in opposition to vernacular expression. But there was obviously an information and education vacuum created around it by authors of design manuals and other experts and evangelists.
Only in 2008, in Fresher Styles of Web Design, Cloninger, following Cory Archangel’s Dirtstyle,[22] introduced “1996 Dirt style”, which he attributed to Myspace, blingee.com and other sites and communities “greatly influenced by hobbyist created personal home pages circa 1996”[23] without giving a single example of any website from that era.
No wonder that young web designers think that responsive web design was invented this century, although Ethan Marcotte never hid the fact that he only coined the term,[24] brought back and popularised the principle of liquid layouts, which was very popular among personal home page makers of the mid 90s; and why Aaron Walter, the author of Designing for Emotion[25] – a web design manual that explains step-by-step how to create a service in a way that its users think that there is a real person behind it – dares to deliver his point without once mentioning a personal home page.
Webmasters and their production were an easy target. Professional designers, evangelists – they all took their chance/really took the opportunity: ridiculing, discrediting, alienating, exposing clean styles and templates, usurping the right to make design decisions.
And they succeeded, they protected the Internet from “wrong” colour combinations…, annoying background sound, from marquees and blinking, but in the long term it was the beginning of the end of web design itself. The rhetoric of alienation that design experts practised in 1996 was picked up by IT giants a decade later.
To quote Vincent Flanders’ (the extensively quoted above Flanders, who, book by book, article by article, humiliated websites that were too bright, too loud, too confusing) tweet from 4 years ago: “in 2016 web design is what Google wants it to be”.[26] Even more true in 2020.
There is no web design and web designers any more, there are graphic designers and developers again, front-end and back-end developers this time. For me as a net artist and new media design educator, this splitting of web designer into graphic designer and front-end developer is bitter, because it is the death of a very meaningful profession.
“Web publishing is one of the few fields left where the generalist is valuable. To make a great site you need to know a little bit about writing, photography, publishing, UNIX system administration, relational database management systems, user interface design, and computer programming,”[27] writes Philip Greenspun in Philip’s and Alex’s Guide to Web Publishing in 1999. It would be naive to think that it would work the same way two decades later, taking into account the complexity of modern online products. But still the web designer is a generalist in a leading position. But knowing a bit of everything is not the most important part of the profession. The generalist as web designer is a person who sees the medium designed and shows it to the users, a person who is growing up together with the medium (and never gets old because the medium is forever new) and who has the potential to reshape it, because intelligence is still the ends.
“Web designers are still there though, I think. Just maybe more and more are actually growing into Frontend developers or turning to something more specific like becoming UI/UX designers, or Product designers. It's less browser focused maybe, less ‘web’? Even though most of these still technically rely on web protocols and technologies”[28] – net artist at night and “full-stack developer with more experiences as a front-end developer” –, Émilie Gervais sees it more optimistically in our email correspondence but still confirms the shift: the Web is not a medium but underlying technology.
Underlying and invisible. Most of the digital products and interfaces we use today are in fact browsers opened in kiosk mode. The majority of mobile apps, digital signages, installations, and other big and small “experiences” are constructed with HTML, CSS and JavaScript. Front-end developers who can talk with screens and layouts in these languages are demanded, celebrated, well paid … but harmless; they master technologies without ambitions to master the medium.
Without web designers, the Web is left to front-end developers who implement Material Design guidelines (“what Google wants it to be”), graphic designers mix-n-matching “illstrations for every occasion”[29] – and for the rest of us there is Artificial Design Intelligence (ADI).[30]
“There is no room for ornament on the web. People want to look at Instagram […] because their brain already understands how Instagram is laid out. In my opinion the goal of an artist vs a UX/UI/product designer is totally opposite. To combat templatization and minimalism I try to exaggerate designs with ephemeral styles and effects,”[31] -- says Steph Davidson. She is web art director at Bloomberg, a publishing house that actually makes an effort[32] to revive the genre – with a website that is different. Bloomberg designers are not the only ones. There are exceptions and we identify them as such. For example, every work of German web design duo Christoph Knoth and Konrad Renner makes people say “wow, the Web (design) is alive”. They confirm that “there is a small movement that is fusing web design back together with new tools. We design and develop frontends and backends and it feels like a perfect habitat for our work. We are the living proof”.[33]
“Small movement” is very important for rescuing the profession and the idea that one – be it a publishing house, a festival, a journalist investigation, a person – needs a website.
“[…] the idea of a site and its relationship to our online identity has far more depth of possibility than ever before, which makes me think the concept of having one’s own site online might never be more relevant given how ‘homeless’ our digital presence is online currently,”[34] writes co-founder of Reclaim Hosting initiative, Jim Groom.
The homeless status is a reality for individuals, who never know when Facebook will implode together with their images and interactions, and for institutions begging Google and Wikipedia to edit their “knowledge panels”. Experts and celebrities are not better settled than naive users of Instagram.
Nothing is more eloquent than popular tech journalist Katie Notopoulos tweeting: “I had an idea for a blog, but realized that there's nowhere to like, make a new blog (rip tumblr), so I think the best blogging platform now is.... a really long caption on an Instagram?”[35] or aforementioned web design guru David Siegel, whose web home today is a link list on Medium.[36] Many links to his own text about the future of the Web once published on dsigel.com point to the Wayback machine.
The father of hypertext gave up updating hyperland.com and directed it to his YouTube channel.[37] The mother of Post-Internet made a spectacular home page[38] for marisaolson.com – the rest of her portfolio is outsourced to blocks and channels on arena.com. Among the ruins of online portfolios rises the home page of artist Petra Cortright,[39] who links everything she’s done in between 2012 and 2019 to “petra cortright 2019 2018 2017 2016 2015 2014 2013 2012” on llmgtfy.com – a very contemporary gesture, which could be interpreted as both despair and arrogance.
In this situation I think a new role and an understanding of web designers could be rebuilding homes; showing gnomes the way out of corporations’ front yard, if I may steal Ted Nelson’s metaphor.
These are not “ornaments” per se, Davidson mentions, and not the awesome audio visual effects Knoth and Renner provide to their clients; it is the notion of having an appearance – that they bring back by exaggerating it – and subsequently a place of your own outside of standard interfaces and grids of algorithmic timelines.
*
To turn designers and users away from technology and back to the medium one should try to adjust the optics and see the people who made the Web, to write the history of the users (not corporations that released these or those products, or updates) and frame it in a continuum of their actions, views, self-identification. Not an easy task because on the Web we are always confronted with revolutions, with histories of big men and binary time[40] and space: before/after, web 1.0/web 2.0, desktop/mobile, flat/material.
My slow climb from 1995 to 2004 in the 1TB archive, my personal journey online that started in 1994 and is still not over, as well as two decades of teaching new media designers to see and show the environment they work with, we should recognise several trajectories we (web users) took since 1993.[41]
From web designer to front-end developer could be one of these trajectories. This is partially introduced on the previous pages. To make it complete I’d first of all have to place it in a more complex, forking path, starting from webmaster (not web designer), following the genesis and metamorphosis of that profession (passion) through time and niches of the Web.