It is difficult to imagine, in this age of generic, impersonal computers and corporations, a time when individually-branded computer hardware attracted phenomenal and fanatical loyalty and devotion from their users. Sure, today you might see the occasional Windows95-vs-Linux flame war on the computing newsgroups, but that's just fighting over software, which is different. I won't explain how, it's complex.
The 1980s were the age of the home computer. They were touted as complete computing solutions on which you could do an assignment, write a letter, experiment with BASIC programming, and play the occasional game if the mood took you. The first of these that achieved massive popularity in the early '80s was the Commodore 64, a machine with an 8-bit processor, a three voice mono sound chip, 16 colours and attitude to the hilt. Soon after (around 1984) came the Apple Macintosh, a simple machine that could only display two colours, but had a mouse and a rudimentary multitasking operating system, and, more through inspired marketing than anything else, its unique identity found it a niche that IBM couldn't touch.
1985 was the year when the home computer came of age with the introduction of Commodore's Amiga. If you pushed it, the Amiga could do 4096 colours at once, and it had stereo sound. It was designed by Californian surfers who loved the B-52s (the first motherboard was called the Rock Lobster, and when the machine crashed, up popped a sign saying "Guru Meditation Error"), and it was a dream in beige plastic. By today's standards, the games and programs available were unsophisticated, and so was the hardware. On the other hand the software was cheap and the hardware was expensive. But still, the limitations imposed forced the programmers of the time to create ingenious solutions, and push the hardware to the limit. Many of the games of the period still look reasonable today, and they certainly play a lot better. And so a generation grew up with these ingenious and creative machines. Part of the appeal was the idea of standing up against a dominant culture, typified by the monolithic IBM and its business-oriented, resolutely identity-less PC, a machine which proved so featureless as to be mimicked, or cloned, by countless other manufacturers.
By the mid-1990s, however, things weren't looking so good for the home computer. Commodore had gone bankrupt (although its curse lives on, bankrupting any company that tries to buy the brand) and the Amiga was far behind the curve in technical specifications (rumours persist that the engineers had lost the machine's blueprints, forcing them to "hack", or reverse-engineer the hardware). The majority of casual users slowly left the platform (although many dedicated acolytes remain), migrating to IBM PC clones, with the temptations of lower costs, standardised software -- and the problem of having the worst operating system ever devised for a computer.
The late 1990s, and computer processor power had advanced to the level that it was possible to produce software that could create a virtual computer inside a real computer's memory. The program would emulate the hardware, instruction set and operating environment of the original machine, and allow the software especially written for the machine to be used on an entirely different computer. The first steps were to emulate the simple computers and video game consoles of the early eighties. Once we saw a Commodore 64 screen running on a PC, it was obvious that anything was possible.
The discourse of technology is inexorably linked with a modernist notion of evolution, progress and advancement. Why then would computer users with the most powerful domestic computing hardware available choose to cripple these machines by using them to run a ten-year-old operating system and software. One answer is that this old software could do things that would be impossible on even today's machines. But perhaps these things fill a more abstract, less pragmatic need. An identification with a coherent subculture. A return to rebellion against the mainstream. One thing is sure: humans will colonise and give life to anything, even something as seemingly cold and lifeless as a computer.