RSS

The Shareware Scene, Part 2: The Question of Games

In one of the last interviews he gave before his death, shareware pioneer Jim Button said that he “had written off the idea of shareware games” prior to the beginning of the 1990s. At the time, it seemed a reasonable position to take, one based on quite a bit of evidence. While any number of people had tried to sell their games this way, there had been no shareware success stories in games to rival those of Andrew Fluegelman, Jim Button, or Bob Wallace.

Naturally, many pondered why this should be so. The answers they came up with were often shot through with the prejudices of the period, which held that programming or playing frivolous games was a less upstanding endeavor than that of making or using stolid business software. Still, even the prejudiced answers often had a ring of truth. You had a long-term relationship with your telecommunications program, database, or word processor, such that sending its author a check in order to join the mailing list, acquire a printed manual, and be assured of access to updates felt as much like a wise investment as merely “the honest thing to do.” But you had a more transient relationship with games; you played a game only until you beat it or got tired of it, then moved on to the next one. Updates and other forms of long-term support just weren’t a factor at all. No one could seem to figure out how to untangle this knot of motivation and contingency and make shareware work for games.

Luckily, there was an alternative to the shareware model for those game programmers who lacked the right combination of connections, ambitions, and talents to go the traditional commercial route — an alternative that offered a better prospect than shareware during the 1980s of getting paid at least a little something for one’s efforts. It was the odd little ghetto of the disk magazines, and so it’s there that we must start our story today.


The core idea behind the disk magazines is almost as old as personal computing itself. In February of 1978, Ralph McElroy of Goleta, California, published the first issue of CLOAD, a monthly collection of software for the Radio Shack TRS-80, the first pre-assembled microcomputer to rack up really impressive sales numbers. “To join the somewhat elite club of computer users,” wrote McElroy in his introductory editorial, “one [previously] had to learn the mysterious art of speaking in a rather obscure tongue” — i.e., one had to learn to program. Before any commercial software industry to speak of existed, CLOAD proposed to change that by offering “vast quantities of software to be shared.” It was actually distributed on cassette tape rather than floppy disk — a disk drive was still a very exotic piece of hardware in 1978 — but otherwise it put all the pieces into place.

By 1981, the TRS-80’s early momentum was beginning to flag and the more capable Apple II was coming on strong. Jim Mangham, a programmer at the Louisiana State University Medical Center in Shreveport, decided that the market was ready for a CLOAD equivalent for the Apple II — albeit published not on cassettes but on floppy disks, which were now steadily gaining traction. He recruited a buddy named Al Vekovius to join him in the venture, and the two prepared the first issue of something they called The Harbinger. They called up Softalk magazine, the journal of record for early Apple II users, to discuss placing an advertisement, whereupon said magazine’s founder and editor Al Tommervik got so excited by their project that he asked to become an investor and official marketing partner. Thus The Harbinger acquired the rather less highfalutin name of Softdisk to connote its link with the print magazine.

Starting with just 50 subscribers, Mangham and Vekovius built Softdisk into a real force in Apple II computing. Well aware that they couldn’t possibly write enough software themselves to fill a disk every single month, they worked hard from the beginning to foster a symbiotic relationship with their readership; most of the programs they published came from the readers themselves. In the early days, the spirit of reciprocity extended to the point of expecting readers to mail their disks back each month; this both allowed Mangham and Vekovius to save money on media and provided a handy way for readers to send in their programs and comments. Even after this practice was abandoned in the wake of falling disk prices, Softdisk subscribers felt themselves to be part of a real digital community, long before the rise of modern social media made such things par for the course. At a time when telecommunications was a slow, difficult, complicated endeavor, Softdisk provided an alternative way of feeling connected with a larger community of people who were as passionate as oneself about a hobby which one’s physical neighbors might still regard as hopelessly esoteric.

Thus Mangham and Vekovius’s little company Softdisk Publishing slowly turned into a veritable disk-magazine empire. In time, Mangham stepped back from day-to-day operations, becoming a nearly silent partner to Vekovius, always the more business-focused of the pair. He expanded Softdisk to two disks per issue in August of 1983; started reaching retail stores by January of 1984; launched a companion disk magazine called Loadstar for the Commodore 64 in June of 1984. Softdisk survived the great home-computer bust of the second half of 1984, which took down Softalk among many other pioneering contemporaries, then got right back to expanding. In November of 1986, Vekovius launched a third disk magazine by the name of Big Blue Disk, for MS-DOS-based computers; it soon had a monthly circulation of 15,000, comparable to that of Softdisk and Loadstar. A fourth disk magazine, for the Apple Macintosh this time, followed in 1988. At least a dozen competitors sprang up at one time or another with their own disk magazines, but none seriously challenged the cross-platform supremacy of the Softdisk lineup.


In order to encourage software submissions, all of the Softdisk magazines ran a periodic programming competition called CodeQuest. Readers were encouraged to send in programs of any type, competing for prizes of $1000 for the top submission, $500 for second place, and $250 for third place, on top of the money Softdisk would pay upon eventually publishing the winning software. Big Blue Disk‘s second incarnation of the contest ended on January 31, 1988, yielding two winners that were fairly typical disk-magazine fare: the gold-winning The Compleat Filer was a file-management program to replace the notoriously unfriendly MS-DOS command line, while the bronze-winning Western was a sort of rudimentary text-based CRPG set in, you guessed it, the Old West. But it was the silver winner — a game called Kingdom of Kroz, submitted by one Scott Miller from a suburb of Dallas, Texas — that interests us today.

At the time of the contest, Miller didn’t seem to be going much of anywhere in life. In his late twenties, he was still attending junior college in a rather desultory fashion whilst working dead-end gigs at the lower end of the data-processing totem pole, such as babysitting his college’s computer lab. His acquaintances hardly expected him to ever move out of his parents’ house, much less change an industry. Yet this seeming slacker had reserves of ambition, persistence, marketing acumen, and sheer dogged self-belief that would in the end prove a stick in the eye to every one of his doubters. Scott Miller, you see, wanted to make money from videogames — make a lot of money. And by God, he was going to find a way to do it.

The young Scott Miller.

Before entering the CodeQuest contest, he’d written a column on games for the local newspaper, written a book on how to beat popular arcade games, and, last but not least, tested the early shareware market for games: he’d written and distributed a couple of shareware text adventures under the name of Apogee Software — a name which would later become very, very famous among a certain segment of gamers. But on this occasion he was disappointed by the response, just like everyone else making shareware games at the time. Unlike most of those others, though, Miller didn’t give up. If shareware text adventures wouldn’t do the trick, he’d just try something else.

Put crudely, Kingdom of Kroz was a mash-up of the old mainframe classic Rogue and the arcade game Gauntlet — or, if you like, a version of Rogue that played in real time and had handcrafted levels instead of procedurally-generated ones. It wasn’t much to look at — like classic Rogue, it was rendered entirely in ASCII graphics — but many people found it surprisingly addictive once they got into it. It went over very well indeed with Big Blue Disk‘s subscribers when it appeared in the issue dated June 1988 — went over so well that Miller provided two sequels, called Dungeons of Kroz and Caverns of Kroz, almost immediately, although the magazine wouldn’t find an opening for them in its editorial calendar until the issues dated March and September of 1989.

While he waited on Big Blue Disk to release those sequels, Miller started to explore a new idea for marketing games outside the traditional publishing framework. In fact, this latest idea would eventually prove his greatest single stroke of marketing genius, even if its full importance would take some time yet to crystallize. He would later sum up his insight in an interview: “People aren’t willing to pay for something they’ve already got in their hands, but they are willing to pay if it gets them something new.” Call it a cynical notion if you must, but, in the context of games at least, it would prove the only way to make shareware pay on a scale commensurate with Scott Miller’s ambitions.

Miller and George Broussard, his longtime best friend and occasional partner in the treacherous world of shareware, made an engine for multiple-choice trivia games — not exactly a daunting programming challenge after the likes of Kroz. They compiled sets of questions dealing with different topics: general trivia, vocabulary, the original Star Trek and Star Trek: The Next Generation. They created “volumes” in each category consisting of 100 questions. Then they released the first volume of each category online, accompanied by an advertisement for additional volumes for the low, low price of $4 each.

Alas, the scheme proved not to be a surefire means of selling trivia games; the economics of getting just 100 questions for $4 were perhaps a bit dodgy even in the late 1980s, when just about everything involving computers cost exponentially more than it does today. But a seed had been planted; the next time Miller tried something similar, he would finally hit pay dirt.

The next time in question came in the second half of 1989, just after Big Blue Disk published the last Kroz game. The magazine’s contract terms were far more generous than those of any traditional software publisher: Miller had retained the Kroz copyright throughout, and the magazine’s license to it became non-exclusive as soon as it published the third and last game of the trilogy. Miller, in other words, could now do whatever he wished with his three Kroz games, while still benefiting from the buzz their appearance in Big Blue Disk had caused in some quarters.

Kingdom of Kroz

So, he decided to try the same scheme he had used with his trivia games: release the first part of the trilogy for free, but ask people to send him $7.50 each for the second and third parts. A tactic that had prompted an underwhelming response the first time around worked out much better this time. Unlike those earlier exercises in multiple choice, the Kroz trilogy was made up of real games — or, perhaps better said, was actually one real game artificially divided into three. After you’d played the first part of said game, you wanted to see the rest of it through.

In short, Scott Miller — and shareware gaming in general — finally got their equivalent to that day when Jim Button returned home from a Hawaiian vacation to find his basement drowning in paid registrations. Suddenly Miller as well was drowning in mail, making thousands of dollars every month. He’d done it; his dogged persistence had paid off. He’d found a way around the machinations of the big publishers, found a way to sell games on his own terms, cracked the code of shareware gaming. His sense of vindication after so many years of struggle must defy description.

From here, things happened very, very quickly. Miller whipped up a second trilogy of Kroz games to sell under the same model — first part free, second and third must be paid for — and was rewarded with more checks in the mail. Most people at this point would have been content to continue writing lone-wolf games and reaping huge rewards — but Miller was, as I’ve already noted, a man of unusual ambition. At heart, he was more passionate about marketing games than programming them; in fact, he would never program another game at all after the second Kroz trilogy.

Already before 1989 was over, he had reached out to a Silicon Valley youth named Todd Replogle, who had created and uploaded to various bulletin-board systems a little action-adventure called Caves of Thor that was similar in spirit to the Kroz games. Miller convinced Replogle to re-release his free game under the Apogee imprint, and to make two paid sequels to accompany it. Replogle followed that trilogy up with a tetralogy called Monuments of Mars. Meanwhile George Broussard returned on the scene to make two more four-volume series, called Pharaoh’s Tomb and Arctic Adventure.

By 1991, Apogee was off and running as a real business. Miller quit his dead-end day jobs, moved out of his parents’ house, convinced Broussard to join him as a full-time partner, found an accountant, leased himself an office, and started hiring helpline attendants and clerical help to deal with a workload that was mushrooming for all the right reasons. His life had undergone a head-spinning transformation in the span of less than two years.

At this point, then, we might want to ask ourselves in a more holistic way just why Apogee became so successful so quickly. Undoubtedly, a huge part of the equation is indeed the much-vaunted “Apogee model” of selling shareware: hook them with a free game, then reel them in with the paid sequels. Yet that wasn’t a silver bullet in and of itself, as Miller’s own early lack of success with his trivia games illustrates. It had to be executed just right — which tells us that Miller got it just right the second time around. The price of $7.50 was enough to make the games extremely profitable for Apogee in relation to the negligible amounts of money it took to create and market them, but cheap enough that customers could take the plunge without feeling guilty about it or needing to justify it to a significant other. Likewise, each game was perfectly calibrated to be just long enough for the customer not to feel cheated, but not so long that she spent hours playing it which she could have sunk into another Apogee game.

If all of this sounds a bit mercenary, so be it; Miller was as hard-nosed as capitalists come, and he certainly wasn’t running Apogee as a charity. Yet it’s seldom good business, at least in the long run, to sell junk, and this too Miller understood. Apogee maintained a level of quality control that was often lacking even from the big publishers, who often felt compelled to release a game before its time to meet the Christmas market or to pump up the quarterly numbers. Apogee games, on the other hand, seldom appeared under a Christmas tree, and Miller had no shareholders other than his best friend to placate. “Our philosophy is never to let an arbitrary date dictate when we release a game,” said Miller in an interview. As a result, their games were small but also tight: bug-free, stable, consistent. They evinced a sense of care, felt like creations worth paying a little something for. Soon enough, people learned that they could trust Apogee. If none of Apogee’s early games were revolutionary advances within the medium, there were few to no complete turkeys among them either.

I’ll be the first to admit that the Apogee style of game does little for me. Still, my personal tastes in no way blind me to the reality that these unprepossessing but well-crafted little games filled a space in the market of the early 1990s that the big publishers were missing entirely as they rushed to cement a grand merger of Silicon Valley and Hollywood and begin the era of the “interactive movie.” While the boxed-games industry went more and more high-concept, with prices and system requirements to match, Apogee kept things simple and fun, as befit their slogan: “Apogee means action!” Apogee games were quick to play, quick to get in and out of; they had some of the same appeal that the earliest arcade games had, albeit implemented in a more user-friendly way, with the addictive addition of a sense of progression through their levels. The traditional industry regarded this sort of thing as hopelessly passé on a personal computer, suitable only for videogame consoles like the Nintendo Entertainment System. But, as the extraordinary success of Nintendo and the only slightly less extraordinary success of Apogee both demonstrated, people still wanted these sorts of games. Their near-complete absence from the boxed-computer-game market left a massive hole which Scott Miller was happy to fill. Younger people with limited disposable income found Apogee particularly appealing; they could buy six or seven Apogee games for the price of one boxed production that would probably just bore them anyhow.

But of course a business model as profitable as Miller’s must soon attract rivals who hope to execute it even better. Already in 1992, a company called Epic MegaGames appeared to challenge Apogee for the title of King of Shareware; they as well employed Scott Miller’s episodic approach, and also echoed Apogee’s proven action-first design aesthetic. Shareware gaming was becoming a thriving shadow industry of its own, right under the noses of the big boys who were still chasing after their grand cinematic fantasias. They would have gotten the shock of their lives if they had ever bothered to compare their slim profit margins to the fat ones of Apogee and Epic. As it was, though, they felt nary an inkling in their ivory towers that a proletarian revolution in ludic aesthetics was in the offing out there on the streets. But even they wouldn’t be able to ignore it for much longer.


This shareware sales chart from July of 1993 shows how dominant Apogee was at that time. Seven out of the top ten games are theirs, with a further two going to Epic MegaGames, their only remotely close competitor. Although the fast-and-simple design aesthetic in which those companies specialized ruled the charts, they pulled with them a long tail of many other types of shareware games, as we’ll see in the next part of this article. The very fact that there existed a sales chart like this one at all says much about how quickly shareware had exploded in a very short time.

Many of you doubtless have an inkling already of where this series of articles must go from here — of how not only the story of Apogee Software but also that of Softdisk Publications will feed directly into that of the most transformative computer game in history. And never fear, I’ll get to all of that — but in my next article rather than this one.

For in addition to that other story which threatens to suck all the oxygen out of the room, there are a thousand other, smaller ones of individual creators being inspired to program all kinds of games and sell them as shareware in the wake of Apogee’s success. Exactly none of them made as much money from their endeavors as did Scott Miller, but some became popular enough to still be remembered today. Indeed, many of us who were around back then still have our obscure little hobby horses from the shareware era that we like to take out and ride from time to time. My personal favorite of the breed might just be Pyro II, a thunderously non-politically-correct puzzle game in which you play a pyromaniac who must burn down famous buildings all over the world. Truly, though, the list of old shareware games that come up in any given discussion is guaranteed to be almost as long as the list of old-timers reminiscing about them. The shareware gaming scene in the aggregate which took off after Apogee’s success touched a lot of people’s lives, regardless of how much money this or that individual game might have earned.

Like the Apogee games, many other shareware titles identified holes in the market which the big publishers, who all seemed to be rushing hell-bent in the exact same direction, were failing to fill. In many cases, these were genres from which the traditional industry had actually done very well in the past, but which it had now judged to no longer be worth its while. For example, the years between the collapse of Infocom in 1989 and the beginning of the Internet-based Interactive Fiction Renaissance circa 1995 were marked by quite a number of shareware text adventures. Likewise, as boxed CRPGs got ever more plot- and multimedia-heavy at the expense of the older spirit of free-form exploration, other shareware programmers rushed to fill that gap. Still others mimicked the look and feel of the old ICOM Simulations graphic adventures, while lots more catered to the eternal need just to blow some stuff up after a long, hard day. There were shareware card games, board games, strategy games, fighting games, action puzzlers, proto-first-person shooters of various stripes, and even ballistics simulators.

In terms of presentation, most of these shareware games were dead ringers for the games that had been sold on store shelves five to ten years earlier. And by the same token, the people who made them in the 1990s were really not all that different from the bedroom programmers who had built the boxed-games industry in the 1980s. Just as many creators of non-game shareware were uncomfortable with time-limited or otherwise crippled software, not all creators of shareware games embraced the Apogee model — not even after it had so undeniably demonstrated its efficacy. Even then, some idealistic souls were still willing to place their faith in people sending in checks simply because it was the right thing to do. All of which is to say that shareware gaming encompassed a vast swath of motivations, styles, and approaches. Apogee, Epic, and that other company which we’ll get to in my next article tend to garner all the press when the early 1990s shareware scene is remembered today, but they were by no means the sum total of its personality.

By way of illustration, I’d like to conclude this article with a short case study of a shareware partnership that didn’t make its principals rich, that didn’t even allow them to quit their day jobs. In fact, neither partner ever really even tried to achieve either of those things. They just made games in two unfashionable styles which they still happened to love, and said games made some other people with the same tastes very happy. And that was more than enough for Daniel Berke and Matthew Engle.


Excelsior Phase I: Lysandia

Matthew remembers his best childhood Christmas ever as the one in 1983, when he was twelve years old and his family got an Apple IIe computer. A sheet of Apple-logo stickers came in the box that housed the computer, and Matthew stuck one of them on his notebook. Soon Daniel, another student at his Los Angeles-area school, noticed the sticker and came over to chat. “I’ve got an Apple II also!” he said. Just like that, a lifelong friendship was born.

The two joined an informal community of fellow travelers, the likes of which could be found in school cafeterias and playgrounds all over the country, swapping tips and exploits and most of all games. Their favorites of the games they traded were the text adventures of Infocom and the Ultima CRPGs of Origin Systems; if the pair’s friendship was born over the Apple II, it was cemented during the many hours they spent plumbing the depths of Zork together. Matthew and Daniel eventually joined the minority of kids like them who took the next step beyond playing and trading games: they started to experiment with making them. Their roles broke down into a classic game-development partnership: the analytical Daniel took to programming like a duck takes to water, while the more artistically-minded Matthew was adept at drawing and storytelling.

So many things in life are a question of timing — not least the careers of game developers. One story which Matthew Engle shared with me when I interviewed him in preparation for this article makes that point disarmingly explicit. In 1986, Daniel, Matthew, and another friend created a BASIC text adventure called Zapracker, which they attempted to sell through their local software stores. Matthew:

We made our own boxes and packaged the game with the floppy disk and the manual, just like Richard Garriott did back in the day. Our box was designed to hang on a peg in a software store. We got on a bus with 25 or so copies and visited a few different stores. We’d say, “Hey, would you like to sell this on consignment? You get half the money and we get half.” A few stores took us up on it, and we sold a few copies.


Zapracker (A Lost Classic?)



This tale is indeed almost eerily similar of that of Richard Garriott selling a Ziploc-bagged Akalabeth through his local Computerland just six years earlier; if anything, our heroes in 1986 would appear to have put more effort into their packaging, and perhaps into their game as well, than Garriott did into his. But in the short span of barely half a decade, the possibility of parlaying a homemade game hanging on a rack in a local computer store into an iconic franchise had evaporated. Instead Daniel and Matthew would have to go another route.

Their game-making efforts were growing steadily more sophisticated, as evinced by Daniel’s choice of programming languages: after starting off in Apple II BASIC, he moved on to an MS-DOS C compiler. Adopting unknowingly the approach that had already been used by everyone from Scott Adams to Infocom, from Telarium to Polarware to Magnetic Scrolls, Daniel wrote an interpreter in C which could present a text adventure written in a domain-specific language of his own devising. Matthew then wrote most of the text for what became Skyland’s Star, a science-fiction scenario.

During the pair’s last year in high school, the Los Angeles school district and the manufacturing conglomerate Rockwell International co-sponsored a contest for interesting student projects in computer science. Once Daniel and Matthew decided to enter it, it gave them a thing which many creators find invaluable: a deadline. They finished up their game, and submitted it alongside the technological framework that enabled it. They were soon informed that their project was among the finalists, and were invited to a dinner and awards ceremony at a fancy hotel. Matthew:

All of the finalists were there, demonstrating their entries. We did a couple of interviews for a local TV station. Then the dinner started. They started running down the list of winners, and before we knew it, it was down to two finalists: my and Dan’s project and another one. Then they announced the other one as second place; we had won. It was quite a night!

Matthew Engle and Daniel Berke win the contest with Skyland’s Star in 1989. That’s Daniel’s Apple II GS running the game; he wrote it on that machine in MS-DOS via a PC Transporter emulator card.

Daniel and Matthew gave little initial thought to monetizing their big win. After finishing high school in 1989, they went their separate ways, at least in terms of physical location: Daniel moved to New York to study computer science, while Matthew stayed in Los Angeles to study film. But they kept in touch, and soon started talking about making another game, this time in the spirit of their other favorite type from the 1980s: an old-school Ultima.

It was 1991 by now, and, fed by the meteoric success of Apogee, shareware games of many different stripes were appearing. Daniel and Matthew as well finally caught the fever. They belatedly released Skyland’s Star as shareware for $15, using it as a sort of test case for the eventual marketing of their Ultima-alike. They were among those noble or naïve souls who eschewed the Apogee model in favor of releasing their whole game at once. Instead of offering the rest of the game as an enticement, Daniel and Matthew offered a printed instruction manual, hint book, and map — nice things to have, to be sure, but perhaps not things that played on the psychological compulsions of gamers so powerfully as the literal rest of a game which they dearly wanted to finish. Daniel and Matthew weren’t overwhelmed with registrations.

Progress on the Ultima-like game, which was to be called Excelsior Phase I: Lysandia, was inevitably slowed by their respective university studies; the biggest chunk of the work got done in the summers of 1991, 1992, and 1993, when Daniel came back to Los Angeles and they both had more free time. Then they would sit for hours many days at their favorite pizza restaurant, sketching out their plans. Matthew did most of the scenario design, graphics, and writing, while Daniel did all of the programming.

Calling themselves by now 11th Dimension Entertainment, they finished and released Excelsior in 1993 as shareware, with a registration price of $20. Once again, they relied on a manual, a hint book, and a map alongside players’ consciences to convince them to register. Although it certainly didn’t become an Apogee-sized success story, Excelsior did garner more attention and registrations than had Skyland’s Star. It was helped not only by its being in a (marginally) more commercially viable genre, but also by its coming into a world that was just on the cusp of the Internet Revolution, with the additional distribution possibilities which that massive change to the way that everyday people used their computers brought with it.

As they were finishing Excelsior, Daniel and Matthew had also been finishing their degree programs. Daniel got a programming job at Electronic Arts after a few false starts, while Matthew started a career in Hollywood that would put him, ironically given the retro nature of Excelsior, on teams making cutting-edge CD-ROM-enabled multimedia products at companies like Disney Interactive. Despite their busy lives, they were both still excited enough by independent game development, and gratified enough by the response to Excelsior I, that they embarked on a sequel in 1994. Whereas Excelsior I had aimed for a point somewhere between Ultima IV and Ultima V, Excelsior II took Ultima VI as its model, with all of the increased graphics sophistication that would imply. For this reason not least, the partners wound up spending fully five years making it, communicating almost entirely electronically.

The sheer quantity of labor which Matthew in particular put into this retro-game with limited commercial prospects could have been motivated only by love. Matthew:

We went all out. I ultimately made about 3800 16 X 16-pixel tiles. It was an exhausting process. For every tile, I had to specify whether you could walk on it or it would block you. There was also transparency; we had layers of tiles, overlaid upon one another. There might be a grass tile, then the player-character tile. Then, if you’re walking through a doorway, for example, the arch at the top of the doorway.

Then, after that exhausting process, began the arduous process of putting the tiles down to create the map, which was 500 X 500 tiles if I’m not mistaken — so, 250,000 tiles to place. Plus all of the town and castle and dungeon maps had to be created.

By the time they released Excelsior Phase II: Errondor in 1999, software distribution had changed dramatically from what it had been six years before. It was now feasible to accept credit-card registrations online, and to offer registrants the instant satisfaction of downloadable PDF documents and the like. The motivating ethic of the original shareware movement was alive and well in its way, but, just as with other types of software, the phrase “shareware games” was soon to fall out of use. The more tactile, personal side of the shareware experience, entailing mailed checks, documents, and disks, had already mostly faded into history. Excelsior II did reasonably well for a niche product in this brave new world, but even before its release Daniel and Matthew knew that it would be their last game together. “We realized we just didn’t have it in us to do an Excelsior III,” says Matthew.

In the end, the two of them sold roughly 500 copies each of Excelsior I and II — “small potatoes” by any standard, as Matthew freely admits. He believes that they made perhaps $5000 to $10,000 in all on their games, after the cost of postage and all those printed manuals was subtracted.

I must confess that I personally have some reservations about the 11th Dimension games. It seems to me that Skyland’s Star‘s scenario isn’t quite compelling enough to overcome the engine’s limited parser and lack of player conveniences, and that the Excelsior games, while certainly expansive and carefully put-together, rely a bit too much on needle-in-the-haystack hunting over their enormous maps. Then again, though, I have the exact same complaints about the Ultima games which Excelsior emulates, which would seem to indicate that Daniel and Matthew actually achieved their goal of bringing old-school Ultima back to life. If you happen to like those Ultima games a little more than I do, in other words, you’ll probably be able to say the same about the Excelsior games. One thing that cannot be denied is that all of the 11th Dimension games reflect the belief on the part of their makers that anything worth doing at all is worth doing well.

Shareware gave a place for games like those of Daniel and Matthew to live and breathe when the only other viable mode of distribution was through the boxed publishers, who interested themselves only in a fairly small subset of the things that games can do and be. Long before the likes of Steam, the shareware scene was the indie-games scene of its time, demonstrating all of the quirky spirit which that phrase has come to imply. While the big boys were all gazing fixedly at the same few points in the middle distance, shareware makers dared to look in other directions — even, as in the case of Daniel and Matthew, to look behind them. In the face of a mainstream industry which seemed hell-bent on forgetting its history, that was perhaps the most radically indie notion of them all.

(Sources: the books Masters of Doom by David Kushner, Rocket Jump: Quake and the Golden Age of First-Person Shooters by David L. Craddock, and Sophistication & Simplicity: The Life and Times of the Apple II Computer by Steven Weyhrich; Computer Gaming World of December 1992, January 1993, March 1993, May 1993, June 1993, July 1993, September 1993, January 1994, February 1994, and June 1994; Game Developer of January/February 1995; PC Powerplay of May 1996; Questbusters of November 1991; Los Angeles Times of February 6 1987; the tape magazine CLOAD of February 1978; the disk magazine Big Blue Disk of January 1988, May 1988, June 1988, March 1989, April 1989, September 1989, and August 1990. Online sources include the archives on the old 3D Realms site, the M & R Technologies interview with Jim Knopf, Samuel Stoddard’s Apogee FAQ, Al Vekovius’s old faculty page at Louisiana State University Shreveport, Stephen Vekovius’s appearance on All Y’all podcast, “Apogee: Where Wolfenstein Got Its Start” at Polygon, Benj Edwards’s interview with Scott Miller for Game Developer, and Matt Barton’s interview with Scott Miller. Most of all, I owe a warm thank you to Matthew Engle for giving me free registered copies of the 11th Dimension games and talking to me at length about his experiences in shareware games.

In the interest of full disclosure as well as a full listing of sources, I have to note that a small part of this article is drawn from lived personal experience. I actually knew Scott Miller and George Broussard in the late 1980s and early 1990s, albeit only in a very attenuated, second-hand sort of way: Scott dated my sister for several years. Scott and George came by my room from time to time to see the latest Amiga games when I was still in high school. Had I known that my sister’s love life had provided me with a front-row seat to gaming history, and that I would later become a gaming historian among other things, I would doubtless have taken more interest in them. As it was, though, they were just a couple of older guys with uncool MS-DOS computers wanting to see what an Amiga could do.

A year and a half to two years after finishing high school, I interviewed for a job at Apogee, which was by then flying high. Again, had I known what my future held I would have paid more attention to my surroundings; I retain only the vaguest impression of a chaotic but otherwise unremarkable-looking office. Scott and George were perceptive enough to realize that I would never fit in with them, and didn’t hire me. For this I bear them no ill will whatsoever, given that their choice not to do so was the best one for all of us; I’m pretty sure I would have been miserable there. I got a job at a record store instead, which remains the best job I’ve ever had, except for the pay.

I believe that the day of that interview in 1992 was the last time I ever saw Scott and George. Scott and my sister broke up permanently shortly thereafter if not before.

The company once known as Apogee, which is now known as 3D Realms, has released all of their old shareware games for free on their website. Daniel Berke and Matthew Engle continue to maintain their old games in updated versions that work with modern incarnations of Windows; you can download them and purchase registrations on the 11th Dimension Entertainment home page.)

 

Tags: , , , ,

The Shareware Scene, Part 1: The Pioneers

The digital society which we’ve created over the last few decades has upended many of our traditional notions about commerce. Everyday teenagers now stress over their ratings and advertising revenues on YouTube; gamers in “free” games pay staggering sums for the privilege of advancing through them a little faster (wasn’t the actual playing supposed to be the point of a game?); “clicks” and “likes” have become commodities that are traded in the same way that soybean futures are in the “real” world; consumers have become speculators in their own future entertainment on crowd-funding platforms like Kickstarter; a writer like me can ask for support from readers like you to allow me to make content that I then give away for free. (Thank you for that!) And, in the most direct parallel to our main topic for today, even some of the biggest corporations on the planet have learned to give away their products for free, then ask us to pay for them later.

Some of these new modes of commerce reflect the best in us, some perhaps the very worst. They all share in common, however, the quality of being markedly different from the old model wherein you paid someone an upfront amount of money and got some concrete good or service in exchange. As those of you with elderly parents or grandparents may well have learned, our modern digital economies have departed so far from that model in some areas that just explaining how they work to someone still wedded to the old ways can be a daunting task indeed. (I know that my 86-year-old father has literally no idea what I do all day or how I can possibly be earning money from it…) Maybe we too should ask the question that so many of our elders are already asking themselves every day: exactly how did we get from there to here so quickly?

It’s a bigger question than any one article can possibly answer. Still, it does turn out that we can trace at least one point of origin of our strange new ways of commerce to a trio of American pioneers who, all within a year of one another, embraced a new model for selling software — a model which has, one might say, taken over the world.


Andrew Fluegelman

The first of our pioneers is one Andrew Fluegelman. Born in 1943, Fluegelman within his first 35 years of life finished law school, passed the Bar exam, took up and then gave up corporate law, and settled into a whole new career as the owner, editor, and sole employee of the Headlands Press, a boutique book publisher in Marin County, California. He worked from time to time with the techno-utopian visionary Stewart Brand on The Whole Earth Catalog, and even the books he edited and published on his own had much the same counter-cultural DIY flavor: The New Games Book (a selection of friendly outdoor sporting activities for groups of adults), How to Make and Sell Your Own Record, Worksteads: Living and Working in the Same Place. Yet for all their hippie bona fides, Headlands books went out under the larger imprint of the international publishing titan Doubleday. The ability to speak the language of both the idealistic dreamer and the everyday businessperson proved a vital asset for Fluegelman throughout his life.

Like Brand and so many others of a similar bent, Fluegelman saw great potential in the personal computer as a force for social liberation. Therefore in 1981, before ever actually purchasing a computer of his own, he signed a contract with Doubleday to embark on a new book project, this time with himself in the role of coauthor rather than just editor. It was to be an exploration of the role of computers in the writing process, in terms of both current practicalities and future potential. He would of course need to buy himself a computer to complete the project. Just as he was about to pull the trigger on an Apple II, the IBM PC was announced. “I took one look at it and just had this gut feeling,” he said in a later interview. “This is what I want.”

While he waited for the machine he had ordered to arrive, Fluegelman, who had never touched a computer before in his life, started teaching himself BASIC from books. Even after the computer came in, learning to word-process on it remained on the back burner for a time while he continued to pursue his new passion for programming. His bible was that touchstone of a generation of amateur programmers, David Ahl’s million-selling book BASIC Computer Games. Fluegelman:

I got Ahl’s [book], and I said, “This is just what I want to do.” I typed [one of the games] in. It took me a day to get the bugs out and get the thing to run. And as soon as I saw the program running, I immediately started thinking, “Well, gee, I’d really like to add up the scores, and say this, and make a little noise…” I’d look through the book, and I’d say, “Oh, there’s something I could use. What happens if I stick it in there?”

I’m a real believer in the Berlitz method of programming. Which is: you learn how to say, “Please pass the salt,” [then] you look in the dictionary and look up the word for “pepper,” stick it in there, and, by God, someone gives you the pepper. And you know you’re making progress. Purely trial and error.

I liked it a lot. I abandoned all bodily functions for about a month.

Programmers are born as much as made. You either feel the intrinsic joy of making a machine carry out your carefully stipulated will or you don’t; the rest is just details. Clearly Fluegelman felt the joy.

Still, the book project wouldn’t wait forever. Fluegelman and Jeremy Joan Hewes, his coauthor, had the idea that they would indeed write the book together, but with each working on his or her own machine from his or her own office. They would share their files electronically; it would be one more way of practicing what they intended to preach in the book proper, about the new methods of working that were unlocked by the computer. But Hewes had an older CP/M computer rather than a flashy new IBM PC, and this stopped them in their tracks — for the only telecommunications package currently available for the latter came from IBM themselves, and could only swap files using IBM’s proprietary protocols. Fluegelman thus found himself in the ironic position of being able to trade files with an IBM mainframe, but not with most of his peers in the world of personal computing. He could see only one solution:

[I] started out to write a communications program. I said, “Gee, I’d really like to do this, and I’d like to do that, and we should have a dialing directory, and we should have some macros…” And I just kept adding to it for my own use.

We eventually typeset the book using the program I wrote. In the process, I gave it to a lot of my friends, and they started using it. At the time it was the only program that let you do these things on the IBM PC; this was the early spring of 1982. And inevitably one of my friends said, “You know, you really ought to publish that.”

If I hadn’t been in the publishing business for eight years, I would have gone the traditional route — find a publisher, royalties — but I’d been through all that, and I’d seen the pitfalls and all the ways things can get derailed. And this was kind of a new medium, and I was still very exhilarated by it. And I said, having had all this fun, I just can’t go the same publishing route that I’ve gone before.

Throughout his life, Fluegelman had a special relationship with San Francisco’s Golden Gate Bridge. “I think it’s a power point,” he said once only semi-facetiously. “I have more inspirations driving across the Golden Gate Bridge…” One day shortly after finishing his program, he was driving across while thinking back to the pledge drive he had seen the night before on the local PBS television station.

My American readers will doubtless recognize the acronym, but, for the benefit of those of you in other places: PBS stands for “Public Broadcasting System.” It’s a network of over-the-air television stations which show children’s programs (most famously Sesame Street) as well as documentaries, news, and high-culture content such as symphony concerts and dramatizations of classic literature. Although the stations are free to watch, they are unlike other free stations in that they don’t sustain themselves with advertising. Instead they rely on a limited degree of taxpayer funding, but most of all on donations, in any amount and frequency, from viewers who appreciate their content and consider it worth supporting. In some ways, then, PBS can be called the great forefather of the many non-coercive digital-funding models of today. And indeed, the tale of Andrew Fluegelman makes the otherwise tangential thread that runs from PBS to so many modern Internet economies much more direct.

For, driving across his favorite bridge that day, Fluegelman had a PBS-inspired epiphany. He would market his little telecommunications package under the name of PC-Talk, using a method no one had ever dreamed of before.

I said, I’ll just set it out there, encourage people to use it. If they like it, I’ll ask them to send me some money. [He set the initial “suggested” donation at $25.]

So, I sent out the first version of the program that way. I put some notices on The Source and CompuServe: I’ve got this program, I wrote it, it’ll do this and this. It’s available for free, but if you like it, send me the money. And even if you don’t like it, still make copies for your friends because maybe they’ll like it and send some money.

The response was really overwhelming. I was getting money! I remember on the first day I got a check in the mail, and I just couldn’t believe it. I almost got driven out of business filling orders. At the time I was still producing books, and software programming was my own late-night thing. And suddenly I was standing there all day filling orders and licking stamps and sending things out, and I had to hire someone to start doing that. I was totally unprepared for it.

While I had written the program to work very well in my own situation, once you start sending software out into the world you start hearing about people with all sorts of crazy circumstances that you haven’t anticipated at all. I think if I had tried to publish this first version of the program [conventionally], people would have reacted very negatively. But they didn’t because I’d sent it out in this unrestricted way. So people would write back and say, “This is great, but why don’t you add this? Why don’t you try this?” In many cases people even helped me re-program to deal with their situations. And I ended up calling that “freeback” instead of “feedback” because it was really getting free support back from the community.

The usually savvy Fluegelman did make a couple of puzzling decisions during these early days. The first was to name his revolutionary scheme for software distribution “Freeware.” If you twist your synapses around just right, you can almost arrive at the sense he was trying to convey, but under any more straightforward reading the name becomes dangerously counter-intuitive. Thousands upon thousands of developers who came after Fluegelman would work desperately, but only partially successfully, to make people understand that their software wasn’t in fact “free” in the sense that using it regularly placed no ethical demand upon the user to financially compensate the creator.

Then, having come up with such a flawed name, the lawyer in Fluegelman came to the fore: he went out and trademarked it. He imagined creating a proprietary “Freeware catalog,” collecting a lot of software that was marketed on the same model. Accordingly, he also included in his program’s liner notes a request for other programmers with useful software of their own to contact him, thereby to join him in a “unique marketing experiment.”

In the meanwhile, PC-Talk’s success was such that it quickly caught the attention of the business-computing mainstream. Already in August of 1982, the widely read InfoWorld magazine published an article on the subject, under the heading “CA man likens ‘Freeware’ to user-supported TV.” Fluegelman noted sensibly therein that, rather than fighting against the natural desire people had to make copies of their software and share them with their friends, Freeware leveraged it. He estimated that five copies of PC-Talk were made for every one that was downloaded directly from one of the commercial online services or sent out on disk by himself in response to a mailed request — and, unlike a conventional software publisher, he thought this ratio was just great.


Jim Knopf/Button

Our second pioneer was a far more experienced programmer than Fluegelman. Seattle-area resident Jim Knopf was only one year older than our first pioneer, but had already worked for IBM for many years as a systems analyst by the dawn of the microcomputer era. He built his first personal computer himself in 1978, then sold it to partially finance an Apple II. Among other things, he used that machine to keep track of the names and addresses of his church’s congregation. Knopf later wrote that “I liked what I produced so much [that] the program itself became a hobby — something I continued to work on and improve in my spare time.”

When the IBM PC was released in 1981, Knopf sold his Apple II and bought one of those instead. His first project on his new computer was to write a new version of his database program. As soon as said program was far enough along, Knopf started sharing it with his colleagues at IBM. They in turn shared it with their friends, and soon the database, which he called Easy File, went beyond his office, beyond Seattle, beyond Washington State. People encouraged him to upload it to the early online services; this he obligingly did, and it spread still faster.

Knopf was gratified by its popularity, but also bothered by it in a certain way. His database was still under active development; he was improving it virtually every week. But how to get these updates out to users? He included a note in the program asking users to “register” themselves so he could keep in touch with them; he maintained the resulting mailing list in Easy File itself. Yet keeping everyone up to date was prohibitively complicated and expensive in a world where most software was still passed around on floppy disks — a world where the idea of a program as a changing, improving entity rather than a static tool that just was what it was barely existed in the minds of most people. “How could I identify which of the users were serious ones – those that desired and required enhancements?” Knopf later wrote about his mindset at the time. “How could I afford to send mailings to notify them of the availability of improvements?”

So, in September of 1982, Knopf made a few moves which would define his future. First, he changed his own name for purposes of business. Worried that his Germanic surname would be too difficult for potential customers to pronounce and remember, he quite literally translated it into English. “Knopf,” you see, is the German word for the English “button” — and so Jim Knopf became Jim Button. (I’ll refer to him by the latter name from now on. Coincidentally, “Jim Knopf” is also the name of a character from a popular series of children’s books in Germany.) Next, he registered a company that referenced his new nom de plume: Buttonware. And, last but by no means least, he added a new note to his program. “I would ask those who received it to voluntarily send a modest donation to help defray my costs,” remembered Button later. “The message encouraged users to continue to use and share the program with others, and to send a $10 donation only if they wanted to be included in my mailing list.”

The very first person to contact Button in response told him that his approach was just the same as the one used by another program called PC-Talk. Button found himself a copy of PC-Talk, read its pitch to other programmers interested in joining the ranks of Freeware, and sent his own Easy File to Andrew Fluegelman. Fluegelman phoned Button excitedly on the same day that he received the package in the mail. The two of them hit it off right away.

While they waited for Fluegelman to find enough other quality software to make up his Freeware Catalog, the two agreed to form a preliminary marketing partnership. Button would rename his Easy File to PC-File and raise its price to $25 to create a kinship between the two products, and each program would promote the other, along with the Freeware trademark, in its liner notes. Button:

My wife said I was “a foolish old man” if I thought even one person would voluntarily send me money for the program. I was more optimistic. I suspected that enough voluntary payments would come to help pay for expansions to my personal-computer hobby – perhaps several hundred dollars. Maybe even a thousand dollars (in my wildest dreams!).

As it happened, he would have to learn to dream bigger. Like PC-Talk, PC-File turned into a roaring success.


The founding staff of PC World magazine. Andrew Fluegelman stands in the very back, slightly right of center.

Both programs owed much of their early success to the extracurricular efforts of the indefatigable Andrew Fluegelman. Shortly after releasing PC-Talk to such gratifying interest, Fluegelman had given the final manuscript of his word-processing book to Doubleday, who would soon publish it under the title Writing in the Computer Age. Still as smitten as ever by the potential of personal computing, he now embarked on his third career: he became a full-time computer journalist. He initially wrote and edited articles for PC Magazine, the first periodical dedicated to the IBM PC, but got his big break when he was asked to join the staff of a new rival known as PC World. Within a few issues, Fluegelman became editor-in-chief there.

Not coincidentally, the magazine lavished glowing coverage upon PC-Talk and PC-File. The latest version of Button’s program, for example, got a six-page feature review — as much space as might be devoted to a major business-software release from the likes of Microsoft or VisiCorp — in PC World‘s September 1983 issue. “What was previously a very desirable program is now just about mandatory for much of the PC population,” the review concluded. “If you use PC-File and don’t send Jim Button a check, the guilt will kill you. And it should.”

Button and his family were vacationing in Hawaii when the review appeared. Button:

The response was overwhelming. Our house sitter had to cart the mail home daily in grocery sacks.

When we arrived home, the grocery sacks were strewn all over the basement floor. We had to step over and around them just to get into our basement office. My son, John, worked days, evenings, and weekends just catching up on the mail. Life would never be the same for any of us!

Button would later date the beginning of Buttonware as a real business to these events. Nine months later, he quit his job with IBM, by which time he was making ten times as much from his “moonlighting” gig as from his day job.

Ironically, though, Button had already parted ways to some extent with Fluegelman by the time that life-changing review appeared. Fluegelman was finding it difficult to focus on his idea of starting a Freeware catalog, given that he was already spending his days running one of the biggest magazines in the computer industry and his evenings improving and supporting PC-Talk. Button:

Andrew got questions about my program and I got questions and requests about his. Checks were sent to the wrong place. The work required to correct all this grew exponentially. We had to make the separation.

Button came up with his own moniker for the distribution model he and Fluegelman had pioneered: “user-supported software.” That name was perhaps less actively misleading than “Freeware,” but still didn’t really get to the heart of the matter. Other names that were tried, such as “quasi-public domain,” were even worse. Luckily, the perfect moniker — one that would strike exactly the right note, and do it in just two syllables at that — was about to arrive along with Bob Wallace, the third principal in our little drama.


In this iconic picture of the early Microsoft, Bob Wallace is in the middle of the back row.

Like Jim Button, Bob Wallace was based in Seattle, and was a veteran of the kit era of personal computing. In fact, his experience with microcomputers stretched back even further than that of his counterpart: he had been the founder in 1976 of the Northwest Computer Society, one of the first hobbyist user groups in the country. Shortly thereafter, he was recruited from the computer store where he worked by Paul Allen, whereupon he became Microsoft’s ninth employee. In time, he became the leading force behind Microsoft’s implementation of the Pascal programming language. But, as an unreformed hippie whose social idealism paralleled his taste for psychedelic drugs, he found both Microsoft’s growing bureaucracy and its founders’ notoriously sharp-elbowed approach to business increasingly uncongenial as time went on. In March of 1983, he was for the first time refused permission to barge into Bill Gates’s office unannounced to argue some technical point or other, as had always been his wont. It was the last straw; he quit in a huff.

Taking note of Fluegelman and Button’s success, he wrote a word processor using his own Pascal implementation, and released it as PC-Write under the same payment model. To encourage its distribution, he added an extra incentive. He sent to any user who mailed in the suggested donation of $75 a special registration code, which she was then expected to enter into her copy of the program. When she gave this copy to others, it was thus tagged with its source. If any users of those copies sent in the fee, Wallace would send $25 to the user whose tag it bore; he later claimed that at least one person made $500 in these commissions. In its roundabout way, the scheme pioneered the idea of not just asking users for a donation out of the goodness of their hearts, but marking and altering the functionality of the software for those who sent in the payment, all through the use of the soon-to-be ubiquitous mechanism of the registration code.

But Wallace’s biggest contribution of all came in the form of a name. And therein lies a tale in itself.

Back in July of 1982, an InfoWorld magazine editor named Jay Lucas had started a column on “freeware” without being aware of Fluegelman’s counter-intuitive use of that term; Lucas took the word to mean any and all freely distributed software, whether the author asked for an eventual payment in return or not. The following spring, Fluegelman contacted the magazine to inform them of his trademark and ask them to cease and desist from violating it. So, Lucas launched a contest among his readers to come up with a new name. He reported in the InfoWorld dated May 30, 1983, that “at least a dozen” readers had sent in the same suggestion: “shareware.” He announced that he would be using this name henceforth. At the time, he still made no distinction between “free” software that came with financial strings attached and software that didn’t. He was, in other words, effectively using “shareware” as a synonym for all types of freely distributed software.

But when Bob Wallace saw the name, he knew that it was perfect for his distribution model: pithy, catchy, with all the right intimations. He contacted Lucas, who told him that he was free to use it; InfoWorld made no legal claim on the name. So, when PC-Write went out later that year, it described itself as “shareware.”

In early 1984, Softalk IBM, a brief-lived spinoff of a much-loved Apple II magazine, hired one Nelson Ford to write a regular column about “public-domain software.” Unsure what he should call the distribution model being used by each of Fluegelman, Button, and Wallace under a different name, he started off by employing the manifestly inadequate placeholder “quasi-public domain.” But in his May 1984 column, he announced a contest of his own: “A free disk of software and widespread publicity for the person sending in the best name for quasi-PD, contribution-suggested software. Since Andy won’t let anyone use ‘freeware,’ we’ll have to come up with another catchy name.”

He received such dubious suggestions as “conscience-wear” — “the longer you use the software, the more it wears on your conscience if you do not pay” — and “tryware.” But, just as Lucas had over at InfoWorld, Ford kept getting most of all the suggestion of “shareware.” Unaware of the name’s origin at InfoWorld, but well aware of its use by Wallace, he suspected that “shareware” would be as impossible for him to appropriate as “freeware.” Nevertheless, he inquired with Wallace — and was pleasantly surprised to be told that he was more than welcome to it. Ford announced the new name in the August 1984 issue of Softalk IBM.

It’s questionable whether the actual column in which he made the announcement was all that influential in the end, given that the issue in which it appeared was also the last one that Softalk IBM ever published. Still, Ford himself was a prominent figure online and in user-group circles. His use of the name going forward in those other contexts, combined with that of Jay Lucas in InfoWorld, probably had a real impact. Yet one has to suspect that it was PC-Write itself which truly spread the name hither and yon.

For, perhaps because a word processor, unlike a telecommunications program or a database, was a piece of software which absolutely every computer owner seemed to need, Wallace was even more successful with his first piece of shareware than the two peers who had beaten him onto the scene had been with theirs. The company he founded, which he called QuickSoft, would peak with annual sales of more than $2 million and more than 30 employees, while PC-Write itself would garner more than 45,000 registered users. Staying true to his ideals, Wallace would always refuse to turn it into a boxed commercial product with a price tag in the hundreds of dollars, something many conventional software publishers were soon pressuring him to do. “I’m out to make a living, not a killing,” he said.

Jim Button was less inclined to vocalize his ideals, but one senses that much the same sentiment guided him. Regardless, he too did very well for himself. Already by 1984, he was getting approximately $1000 worth of checks in the mail every day. While PC-File itself never garnered quite the popularity of PC-Write — about 7000 users registered their copies in the end — Button soon branched out well beyond that first effort. Buttonware would peak with annual sales of $4.5 million and 35 employees.

Those who jumped on the shareware bandwagon afterward would find it very difficult to overtake these two pioneers in terms of either income or market impact. As late as 1988, Compute! magazine judged that the two most impressive shareware products on the market were still PC-File and PC-Write, two of the first three ever released. But PC-Talk would have a shorter lifespan — and, much more tragically, so would its creator.


The founding staff of Macworld magazine. Andrew Fluegelman can just be seen at the very back, slightly left of center.

The PC World issue with the landmark review of PC-File was still on newsstands when Andrew Fluegelman had his next life-changing encounter with a computer: he was one of a select few invited to Apple for an early unveiling of the new Macintosh. He was so smitten by this whole new way of operating a computer that he immediately began lobbying for a companion magazine to PC World, to be named, naturally enough, Macworld. Its first issue appeared in time to greet the first Macintosh buyers early in 1984. Fluegelman held down the editor-in-chief job there even as he continued to fill the same role at PC World.

He was utterly unfazed to thus be straddling two encampments between which Apple was trying to foment a holy war. He spoke about the differences between the two aesthetics of computing in an interview that, like so much of what he said back then, rings disarmingly prescient today:

People [say the Macintosh is] more of a right-brain machine and all that. I think there is some truth to that. I think there is something to dealing with a graphical interface and a more kinetic interface; you’re really moving information around, you’re seeing it move as though it had substance. And you don’t see that on [an IBM] PC. The PC is very much a conceptual machine; you move information around the way you move formulas, elements on either side of an equation. I think there’s a difference.

I think the most important thing is to realize that computers are tools, that unless you want to become an expert programmer, the main thing that a computer provides you is the ability to express yourself. And if it’s letting you do that, if you now have hands on those tools, then you can be a force for good out in the world, doing the things that you used to do, that you’re still doing — representing your own ideas, not changing your persona to suddenly become a “computer person.”

And I think that may be the advantage of the Macintosh.

At bottom, Fluegelman himself wasn’t really a “computer person” in the sense of Button and Wallace, both of whom had been programming since the 1960s. And then, running not one but two of the biggest computer magazines in the country could hardly leave him with much free time. Thus PC-Talk was somewhat neglected, and other telecommunications software — some of it released under the burgeoning shareware model — took its place. Fluegelman accepted this with equanimity; he was never inclined to stay in one place for very long anyway. In an interview conducted at the very first Macworld Expo in January of 1985, he spoke of his excitement about the future — both his personal future and the world’s technological future:

I think this is just the next adventure for a lot of us to get into. I know the intellectual excitement the [computer] has caused for me. It’s really been a rejuvenation, and anything that gets you that pumped up has got to be something that you can use in a good way.

I also think that people who do get excited about computers and involved in all this are almost uniformly intelligent, interesting people. I never have been as socially involved, as interconnected with as many different kinds of people, as when I started getting involved with computers. I think that the easier it is for people to express themselves, and to share their views with others, that’s got to be a good democratic force.

It’s great to go along for 40 years and still find your life changing and new things happening. It makes you look forward to what’s going to happen when you’re 60, what’s going to happen when you’re 80.

Quotes like these are hard to square with what happened to Andrew Fluegelman just six months later.

On July 6, 1985, Fluegelman left his office as usual at the end of a working day, but never arrived at his home; he simply disappeared. A week later, police discovered his Mazda hatchback parked near the toll plaza at the entrance to the Golden Gate Bridge. They found a note addressed to his wife and family inside, but its contents have never been published. Nevertheless, we can piece some things together. It seems that his health hadn’t been good; he’d been suffering from colitis, for which he’d begun taking strong medication that was known to significantly impact many patients’ psychology — and, indeed, friends and colleagues in the aftermath mentioned that he’d been acting erratically in the final few days before his disappearance. There are reports as well that he may have recently received a cancer diagnosis. At any rate, the implications seem clear: the 41-year-old Andrew Fluegelman went back to one of his favorite places in the world — the bridge where he had invented the revolutionary concept of shareware if not the name — and jumped 220 feet into the water below. His body was never recovered.

The legacy of those brief four years between his discovery of the joys of BASIC and his death by suicide encompasses not only the shareware model but also PC World and especially Macworld. It went on to become arguably the most literate, thoughtful computer magazine ever, one of the vanishingly few to evince a genuine commitment to good writing in the abstract. In doing so, it merely held to the founding vision of its first editor-in-chief. One can’t help but wonder what else this force of nature might have done, had he lived.


At shareware’s peak in the early and mid-1990s, at least one glossy newsstand magazine was devoted exclusively to the subject in quite a number of countries.

By that fateful day in 1985, shareware was already becoming an unstoppable force, with more and more programmers throwing their hats into the ring. To be sure, most of them didn’t build seven-figure businesses out of it, as Jim Button and Bob Wallace did. Inevitably for a distribution model that placed all of its quality control on the back end, much of the shareware that was released wasn’t very good at all. Yet even many of those who didn’t get to give up their day jobs did receive the satisfaction and capitalistic validation of being paid real money, at least every once in a while, for something they had created. In time, this loose-knit band of fellow travelers began to take on the trappings of a movement.

To wit: in February of 1987, a “Meeting of Shareware Authors” assembled in Houston to chat and kibitz about their efforts. Out of that meeting grew the Association of Shareware Professionals six months later, with founding chairmen Jim Button and Bob Wallace. In the years that followed, the ASP published countless shareware catalogs and pamphlets; they even published a 780-page book in 1993 called The Shareware Compendium, which represented the last attempt anyone ever made to list in one place all of the staggering quantity of shareware that was available by that point. But perhaps even more importantly, the ASP acted as a social outlet for the shareware authors themselves, a way of sharing hints and tips, highs and lows, dos and don’ts with one another.

There arose more big success stories out of all this ferment. For example, one Phil Katz was responsible for what remains today the most tangible single software artifact of the early shareware scene. In 1986, he started a little company called PKWare to distribute a reverse-engineered shareware clone of ARC, the most popular general-purpose compression program of the time. When the owners of ARC came after him with legal threats, he switched gears and in 1989 released PKZIP, which used an alternative, much more efficient compression format of his own design. Although he sold PKZIP as shareware — $25 donation requested, $47 for a printed manual — he also scrupulously documented the compression format it used and left the door open for other implementations of it. He was rewarded with sweet revenge: ZIP quickly superseded ARC all across the digital world. Striking a fine balance between efficiency and ease of implementation, not to mention being unentangled by patents, it has remained the world’s most common compression format to this day, a de facto standard that is now built right into many operating systems.

Another success story is less earthshaking and more esoteric, but instructive nonetheless as an illustration of just how far the shareware model could be stretched. In a time when desktop publishing was one of the biggest buzzwords in computing, a veteran of print publishing named Gary Elfring took a hard look at the current state of digital fonts, and noted how expensive those offered by major foundries like Adobe tended to be. He started Elfring Soft Fonts to distribute shareware typefaces, and made a lot of money from them in the late 1980s and early 1990s, before the established vendors of word processors and operating systems got their acts together in that department.

I could go on and on with such stories, but suffice to say that many people did very, very well from shareware during its heyday.

Like any movement, shareware also came complete with internecine disputes. One constant source of tension were the many third parties who collected shareware which they didn’t own on physical media for distribution. As early as 1984, the librarian of the Silicon Valley Computer Society users group caused an uproar when he started selling floppy disks filled with shareware for $6 apiece, a figure somewhat above the cost of blank disks and postage alone. “It’s not legal,” said Andrew Fluegelman flatly at the time. “I’m opposed to it because when somebody spends even $6 for a disk, they feel they’ve paid for it and see little reason to pay again for it. I’m concerned about somebody building a product around my product.” But, in a rare break with Fluegelman, Jim Button had a different point of view: “With that [price], all he’s doing is helping me distribute sample copies.” He continued in later years to believe that “distribution is one of the cornerstones of sales. All other factors being equal, if you can double your distribution you will double your sales.”

In the end, Button’s point of view carried the day. Shareware authors were never entirely comfortable with the “parasites” who profited off their software in this way, and Fluegelman’s worry that many users would fail to distinguish between paying a cataloger and paying the actual creator of the software was undoubtedly well-founded. Yet the reality was that the vast majority of computer owners would not go online until the World Wide Web struck in the mid-1990s. In the meantime, floppy disks — and eventually CD-ROMs — were the only realistic mechanism for reaching all of these otherwise isolated users. The catalogers and the authors had to learn to live with one another in an uneasy symbiotic relationship.

Another, even more bitter dispute within the ranks of shareware was touched off near the end of the 1980s, when some authors started opting to “encourage” registration by releasing crippled versions of their software — programs that only functioned for a limited time, or that blocked access to important features — that could only have their full potential unlocked via the input of a valid registration code. Although Bob Wallace had ironically pioneered the idea of a registration code that was input directly into a program, he and most of the other early shareware pioneers hated to see the codes used in this way. For the socially conscious Wallace, it was a moral issue; his vision for shareware had always been to collect payment from those who could pay, but not to deprive those who couldn’t of quality software. Button as well preferred to rely upon the honor system: “Don’t get off on the wrong foot with your users with things like crippled programs, time-limited programs, and other negative incentives to register your software. If you can’t trust your users to pay for truly good software, then you should stay out of the shareware business.” Under the influence of these two founding chairmen, the ASP refused for a time to admit shareware authors who freely distributed only crippled versions of their software.

In the end, though, the ASP would be forced to relax their stance, and “crippleware” would become nearly synonymous with shareware in many circles, for better or for worse. In 1989, Nelson Ford, the earlier popularizer of the name “shareware,” set up a service for authors which let people register their software over the telephone using their credit cards instead of having to mail checks or cash through the post. The ease of passing out registration codes this way, without having to send out disks and/or documentation or do any additional work at all, probably led many more authors to go the crippleware route. In fairness to those who decided to implement such schemes, it should be noted that they didn’t have the advantages that went along with being first on the scene, and were often marketing to less committed computer users with a less nuanced sense of the ethics of intellectual property and the sheer amount of work that goes into making good software of any stripe.


In a strange sort of way, Windows 10 is actually a shareware product.

The buzz around shareware gradually faded in the second half of the 1990s, and by soon after the turn of the millennium the term was starting to seem like an antiquated relic of computing’s past. Even the Association of Shareware Professionals eventually changed their name to the Association of Software Professionals, before doddering off entirely. (A website still exists for the organization today, but it now bills itself as strictly an historical archive.)

Yet it would be profoundly inaccurate to say that shareware died as anything but a name. On the contrary: it conquered the world to such an extent that it became the accepted means of distributing much or most software, and as such is no longer in need of any particular name. Just about everyone is selling shareware today — not only the sometimes honest, sometimes dodgy small vendors of “try before you buy” utilities of many types, but also some of the biggest corporations in the world. Microsoft, for example, now distributes Windows using what is essentially the shareware model: users download a copy for free, enjoy a limited trial period, and then need to purchase a registration code if they wish to go on using it. Many other software developers have stuck to their idealistic guns and put their creations out there uncrippled, asking for a donation only from those who can afford it. And, as I mentioned to open this piece, the overarching spirit of shareware, if you will, has infected countless digital economies that don’t involve downloads or registration keys at all.

Jim Button and Bob Wallace got to see some of these later developments, but they weren’t active participants in most of them. Wallace gradually divested himself from Quicksoft after 1990. Ever the hippie, he devoted his time to the study and promotion of psychedelic drugs and other “mind-expanding technologies” via publications and foundations. He died in 2002 at age 53 from a sudden attack of pneumonia that may or may not have been related to his quest for chemical transcendence.

Jim Button (né Knopf) very nearly died even younger. At the age of 49 in 1992, he had a major heart attack. He survived, but wasn’t sure that he could continue to cope with the stress of running his shareware business. At the time, big players like Microsoft were pouring enormous resources into their own productivity software, and the likes of little Buttonware had no real hope of competing with them anymore. This combination of factors prompted Button to slowly wind his company down; after all, his decade in shareware had already left him with enough money to enjoy a comfortable early retirement. He died in 2013, a few weeks shy of his 71st birthday. He continued until the end to downplay his role in the evolution of software distribution and digital culture. “I’m not a visionary man,” he said. “I never saw the future, but I was lucky enough to be in the right place at the right time, with the right ideas and a proper amount of energy.”

Some might say that the “right ideas” are synonymous with vision, but no matter; we’ll let him keep his modesty. What he and his fellow pioneers wrought speaks for itself. All you have to do is look around this place we call the Internet.

(Sources: the books The New Games Book by the New Games Foundation, Writing in the Computer Age by Andrew Fluegelman and Jeremy Joan Hewes, and Gates by Stephen Manes and Paul Andrews; Softalk IBM of May 1984, June 1984, July 1984, and August 1984; Byte of June 1976, June 1983, July 1984, March 1985, and September 1987; 80 Computing of May 1987; Ahoy! of February 1984; CompuServe Magazine of December 1990 and March 1992; Family Computing of March 1984; InfoWorld of July 5 1982, August 23 1982, December 20 1982, March 7 1983, May 30 1983, June 27 1983, July 30 1984, September 17 1984, October 22 1984, July 29 1985, December 23 1985, August 25 1986, and December 7 1987; MicroTimes of May 1985 and August 1985; Games Machine of October 1987; Compute! of February 1985 and June 1988; PC World of September 1983; Macworld premiere issue. Online sources include The Association of Software Professional’s website, Michael E. Callahan’s “History of Shareware” on Paul’s Picks, The Charley Project‘s entry on Andrew Fluegelman’s disappearance, the Shareware Junkies interview with Jim “Button” Knopf, “Jim Button: Where is He Now?” at Dr. Dobb’s, the M & R Technologies interview with Jim Knopf, and the Brown Alumni Monthly obituary of Bob Wallace. My thanks to Matthew Engle for giving me the picture of Shareware Magazine included in this article.)

 
 

Tags: , ,

The (7th) Guest’s New Clothes

Once upon a time, two wizards decided to remake the face of computer gaming with the help of a new form of magic known as CD-ROM. They labored for years on their task, while the people waited anxiously, pouncing upon the merest hint the wizards let drop of what the final product would look like.

At long last — well after the two wizards themselves had hoped — the day of revelation came. Everyone, including both the everyday people and the enlightened scribes who kept them informed on the latest games, rushed to play this one, which they had been promised would be the best one ever. And at first, all went as the wizards had confidently expected. The scribes wrote rapturously about the game, and hordes of people bought it, making the wizards very rich.

But then one day a middle-aged woman, taking a break from reckoning household accounts by playing the wizards’ game, said to her husband, “You know, honey, this game is really kind of slow and boring.” And in time, a murmur of discontent spread through many ranks of the people, gaining strength all the while. The cry was amplified by a disheveled young man with a demon of some sort on his tee-shirt and a fevered look in his eyes: “That’s what I’ve been saying all along! The wizards’ game sucks! Play this one instead!” And he hunched back down over his computer to continue playing his very different sort of game, muttering something about “gibs” and “frags” as he did so.

The two wizards were disturbed by this growing discontent, but resolved to win the people over with a new game that would be just like their old one, except even more beautiful. They worked on it too for years to make it as amazing as possible. Yet when they offered it to the people, exponentially fewer of them bought it than had bought their first game, and their critics grew still louder and more strident. They tried yet one more new game of the same type, yet more beautiful, but by now the people had lost interest entirely; few could even be bothered to criticize it. The wizards started bickering with each other, each blaming the other for their failures.

One of the wizards, convinced he could do better by himself, went away to make still more games of the same type, but the people remained stubbornly uninterested; he finally gave up and found another occupation. From time to time, he tries again to see if the people want another game like the one they seemed to love so much on that one occasion long ago, but he is invariably disappointed.

The other wizard — perhaps the wiser of the two — said, “If you can’t beat ’em, join ’em.” He joined the guild that included the violent adolescent with the demon on his shirt, and enjoyed a return to fortune if not fame.

Such is the story of Trilobyte Games in a nutshell. Today, we remember 1993 as the year that Cyan Productions and id Software came to the fore with Myst and Doom, those two radically different would-be blueprints for gaming’s future. But we tend to forget that the most hyped company and game of the year were in fact neither of those pairings: they were rather Trilobyte and their game The 7th Guest. Echoing the conventional wisdom of the time, Bill Gates called The 7th Guest “the future of multimedia,” and some even compared Graeme Devine and Rob Landeros, the two “wizards” who had founded Trilobyte together, to John Lennon and Paul McCartney. Sadly for the wizards, however, The 7th Guest had none of the timeless qualities of the Beatles’ music; it was as of its own time as hula hoops, love beads, or polyester leisure suits were of theirs.


Rob Landeros and Graeme Devine

Unlike their alter egos in the Beatles, Graeme Devine and Rob Landeros grew up in vastly different environments, separated not only by an ocean but by the equally enormous gulf of seventeen years.

Born in Glasgow, Scotland, in 1966, Devine was one of the army of teenage bedroom coders who built the British games industry from behind the keyboards of their Sinclair Spectrums. His first published work was actually a programming utility rather than a game, released as part of a more complete Speccy programmer’s toolkit by a company known as Softek in the spring of 1983. But it was followed by his shoot-em-up Firebirds just a few months later. That game’s smooth scrolling and slick presentation won him a reputation. Thus one day the following year the phone rang at his family’s home; a representative from Atari was on the line, asking if he would be free to port their standup-arcade and console hit Pole Position to the Spectrum.

Over the next several years, Devine continued to port games from American publishers to the Europe-centric Spectrum, while also making more original games of his own: Xcel (1985), Attack of the Killer Tomatoes (1986), Metropolis (1987). His originals tended to be a bit half-baked once you really dove in, but their technical innovations were usually enough to sustain them, considering that most of them only cost a few quid. Metropolis, the first game Devine programmed for MS-DOS machines, provides a prime example of both his technical flair and complete lack of detail orientation. A sort of interactive murder mystery taking place in a city of robots, sharing only a certain visual sensibility with the Fritz Lang film classic of the same name, it includes almost-decipherable “voice acting” for its characters, implemented without the luxury of a sound card, being played entirely through the early IBM PC beeper. The game itself, on the other hand, is literally unfinished; it breaks halfway through its advertised ten cases. Perhaps Devine decided that, given that he included no system for saving his rather time-consuming game, no one would ever get that far anyway.

Metropolis

Metropolis was published through the British budget label Mastertronic, whose founder Martin Alper was a force of nature, famous as a cultivator of erratic young talent like Devine. Alper sold Mastertronic to Richard Branson’s Virgin Media empire just after Metropolis was released, and soon after that absconded to Southern California to oversee the newly formed American branch of Virgin Games. On a routine visit back to the Virgin mother ship in London in 1988, he dropped in on Devine, only to find him mired in a dark depression; it seemed his first serious girlfriend had just left him. “England obviously isn’t treating you well,” said Alper. “Why don’t you come with me to California?” Just like that, the 22-year-old Devine became the head of Virgin Games’s American research and development. It was in that role that he met Rob Landeros the following year.

Landeros’s origin story was about as different from Devine’s as could be imagined. Born in 1949 in Redlands, California, he had lived the life of an itinerant bohemian artist. After drifting through art school, he spent much of the 1970s in hippie communes, earning his keep by drawing underground comic books and engraving tourist trinkets. By the early 1980s, he had gotten married and settled down somewhat, and found himself fascinated by the burgeoning potential of the personal computer. He bought himself a Commodore 64, learned how to program it in BASIC, and even contributed a simple card game to the magazine Compute!’s Gazette in the form of a type-in listing.

But he remained a computer hobbyist only until the day in early 1986 that an artist friend of his by the name of Jim Sachs showed him his new Commodore Amiga. Immediately struck by the artistic possibilities inherent in the world’s first true multimedia personal computer, Landeros worked under Sachs to help illustrate Defender of the Crown, the first Amiga game from a new company called Cinemaware. After that project, Sachs elected not to stay on with Cinemaware, but instead recommended Landeros for the role of the company’s art director. Landeros filled that post for the next few years, illustrating more high-concept “interactive movies” which could hardly have been more different on the surface from Devine’s quick-and-dirty budget games — but which nevertheless tended to evince some of the same problems when it came to the question of their actual gameplay.

Whatever its flaws in that department, Martin Alper over at Virgin was convinced that the Cinemaware catalog was an early proof of concept for gaming’s future. As Cinemaware founder Bob Jacob and many others inside and outside his company well recognized, their efforts were hobbled by the need to rely on cramped, slow floppy disks to store all of their audiovisual assets and stream them into memory during play. But with CD-ROM on the horizon for MS-DOS computers, along with new graphics and sound cards that would make the platform even more audiovisually capable than the Amiga, that could soon be a restriction of the past. Alper asked Devine to interview Landeros for the role of Virgin’s art director.

Landeros was feeling “underappreciated and underpaid” at Cinemaware, as he puts it, so he was very receptive to such an offer. When he called Devine back after hearing the message the latter had left on his answering machine, he found the younger man in an ebullient mood. He had just gotten engaged to be married, Devine explained, to a real California girl — surely every cloistered British programmer’s wildest fantasy. Charmed by the lad’s energy and enthusiasm, Landeros let himself be talked into a job. And indeed, Devine and Landeros quickly found that they got on like a house on fire.

Tall and skinny and bespectacled, with unkempt long hair flying everywhere, Devine alternated the euphoria with which he had first greeted Landeros with bouts of depression such as the one Martin Alper had once found him mired in.  Landeros was calmer, more grounded, as befit his age, but still had a subversive edge of his own. When you first met him, he had almost a patrician air — but when he turned around for the first time, you noticed a small ponytail snaking down his back. While Devine was, like so many hackers, used to coding for days or weeks on end, sometimes to the detriment of his health and psychological well-being, Landeros needed a very good reason indeed to give up his weekend motorcycle tours. Devine was hugely impressed by Landeros’s tales of his free-spirited life, as he was by the piles of self-inked comic books lying about his home; Landeros was repeatedly amazed simply at the things Devine could make computers do. The two men complemented each other — perhaps were even personally good for one another in some way that transcends job and career.

Their work at Virgin, however, wasn’t always the most exciting. The CD-ROM revolution proved late in arriving; in the meantime, the business of making games continued pretty much as usual. In between his other duties, Devine made Spot, an abstract strategy game which betrayed a large debt to the ancient Japanese board game of Go whilst also serving as an advertisement for the soft drink 7 Up; if not quite a classic, it did show more focus than his earlier efforts. Meanwhile Landeros did the art for a very Cinemaware-like cross-genre concoction called Spirit of Excalibur. In his spare time, he also helped his friend and fellow Cinemaware alumnus Peter Oliphant with a unique word-puzzle/game-show hybrid called Lexi-Cross. (Rejected by Alper because “game shows need a license in order to sell,” it was finally accepted by Interplay after that company’s head Brian Fargo brought a copy home to his wife and she couldn’t stop playing it. Nonetheless, it sold hardly at all, just as Alper had predicted.)

Devine and Landeros were itching to work with CD-ROM, but everywhere they went they were told that the market just wasn’t there yet. As they saw it, no one was buying CD-ROM drives because no one was making compelling enough software products for the new medium. It was a self-fulfilling prophecy, a marketplace Gordian knot which someone had to break. Accordingly, they decided to put together their own proposal for a showpiece CD-ROM game. Both were entranced by Twin Peaks, the darkly quirky murder-mystery television series by David Lynch, which had premiered in the spring of 1990 and promptly become an unlikely mass-media sensation. Sitting in the airport together one day, they overheard the people around them debating the question of the year: who killed Laura Palmer?

Imagine a game that can fascinate in the same way, mused Devine. And so they started to brainstorm. They pictured a game, perhaps a bit like the board game Clue — tellingly, the details of the gameplay were vague in their minds right from the start — that might make use of a Twin Peaks license if such a thing was possible, but would go for that sort of vibe regardless. Most importantly, it would pull out all the stops to show what CD-ROM — and only CD-ROM — could do; there would be no floppy version. Indeed, the project would be thoroughly uncompromising in all of its hardware requirements, freeing it from the draconian restrictions that came with catering to the lowest common denominator. It would require one of a new generation of so-called “Super” VGA graphics cards, which would let it push past the grainy resolution of 320 X 200, still the almost universal standard in games, to a much sharper 640 X 480.

To keep the development complications from spiraling completely out of control, it could take place in a haunted house that had a group of people trapped inside, being killed one by one. Sure, Agatha Christie had done it before, but this would be different. Creepier. Darker. A ghost story as well as a mystery, all served up with a strong twist of David Lynch. “Who killed Laura Palmer? Who killed Laura Palmer? We wanted to create that sort of intrigue,” remembers Landeros.

When they broached the possibility of a Twin Peaks game with Alper, he was definitive on one point: there wasn’t enough room in his budget to acquire a license to one of the hottest media properties in the country. They should therefore focus their thinking on a Twin Peaks-like game, not the real thing. Otherwise, he was noncommittal. “Give me a detailed written proposal, and we’ll see,” he said.

At this point in our story, it would behoove us to know something more of Martin Alper the man, a towering figure whose shadow loomed large over all of Virgin Games. A painter and sculptor of some talent during his free time, Alper was also an insatiable culture vulture, reading very nearly a novel per day and seeing several films per week. His prodigious consumption left no space for games. “I’ve never played any game,” he liked to boast. “What interests me is the cultural progress that games can generate. I’m looking to make a difference in society.” He liked to think of himself as a 1990s incarnation of Orson Welles, nudging his own group of Mercury Players into whole new fields of creative expression. When Devine and Landeros’s detailed proposal landed on his desk in November of 1990, full of ambition to harness the current zeitgeist in the service of a new medium, it hit him right where he lived. Even the proposed budget of $300,000 — two to three times that of the typical Virgin game — put him off not at all.

So, he invited Devine and Landeros to a lunch which has since gone down in gaming lore. After the niceties had been dispensed with, he told the two bluntly that they had “no future at Virgin Games.” He enjoyed their shock for a while — a certain flair for drama was also among his character traits — then elaborated. “Your idea is too big to be developed here. If you stayed here, you’d quickly overrun our offices. I can’t afford to let you do that. Other games have to be made here as well.”

“What do you suggest?” ventured Devine.

And so Alper laid out his grand plan. They should start their own studio, which Virgin Games would finance. They could work where they liked and hire whomever they liked, as long as the cost didn’t become too outrageous and as long as they stayed within 90 minutes of Virgin’s headquarters, so that Alper and David Bishop, the producer he planned to assign to them, could keep tabs on their progress. And they would have to plan for the eventuality of a floppy-disk release as well, if, as seemed likely, CD-ROM hadn’t yet caught on to a sufficient degree with consumers by the following Christmas, the game’s proposed release date. They were simple requirements, not to mention generous beyond Devine and Landeros’s wildest dreams. Nevertheless, they would fail to meet them rather comprehensively.

In the course of his hippie wanderings, Landeros had fallen in love with the southern part of Oregon. After the meeting with Alper, he suggested to Devine that they consider setting up shop there, where the biking and motorcycling were tremendous, the scenery was beautiful, the people were mellow, and the cost of living was low. When Devine protested that one certainly couldn’t drive there from Virgin’s offices within 90 minutes, Landeros just winked back. Alper hadn’t actually specified a mode of transportation, he noted. And one could just about fly there in an hour and a half.

On December 5, 1990, the pair came for the first time to Jacksonville, Oregon, a town of just 2000 inhabitants. It so happened that the lighting of the town Christmas tree was taking place that day. All of the people had come out for the occasion, dressed in Santa suits and Victorian costumes, caroling and roasting chestnuts. Just at sunset, snow started to fall. Devine, the British city boy far from home, looked around with shining eyes at this latest evolution of his American dream. Oregon it must be.

So, during that same visit, they signed a lease on a small office above a tavern in an 1884-vintage building — wood floors, a chandelier on the ceiling, even a fireplace. They hired Diane Moses, a waitress from the tavern below, to serve as their office manager. Then they went back south to face the music.

The 7th Guest was created in this 1884-vintage building in Jacksonville, Oregon, above a tavern which is now known as Boomtown Saloon.

Alper was less than pleased at first that they had so blatantly ignored his instructions, but they played up the cheap cost of living and complete lack of distractions in the area until he grudgingly acquiesced. The men’s wives were an even tougher sell, especially when they all returned to Jacksonville together in January and found a very different scene: a bitter cold snap had caused pipes to burst all over town, flooding the streets with water that had now turned to treacherous ice, making a veritable deathtrap of the sidewalk leading up to their new office’s entrance. But the die was now cast, for better or for worse.

The studio which Devine and Landeros had chosen to name Trilobyte officially opened for business on February 1, 1991. The friends found that working above a tavern had its attractions after a long day — and sometimes even in the middle of one. “It’s fun to watch the fights spill out onto the street,” said Devine to a curious local newspaper reporter.

The first pressing order of business was to secure a script for a game that was still in reality little more than a vague aspiration. Landeros had already made contact over the GEnie online service with Matthew Costello, a horror novelist, gaming journalist, and sometime tabletop-game designer. He provided Trilobyte with a 100-page script for something he called simply Guest. Graeme Devine:

We presented the basic story to Matt, and he made it into a larger story, built the characters and the script. He created it out of what was really just a sketch. We were anxious that the [setting] be very, very closed. One that would work as a computer environment. That’s what he gave us.

The script took place within a single deserted mansion, and did all of its storytelling through ghostly visions which the player would bump into from time to time, and which could be easily conveyed through conveniently non-interactive video snippets. Like so many computer games, in other words, Guest would be more backstory than story.

Said backstory takes place in 1935, and hinges on a mysterious toy maker named Henry Stauf — the anagram of Faust is intentional — who makes and sells a series of dolls which cause all of the children who play with them to sicken and die. When the people of his town figure out the common thread that connects their dead children, they come for him with blood in their eyes. He barricades himself in his mansion to escape their wrath — but sometime shortly thereafter he lures six guests into spending a night in the mansion, with a promise of riches for those who survive. Falling victim either to Stauf’s evil influence or their own paranoia, or both, the six guests all manage to kill one another, Agatha Christie-style, over the course of the night, all without ever meeting Stauf himself in the flesh. But there is also a seventh, uninvited guest, a street kid named Tad who sneaks in and witnesses all of the horror, only to have his own soul trapped inside the mansion. It becomes clear only very slowly over the course of the game that the player is Tad’s spirit, obsessively recapitulating the events of that night of long ago, looking for an escape from his psychic prison in the long-deserted mansion.

The backstory of how Stauf came to take up residence in his mansion is shown in the form of narrated storybook right after the opening credits.

The only thing missing from Costello’s script was any clear indication of what the player would be expected to do in the course of it all. Trilobyte planned to gate progress with “challenges to the player’s intellect and curiosity. Our list of things to avoid includes: impossible riddles, text parsers, inventories, character attribute points, sword fights, trolls, etc. All actions are accomplished via mouse only. Game rules will either be self-explanatory or simple enough to discover with minimal experimentation.” It sounded good in the abstract, but it certainly wasn’t very specific. Trilobyte wouldn’t seriously turn to the game part of their game for a long, long time to come.

The question of Guest‘s technical implementation was almost as unsettled, but much more pressing. Devine and Landeros first imagined showing digitized photographs of a real environment. Accordingly, they negotiated access to Jacksonville’s Nunan House, a palatial three-story, sixteen-room example of the Queen Anne style, built by a local mining magnate in 1892. But, while the house was fine, the technology just wouldn’t come together. Devine had his heart set on an immersive environment where you could see yourself actually moving through the house. Despite all his technical wizardry, he couldn’t figure out how to create such an effect from a collection of still photographs.


The Mansion

The Nunan House in Jacksonville, Oregon, whose exterior served as the model for the Stauf Mansion. The interior of the latter was, however, completely different, with the exception only of a prominent central staircase.



A breakthrough arrived when Devine and Landeros shared their woes with a former colleague from Virgin, an artist named Robert Stein. Stein had been playing for several months with 3D Studio, a new software package from a company known as Autodesk which let one build and render 3D scenes and animations. It was still an awkward tool in many ways, lagging behind similar packages for the Commodore Amiga and Apple Macintosh. Nonetheless, a sufficiently talented artist could do remarkable things with it, and it had the advantage of running on the MS-DOS computers on which Trilobyte was developing Guest. Devine and Landeros were convinced when Stein whipped up a spooky living room for them, complete with a ghostly chair that flew around of its own accord. Stein soon came to join them in Jacksonville, becoming the fourth and last inhabitant of their cozy little office.


3D Studio

The 7th Guest was the first major game to make extensive use of Autodesk’s 3D Studio, a tool that would soon become ubiquitous in the industry. Here we see the first stage of the modeling process: the Shaper, in which an object is created as a two-dimensional geometric drawing, stored in the form of points and vectors.

In the Lofter, an object’s two dimensions are extruded into three, as the X- and Y-coordinates of its points are joined to Z-coordinates.

The Materials Editor is used to apply textured surfaces to what were previously wire-frame objects.

The 3D Editor is used to build a scene by hanging objects together in a virtual space and defining the position, color, and intensity of light sources.

The Keyframer is used to create animation. The artist arranges the world in a set of these so-called key frames, then tells the computer to extrapolate all of the frames in between. The process was an extremely time-consuming one on early-1990s computer hardware; each frame of a complex animation could easily take half an hour to render.



Even using 3D Studio, Guest must fall well short of the ideal of an immersive free-scrolling environment. At the time, only a few studios — most notably Looking Glass Technologies and, to a much more limited extent, id Software of eventual Doom fame — were even experimenting with such things. The reality was that making interactive free-scrolling 3D work at all on the computer hardware of the era required drastic compromises in terms of quality — compromises which Trilobyte wasn’t willing to make. Instead they settled for a different sort of compromise, in the form of a node-based approach to movement. The player is able to stand only at certain pre-defined locations, or nodes, in the mansion. When she clicks to move to another node, a pre-rendered animation plays, showing her moving through the mansion.

Just streaming these snippets off CD fast enough to play as they should taxed Devine’s considerable programming talents to the utmost. He would later muse that he learned two principal things from the whole project: “First, CD-ROM is bloody slow. Second, CD-ROM is bloody slow.” When he could stretch his compression routines no further, he found other tricks to employ. For example, he got Landeros to agree to present the environment in a “letter-boxed” widescreen format. Doing so would give it a sense of cinematic grandeur, even as the black bars at the top and bottom of the monitor dramatically reduced the number of pixels Devine’s routines had to move around. A win win.

With the interior of the mansion slowly coming into being, the time was nigh to think about the ghostly video clips which would convey the story. Trilobyte recruited local community-theater thespians to play all the parts; with only $35,000 to spend on filming, including the camera equipment, they needed actors willing to work for almost nothing. The two-day shoot took place in a rented loft in Medford, Oregon, on a “stage” covered with green butcher paper. The starring role of Stauf went to Robert Hirschboeck, a fixture of the annual Oregon Shakespeare Festival, which was (and is) held in nearby Ashland. Diane Moses, Trilobyte’s faithful office manager, also got a part.

Robert Hirschboeck, the semi-professional Shakespearean actor who played the role of Stauf in The 7th Guest and its sequel. He was bemused by the brief fame the role won him: “I’ll be walking down the street and meet someone with all the CD-ROM gear, and they’ll say, ‘Ah, man, I’ve been looking at your ugly mug for 60 hours this week.'”

Trilobyte believed, with some justification, that their game’s premise would allow them to avoid some of the visual dissonance that normally resulted from overlaying filmed actors onto computer-generated backgrounds: their particular actors represented ghosts, which meant it was acceptable for them to seem not quite of the world around them. To enhance the impression, Trilobyte added flickering effects and blurry phosphorescent trails which followed the actors’ movements.


The Chroma-Key Process

A technique known as chroma-keying was used by The 7th Guest and most other games of the full-motion-video era to blend filmed actors with computer-generated backgrounds. The actor is filmed in front of a uniform green background. After digitization, all pixels of this color are rendered transparent. (This means that green clothing is right out for the actors…)

Meanwhile a background — the “stage” for the scene — has been created on the computer.

Finally, the filmed footage is overlaid onto the background.



While Trilobyte built their 3D mansion and filmed their actors, the project slipped further and further behind schedule. Already by May of 1991, they had to break the news to Alper that there was no possibility of a Christmas 1991 release; Christmas 1992 might be a more realistic target. Luckily, Alper believed in what they were doing. And the delay wasn’t all bad at that; it would give consumers more time to acquire the SVGA cards and CD-ROM drives they would need to run Guest — for by now it was painfully clear that a floppy-disk version of the game just wasn’t going to happen.

In January of 1992, Devine, Landeros, and Stein flew to Chicago for the Winter Consumer Electronics Show. They intended to keep a low profile; their plan was simply to check out the competition and to show their latest progress to Alper and his colleagues. But when he saw what they had, Alper broke out in goosebumps. Cinema connoisseur that he was, he compared it to Snow White and the Seven Dwarfs, Walt Disney’s first feature film, which forever changed the way people thought about cartoon animation. What Snow White had done for film, Alper said, Guest could do for games. He decided on the spot that it needed to be seen, right there and then. So, he found a computer on the show floor that was currently demonstrating a rather yawn-inducing computerized version of Scrabble and repurposed it to show off Guest. To make up for the fact that Trilobyte’s work had no music as of yet, he put on a CD of suitably portentous Danny Elfman soundtrack extracts to accompany it.

Thanks to this ad hoc demonstration, Guest turned into one of the most talked-about games of the show. Its stunning visuals were catnip to an industry craving killer apps that could nudge reluctant consumers onto the CD-ROM bandwagon. Bill Gates hung around the demo machine like a dog close to feeding time. Virgin’s competitor Origin Systems, of Wing Commander and Ultima fame, also sat up and took notice. They highlighted Guest as the game to watch in their internal newsletter:

Here’s a tip: keep an eye out for Guest, a made-for-CD-ROM title from Oregon developer Trilobyte for Virgin Games. In it, you explore a 22-room haunted mansion, complete with elaborate staircases, elegant dining rooms, a gloomy laboratory, and see-through ghosts. The version we saw is in a very primitive stage; there’s no real story line yet and many of the rooms are only rendered in black and white. But the flowing movement and brilliant detail in a few scenes which are fleshed-out are nothing less than spectacular. Ask anybody who saw it.

None of the press or public seemed to even notice that it was far from obvious what the player was supposed to do amidst all the graphical splendor, beyond the vague notion of “exploring.” The Trilobyte trio flew back to Oregon thoroughly gratified, surer than ever that all of their instincts had been right.

Still, with publicity came expectations, and also cynicism; Bill Gates’s enthusiasm notwithstanding, a group of multimedia experts at Microsoft said publicly that what Trilobyte was proposing to do was simply impossible. Some believed the entire CES demo had been a fake.

Trilobyte remained a tiny operation: there were still only Devine, Landeros, Stein, and Moses in their digs above the tavern. Other artists, as well as famed game-soundtrack composer George “The Fat Man” Sanger, worked remotely. But Devine, who had always been a lone-wolf coder, refused to delegate any of his duties now, even when they seemed about to kill him. “I’ve never seen someone work so hard on a project,” remembers one Virgin executive. The Fat Man says that “Graeme wanted to prove everyone else a liar. He knew he was going to be able to do it.” This refusal to delegate began to cause tension with Alper and others at Virgin, especially as it gradually became clear that Trilobyte was going to miss their second Christmas deadline as well. Virgin had now sunk twice the planned $300,000 into the project, and the price tag was still climbing. Incredibly, Trilobyte’s ambitions had managed to exceed the 650 MB of storage space on a single CD, a figure that had heretofore seemed inconceivably enormous to an industry accustomed to floppy disks storing barely 1 MB each; Guest was now to ship on two CDs. Devine and Landeros agreed to work without salary to appease their increasingly impatient handlers.

Only in these last months did an already exhausted Devine and Landeros turn their full attention to the puzzles that were to turn their multimedia extravaganza into a game. Trilobyte was guided here by a simple question: “What would Mom play?” They found to their disappointment that many of the set-piece puzzles and board and card games they wanted to include were still under copyright. Their cutting-edge game would have to be full of hoary puzzles plundered from Victorian-era texts.

But at least Trilobyte could now see the light at the end of the tunnel. In January of 1993, they made a triumphant return to CES, this time with far more pomp and circumstance, to unveil the game they were now calling The 7th Guest. Alper sprang for a haunted-house mock-up in the basement of the convention hall, to which only a handpicked group of VIPs were admitted for a “private screening.” Bill Gates was once again among those who attended; he emerged a committed 7th Guest evangelist, talking it up in the press every chance he got. And why not? It blew Sherlock Holmes Consulting Detective, the current poster child for CD-ROM gaming, right out of the water. Sherlock‘s herky-jerky video clips, playing at a resolution of just 160 X 100, paled next to The 7th Guest‘s 3D-rendered SVGA glory.

When it was finally released in April of 1993, the reaction to The 7th Guest exceeded Virgin and Trilobyte’s fondest hopes. Virgin began with a production run of 60,000, of which they would need to sell 40,000 copies to break even on a final development budget of a little over $700,000. They were all gone within days; Virgin scrambled to make more, but would struggle for months to keep up with demand. “Believe it or not, The 7th Guest really does live up to all the hype,” wrote Video Games and Computer Entertainment magazine. “It takes computer entertainment to the next level and sets new standards for graphics and sound.” What more could anyone want?



Well, in the long run anyway, a lot more. The 7th Guest would age more like raw salmon than fine wine. Already just two and a half years after its release to glowing reviews like the one just quoted, the multimedia trade magazine InterAction was offering a much more tepid assessment:

As a first-generation CD-ROM-based experience, The 7th Guest broke new ground. It also broke a lot of rules – of course, this was before anyone knew there were any rules. The music drowns out the dialog; the audio is not mixable. The video clips, once triggered, can’t be interrupted, which in a house of puzzles and constant searching leads to frustration. How many times can you watch a ghost float down a hallway before you get bored?

Everywhere The 7th Guest evinces the telltale signs of a game that no one ever bothered to play before its release — a game the playing of which was practically irrelevant to its real goals of demonstrating the audiovisual potential of the latest personal computers. Right from the moment you boot it up, when it subjects you to a cheesy several-seconds-long sound clip you can’t click past, it tries your patience. The Ouija Board used to save and restore your session seems clever for about half a minute; after that’s it’s simply excruciating. Ditto the stately animations that sweep you through the mansion like a dancing circus elephant on Quaaludes; the video clips that bring everything to a crashing halt for a minute or more at a time; the audio clips of Stauf taunting you which are constantly freezing the puzzles you’re trying to solve. The dominant impression the game leaves you with is one of slowness: the slowness of cold molasses coming out of the jar, of a glacier creeping over the land, of the universe winding down toward its heat death. I get fidgety just thinking about it.

One of the game’s few concessions to player convenience is this in-game map. Yet it’s made so annoying to use that you hardly want to. First, you have to click through a menu screen which forces you to watch it tediously fading in and out, like every screen in the game. And then you have to watch the game fill in the map with colors square by exasperating square to indicate where you’ve solved the puzzles and where you still have puzzles remaining. This game would make an excellent trial of patience for a Zen school, if such institutions exist.

The puzzles that are scattered through the rooms of the mansion gate your progress, but not for any reason that is discernable within the environment. When you solve certain puzzles, the game simply starts letting you go places you couldn’t go before. In practice, this means that you’re constantly toing and froing through the mansion, looking for whatever arbitrary new place the game has now decided to let you into. And, as already noted, moving around takes forever.

The puzzles themselves were already tired in 1993. Landeros has been cheeky enough to compare The 7th Guest to The Fool’s Errand, Cliff Johnson’s classic Macintosh puzzler, but the former’s puzzles haven’t a trace of the latter’s depth, grace, wit, or originality. Playing The 7th Guest exposes a pair of creators who were, despite being unquestionably talented in other ways, peculiarly out of their depth when it came to the most basic elements of good game design.

For example, one of the puzzles, inevitably, is an extended maze, which the vast majority of players solve, assuming they do so at all, only through laborious trial and error. “The solution to the maze was on a rug in one of the bedrooms,” notes Devine. “We thought people would copy that down.” A more experienced design team would have grasped that good game design requires consistency: all of the other puzzles in the game are completely self-contained, a fact which has trained the player long before she encounters the maze not to look for clues like this one in the environment. Alternately, testers could have told the designers the same thing. The 7th Guest provides yet one more illustration of my maxim that the difference between a bad and a good one is the same as that between a game that wasn’t played before its release and one that was. “Our beta testing was, well, just us,” admits Devine.

Another infamous lowlight — easily the worst puzzle in the game in purely abstract design terms — is a shelf of lettered soup cans which you must rearrange to spell out a message. The problem is that the sentence you’re looking for makes sense only under a mustily archaic Scottish diction that vanishingly few players are likely to be familiar with.

But the worst puzzle in practical terms is actually Devine’s old abstract strategy game Spot, imported wholesale, albeit with the intelligence of your computer opponent cranked up to literally superhuman levels. It’s so difficult that even the official strategy guide throws up its hands, offering only the following clarification: “It is not necessary to beat this game to advance through The 7th Guest, and you will not be missing anything if you can’t beat it. To our knowledge, nobody has a consistent strategy to beat this game, not even Graeme!” The most serious problem here, even beyond the sheer lunacy of including a mini-game that even the programmer doesn’t know how to beat, is that the player doesn’t know that the puzzle is unnecessary. Thus she’s likely to waste hours or days on an insurmountable task, thinking all the while that it must gate access to a critical part of the plot, just like all the other puzzles. (What did I say about consistency?) Its presence is unforgivably cruel, especially in a game that advertised itself as being suitable for casual players.

None of the other puzzles are quite as bad as these, but they are samey —  three of the 22 are chess puzzles, doubtless all drawn from the same Victorian book — at wild variance with one another in difficulty, and just generally dull, in addition to being implemented in ways calculated to maximize their tedium. Playing the game recently to prepare for this article, I never once felt that rush that accompanies the solution of a really clever puzzle. Working through these ones does indeed feel like work, made all the more taxing by the obstinately form-over-function interface. The best thing to be said about the puzzles is that they can all be bypassed by consulting an in-game hint book in the mansion’s library, albeit at the cost of missing the video clips that accompany their successful solutions and thus missing out on that part of the plot.

Still, one might want to argue that there is, paradoxical though it might sound, more to games than gameplay. Aesthetics have a value of their own, as does story; certainly The 7th Guest is far from the first adventure game with a story divorced from its puzzles. In all of these areas as well, however, it’s long since curdled. The graphics, no longer able to dazzle the jaded modern eye with their technical qualities, stand revealed as having nothing else to offer. There’s just nothing really striking in the game’s visual design — no compelling aesthetic vision. The script as well manages only to demonstrate that Matthew Costello is no David Lynch. It turns out that subversive surrealistic horror is harder to pull off than it looks.

As for the actors… I hesitate to heap too much scorn on them, given that they were innocent amateurs doing their best with a dodgy script in what had to feel like a thoroughly strange performing situation. Suffice to say, then, that the acting is about as good as that description would suggest. On the other hand, it does seem that they had some fun at least some of the time by hamming it up.


Indeed, the only claim to aesthetic or dramatic merit which The 7th Guest can still make is that of camp. Even Devine acknowledges today that the game is more silly than scary. He now admits that the story is “a bit goofy” and calls the game “Scooby Doo spooky” rather than drawing comparisons to The Shining and The Haunting, as he did back in the day. Which is progress, I suppose — but then, camp is such a lazy crutch, one that far too many games try to lean upon.



The 7th Guest just kept selling and selling,” says its producer David Bishop of the months after its release. “We’d look at the sales charts and it had incredible legs. Sales were picking up, not slowing down.” By the end of 1996, the game would sell well over 2 million copies.  Trilobyte was suddenly flush with cash; they earned $5 million in royalties in the first year alone. Nintendo gave them a cool $1 million upfront for the console rights; Paul Allen came along with another $5 million in investment capital. Trilobyte moved out of their little office above the tavern into a picturesque old schoolhouse, and started hiring the staff that had been so conspicuously missing while they made their first game. Then they moved out of the schoolhouse into a 29,000-square-foot monstrosity, formerly a major bank’s data center.

The story of Trilobyte after The 7th Guest becomes that of two merely smart men who started believing that they really were the infallible geniuses they were being hyped as. “Trilobyte thought they could pick up any project and it would turn to gold,” says one former Virgin staffer. “They had huge egos and wanted to grow,” says another. Even writer Matthew Costello says that he “could see the impact the attention from The 7th Guest had on [Devine and Landeros’s] perceptions of themselves.”

Despite the pair’s heaping level of confidence and ambition, or perhaps because of it, Trilobyte never came close to matching the success of The 7th Guest. The sequel, called The 11th Hour, shipped fully two and a half years later, but nonetheless proved to be just more of the same: more dull puzzles, more terrible acting, more technically impressive but aesthetically flaccid graphics. The zeitgeist instant for this sort of thing had already passed; after a brief flurry of early sales, The 11th Hour disappeared. Other projects came and went; Trilobyte spent $800,000 on Dog Eat Dog, a “workplace-politics simulator,” before cancelling it. Meanwhile Clandestiny, another expensive game in the mold of The 7th Guest, sold less than 20,000 copies to players who had now well and truly seen that the guest had no clothes.

Dog Eat Dog, Trilobyte’s never-released “workplace-politics simulator.”

Rob Landeros gradually revealed himself to be a frustrated filmmaker, always a dangerous thing to have around a game-development studio. Worse, he was determined to push Trilobyte into “edgy” content, rife with adult themes and nudity, which he lacked sufficient artistic nuance to bring to life in ways that didn’t feel crass and exploitative. When Devine proved understandably uncomfortable with his direction, the two fast friends began to feud.

The two founders were soon pulling in radically different directions, with Landeros still chasing the interactive-movie unicorn as if Doom had never happened, while Devine pushed for a move into real-time 3D games like the ones everyone else was making. New Media magazine memorably described Landeros’s Tender Loving Care as “a soft-porn film with a weak plot and rancid acting” after getting a sneak preview; the very name of Devine’s Extreme Warfare sounded like a caricature of bro-gamer culture. The former project was eventually taken by an embittered Landeros to a new company he founded just to publish it, whereupon it predictably flopped; the latter never got released at all. Trilobyte was officially wound up in January of 1999. “In the end, I never outran the shadow of The 7th Guest,” wrote Devine in a final email to his staff. “Mean old Stauf casts his long and bony shadow across this valley, and Trilobyte will always be remembered for those games and none other.”

In the aftermath, Devine continued his career in the games industry as an employee rather than an entrepreneur, working on popular blockbusters like Quake III, Doom 3, and Age of Empires III. (Good things, it seems, come to him in threes.) Landeros intermittently tried to get more of his quixotic interactive movies off the ground, whilst working as a graphic designer for the Web and other mediums. He’s become the keeper of the 7th Guest flame, for whatever that is still worth. In 2019, he launched a remastered 25th anniversary edition of the game, but it was greeted with lukewarm reviews and little enthusiasm from players. It seems that even nostalgia struggles to overcome the game’s manifest deficiencies.

The temptation to compare The 7th Guest to Myst, its more long-lived successor in the role of CD-ROM showcase for the masses, is all but irresistible. One might say that The 7th Guest really was all the things that Myst was so often accused of being: shallow, unfair, a tech demo masquerading as a game. Likewise, a comparison of the two games’ respective creators does Devine and Landeros no favors. The Miller brothers of Cyan Productions, the makers of Myst, took their fame and fortune with level-headed humility. Combined with their more serious attitude toward game design as a craft, this allowed them to weather the vicissitudes of fortune — albeit not without a few bumps along the way, to be sure! — and emerge with their signature franchise still intact. Devine and Landeros, alas, cannot make the same claim.

And yet I do want to be careful about using Myst as a cudgel with which to beat The 7th Guest. Unlike so many bad games, it wasn’t made for cynical reasons. On the contrary: all indications are that Devine and Landeros made it for all the right reasons, driven by a real, earnest passion to do something important, something groundbreaking. If the results largely serve today as an illustration of why static video clips strung together, whether they were created in a 3D modeler or filmed in front of live actors, are an unstable foundation on which to build a compelling game, the fact remains that we need examples of what doesn’t work as well as what does. And if the results look appallingly amateurish today on strictly aesthetic terms, they shouldn’t obscure the importance of The 7th Guest in the history of gaming. As gaming historians Magnus Anderson and Rebecca Levene put it, “The 7th Guest wasn’t anywhere near the league of professional film-making, but it moved games into the same sphere — a non-gamer could look at The 7th Guest and understand it, even if they were barely impressed.”

A year before Myst took the Wintel world by storm, The 7th Guest drove the first substantial wave of CD-ROM uptake, doing more than any other single product to turn 1993 into the long-awaited Year of CD-ROM. It’s been claimed that sales of CD-ROM drives jumped by 300 percent within weeks of its release. Indeed, The 7th Guest and CD-ROM in general became virtually synonymous for a time in the minds of consumers. And the game drove sales of SVGA cards to an equal degree; The 7th Guest was in fact the very first prominent game to demand more than everyday VGA graphics. Likewise, it undoubtedly prompted many a soul to take the plunge on a whole new 80486- or Pentium-based wundercomputer. And it also prompted the sale of countless CD-quality 16-bit sound cards. Thanks to The 7th Guest‘s immense success, game designers after 1993 had a far broader technological canvas on which to paint than they had before that year. And some of the things they painted there were beautiful and rich and immersive in all the ways that The 7th Guest tried to be, but couldn’t quite manage. While I heartily and unapologetically hate it as a game, I do love the new worlds of possibility it opened.

(Sources: the books La Saga des Jeux Vidéo by Daniel Ichbiah, Grand Thieves and Tomb Raiders: How British Video Games Conquered the World by Magnus Anderson and Rebecca Levene, and The 7th Guest: The Official Strategy Guide by Rusel DeMaria; Computer Gaming World of December 1990, May 1991, November 1992, October 1994, November 1994, June 1995, November 1998, December 1999, and July 2004; Electronic Entertainment of June 1994 and August 1995; Game Players PC Entertainment Vol. 5 No. 5; InterActivity of February 1996; Retro Gamer 85, 108, 122, and 123; Video Games and Computer Entertainment of August 1993; Zero of May 1992; Run 1986 Special Issue; Compute!’s Gazette of April 1985 and September 1986; ZX Computing of April 1986; Home Computing Weekly of July 19 1983; Popular Computing Weekly of May 26 1983; Crash of January 1985; Computer Gamer of December 1985 and February 1986; Origin Systems’s internal newslatter Point of Origin dated January 17 1992. Online sources include Geoff Keighly’s lengthy history of Trilobyte for GameSpot, John-Gabriel Adkins’s “Two Histories of Myst,” and “Jeremiah Nunan – An Irish Success Story” at the Jacksonville Review.

The 25th anniversary edition of The 7th Guest is available for purchase at GOG.com, as is the sequel The 11th Hour.)

 

Tags: , ,

Lemmings 2: The Tribes

When the lads at DMA Design started making the original Lemmings, they envisioned that it would allow you to bestow about twenty different “skills” upon your charges. But as they continued working on the game, they threw more and more of the skills out, both to make the programming task simpler and to make the final product more playable. They finally ended up with just eight skills, the perfect number to neatly line up as buttons along the bottom of the screen. In the process of this ruthless culling, Lemmings became a classic study in doing more with less in game design: those eight skills, combined in all sorts of unexpected ways, were enough to take the player through 120 ever-more-challenging levels in the first Lemmings, then 100 more in the admittedly less satisfying pseudo-sequel/expansion pack Oh No! More Lemmings.

Yet when the time came to make the first full-fledged sequel, DMA resurrected some of their discarded skills. And then they added many, many more of them: Lemmings 2: The Tribes wound up with no less than 52 skills in all. For this reason not least, it’s often given short shrift by critics, who compare its baggy maximalism unfavorably with the first game’s elegant minimalism. To my mind, though, Lemmings 2 is almost a Platonic ideal of a sequel, building upon the genius of the original game in a way that’s truly challenging and gratifying to veterans. Granted, it isn’t the place you should start; by all means, begin with the classic original. When you’ve made it through those 120 levels, however, you’ll find 120 more here that are just as perplexing, frustrating, and delightful — and with even more variety to boot, courtesy of all those new skills.



The DMA Design that made Lemmings 2 was a changed entity in some ways. The company had grown in the wake of the first game’s enormous worldwide success, such that they had been forced to move out of their cozy digs above a baby store in the modest downtown of Dundee, Scotland, and into a more anonymous office in a business park on the outskirts of town. The core group that had created the first Lemmings — designer, programmer, and DMA founder David Jones; artists and level designers Mike Dailly and Gary Timmons; programmer and level designer Russell Kay — all remained on the job, but they were now joined by an additional troupe of talented newcomers.

Lemmings 2 also reflects changing times inside the games industry in ways that go beyond the size of its development team. Instead of 120 unrelated levels, there’s now a modicum of story holding things together. A lengthy introductory movie — which, in another telling sign of the times, fills more disk space than the game itself and required almost as many people to make — tells how the lemmings were separated into twelve tribes, all isolated from one another, at some point in the distant past. Now, the island (continent?) on which they live is facing an encroaching Darkness which will end all life there. Your task is to reunite the tribes, by guiding each of them through ten levels to reach the center of the island. Once all of the tribes have gathered there, they can reassemble a magical talisman, of which each tribe conveniently has one piece, and use it to summon a flying ark that will whisk them all to safety.

It’s not exactly an air-tight plot, but no matter; you’ll forget about it anyway as soon as the actual game begins. What’s really important are the other advantages of having twelve discrete progressions of ten levels instead of a single linear progression of 120. You can, you see, jump around among all these tribes at will. As David Jones said at the time of the game’s release, “We want to get away from ‘you complete a level or you don’t.'” When you get frustrated banging your head against a single stubborn level — and, this being a Lemmings game, you will get frustrated — you can just go work on another one for a while.

Rather than relying largely on the same set of graphics over the course of its levels, as the original does, each tribe in Lemmings 2 has its own audiovisual theme: there are beach-bum lemmings, Medieval lemmings, spooky lemmings, circus lemmings, alpine lemmings, astronaut lemmings, etc. In a tribute to the place where the game was born, there are even Scottish Highland lemmings (although Dundee is actually found in the less culturally distinctive — or culturally clichéd — Lowlands). And there’s even a “classic” tribe that reuses the original graphics; pulling it up feels a bit like coming home from an around-the-world tour.


Teaching Old Lemmings New Tricks

In this Beach level, a lemming uses the “kayak” skill to cross a body of water.

In this Medieval level, one lemming has become an “attractor”: a minstrel who entrances all the lemmings around him with his music, keeping them from marching onward. Meanwhile one of his colleagues is blazing a trail in front for the rest to eventually follow.

In this Shadow level, the lemming in front has become a “Fencer.” This allows him to dig out a path in front of himself at a slight upward angle. (Most of the skills in the game that at first seem bewilderingly esoteric actually do have fairly simple effects.)

In this Circus level, one lemming has become a “rock climber”: a sort of super-powered version of an ordinary climber, who can climb even a canted wall like this one.

In this Polar level, a lemming has become a “roper,” making a handy tightrope up and over the tree blocking the path.

In this Space level, we’ve made a “SuperLem” who flies in the direction of the mouse cursor.


Other pieces of plumbing help to make Lemmings 2 feel like a real, holistic game rather than a mere series of puzzles. The first game, as you may recall, gives you an arbitrary number of lemmings which begin each level and an arbitrary subset of them which must survive it; this latter number thus marks the difference between success and failure. In the sequel, though, each tribe starts its first level with 60 lemmings, who are carried over through all of the levels that follow. Any lemmings lost on one level, in other words, don’t come back in the succeeding ones. It’s possible to limp to the final finish line with just one solitary survivor remaining — and, indeed, you quite probably will do exactly this with a few of the tribes the first time through. But it’s also possible to finish all but a few of the levels without killing any lemmings at all. At the end of each level and then again at the end of each tribe’s collection of levels, you’re awarded a bronze, silver, or gold star based on your performance. To wind up with gold at the end, you usually need to have kept every single one of the little fellows alive through all ten levels. There’s a certain thematic advantage in this: people often note how the hyper-cute original Lemmings is really one of the most violent videogames ever, requiring you to kill thousands and thousands of the cuties over its course. This objection no longer applies to Lemmings 2. But more importantly, it sets up an obsessive-compulsive-perfectionist loop. First you’ll just want to get through the levels — but then all those bronze and silver performances lurking in your past will start to grate, and pretty soon you’ll be trying to figure out how to do each level just that little bit more efficiently. The ultimate Lemmings 2 achievement, needless to say, is to collect gold stars across the board.

This tiered approach to success and failure might be seen as evidence of a kinder design sensibility, but in most other respects just the opposite is true; Lemmings 2 has the definite feel of a game for the hardcore. The first Lemmings does a remarkably good job of teaching you how to play it interactively over the course of its first twenty levels or so, introducing you one by one to each of its skills along with its potential uses and limitations. There’s nothing remotely comparable in Lemmings 2; it just throws you in at the deep end. While there is a gradual progression in difficulty within each tribe’s levels, the game as a whole is a lumpier affair, especially in the beginning. Each level gives you access to between one and eight of the 52 available skills, whilst evincing no interest whatsoever in showing you how to use any of them. There is some degree of thematic grouping when it comes to the skills: the Highland lemmings like to toss cabers; the beach lemmings are fond of swimming, kayaking, and surfing; the alpine lemmings often need to ski or skate. Nevertheless, the sheer number of new skills you’re expected to learn on the fly is intimidating even for a veteran of the first game. The closest Lemmings 2 comes to its predecessor’s training levels are a few free-form sandbox environments where you can choose your own palette of skills and have at it. But even here, your education can be a challenging one, coming down as it still does to trial and error.

Your first hours with the game can be particularly intimidating; as soon as you’ve learned how one group of skills works well enough to finish one level, you’re confronted with a whole new palette of them on the next level. Even I, a huge fan of the first game, bounced off the second one quite a few times before I buckled down, started figuring out the skills, and, some time thereafter, started having fun.

Luckily, once you have put in the time to learn how the skills work, Lemmings 2 becomes very fun indeed, — every bit as rewarding as the first game, possibly even more so. Certainly its level design is every bit as good — better in fact, relying more on logic and less on dodgy edge cases in the game engine than do the infamously difficult final levels of the first Lemmings. Even the spiky difficulty curve isn’t all bad; it can be oddly soothing to start on a new tribe’s relatively straightforward early levels after being taxed to the upmost on another tribe’s last level. If the first Lemmings is mountain climbing as people imagine it to be — a single relentless, ever-steeper ascent to a dizzying peak — the second Lemmings has more in common with the reality of the sport: a set of more or less difficult stages separated by more or less comfortable base camps. While it’s at least as daunting in the end, it does offer more ebbs and flows along the way.

One might say, then, that Lemmings 2 is designed around a rather literal interpretation of the concept of a sequel. That is to say, it assumes that you’ve played its predecessor before you get to it, and are now ready for its added complexity. That’s bracing for anyone who fulfills that criterion. But in 1993, the year of Lemmings 2‘s release, its design philosophy had more negative than positive consequences for its own commercial arc and for that of the franchise to which it belonged.

The fact is that Lemmings 2‘s attitude toward its sequel status was out of joint with the way sequels had generally come to function by 1993. In a fast-changing industry that was fast attracting new players, the ideal sequel, at least in the eyes of most industry executives, was a game equally welcoming to both neophytes and veterans. Audiovisual standards were changing so rapidly that a game that was just a couple of years old could already look painfully dated. What new player with a shiny new computer wanted to play some ugly old thing just to earn a right to play the latest and greatest?

That said, Lemmings 2 actually didn’t look all that much better than its predecessor either, flashy opening movie aside. Part of this was down to DMA Design still using the 1985-vintage Commodore Amiga, which was still very popular as a gaming computer in Britain and other European countries, as their primary development platform, then porting the game to MS-DOS and various other more modern platforms. Staying loyal to the Amiga meant working within some fairly harsh restrictions, such as that of having no more than 32 colors on the screen at once, not to mention making the whole game compact enough to run entirely off floppy disk; hard drives, much less CD-ROM drives, were still not common among European Amiga owners. Shortly before the release of Lemmings 2, David Jones confessed to being “a little worried” about whether people would be willing to look beyond the unimpressive graphics and appreciate the innovations of the game itself. As it happened, he was right to be worried.

Lemmings and Oh No! More Lemmings sold in the millions across a bewildering range of platforms, from modern mainstream computers like the Apple Macintosh and Wintel machines to antique 8-bit computers like the Commodore 64 and Sinclair Spectrum, from handheld systems like the Nintendo Game Boy and Atari Lynx to living-room game consoles like the Sega Master System and the Nintendo Entertainment System. Lemmings 2, being a much more complex game under the hood as well as on the surface, wasn’t quite so amenable to being ported to just about any gadget with a CPU, even as its more off-putting initial character and its lack of new audiovisual flash did it no favors either. It was still widely ported and still became a solid success by any reasonable standard, mind you, but likely sold in the hundreds of thousands rather than the millions. All indications are that the first game and its semi-expansion pack continued to sell more copies than the second even after the latter’s release.

In the aftermath of this muted reception, the bloom slowly fell off the Lemmings rose, not only for the general public but also for DMA Design themselves. The franchise’s true jump-the-shark moment ironically came as part of an attempt to re-jigger the creatures to become media superstars beyond the realm of games. The Children’s Television Workshop, the creator of Sesame Street among other properties, was interested in moving the franchise onto television screens. In the course of these negotiations, they asked DMA to give the lemmings more differentiated personalities in the next game, to turn them from anonymous marchers, each just a few pixels across, into something more akin to individualized cartoon characters. Soon the next game was being envisioned as the first of a linked series of no less than four of them, each one detailing the further adventures of three of the tribes after their escape from the island at the end of Lemmings 2, each one ripe for trans-media adaptation by the Children’s Television Workshop. But the first game of this new generation, called The Lemmings Chronicles, just didn’t work. The attempt to cartoonify the franchise was cloying and clumsy, and the gameplay fell to pieces; unlike Lemmings 2, Lemmings Chronicles eminently deserves its underwhelming critical reputation. DMA insiders like Mike Dailly have since admitted that its was developed more out of obligation than enthusiasm: “We were all ready to move on.” When it performed even worse than its predecessor, the Children’s Television Workshop dropped out; all of its compromises had been for nothing.

Released just a year after Lemmings 2, Lemmings Chronicles marked the last game in the six-game contract that DMA Design had signed with their publisher Psygnosis what seemed like an eternity ago — in late 1987 to be more specific, when David Jones had first come to Psygnosis with his rather generic outer-space shoot-em-up Menace, giving no sign that he was capable of something as ingenious as Lemmings. Now, having well and truly demonstrated their ingenuity, DMA had little interest in re-upping; they were even willing to leave behind all of their intellectual property, which the contract Jones had signed gave to Psygnosis in perpetuity. In fact, they were more than ready to leave behind the cute-and-cuddly cartoon aesthetic of Lemmings and return to more laddish forms of gaming. The eventual result of that desire would be a second, more long-lasting worldwide phenomenon, known as Grand Theft Auto.

Meanwhile Sony, who had acquired Psygnosis in 1993, continued off and on to test the waters with new iterations of the franchise, but all of those attempts evinced the same vague sense of ennui that had doomed Lemmings Chronicles; none became hits. The last Lemmings game that wasn’t a remake appeared in 2000.

It’s interesting to ask whether DMA Design and Psygnosis could have managed the franchise better, thereby turning it into a permanent rather than a momentary icon of gaming, perhaps even one on a par with the likes of Super Mario and Sonic the Hedgehog; they certainly had the sales to compete head-to-head with those other videogame icons for a few years there in the early 1990s. The obvious objection is that Mario and Sonic were individualized characters, while DMA’s lemmings were little more than a handful of tropes moving in literal lockstep. Still, more has been done with less in the annals of media history. If everyone had approached Lemmings Chronicles with more enthusiasm and a modicum more writing and branding talent, maybe the story would have turned out differently.

Many speculate today that the franchise must inevitably see another revival at some point, what with 21st-century pop culture’s tendency to mine not just the A-list properties of the past, but increasingly its B- and C-listers as well, in the name of one generation’s nostalgia and another’s insatiable appetite for kitsch. Something tells me as well that we haven’t seen the last of Lemmings, but, as of this writing anyway, the revival still hasn’t arrived.

As matters currently stand, then, the brief-lived but frenzied craze for Lemmings has gone down in history, alongside contemporaries like Tetris and The Incredible Machine, as one more precursor of the casual revolution in gaming that was still to come, with its very different demographics and aesthetics. But in addition to that, it gave us two games that are brilliant in their own right, that remain as vexing but oh-so-rewarding as they were in their heyday. Long may they march on.

One other surviving tribute to Dundee’s second most successful gaming franchise is this little monument at the entrance to the city’s Seabraes Park, erected by local artist Alyson Conway in 2013. Lemmings and Grand Theft Auto… not bad for a city of only 150,000 souls.

(Sources: the book Grand Thieves and Tomb Raiders by Magnus Anderson and Rebecca Levene; Compute! of January 1992; Amiga Format of May 1993 and the special 1992 annual; Retro Gamer 39; The One of November 1993; Computer Gaming World of July 1993.

Lemmings 2 has never gotten a digital re-release. I therefore make it available for download here, packaged to be as easy as possible to get running under DOSBox on your modern computer.)

 
 

Tags: , , ,

The 68000 Wars, Part 6: The Unraveling

Commodore International’s roots are in manufacturing, not computing. They’re used to making and selling all kinds of things, from calculators to watches to office furniture. Computers just happen to be a profitable sideline they stumbled into. Commodore International isn’t really a computer company; they’re a company that happens to make computers. They have no grand vision of computing in the future; Commodore International merely wants to make products that sell. Right now, the products they’re set up to sell happen to be computers and video games. Next year, they might be bicycles or fax machines.

The top execs at Commodore International don’t really understand why so many people love the Amiga. You see, to them, it’s as if customers were falling in love with a dishwasher or a brand of paper towels. Why would anybody love a computer? It’s just a chunk of hardware that we sell, fer cryin’ out loud.

— Amiga columnist “The Bandito,” Amazing Computing, January 1994

Commodore has never been known for their ability to arouse public support. They have never been noted for their ability to turn lemons into lemonade. But, they have been known to take a bad situation and create a disaster. At least there is some satisfaction in noting their consistency.

— Don Hicks, managing editor, Amazing Computing, July 1994

In the summer of 1992, loyal users of the Commodore Amiga finally heard a piece of news they had been anxiously awaiting for half a decade. At long last, Commodore was about to release new Amiga models sporting a whole new set of custom chips. The new models would, in other words, finally do more than tinker at the edges of the aging technology which had been created by the employees of little Amiga, Incorporated between 1982 and 1985. It was all way overdue, but, as they say, better late than never.

The story of just how this new chipset managed to arrive so astonishingly late is a classic Commodore tale of managerial incompetence, neglect, and greed, against which the company’s overtaxed, understaffed engineering teams could only hope to fight a rearguard action.



Commodore’s management had not woken up to the need to significantly improve the Amiga until the summer of 1988, after much of its technological lead over its contemporaries running MS-DOS and MacOS had already been squandered. Nevertheless, the engineers began with high hopes for what they called the “Advanced Amiga Architecture,” or the “AAA” chipset. It was to push the machine’s maximum screen resolution from 640 X 400 to 1024 X 768, while pushing its palette of available colors from 4096 to 16.8 million. The blitter and other pieces of custom hardware which made the machine so adept at 2D animation would be retained and vastly improved, even as the current implementation of planar graphics would be joined by “chunky-graphics” modes which were more suitable for 3D animation. Further, the new chips would, at the user’s discretion, do away with the flicker-prone interlaced video signal that the current Amiga used for vertical resolutions above 200 pixels, which made the machine ideal for desktop-video applications but annoyed the heck out of anyone trying to do anything else with it. And it would now boast eight instead of four separate sound channels, each of which would now offer 16-bit resolution — i.e., the same quality as an audio CD.

All of this was to be ready to go in a new Amiga model by the end of 1990. Had Commodore been able to meet that time table and release said model at a reasonable price point, it would have marked almost as dramatic an advance over the current state of the art in multimedia personal computing as had the original Amiga 1000 back in 1985.

Sadly, though, no plan at Commodore could long survive contact with management’s fundamental cluelessness. By this point, the research-and-development budget had been slashed to barely half what it had been when the company’s only products were simple 8-bit computers like the Commodore 64. Often only one engineer at a time was assigned to each of the three core AAA chips, and said engineer was more often than not young and inexperienced, because who else would work 80-hour weeks at the salaries Commodore paid? Throw in a complete lack of day-to-day oversight or management coordination, and you had a recipe for endless wheel-spinning. AAA fell behind schedule, then fell further behind, then fell behind some more.

Some fifteen months after the AAA project had begun, Commodore started a second chip-set project, which they initially called “AA.” The designation was a baseball metaphor rather than an acronym; the “AAA” league in American baseball is the top division short of the major leagues, while the “AA” league is one rung further down. As the name would indicate, then, the AA chipset was envisioned as a more modest evolution of the Amiga’s architecture, an intermediate step between the original chipset and AAA. Like the latter, AA would offer 16.8 million colors — of which 256 could be onscreen at once without restrictions, more with some trade-offs —  but only at a maximum non-interlaced resolution of 640 X 480, or 800 X 600 in interlace mode. Meanwhile the current sound system would be left entirely alone. On paper, even these improvements moved the Amiga some distance beyond the existing Wintel VGA standard — but then again, that world of technology wasn’t standing still either. Much depended on getting the AA chips out quickly.

But “quick” was an adjective which seldom applied to Commodore. First planned for release on roughly the same time scale that had once been envisioned for the AAA chipset, AA too fell badly beyond schedule, not least because the tiny engineering team was now forced to split their energies between the two projects. It wasn’t until the fall of 1992 that AA, now renamed to the “Advanced Graphics Architecture,” or “AGA,” made its belated appearance. That is to say, the stopgap solution to the Amiga’s encroaching obsolescence arrived fully two years after the comprehensive solution to the problem ought to have shipped. Such was life at Commodore.

Rather than putting the Amiga out in front of the competition, AGA at this late date could only move it into a position of rough parity with the majority of the so-called “Super VGA” graphics cards which had become fairly commonplace in the Wintel world over the preceding year or so. And with graphics technology evolving quickly in the consumer space to meet the demands of CD-ROM and full-motion video, even the Amiga’s parity wouldn’t last for long. The Amiga, the erstwhile pioneer of multimedia computing, was now undeniably playing catch-up against the rest of the industry.

The Amiga 4000

AGA arrived inside two new models which evoked immediate memories of the Amiga 2000 and 500 from 1987, the most successful products in the platform’s history. Like the old Amiga 2000, the new Amiga 4000 was the “professional” machine, shipping in a big case full of expansion slots, with 4 MB of standard memory, a large hard drive, and a 68040 processor running at 25 MHz, all for a street price of around $2600. Like the old Amiga 500, the Amiga 1200 was the “home” model, shipping in an all-in-one-case form factor without room for internal expansion, with 2 MB of standard memory, a floppy drive only, and a 68020 processor running at 14 MHz, all for a price of about $600.

The two models were concrete manifestations of what a geographically bifurcated computing platform the Amiga had become by the early 1990s. In effect, the Amiga 4000 was to be the new face of Amiga computing in North America; ditto the Amiga 1200 in Europe. Commodore would make only scattered, desultory attempts to sell each model outside of its natural market.



Although the Amiga 500 had once enjoyed some measure of success in the United States as a games machine and general-purpose home computer, those days were long gone by 1992. That year, MS-DOS and Windows accounted for 87 percent of all American computer-game sales and the Macintosh for 9 percent, while the Amiga was lumped rudely into the 4 percent labeled simply “other.” Small wonder that very few American games publishers still gave any consideration to the platform at all; what games were still available for the Amiga in North America must usually be acquired by mail order, often as pricey imports from foreign climes. Then, too, most of the other areas where the Amiga had once been a player, and as often as not a pioneer — computer-based art and animation, 3D modeling, music production, etc. — had also fallen by the wayside, with most of the slack for such artsy endeavors being picked up by the Macintosh.

The story of Eric Graham was depressingly typical of the trend. Back in 1986, Graham had created a stunning ray-traced 3D animation called The Juggler on the Amiga; it became a staple of shop windows, filling at least some of the gap left by Commodore’s inept marketing. User demand had then led him to create Sculpt 3D, one of the first two practical 3D modeling applications for a consumer-class personal computer, and release it through the publisher Byte by Byte in mid-1987. (The other claimant to the status of absolute first of the breed ran on the Amiga as well; it was called Videoscape 3D, and was released virtually simultaneously with Sculpt 3D). But by 1989 the latest Macintosh models had also become powerful enough to support Graham’s software. Therefore Byte by Byte and Graham decided to jump to that platform, which already boasted a much larger user base who tended to be willing to pay higher prices for their software. Orphaned on the Amiga, the Sculpt 3D line continued on the Mac until 1996. Thanks to it and many other products, the Mac took over the lead in the burgeoning field of 3D modeling. And as went 3D modeling, so went a dozen other arenas of digital creativity.

The one place where the Amiga’s toehold did prove unshakeable was desktop video, where its otherwise loathed interlaced graphics modes were loved for the way they let the machine sync up with the analog video-production equipment typical of the time: televisions, VCRs, camcorders, etc. From the very beginning, both professionals and a fair number of dedicated amateurs used Amigas for titling, special effects, color correction, fades and wipes, and other forms of post-production work. Amigas were used by countless television stations to display programming information and do titling overlays, and found their way onto the sets of such television and film productions as Amazing Stories, Max Headroom, and Robocop 2. Even as the Amiga was fading in many other areas, video production on the platform got an enormous boost in December of 1990, when an innovative little Kansan company called NewTek released the Video Toaster, a combination of hardware and software which NewTek advertised, with less hyperbole than you might imagine, as an “all-in-one broadcast studio in a box” — just add one Amiga. Now the Amiga’s production credits got more impressive still: Babylon 5, seaQuest DSV, Quantum Leap, Jurassic Park, sports-arena Jumbotrons all over the country. Amiga models dating from the 1980s would remain fixtures in countless local television stations until well after the millennium, when the transition from analog to digital transmission finally forced their retirement.

Ironically, this whole usage scenario stemmed from what was essentially an accidental artifact of the Amiga’s design; Jay Miner, the original machine’s lead hardware designer, had envisioned its ability to mix and match with other video sources not as a means of inexpensive video post-production but rather as a way of overlaying interactive game graphics onto the output from an attached laser-disc accessory, a technique that was briefly en vogue in videogame arcades. Nonetheless, the capability was truly a godsend, the only thing keeping the platform alive at all in North America.

On the other hand, though, it was hard not to lament a straitening of the platform’s old spirit of expansive, experimental creativity across many fields. As far as the market was concerned, the Amiga was steadily morphing from a general-purpose computer into a piece of niche technology for a vertical market. By the early 1990s, most of the remaining North American Amiga magazines had become all but indistinguishable from any other dryly technical trade journal serving a rigidly specialized readership. In a telling sign of the times, it was almost universally agreed that early sales of the Amiga 4000, which were rather disappointing even by Commodore’s recent standards, were hampered by the fact that it initially didn’t work with the Video Toaster. (An updated “Video Toaster 4000” wouldn’t finally arrive until a year after the Amiga 4000 itself.) Many users now considered the Amiga little more than a necessary piece of plumbing for the Video Toaster. NewTek and others sold turnkey systems that barely even mentioned the name “Commodore Amiga.” Some store owners whispered that they could actually sell a lot more Video Toaster systems that way. After all, Commodore was known to most of their customers only as the company that had made those “toy computers” back in the 1980s; in the world of professional video and film production, that sort of name recognition was worth less than no name recognition at all.

In Europe, meanwhile, the nature of the baggage that came attached to the Commodore name perhaps wasn’t all that different in the broad strokes, but it did carry more positive overtones. In sheer number of units sold, the Amiga had always been vastly more successful in Europe than in North America, and that trend accelerated dramatically in the 1990s. Across the pond, it enjoyed all the mass-market acceptance it lacked in its home country. It was particularly dominant in Britain and West Germany, two of the three biggest economies in Europe. Here, the Amiga was nothing more nor less than a great games machine. The sturdy all-in-one-case design of the Amiga 500 and, now, the Amiga 1200 placed it on a continuum with the likes of the Sinclair Spectrum and the Commodore 64. There was a lot more money to be made selling computers to millions of eager European teenagers than to thousands of sober American professionals, whatever the wildly different price tags of the individual machines. Europe was accounting for as much as 88 percent of Commodore’s annual revenues by the early 1990s.

And yet here as well the picture was less rosy than it might first appear. While Commodore sold almost 2 million Amigas in Europe during 1991 and 1992, sales were trending in the wrong direction by the end of that period. Nintendo and Sega were now moving into Europe with their newest console systems, and Microsoft Windows as well was fast gaining traction. Thus it comes as no surprise that the Amiga 1200, which first shipped in December of 1992, a few months after the Amiga 4000, was greeted with sighs of relief by Commodore’s European subsidiaries, followed by much nervous trepidation. Was the new model enough of an improvement to reverse the trend of declining sales and steal a march on the competition once again? Sega especially had now become a major player in Europe, selling videogame consoles which were both cheaper and easier to operate than an Amiga 1200, lacking as they did any pretense of being full-fledged computers.

If Commodore was facing a murky future on two continents, they could take some consolation in the fact that their old arch-rival Atari was, as usual, even worse off. The Atari Lynx, the handheld game console which Jack Tramiel had bilked Epyx out of for a song, was now the company’s one reasonably reliable source of revenue; it would sell almost 3 million units between 1989 and 1994. The Atari ST, the computer line which had caused such a stir when it had beat the original Amiga into stores back in 1985 but had been playing second fiddle ever since, offered up its swansong in 1992 in the form of the Falcon, a would-be powerhouse which, as always, lagged just a little bit behind the latest Amiga models’ capabilities. Even in Europe, where Atari, like Commodore, had a much stronger brand than in North America, the Falcon sold hardly at all. The whole ST line would soon be discontinued, leaving it unclear how Atari intended to survive after the Lynx faded into history. Already in 1992, their annual sales fell to $127 million — barely a seventh those of Commodore.

Still, everything was in flux, and it was an open question whether Commodore could continue to sell Amigas in Europe at the pace to which they had become accustomed. One persistent question dogging the Amiga 1200 was that of compatibility. Although the new chipset was designed to be as compatible as possible with the old, in practice many of the most popular old Amiga games didn’t work right on the new machine. This reality could give pause to any potential upgrader with a substantial library of existing games and no desire to keep two Amigas around the house. If your favorite old games weren’t going to work on the new machine anyway, why not try something completely different, like those much more robust and professional-looking Windows computers the local stores had just started selling, or one of those new-fangled videogame consoles?



The compatibility problems were emblematic of the way that the Amiga, while certainly not an antique like the 8-bit generation of computers, wasn’t entirely modern either. MacOS and Windows isolated their software environment from changing hardware by not allowing applications to have direct access to the bare metal of the computer; everything had to be passed through the operating system, which in turn relied on the drivers provided by hardware manufacturers to ensure that the same program worked the same way on a multitude of configurations. AmigaOS provided the same services, and its technical manuals as well asked applications to function this way — but, crucially, it didn’t require that they do so. European game programmers in particular had a habit of using AmigaOS only as a bootstrap. Doing so was more efficient than doing things the “correct” way; most of the audiovisually striking games which made the Amiga’s reputation would have been simply impossible using “proper” programming techniques. Yet it created a brittle software ecosystem which was ill-suited for the long term. Already before 1992 Amiga gamers had had to contend with software that didn’t work on machines with more than 512 K of memory, or with a hard drive attached, or with or without a certain minor revision of the custom chips, or any number of other vagaries of configuration. With the advent of the AGA chipset, such problems really came home to roost.

An American Amiga 1200. In the context of American computing circa 1992, it looked like a bizarrely antediluvian gadget. “Real” computers just weren’t sold in this style of all-in-one case anymore, with peripherals dangling messily off the side. It looked like a chintzy toy to Americans, whatever the capabilities hidden inside. A telling detail: notice the two blank keys on the keyboard, which were stamped with characters only in some continental European markets that needed them. Rather than tool up to produce more than one physical keyboard layout, Commodore just let them sit there in all their pointlessness on the American machines. Can you imagine Apple or IBM, or any other reputable computer maker, doing this? To Americans, and to an increasing number of Europeans as well, the Amiga 1200 just seemed… cheap.

Back in 1985, AmigaOS had been the very first consumer-oriented operating system to boast full-fledged preemptive multitasking, something that neither MacOS nor Microsoft Windows could yet lay claim to even in 1992; they were still forced to rely on cooperative multitasking, which placed them at the mercy of individual applications’ willingness to voluntarily cede time to others. Yet the usefulness of AmigaOS’s multitasking was limited by its lack of memory protection. Thanks to this lack, any individual program on the system could write, intentionally or unintentionally, all over the memory allocated to another; system crashes were a sad fact of life for the Amiga power user. AmigaOS also lacked a virtual-memory system that would have allowed more applications to run than the physical memory could support. In these respects and others — most notably its graphical interface, which still evinced nothing like the usability of Windows, much less the smooth elegance of the Macintosh desktop — AmigaOS lagged behind its rivals.

It is true that MacOS, dating as it did from roughly the same period in the evolution of the personal computer, was struggling with similar issues: trying to implement some form of multitasking where none at all had existed originally, kludging in support for virtual memory and some form of rudimentary memory protection. The difference was that MacOS was evolving, however imperfectly. While AmigaOS 3.0, which debuted with the AGA machines, did offer some welcome improvements in terms of cosmetics, it did nothing to address the operating system’s core failings. It’s doubtful whether anyone in Commodore’s upper management even knew enough about computers to realize they existed.

This quality of having one foot in each of two different computing eras dogged the platform in yet more ways. The very architectural approach of the Amiga — that of an ultra-efficient machine built around a set of tightly coupled custom chips — had become passé as Wintel and to a large extent even the Macintosh had embraced modular architectures where almost everything could live on swappable cards, letting users mix and match capabilities and upgrade their machines piecemeal rather than all at once. One might even say that it was down almost as much to the Amiga’s architectural philosophy as it was to Commodore’s incompetence that the machine had had such a devil of a time getting itself upgraded.

And yet the problems involved in upgrading the custom chips were as nothing compared to the gravest of all the existential threats facing the Amiga. It was common knowledge in the industry by 1992 that Motorola was winding down further development of the 68000 line, the CPUs at the heart of all Amigas. Indeed, Apple, whose Macintosh used the same CPUs, had seen the writing on the wall as early as the beginning of 1990, and had started projects to port MacOS to other architectures, using software emulation as a way to retain compatibility with legacy applications. In 1991, they settled on the PowerPC, a CPU developed by an unlikely consortium of Apple, IBM, and Motorola, as the future of the Macintosh. The first of the so-called “Power Macs” would debut in March of 1994. The whole transition would come to constitute one of the more remarkable sleights of hand in all computing history; the emulation layer combined with the ported version of MacOS would work so seamlessly that many users would never fully grasp what was really happening at all.

But Commodore, alas, was in no position to follow suit, even had they had the foresight to realize what a ticking time bomb the end of the 68000 line truly was. AmigaOS’s more easygoing attitude toward the software it enabled meant that any transition must be fraught with far more pain for the user, and Commodore had nothing like the resources Apple had to throw at the problem in any case. But of course, the thoroughgoing, eternal incompetence of Commodore’s management prevented them from even seeing the problem, much less doing anything about it. While everyone was obliviously rearranging the deck chairs on the S.S. Amiga, it was barreling down on an iceberg as wide as the horizon. The reality was that the Amiga as a computing platform now had a built-in expiration date. After the 68060, the planned swansong of the 68000 family, was released by Motorola in 1994, the Amiga would literally have nowhere left to go.

As it was, though, Commodore’s financial collapse became the more immediate cause that brought about the end of the Amiga as a vital computing platform. So, we should have a look at what happened to drive Commodore from a profitable enterprise, still flirting with annual revenues of $1 billion, to bankruptcy and dissolution in the span of less than two years.



During the early months of 1993, an initial batch of encouraging reports, stating that the Amiga 1200 had been well-received in Europe, was overshadowed by an indelibly Commodorian tale of making lemons out of lemonade. It turned out that they had announced the Amiga 1200 too early, then shipped it too late and in too small quantities the previous Christmas. Consumers had chosen to forgo the Amiga 500 and 600 — the latter being a recently introduced ultra-low-end model — out of the not-unreasonable belief that the generation of Amiga technology these models represented would soon be hopelessly obsolete. Finding the new model unavailable, they’d bought nothing at all — or, more likely, bought something from a competitor like Sega instead. The result was a disastrous Christmas season: Commodore didn’t yet have the new computers everybody wanted, and couldn’t sell the old computers they did have, which they’d inexplicably manufactured and stockpiled as if they’d had no inkling that the Amiga 1200 was coming. They lost $21.9 million in the quarter that was traditionally their strongest by far.

Scant supplies of the Amiga 1200 continued to devastate overall Amiga sales well after Christmas, leaving one to wonder why on earth Commodore hadn’t tooled up to manufacture sufficient quantities of a machine they had hoped and believed would be a big hit, for once with some justification. In the first quarter of 1993, Commodore lost a whopping $177.6 million on sales of just $120.9 million, thanks to a massive write-down of their inventory of older Amiga models piling up in warehouses. Unit sales of Amigas dropped by 25 percent from the same quarter of the previous year; Amiga revenues dropped by 45 percent, thanks to deep price cuts instituted to try to move all those moribund 500s and 600s. Commodore’s share price plunged to around $2.75, down from $11 a year before, $20 the year before that. Wall Street estimated that the whole company was now worth only $30 million after its liabilities had been subtracted — a drop of one full order of magnitude in the span of a year.

If anything, Wall Street’s valuation was generous. Commodore was now dragging behind them $115.3 million in debt. In light of this, combined with their longstanding reputation for being ethically challenged in all sorts of ways, the credit agencies considered them to be too poor a risk for any new loans. Already they were desperately pursuing “debt restructuring” with their major lenders, promising them, with little to back it up, that the next Christmas season would make everything right as rain again. Such hopes looked even more unfounded in light of the fact that Commodore was now making deep cuts to their engineering and marketing staffs — i.e., the only people who might be able to get them out of this mess. Certainly the AAA chipset, still officially an ongoing project, looked farther away than ever now, two and a half years after it was supposed to have hit the streets.

The Amiga CD32

It was for these reasons that the announcement of the Amiga CD32, the last major product introduction in Commodore’s long and checkered history, came as such a surprise to everyone in mid-1993. CD32 was perhaps the first comprehensively, unadulteratedly smart product Commodore had released since the Amiga 2000 and 500 back in 1987. It was as clever a leveraging of their dwindling assets as anyone could ask for. Rather than another computer, it was a games console, built around a double-speed CD-ROM drive. Commodore had tried something similar before, with the CDTV unit of two and a half years earlier, only to watch it go down in flames. For once, though, they had learned from their mistakes.

CDTV had been a member of an amorphously defined class of CD-based multimedia appliances for the living room — see also the Philips CD-I, the 3DO console, and the Tandy VIS — which had all been or would soon become dismal failures. Upon seeing such gadgets demonstrated, consumers, unimpressed by ponderous encyclopedias that were harder to use and less complete than the ones already on their bookshelves, trip-planning software that was less intuitive than the atlases already in their glove boxes, and grainy film clips that made their VCRs look high-fidelity, all gave vent to the same plaintive question: “But what good is it really?” The only CD-based console which was doing well was, not coincidentally, the only one which could give a clear, unequivocal answer to this question. The Sega Genesis with CD add-on was good for playing videogames, full stop.

Commodore followed Sega’s lead with the CD32. It too looked, acted, and played like a videogame console — the most impressive one on the market, with specifications far outshining the Sega CD. Whereas Commodore had deliberately obscured the CDTV’s technological connection to the Amiga, they trumpeted it with the CD32, capitalizing on the name’s association with superb games. At heart, the CD32 was just a repackaged Amiga 1200, in the same way that the CDTV had been a repackaged Amiga 500. Yet this wasn’t a problem at all. For all that it was a little underwhelming in the world of computers, the AGA chipset was audiovisually superior to anything the console world could offer up, while the 32-bit 68020 that served as the CD32’s brain gave it much more raw horsepower. Meanwhile the fact that at heart it was just an Amiga in a new form factor gave it a huge leg up with publishers and developers; almost any given Amiga game could be ported to the CD32 in a week or two. Throw in a price tag of less than $400 (about $200 less than the going price of a “real” Amiga 1200, if you could find one), and, for the first time in years, Commodore had a thoroughly compelling new product, with a measure of natural appeal to people who weren’t already members of the Amiga cult. Thanks to the walled-garden model of software distribution that was the norm in the console world, Commodore stood to make money not only on every CD32 sold but also from a licensing fee of $3 on every individual game sold for the console. If the CD32 really took off, it could turn into one heck of a cash cow. If only Commodore could have released it six months earlier, or have managed to remain financially solvent for six months longer, it might even have saved them.

As it was, the CD32 made a noble last stand for a company that had long made ignobility its calling card. Released in September of 1993 in Europe, it generated some real excitement, thanks not least to a surprisingly large stable of launch titles, fruit of that ease of porting games from the “real” Amiga models. Commodore sold CD32s as fast as they could make them that Christmas — which was unfortunately nowhere near as fast as they might have liked, thanks to their current financial straits. Nevertheless, in those European stores where CD32s were on-hand to compete with the Sega CD, the former often outsold the latter by a margin of four to one. Over 50,000 CD32s were sold in December alone.

The Atari Jaguar. There was some mockery, perhaps justifiable, of its “Jetsons” design aesthetic.

Ironically, Atari’s last act took much the same form as Commodore’s. In November of 1993, following a horrific third quarter in which they had lost $17.6 million on sales of just $4.4 million, they released a game console of their own, called the Jaguar, in North America. In keeping with the tradition dating back to 1985, it was cheaper than Commodore’s take on the same concept — its street price was under $250 — but not quite as powerful, lacking a CD-ROM drive. Suffering from a poor selection of games, as well as reliability problems and outright hardware bugs, the Jaguar faced an uphill climb; Atari shipped less than 20,000 of them in 1993. Nevertheless, the Tramiel clan confidently predicted that they would sell 500,000 units in 1994, and at least some people bought into the hype, sending Atari’s stock soaring to almost $15 even as Commodore’s continued to plummet.

For the reality was, the rapid unraveling of all other facets of Commodore’s business had rendered the question of the CD32’s success moot. The remaining employees who worked at the sprawling campus in West Chester, Pennsylvania, purchased a decade before when the VIC-20 and 64 were flying off shelves and Jack “Business is War” Tramiel was stomping his rival home-computer makers into dust, felt like dwarfs wandering through the ancient ruins of giants. Once there had been more than 600 employees here; now there were about 50. There was 10,000 square feet of space per employee in a facility where it cost $8000 per day just to keep the lights on. You could wander for hours through the deserted warehouses, shuttered production lines, and empty research labs without seeing another living soul. Commodore was trying to lease some of it out for an attractive rent of $4 per square foot, but, as with with most of their computers, nobody seemed all that interested. The executive staff, not wanting the stigma of having gone down with the ship on their resumes, were starting to jump for shore. Commodore’s chief financial officer threw up his hands and quit in the summer of 1993; the company’s president followed in the fall.

Apart from the CD32, for which they lacked the resources to manufacture enough units to meet demand, virtually none of the hardware piled up in Commodore’s European warehouses was selling at all anymore. In the third quarter of 1993, they lost $9.7 million, followed by $8 million in the fourth quarter, on sales of just $70 million. After a second disastrous Christmas in a row, it could only be a question of time.

In a way, it was the small things rather than the eye-popping financial figures which drove the point home. For example, the April 1994 edition of the New York World of Commodore show, for years already a shadow of its old vibrant self, was cancelled entirely due to lack of interest. And the Army and Air Force Exchange, which served as a storefront to American military personnel at bases all over the world, kicked Commodore off its list of suppliers because they weren’t paying their bills. It’s by a thousand little cuts like this one, each representing another sales opportunity lost, that a consumer-electronics company dies. At the Winter Consumer Electronics Show in January of 1994, at which Commodore did manage a tepid presence, their own head of marketing told people straight out that the Amiga had no future as a general-purpose computer; Commodore’s only remaining prospects, he said, lay with the American vertical market of video production and the European mass market of videogame consoles. But they didn’t have the money to continue building the hardware these markets were demanding, and no bank was willing to lend them any.

The proverbial straw which broke the camel’s back was a dodgy third-party patent relating to a commonplace programming technique used to keep a mouse pointer separate from the rest of the screen. Commodore had failed to pay the patent fee for years, the patent holder eventually sued, and in April of 1994 the court levied an injunction preventing Commodore from doing any more business at all in the United States until they paid up. The sum in question was a relatively modest $2.5 million, but Commodore simply didn’t have the money to give.

On April 29, 1994, in a briefly matter-of-fact press release, Commodore announced that they were going out of business: “The company plans to transfer its assets to unidentified trustees for the benefit of its creditors. This is the initial phase of an orderly voluntary liquidation.” And just like that, a company which had dominated consumer computing in the United States and much of Europe for a good part of the previous decade and a half was no more. The business press and the American public showed barely a flicker of interest; most of them had assumed that Commodore was already long out of business. European gamers reacted with shock and panic — few had realized how bad things had gotten for Commodore — but there was nothing to be done.

Thus it was that Atari, despite being chronically ill for a much longer period of time, managed to outlive Commodore in the end. Still, this isn’t to say that their own situation at the time of Commodore’s collapse was a terribly good one. When the reality hit home that the Jaguar probably wasn’t going to be a sustainable gaming platform at all, much less sell 500,000 units in 1994 alone, their stock plunged back down to less than $1 per share. In the aftermath, Atari limped on as little more than a patent troll, surviving by extracting judgments from other videogame makers, most notably Sega, for infringing on dubious intellectual property dating back to the 1970s. This proved to be an ironically more profitable endeavor for them than that of actually selling computers or game consoles. On July 30, 1996, the Tramiels finally cashed out, agreeing to merge the remnants of their company with JT Storage, a maker of hard disks, who saw some lingering value in the trademarks and the patents. It was a liquidation in all but name; only three Atari employees transitioned to the “merged” entity, which continued under the same old name of JT Storage.

And so disappeared the storied name of Atari and that of Tramiel simultaneously from the technology industry. Even as the trade magazines were publishing eulogies to the former, few were sorry to see the latter go, what with their long history of lawsuits, dirty dealing, and abundant bad faith. Jack Tramiel had purchased Atari in 1984 out of the belief that creating another phenomenon like the Commodore 64 — or for that matter the Atari VCS — would be easy. But the twelve years that followed were destined always to remain a footnote to his one extraordinary success, a cautionary tale about the dangers of conflating lucky timing and tactical opportunism with long-term strategic genius.

Even so, the fact does remain that the Commodore 64 brought affordable computing to millions of people all over the world. For that every one of those millions owes Jack Tramiel, who died in 2012, a certain debt of gratitude. Perhaps the kindest thing we can do for him is to end his eulogy there.



The story of the Amiga after the death of Commodore is long, confusing, and largely if not entirely dispiriting; for all these reasons, I’d rather not dwell on it at length here. Its most positive aspect is the surprisingly long commercial half-life the platform enjoyed in Europe, over the course of which game developers still found a receptive if slowly dwindling market ready to buy their wares. The last glossy newsstand magazine devoted to the Amiga, the British Amiga Active, didn’t publish its last issue until the rather astonishingly late date of November of 2001.

The Amiga technology itself first passed into the hands of a German PC maker known as Escom, who actually started manufacturing new Amiga 1200s for a time. In 1996, however, Escom themselves went bankrupt. The American PC maker Gateway 2000 became the last major company to bother with the aging technology when they bought it at the Escom bankruptcy auction. Afterward, though, they apparently had second thoughts; they did nothing whatsoever with it before selling it onward at a loss. From there, it passed into other, even less sure hands, selling always at a discount. There are still various projects bearing the Amiga name today, and I suspect they will continue until the generation who fell in love with the platform in its heyday have all expired. But these are little more than hobbyist endeavors, selling their products in minuscule numbers to customer motivated more by their nostalgic attachment to the Amiga name than by any practical need. It’s far from clear what the idea of an “Amiga computer” should even mean in 2020.

When the hardcore of the Amiga hardcore aren’t dreaming quixotically of the platform’s world-conquering return, they’re picking through the rubble of the past, trying to figure out where it all went wrong. Among a long roll call of petty incompetents in Commodore’s executive suites, two clear super-villains emerge: Irving Gould, Commodore’s chairman since the mid-1960s, and Mehdi Ali, his final hand-picked chief executive. Their mismanagement in the latter days of the company was so egregious that some have put it down to evil genius rather than idiocy. The typical hypothesis says that these two realized at some point in the very early 1990s that Commodore’s days were likely numbered, and that they could get more out of the company for themselves by running it into the ground than they could by trying to keep it alive. I usually have little time for such conspiracy theories; as far as I’m concerned, a good rule of thumb for life in general is never to attribute to evil intent what can just as easily be chalked up to good old human stupidity. In this case, though, there’s some circumstantial evidence lending at least a bit of weight to the theory.

The first and perhaps most telling piece of evidence is the two men’s ridiculously exorbitant salaries, even as their company was collapsing around them. In 1993, Mehdi Ali took home $2 million, making him the fourth highest-paid executive in the entire technology sector. Irving Gould earned $1.75 million that year — seventh on the list. Why were these men paying themselves as if they ran a thriving company when the reality was so very much the opposite? One can’t help but suspect that Gould at least, who owned 19 percent of Commodore’s stock, was trying to offset his losses on the one field by raising his personal salary on the other.

And then there’s the way that Gould, an enormously rich man whose personal net worth was much higher than that of all of Commodore by the end, was so weirdly niggardly in helping his company out of its financial jam. While he did loan fully $17.4 million back to Commodore, the operative word here is indeed “loan”: he structured his cash injections to ensure that he would be first in line to get his money back if and when the company went bankrupt, and stopped throwing good money after bad as soon as the threshold of the collateral it could offer up in exchange was exceeded. One can’t help but wonder what might have become of the CD32 if he’d been willing to go all-in to try to turn it into a success.

Of course, this is all rank speculation, which will quickly become libelous if I continue much further down this road. Suffice to say that questionable ethics were always an indelible part of Commodore. Born in scandal, the company would quite likely have ended in scandal as well if anyone in authority had been bothered enough by its anticlimactic bankruptcy to look further. I’d love to see what a savvy financial journalist could make of Commodore’s history. But, alas, I have neither the skill nor the resources for such a project, and the story is of little interest to the mainstream journalists of today. The era is past, the bodies are buried, and there are newer and bigger outrages to fill our newspapers.



Instead, then, I’ll conclude with two brief eulogies to mark the end of the Amiga’s role in this ongoing history. Rather than eulogizing in my own words, I’m going to use those of a true Amiga zealot: the anonymous figure known as “The Bandito,” whose “Roomers” columns in the magazine Amazing Computing were filled with cogent insights and nitty-gritty financial details every month. (For that reason, they’ve been invaluable sources for this series of articles.)

Jay Miner, the gentle genius, in 1990. In interviews like the one to which this photo was attached, he always seemed a little befuddled by the praise and love which Amiga users lavished upon him.

First, to Jay Miner, the canonical “father of the Amiga,” who died of the kidney disease he had been battling for most of his life on June 20, 1994, at age 62. If a machine can reflect the personality of a man, the Amiga certainly reflected his:

Jay was not only the inventive genius who designed the custom chips behind the Atari 800 and the Amiga, he also designed many more electronic devices, including a new pacemaker that allows the user to set their own heart rate (which allows them to participate in strenuous activities once denied to them). Jay was not only a brilliant engineer, he was a kind, gentle, and unassuming man who won the hearts of Amiga fans everywhere he went. Jay was continually amazed and impressed at what people had done with his creations, and he loved more than anything to see the joy people obtained from the Amiga.

We love you, Jay, for all the gifts that you have given to us, and all the fruits of your genius that you have shared with us. Rest in peace.

And now a last word on the Amiga itself, from the very last “Roomers” column, written by someone who had been there from the beginning:

The Amiga has left an indelible mark on the history of computing. [It] stands as a shining example of excellent hardware design. Its capabilities foreshadowed the directions of the entire computer industry: thousands of colors, multiple screen resolutions, multitasking, high-quality sound, fast animation, video capability, and more. It was the beauty and elegance of the hardware that sold the Amiga to so many millions of people. The Amiga sold despite Commodore’s neglect, despite their bumbling and almost criminal marketing programs. Developers wrote brilliantly for this amazing piece of hardware, creating software that even amazed the creators of the hardware. The Amiga heralded the change that’s even now transforming the television industry, with inexpensive CGI and video editing making for a whole new type of television program.

Amiga game software also changed the face of entertainment software. Electronic Arts launched themselves headlong into 16-bit entertainment software with their Amiga software line, which helped propel them into the $500 million giant they are today. Cinemaware’s Defender of the Crown showed people what computer entertainment could look like: real pictures, not blocky collections of pixels. For a while, the Amiga was the entertainment-software machine to have.

In light of all these accomplishments, the story of the Amiga really isn’t the tragedy of missed opportunities and unrealized potential that it’s so often framed as. The very design that made it able to do so many incredible things at such an early date — its tightly coupled custom chips, its groundbreaking but lightweight operating system — made it hard for the platform to evolve in the same ways that the less imaginative, less efficient, but modular Wintel and MacOS architectures ultimately did. While it lasted, however, it gave the world a sneak preview of its future, inspiring thousands who would go on to do good work on other platforms. We are all more or less the heirs to the vision embodied in the original Amiga Lorraine, whether we ever used a real Amiga or not. The platform’s most long-lived and effective marketing slogan, “Only Amiga Makes it Possible,” is of course no longer true. It is true, though, that the Amiga made many things possible first. May it stand forever in the annals of computing history alongside the original Apple Macintosh as one of the two most visionary computers of its generation. For without these two computers — one of them, alas, more celebrated than the other — the digital world that we know today would be a very different place.

(Sources: the books Commodore: The Final Years by Brian Bagnall and my own The Future Was Here; Amazing Computing of November 1992, December 1992, January 1993, February 1993, March 1993, April 1993, May 1993, June 1993, July 1993, September 1993, October 1993, November 1993, December 1993, January 1994, February 1994, March 1994, April 1994, May 1994, June 1994, July 1994, August 1994, September 1994, October 1994, and February 1995; Byte of January 1993; Amiga User International of June 1988; Electronic Gaming Monthly of June 1995; Next Generation of December 1996. My thanks to Eric Graham for corresponding with me about The Juggler and Sculpt 3D years ago when I was writing my book on the Amiga.

Those wishing to read about the Commodore story from the perspective of the engineers in the trenches, who so often accomplished great things in less than ideal conditions, should turn to Brian Bagnall’s full “Commodore Trilogy”: A Company on the Edge, The Amiga Years, and The Final Years.)

 

Tags: , , ,