RSS

Category Archives: Interactive Fiction

Journey

Journey

The parser is and has always been both the text adventure’s ace in the hole and its Achilles heel. Devotees will tell you, correctly in my opinion, that it offers possibilities for interaction — even, one might say, possibilities for interactive wonder — allowed by no other interface. Detractors will tell you, also correctly, that it’s too persnickety, too difficult to use, that in its opacity and inscrutability it violates every rule of modern interface design. Devotees will reply, yet again in my opinion correctly, that if you take away the parser you take away the magic. What can compare with typing some crazy command and seeing it work? What, the detractors reply, can frustrate more than figuring out what to do, not being able to get the parser to acknowledge your efforts, turning to a walkthrough, and finding out you were simply using the wrong verb or the wrong phrasing? And so we go, round and round and round. This waltz of point and counterpoint says as much about the text adventure’s decidedly limited mass appeal as it does about why some of us love the form so darn much.

For most of the text adventure’s lifespan people have been devising various ways to try to break the cycle, to capture at least some of the magic without any of the pain. Even Infocom, whose parser was legendary in its day, had a go in their final days at doing away with the gnarly, troublesome thing altogether, via a game called Journey.

The idea that became Journey can be dated to November 6, 1987, when a proposed “new project” emerged from an internal planning meeting. By that point, attitudes about Infocom’s future prospects had broken into two schools of thought. One view, still dominant inside Infocom’s own offices but viewed with increasing skepticism in the headquarters of their corporate masters Mediagenic, [1]Mediagenic was known as Activision until mid-1988. To avoid confusion, I just stick with the name “Mediagenic” in this article. held that the fundamental model of interaction that Infocom’s games had always utilized, that of reading text and typing commands in response, was still commercially viable in the broad strokes. What was needed was to make that model a bit more visually appealing and accessible, by adding pictures and other audiovisual pizazz to break up their walls of text and by making the parser smarter and friendlier. The other view held that Infocom needed to throw out all their old approaches — among them their parser — and tackle their new role as Mediagenic’s designated “master storytellers” with an entirely blank slate. Conservatives versus radicals, denialists versus realists — call the camps what you will, the lines were drawn.

True to the dominant internal opinion, Infocom put the majority of their resources into one last kick at the can for their parser-based games, putting three new illustrated but still parser-driven text adventures into development. They hedged their bets just a little, however, by making sure the new version 6 Z-Machine they had in development to power those games could support purely mouse-based point-and-click interaction as well as the traditional keyboard-driven approach. And then they started this “new project” of theirs to see what the possibilities for non-parser-based adventuring might really be.

The meeting notes read that said new project should be “true to [the] corporate philosophy”; that it should “embody the concept of ‘interactive storytelling'”; that it should “employ a simple, intuitive user interface unlike the one used in our traditional IF games”; and that, while initially “intended for use on existing home computers,” it should be “readily adaptable to other interactive media, such as CD-I, DVI, Nintendo, etc.” Finally, the plan called for “minimal (or optional) use of text.” This last would fall by the wayside in light of Infocom’s limited resources and complete lack of experience working in anything other than text; instead they would settle for lots of pictures to accompany the text. Otherwise, though, the game Marc Blank wrote in response to this plan would hew quite closely to it.

Ironically, it was Blank who had been the mastermind behind the magnificent parser, first implemented as part of the original Zork at MIT, that had been so key to Infocom’s ascendancy during the first half of the 1980s. Now he would be working on the interface that might just become its replacement if the conservative camp should prove mistaken in their faith in the old ways. But then, Blank wasn’t much of a sentimentalist. Assuming he thought of it at all, the idea of sounding the death knell of the traditional Infocom game didn’t bother him one bit. On the contrary, this new project was a perfect fit for Blank, exactly the sort of medium-advancing technical challenge he loved. He insists today that throughout his work with Infocom game design and story were always secondary in his mind to the technology that enabled them. Thus virtually every one of the games with which he was most intimately involved, whether as the officially recognized Implementor or the self-styled “wizard behind the curtain” enabling the creativity of another, pushed Infocom’s technology forward in one way or another. That would be more true than ever of Journey, which Blank created, as he had Border Zone, from the West Coast, working as an independent contractor rather than an Infocom employee. Blank:

Journey was an experiment to find out whether you could play an interactive story without having to type. It was all about whether you could still have people feel they had the ability to do a lot of different things, but not force them to guess words or use a keyboard. A lot of people just don’t like that; they aren’t good at it. It’s a turn-off. For me, the idea was to just experiment with another style of evolving the story — a different interface, just to see where it would go.

Even more so conceptually than technically, this new interface of his was going to be a tricky business. A bunch of hard-branching links in the form of a computerized Choose Your Own Adventure book was likely to appeal to no one. At the same time, though, to simply write a traditional text adventure in which the parser was a menu-based labyrinth of verbs and nouns would be both technically impractical — there wouldn’t be enough space on the screen for such a thing for one thing, and even the new version 6 Z-Machine didn’t support scrolling menus — and unplayable in its sheer complication. Blank would need to thread the needle, staking a middle ground between the extreme granularity of Zork and the huge irreversible plot swings that accompany almost every branch in a Choose Your Own Adventure book. To a rather remarkable degree really, he succeeded in doing just that.

Blank’s first brilliant stroke was to make Journey, if not quite a full-fledged CRPG, at least a CRPG-like experience. You the player identify most closely with a single character named Tag, who also serves as the author of the past-tense “chronicle” of the adventure that you’re helping him to create. You’re responsible for managing several of his companions in adventure as well, however, each with his own strengths, weaknesses, and special abilities. Most notably, the wizard Praxix can cast spells, each of which requires a certain combination of reagents which you’ll need to collect over the course of your Journey. Many problems can be solved in multiple ways, using different spells or combinations of spells, the special abilities of one character or another, and/or your own native cleverness. While the scope of possibility in Journey is undeniably limited in comparison to a traditional Infocom game, in practice it feels broader than you might expect.

Journey

To understand a little better how that might be, let’s have a closer look at the interface, as shown in the screenshot above. You’ll notice that the menu at the bottom of the screen is divided into five columns. The first contains possibilities that apply to the entire party — usually involving movement — along with access to the “Game” menu of utility commands. The second column, which isn’t actually clickable, lists each character in the party; the party can include up to five people, who can come and go according to choices and circumstances. The third, fourth, and fifth columns contain “verbs” applying only to the individual party member whose row they inhabit; these also come and go as circumstances change. Many verbs will lead to a further menu or menus of “nouns.” For example, asking Praxix the wizard to “cast” leads first to a direct-object list of available spells, and then on to an indirect-object list of possible spell targets, as shown in the screenshot below. Clicking on the name of Bergon to the far right on that screen would complete a command equivalent to typing “cast elevation on Bergon” in a traditional Infocom game. The whole system is elegant and well thought-through. Limited though it may be in contrast to a parser, it nevertheless presents a vastly larger possibility space than a Choose Your Own Adventure story, not least because it has a world model behind it that’s not all that far removed from the one found in any other Infocom game.

Journey

Journey is, as you’ve doubtless gathered by now, a high-fantasy story, a quality that, combined with the CRPG-like flavor, delighted a beleaguered marketing department still searching desperately for a counter to the huge popularity of the Ultima, Bard’s Tale, and Advanced Dungeons & Dragons series. Looking for a way to distinguish it from Infocom’s more traditional “graphical interactive fiction,” marketing dubbed it a “role-play chronicle” — not exactly a phrase that trips off the tongue. Blank:

I wanted to call it ‘role-playing fiction.’ They came back with role-play chronicle, and I said, “What does that mean?” They said, “Well, it’s like a chronicle,” and I said, “Yeah, it sort of is because it’s told in the past tense.” So they just sort of invented a phrase. It’s not my favorite, but it’s passable, and I don’t think Journey will stand or fall on what category you put it in. There are a lot of games that are called this type or that, but what really matters is what people think of them.

Awkward though marketing’s name may have been, there is indeed some truth behind it. One of the more interesting aspects of the game is its commitment to the idea of being a chronicle — or, if you like, a novel — that you, through Tag, are creating as you play. If you choose to make a transcript of your adventure, you can opt to have it not include your explicit command choices if you like, just the text that appears in response. The end result can read surprisingly well — a little disjointed at times, yes, but far better than would, say, Zork in this format.

There is, granted, no denying the story’s derivative nature; this is a game that absolutely oozes Tolkien, a fact that Infocom’s marketing department, far from concealing or denying, trumpeted. Journey, runs the game’s official announcement in Infocom’s The Status Line newsletter, is “a classic narrative in the exciting tradition of Tolkien” that “plunges you into an uncharted world of dwarves, elves, nymphs, and wizards.” True to its inspiration, Tag, ultimately the hero of the story, is seemingly the meekest and weakest of a group of disparate companions who form a fellowship and set out on a lonely quest to save their land from an encroaching evil that threatens their civilization’s very existence. Sound familiar? Name a proper noun in The Fellowship of the Ring, and chances are it has an analogue in Journey.

Journey

For instance, in place of Tolkien’s magic rings Journey has magic stones as the key to defeating the Dread Lord, its version of Sauron. In this extract, Gandalf… I mean, the great wizard Astrix tells the party of the true nature of their quest.

"I have been following your progress with great interest," the Wizard said, stroking his stringy gray beard. "You are a very resourceful group, that is certain!"

His voice then became dark. "The question is: Have you mettle enough to make siege on the Dread Lord himself?" And then, smiling, the darkness fell from his voice, and he answered his own question, "We shall see, I suppose; we shall see."

Leading us to his hearth, he sat us in a semi-circle around the blazing fire and spoke. "There is a story I must tell, a story of Seven Stones. Created in a time lost to living memory, these Stones contained the very strength and essence of our world. Of the Seven, Four were entrusted to the races of men who could use them best: Elves, Dwarves, Nymphs, and Wizards.

"These are the Four: the Elf Stone, green as the forests of old, and the Dwarf Stone, brown as the caverns of Forn a-klamen; the Nymph Stone, blue as the deep waters of M'nera, and the Wizard Stone, red as the dark fire of Serdi.

"The four races are now sundered, and the Four have long been kept apart, but now, with the Dread Lord rearing his misshapen head in our lands, we must bring them together again. For with them, we can hope to find the Two, and then, finally, the One with whose help we can destroy all Evil.

"For it is told that having the Four, it is possible to find the Two; so, also, do the Two give witness to their master, the One that in elder days was called the Anvil!"

Yet somehow Journey is far less cringe-worthy than it ought to be. For a designer who stubbornly, almost passive-aggressively insists today that the technology “was more important than the story” to him, Blank delivered some pretty fine writing at times for Infocom. Journey is full of sturdy, unpretentious prose evoking a world that, overwhelmingly derivative though it is, really does manage to feel epic and interesting in a way too few other gaming fictions have matched in my experience. I was always interested to explore the world’s various corners, always happy and genuinely curious when the opportunity arose for Tag to learn a little more about it from one of the other characters. Coming from me, someone who generally finds the real world much more interesting than fantastical ones, that’s high praise.

Indeed, when I first began to play Journey I was surprised at how much I enjoyed the whole experience. Not expecting to think much of this oddball effort released in Infocom’s dying days, I’d put off playing it for a long, long time; Journey was the very last of the 35 canonical Infocom games that I actually played. Yet when I finally did so I found it a unique and very pleasant experience. It felt very much like what I presumed it to be attempting to be: a more easygoing, relaxed take on the adventure game, where I could feel free to just take in the scenery and enjoy the story instead of stressing too much over puzzles or worrying overmuch about logistics. The game’s own rhetoric, obviously trying to wean players of conventional interactive fiction into this new way of doing things, encourages just such a relaxed approach. “Try to play as much as possible without overusing Save,” says the manual. “There are no ‘dead ends’ in Journey; feel free to experiment and take chances. Every action you take will cause the story to move forward.” This idea of a text adventure with no dead ends encourages comparisons with the contemporary works of Lucasfilm Games in the graphic-adventure realm, who were working toward the same goal in response to the notoriously player-hostile designs of Sierra. Marc Blank’s contemporary interview comments make the comparison feel even more apt:

We’ve learned a lot about interactive storytelling, but it’s been sort of clunky and not directed. I thought it would be interesting to design a story in which you really couldn’t get stuck. The choices you have to make are more tied into the story than into the minutia of manipulating objects. That really led to the whole style of telling the story and the interface. All that came out of the desire to try something like that.

So, yes, Journey and I had a great relationship for quite a while. And then it all went off the rails.

The first sneaking suspicion that something is rotten at the core of Journey may come when it hits you with some puzzles mid-way in that suddenly demand you type in phrases at a command line. Not only a betrayal of the “no-typing” premise that Infocom had hoped would make Journey amenable to game consoles and standalone CD-ROM players, these puzzles aren’t even particularly worthy in their own right, requiring intuitive leaps that feel borderline unfair, especially in contrast to the consummate ease with which the rest of the game is played. But, alas, they’re far from the worst of Journey‘s sins.

For there inevitably comes a point when you realize that everything Infocom has been saying about their game and everything the game has been implying about itself is a lie. Far from being the more easy-going sort of text adventure that it’s purported to be, Journey is a minefield of the very dead ends it decries, a cruel betrayal of everything it supposedly stands for. It turns out that there is exactly one correct path through the dozens of significant choices you make in playing the game to completion. Make one wrong choice and it’s all over. Worse — far worse — more often than not you are given no clue about the irrecoverable blunder you’ve just made. You might play on for hours before being brought up short.

The worst offenders to all notions of fairness and fun cluster around the magic system and its reagents. Remember those puzzles I mentioned that can be solved in multiple ways? Well, that’s true enough in the short term, but in the long term failing to solve each one in the arbitrary right way — i.e., solving it by using a spell instead of your wits, or simply by using the wrong spell — leaves you high and dry later on, without the necessary reagents you need to get further. Playing Journey becomes an exercise in stepping again and again through the story you already know, clicking your way hurriedly through the same text you’ve already read ten times or more, making slight adjustments each time through so as to get past whatever dead end stymied you last time. This process is exactly as much fun as it sounds. In contrast to this exercise in aggravation, Shogun‘s summary halting with a “this scene is no longer winnable” message when you fail to do what the novel’s version of Blackthorne did suddenly doesn’t seem so bad.

How incredible to think that Journey and Shogun stemmed from Marc Blank and Dave Lebling, designers of the original Zork and Infocom’s two most veteran Implementors of all. These two of all people ought to have known better. Both games’ failings feel part and parcel of the general malaise infecting everything Infocom did or tried to do after 1987. Absolutely nothing that anyone did seemed to come out right anymore.

Like those of Shogun, Journey's 100-plus pictures are the work of artist Donald Langosy.

Like those of Shogun, Journey‘s 100-plus pictures are the work of artist Donald Langosy.

As bizarre as it is to see such frankly awful game design from a company like Infocom and an Implementor like Marc Blank, the disconnect between the rhetoric and the reality of Journey is still stranger. “Unlike other games you may have played, there are virtually no dead ends,” the manual promises. “Any action you take will advance the story toward one of its many endings.” I suppose there’s a germ of truthfulness here if you count a dead end only as being stranded in a walking-dead situation; the nature of Journey‘s interface means that you will always get a clear message that the jig is up once you’ve run out of options to move forward, sometimes even accompanied by a helpful hint about where you might have messed up way back when. Still, the assertion seems disingenuous at best. When people talk about multiple endings and multiple paths through an interactive story, this isn’t quite what they mean. Ditto Blank’s contemporary claim that there are “dozens” of “alternative endings,” and “very few places where you get killed.” Really, what’s the practical difference between a losing ending that involves death and one that leaves Tag and his friends defeated in their quest? The Dread Lord wins either way.

Today, none of the people left at Infocom during this final unpleasant period of the company’s existence are particularly eager to talk about those painful end times or the final batch of underwhelming games they produced. Thus I’ve never seen anyone even begin to address the fraught question of just what the hell they were thinking in trying to sell this sow’s ear of a game as a silk purse. Part of the disconnect may have stemmed from the physical distance between Marc Blank and the people at Infocom who wrote the manual and did the marketing; this distance prevented Blank from being as intimately involved in every aspect of his game’s presentation as had long been the norm for the in-house team of Imps. And part of the problem may be that the rhetoric around the game was never modified after the original vision for Journey became the cut-down reality necessitated by time pressure and the space limitations of even the latest version 6 Z-Machine. (While Journey‘s text feels quite expansive in comparison to the typical parser-based Infocom game, Blank was still limited to around 70,000 words in total; the perception of loquacity is doubtless aided by the fact that, Journey‘s scope of player possibility being so much more limited, a much larger percentage of that text can be deployed in service of the main channel of the narrative rather than tributaries that many or most players will never see.) Regardless of the reasons, Journey stands as the most blatant and shameless instance of false advertising in Infocom’s history. It’s really, really hard to square marketing’s claim of “no dead ends” with a game that not only includes dead ends but will end up being defined by them in any player’s memory. Infocom was usually better than this — but then, that’s a statement one finds oneself making too often when looking at their final, troubled run of games.

True to the Tolkien model to the last, Infocom planned to make Journey the first of a trilogy of games, the latter entries of which would likely have been written by other authors. Blank proposed starting on an untitled sort of narrative war game as his own next project, “a variant of traditional FRP [fantasy-role-playing] games in which the predominant activity is combat on the battlefield level, as opposed to the hand-to-hand level.” It would use the menu-driven Journey interface to “make a complex game simple to use and learn” and to “provide a narrative force to the unfolding of the war.” But events that followed shortly after the concurrent release and complete commercial failure of Journey and Shogun in March of 1989 put the kibosh on any further use of Journey‘s interface in any context.

And that’s a shame because its interface had huge potential to bridge the gap between the micromanagement entailed by a parser and the sweeping, unsatisfyingly arbitrary plot-branching of a Choose Your Own Adventure book. It’s only in the past decade or so that modern authors have returned to the middle ground first explored by Blank in Journey, constructing choice-based works that include a substantial degree of world modeling behind their text and a more sophisticated approach to interaction than a tangle of irrevocable hard branches. In the years since they began to do so, the quantity of choice-based works submitted to the annual Interactive Fiction Competition has come to rival or exceed those of more traditional parser-based games, and commercial developers like Inkle Studios have enjoyed some financial success with the model. While they provide a very different experience than a parser-based game, my own early engagement with Journey demonstrates how compelling games of this stripe can be on their own terms. And they’re certainly much more viable than traditional text adventures as popular propositions, being so much more accessible to the parser-loathing majority of players.

Unsatisfactory though it is as a game, Journey marks Infocom’s final mad flash of innovation — a flash of innovation so forward-thinking that it would take other developers working in the field of interactive narrative a good fifteen years to catch up to it. Perhaps, then, it’s not such a terrible final legacy after all for Marc Blank in his role as Infocom’s innovator-in-chief — a role he continued to play, as Journey so amply proves, right to the end.

(Sources: As usual with my Infocom articles, much of this one is drawn from the full Get Lamp interview archives which Jason Scott so kindly shared with me. Some of it is also drawn from Jason’s “Infocom Cabinet” of vintage documents. Plus the May 1989 issue of Questbusters, and the very last issue of Infocom’s The Status Line newsletter, from Spring 1989.)

Footnotes

Footnotes
1 Mediagenic was known as Activision until mid-1988. To avoid confusion, I just stick with the name “Mediagenic” in this article.
 

Tags: , ,

Manhole, Anyone?

As part of my research for an upcoming article, I would really like to beg, borrow, or buy a copy of the 1989 CD-ROM version of The Manhole for the black-and-white Macintosh. Note that this means neither the 1988 floppy-disk release nor the 1994 Masterpiece edition or any other re-release. If you happen to have a line on this rarity, I’d hugely appreciate it if you could contact me and let me know. I’d be equally happy with a digital or physical copy, and am willing to pay for the latter.

Thanks a million, and see you in a few days with my next proper article!

Update: Reader Casey Muratori knows the folks at Cyan, and put me in touch with them. They’re going to send me a copy, so problem solved. My huge thanks go to Cyan and to Casey, who has just provided yet more proof that I have the best readers in the world.

 

Shogun

Shogun

One of the generation of male writers forged in the crucible of World War II, James Clavell had a much harder war of it than such peers as Norman Mailer, James Jones, Herman Wouk, Gore Vidal, J.D. Salinger, and James Michener. As a young man of barely twenty years, he found himself facing the Japanese onslaught on the Malay Peninsula at the onset of hostilities in the Pacific Theater. Following the most humiliating British defeat of the entire war, he spent the next three and a half years in prisoner-of-war camps, watching as more than nine out of every ten of his fellow soldiers succumbed to malnutrition, disease, and random acts of violence. Somehow he survived it all and made it home.

In 1953, he emigrated from his native England to Hollywood in the hope of becoming a film director, despite knowing only as much about how movies were made as his actress wife had deigned to tell him. He gradually established himself there as a director and screenwriter by dint of pluck and sheer stubbornness. Clavell claimed he learned how to write stories with mass appeal in Hollywood, developing a style that would preclude more than the merest flirtations with the sort of literary respectability enjoyed by the list of names that opened this article. To hear him tell it, that was just fine with him: “The first time you write a novel you go into ecstasy with the purple prose — how the clouds look, what the sunset is like. All bullshit. What happens? Who does what to whom? That’s all you need.”

If one James Clavell novel was going to please serious students of the literary arts, it would have to be his first, a very personal book in comparison to the epic doorstops for which he would later become known. Holding true to the old adage that everyone’s first novel is autobiographical, King Rat was a lightly fictionalized account of Clavell’s grim experience as a prisoner-of-war. Published in 1962, its success, combined with his difficulty finding sufficient screenwriting gigs, led him to gradually shift his focus from screenplays to novels. The next book he wrote, Tai-Pan (1966), was a much longer, more impersonal, wider-angle historical novel of the early years of Hong Kong. Four similar epics would follow at widely spaced intervals over the next thirty years or so, all chronicling the experiences of Westerners in the Asia of various historical epochs.

James Clavell’s fiction was in many ways no more thoughtful than the majority of the books clogging up the airport bestseller racks then and now. His were novels of adventure, excitement, and titillation, not introspection. Yet there is one aspect of his work that still stands out as surprising, even a little noble. Despite the three and a half years of torture and privation he had endured at the hands of his Japanese captors, he was genuinely fascinated by Asian and especially Japanese culture and history; one might even say he came to love it. And nowhere was that love more evident than in Clavell’s third novel, his most popular of all and the one that most of his fans agree stands as his best: 1975’s Shogun.

The star of Shogun is a typical Clavell hero, a Capable Man whose inner life doesn’t seem to run much deeper than loving queen and country and hating Papists. John Blackthorne is the English pilot — i.e., navigator — of the Erasmus, the first Dutch vessel to discover Japan, circa 1600. Unfortunately, the Spanish and Portuguese are already there when the Erasmus arrives, a situation from which will spring much of the drama of this very lengthy tale of 1100-plus pages. Blackthorne becomes Clavell’s reader surrogate, our window into the strangeness, wonder, mystery, and beauty of feudal Japan.

While Blackthorne’s adventures in Japan are (very) roughly based on those of an actual English adventurer named William Adams, Clavell plays up the violence and the sex for all its worth. Many a youthful reader went to bed at night dreaming fever dreams of inscrutable and lovely geishas and the boxes of toys they kept to hand: “The beads are carefully placed in the back passage and then, at the moment of the Clouds and the Rain, the beads are pulled out slowly, one by one.” Read by adults, such passages… er, extracts are still riotously entertaining in the way that only truly committed Bad Writing can be. My wife Dorte and I used Shogun as our bedtime reading recently. While it didn’t do much to encourage conjugal sexy times, it certainly did make us laugh; Dorte still thinks “pillowing,” Shogun‘s favorite Japanese euphemism for sex, is unaccountably hilarious, and is forever going on about pillowing this and pillowing that. (She also loves the notion of a “poop deck,” but I suppose I can’t blame Clavell for that.)

Unsubtle prose and dodgy euphemisms aside, the first 25 to 30 percent of Shogun is by far the most compelling. Long enough to form a novel of reasonable length in their own right, the early chapters detail the arrival of Blackthorne and his Dutch cohorts in Japan, upon whose shores they literally wash up, starving and demoralized after their long voyage across the Pacific. I’ve occasionally heard the beginning of Shogun described as one of the finest stories of first contact between two alien cultures ever written, worthy of careful study by any science-fiction author who proposes to tell of a meeting between even more far-flung cultures than those of Europe and Japan. To that suggestion I can only heartily concur. As Blackthorne and his cohorts pass from honored guests to condemned prisoners and back again, struggling all the while to figure out what these people want from them, what they want from each other, and how to communicate at all, the story is compulsively readable, the tension at times nearly unbearable. (One suspects that some of the most horrific scenes, like the ones after Blackthorne and the crew are cast into a tiny hole and left to languish there in sweltering heat and their own bodily filth, once again draw from Clavell’s own prisoner-of-war experiences.) While I admit to being far from intimately familiar with the whole of the James Clavell oeuvre, I’d be very surprised if he ever wrote anything better than this.

After Blackthorne, stalwart Capable Man that he is, manages to negotiate a reprieve for the crew and a place for himself as a trusted advisor to a powerful daimyo named Toranaga, the book takes on a different, to my mind less satisfying character. It ceases to focus so much on Blackthorne’s personal plight as a stranger in a strange land in favor of a struggle for control of the entire country, once again based loosely on actual history, that is taking place between Toranaga, very broadly speaking the good guy (or at least the one with whom our hero Blackthorne allies himself), and another daimyo named Ishido. At the same time, the Portuguese Jesuits are trying to stake out a space in the middle that will preserve their influence regardless of who wins, whilst also working righteously to find some way to do away with Blackthorne and the Dutch sailors, who if allowed to return to Europe with information on exactly where Japan lies represent an existential threat to everything they’ve built there. Plot piles on counter-plot on conspiracy on counter-conspiracy, interspersed with regular action-movie set-pieces, as all of the various factions maneuver toward the inevitable civil war that will decide the fate of all Japan for decades or centuries to come.

In the meantime, Blackthorne, apparently deciding his life isn’t already dangerous enough, is carrying on an illicit romance with the beautiful Mariko, wife of one of Toranaga’s most highly placed samurai. Their relationship was much discussed in Shogun‘s first bloom of popularity as being the key to the book’s considerable attraction for female readers; very unusually for such a two-fisted tale of war, adventure, and history, Shogun supposedly enjoyed more female readers than male. True to Clavell’s roots, however, Blackthorne and Mariko’s is a depressingly conventional Hollywood romance. We’re expected to believe that these two characters are wildly, passionately in love with one another simply because Clavell tells us they are, according to the Hollywood logic that two attractive people of the opposite sex thrown into proximity with one another must automatically fall in love — and of course lots of sex must follow.

The plot continues to grow ever more byzantine as the remaining page-count dwindles, and one goes from wondering how Clavell is ever going to wrap all this up to checking Amazon to be sure there isn’t a direct sequel. And then it all just… stops, leaving more loose threads dangling than my most raggedy tee-shirt. I’ve read many books with unsatisfying endings, but I don’t know if I’ve ever read an ending as half-baked as this one. It’s all finally come down to the war that’s been looming throughout the previous 1100-plus pages. We’re all ready for the bloody climax. Instead Clavell gives us a three-page summary of what might have happened next if he’d actually bothered to write it. It’s for all the world like Clavell, who admitted that he wrote his novels with no plan whatsoever, simply got tired of this one, decided 1100 pages was more than enough and just stopped in medias res. Shogun manages the feat, perhaps unique in the annals of anticlimax, of feeling massively bloated and half-finished at the same time. This is a Lord of the Rings that ends just as Frodo and Sam arrive in Mordor; a Tale of Two Cities that ends just as Carton is about to make his final sacrifice. I’ve never felt so duped by a book as this one.

But I must admit that I seem to be the exception here. Whether because of the masterfully taut beginning of the story, the torrid love affair, or the lurid portrayal of Japanese culture that pokes always through the tangled edifice of plot, few readers then or now seem to share my reservations. Shogun became an instant bestseller. In 1980, a television miniseries of the book was aired in five parts, filling more than nine hours sans commercials. It became the most-watched show ever aired on NBC and the second most popular in the history of American television, its numbers exceeded only by those of Roots, another miniseries event which had aired on ABC in 1977. When many people think of Blackthorne today, they still picture Richard Chamberlain, the dashing actor who played him on television. Together the book and the miniseries ignited a craze for Japanese culture in the West that, however distorted or exaggerated it may have been, did serve as a useful counterbalance to lingering resentments over World War II and, increasingly, fears that Japan’s exploding technological and industrial base was about to usurp the United States’s place at the head of the world’s economy.

At this point, at last, Shogun‘s huge popularity on page and screen brings us in our roundabout way to Infocom — or, more accurately, to their corporate masters Mediagenic. [1]Mediagenic was known as Activision until mid-1988. To avoid confusion, I just stick with the name “Mediagenic” in this article. (If the preface to the real point of this article seemed crazily extended, I can only plead that, with Shogun the game having little identity of its own apart from the novel on which it’s based, it’s hard to discuss it through any other framework.)

Shogun the game at least looks pretty good.

Shogun the game at least looks pretty good.

Mediagenic’s absolute mania for licensed games following the accession of Bruce Davis to the CEO’s chair has been well-established in other articles by now. Infocom was able to find some excuse to head off most of the ideas in that vein that Mediagenic proposed, but Shogun was an exception. When Mediagenic came to Infocom with a signed deal already in place in late 1987 to base a game on this literary property — from Bruce Davis’s perspective, the idea was right in Infocom’s wheelhouse — their problem child of a subsidiary just wasn’t in any position to say no. Dave Lebling, having recently finished The Lurking Horror and being without an active project, drew the short straw.

Shogun the game was a misbegotten, unloved project from the start, a project for which absolutely no one in the Infocom, Mediagenic, or Clavell camps had the slightest creative passion. The deal had been done entirely by Clavell’s agent; the author seemed barely aware of the project’s existence, and seemed to care about it still less. It was a weird choice even in the terms of dollars and cents upon which Bruce Davis was always so fixated. Yes, Shogun had been massively popular on page and screen years earlier, and still generated strong catalog sales every year. It was hard to imagine, however, that there was a huge crowd of computer gamers dying to relive the adventures of John Blackthorne interactively. Why this of all licenses? Why now?

Shogun

Dave Lebling was duly dispatched to visit Clavell for a few days at his chalet in the Swiss Alps to discuss ideas for the adaptation; he got barely more than a few words of greeting out of the man. His written requests for guidance were answered with the blunt reply that Clavell had written the book more than a decade ago and didn’t remember that much about it; the subtext was that he couldn’t be bothered with any of it, that to him Lebling’s game represented just another check arranged by his agent. Lebling was left entirely on his own to adapt another author’s work, with no idea of where the boundaries to his own creative empowerment might lie. In the past, Infocom had always taken care to avoid just this sort of collaboration-in-name-only. Now they’d had it imposed upon them.

Lebling chose to structure his version of Shogun as a series of Reader’s Digest “scenes from” the novel, cutting and pasting unwieldy chunks of Clavell’s prose into the game and demanding that the player respond by doing exactly what Blackthorne did in the novel in order to advance to the next canned scene. The player who has read the novel will find little interest or challenge in pantomiming her way through a re-creation of same, while the player who hasn’t will have no idea whatsoever what’s expected of her at any given juncture. It’s peculiar to see such a threadbare design from a company as serious about the craft of interactive fiction as Infocom had always been. Everyone there, not least Lebling himself, understood all too well the problems inherent in this approach to adaptation; these very same problems were the main reason Infocom had so steadfastly avoided literary licenses that didn’t come with their authors attached in earlier years. One can only presume that Lebling, unsure of how far his creative license extended and bored to death with the whole project anyway, either couldn’t come up with anything better or just couldn’t be bothered to try.

Shogun includes one graphical puzzle reminescent of those in Zork Zero, a maze representing the tangled allies of Osaka.

Shogun includes one graphical puzzle reminiscent of those in Zork Zero, a maze representing the tangled alleys of Osaka.

Consider the game’s handling of an early scene from the novel: the first time Blackthorne meets Yabu and Omi, respectively the daimyo and his samurai henchman who have dominion over Anjiro, the small fishing village where the Erasmus has washed up. Also present as translator is a Portuguese priest, Blackthorne’s sworn enemy, who would like nothing better than to see him condemned and executed on the spot. In the book, Blackthorne’s observations of the priest’s interactions with the two samurai convince him that there is no love lost between him and them, that Yabu and Omi hate and mistrust the priest almost as much as Blackthorne does. Blackthorne wants to communicate that he shares their sentiment, but of course all of his words are being translated into Japanese by the priest himself — obviously a highly unreliable means of communication in this situation. Desperate to show his captors that he’s different from this other foreigner, he lunges at the priest, grabs his crucifix, and breaks it in two, a deadly sin for a Catholic but a good day’s work for a Protestant like him. Yabu and especially Omi are left curious and more than a little impressed; Blackthorne’s action quite possibly staves off his imminent execution.

In the book, this all hangs together well enough, based on what we know and what we soon learn of the personalities, histories, and cultures involved. But for the game to expect the player to come up with such a seemingly random action as lunging for the crucifix and breaking it is asking an awful lot of anyone unfamiliar with the novel. It’s not impossible to imagine the uninitiated player eventually coming up with it on her own, especially as Lebling is good enough to drop some subtle hints about the crucifix “on its long chain waving mockingly before your face,” but she’ll likely do so only by dying and restoring many times.

Shogun is the only Infocom game outside of Leather Goddesses of Phobos in which you have to "make love to" someone -- or use another euphemism -- in order to score points.

Shogun is the only Infocom game outside of Leather Goddesses of Phobos in which you have to “make love to” someone — or type another euphemism, if you like — in order to score points. (Unfortunately, you can’t use “pillow” as a verb. This Dorte finds deeply disappointing.) It’s also, needless to say, the only one with nudity. Too bad Blackthorne is covering up his manly member, whose size is a constant point of discussion in the book.

And this is far from the worst of Lebling’s “read James Clavell’s mind” moments. In their announcement of the game in their newsletter, Infocom noted that “the key to success in the interactive Shogun is the ability to act as the British pilot-major Blackthorne would.” For the player who hasn’t read the book and thus doesn’t know Blackthorne, this is quite a confusing proposition. For the player who has, the game falls into a rote pattern. Remember (or look up) what Blackthorne did in the book, figure out how and when to phrase it to the parser, and you get some points and get to live a little longer. Do anything else, and you die or get a message saying “this scene is no longer winnable” and get to try again. In between, you do a lot of waiting and examining, and lots of reading of textual cut scenes — called “interludes” by the game — that grow steadily lengthier as the story progresses and Blackthorne’s part in it becomes more and more ancillary.

In a telling indication of how the times had changed for Infocom, by far the most impressive aspect of Shogun is its visual presentation. Promoted, like the earlier Zork Zero, as “graphical interactive fiction,” it and the simultaneously released Journey are the first Infocom games to unabashedly indulge in pictures for their own sake, abandoning Steve Meretzky’s insistence that his game’s graphics always serve a practical gameplay function. Shogun‘s pictures, drawn in the style of classical Japanese woodcuts by Donald Langosy, are lovely to look at and perfectly suit the atmosphere of the novel. The game’s one truly innovative aspect is the same pictures’ presentation onscreen. Rather than being displayed in a static window, they’re scattered around and within the scrolling text in various positions, giving the game the look of an unfurling illustrated scroll. Infocom had had their share of trouble figuring out the graphics thing, but Shogun demonstrates that, clever bunch that they were, they were learning quickly. Already Infocom’s visual palette was far more sophisticated than that of competitors like Magnetic Scrolls and Level 9 who had been doing text adventures with pictures for years. Pity they wouldn’t have much more time to experiment.

"The socks stay on, Mariko!"

“The socks stay on, Mariko!” [2]Al and Peg

But of course, as Infocom’s vintage advertisements loved to tell us, visuals alone do not a great game make. Shogun stands today as the most unloved and unlovable of all Infocom’s games, a soulless exercise in pure commerce that didn’t make a whole lot of sense even on that basis. Released in March of 1989, its sales were, like those of all of this final run of graphical games, minuscule. In my opinion and, I would venture, that of a substantial number of others, it represents the absolute nadir of Infocom’s 35-game catalog. It is, needless to say, the merest footnote to the bestselling catalog of James Clavell, who died in 1994. And, indeed, it’s little more worthy of discussion in the context of Infocom’s history; the words I’ve devoted to it already are far more than it deserves. I have two more Infocom games to discuss in future articles, each with problems of their own, but we can take consolation in one thing: it will never, ever get as bad as this again. This, my friends, is what the bottom of the barrel looks like.

(Sources: As usual with my Infocom articles, much of this one is drawn from the full Get Lamp interview archives which Jason Scott so kindly shared with me. Some of it is also drawn from Jason’s “Infocom Cabinet” of vintage documents. And the very last issue of Infocom’s The Status Line newsletter, from Spring 1989.)

Footnotes

Footnotes
1 Mediagenic was known as Activision until mid-1988. To avoid confusion, I just stick with the name “Mediagenic” in this article.
2 Al and Peg
 

Tags: , ,

Peter Molyneux’s Kingdom in a Box

Peter Molyneux, circa 1990

Peter Molyneux, circa 1990.

I have this idea of a living world, which I have never achieved. It’s based upon this picture in my head, and I can see what it’s like to play that game. Every time I do it, then it maybe gets closer to that ideal. But it’s an ambitious thing.

— Peter Molyneux

One day as a young boy, Peter Molyneux stumbled upon an ant hill. He promptly did what young boys do in such situations: he poked it with a stick, watching the inhabitants scramble around as destruction rained down from above. But then, Molyneux did something that set him apart from most young boys. Feeling curious and maybe a little guilty, he gave the ants some sugar for energy and watched quietly as they methodically undid the damage to their home. Just like that, he woke up to the idea of little living worlds with lots of little living inhabitants — and to the idea of he himself, the outsider, being able to affect the lives of those inhabitants. The blueprint had been laid for one of the most prominent and influential careers in the history of game design. “I have always found this an interesting mechanic, the idea that you influence the game as opposed to controlling the game,” he would say years later. “Also, the idea that the game can continue without you.” When Molyneux finally grew bored and walked away from the ant hill on that summer day in his childhood, it presumably did just that, the acts of God that had nearly destroyed it quickly forgotten. Earth — and ants — abide.

Peter Molyneux was born in the Surrey town of Guildford (also hometown of, read into it what you will, Ford Prefect) in 1959, the son of an oil-company executive and a toy-shop proprietor. To hear him tell it, he was qualified for a career in computer programming largely by virtue of being so hopeless at everything else. Being dyslexic, he found reading and writing extremely difficult, a handicap that played havoc with his marks at Bearwood College, the boarding school in the English county of Berkshire to which his family sent him for most of his teenage years. Meanwhile his less than imposing physique boded ill for a career in the military or manual labor. Thankfully, near the end of his time at Bearwood the mathematics department acquired a Commodore PET,  while the student union almost simultaneously installed a Space Invaders machine. Seeing a correspondence between these two pieces of technology that eluded his fellow students, Molyneux set about trying to program his own Space Invaders on the PET, using crude character glyphs to represent the graphics that the PET, being a text-only machine, couldn’t actually draw. No matter. A programmer had been born.

These events, followed shortly by Molyneux’s departure from Bearwood to face the daunting prospect of the adult world, were happening at the tail end of the 1970s. Like so many of the people I’ve profiled on this blog, Molyneux was thus fortunate enough to be born not only into a place and circumstances that would permit a career in games, but at seemingly the perfect instant to get in on the ground floor as well. But, surprisingly for a fellow who would come to wear his huge passion for the medium on his sleeve — often almost as much to the detriment as to the benefit of his games and his professional life — Molyneux took a meandering path filling fully another decade to rise to prominence in the field. Or, to put it less kindly: he failed, repeatedly and comprehensively, at every venture he tried for most of the 1980s before he finally found the one that clicked.

Perhaps inspired by his mother’s toy shop, his original dream was to be not so much a game designer as a computer entrepreneur. After earning a degree in computer science from Southampton University, he found himself a job working days as a systems analyst for a big company. By night, he formed a very small company called Vulcan in his hometown of Guildford to implement a novel scheme for selling blank disks. He wrote several simple programs: a music creator, some mathematics drills, a business simulator, a spelling quiz. (The last, having been created by a dyslexic and terrible speller in general, was a bit of a disaster.) For every ten disks you bought for £10, you would get one of the programs for free along with your blank disks. After placing his tiny advertisement in a single magazine, Molyneux was so confident of the results that he told his local post office to prepare for a deluge of mail, and bought a bigger mailbox for his house to hold it all. He got five orders in the first ten days, less than fifty in the scheme’s total lifespan — along with about fifty more inquiries from people who had no interest in the blank disks but just wanted to buy his software.

Taking their interest to heart, Molyneux embarked on Scheme #2. He improved the music creator and the business simulator and tried to sell them as products in their own right. Even years later he would remain proud of the latter in particular — his first original game, which he named Entrepreneur: “I really put loads of features into it. You ran a business and you could produce anything you liked. You had to do things like keep the manufacturing line going, set the price for your product, decide what advertising you wanted, and these random events would happen.” With contests all the rage in British games at the time, he offered £100 to the first person to make £1 million in Entrepreneur. The prize went unclaimed; the game sold exactly two copies despite being released near the zenith of the early-1980s British mania for home computers. “Everybody around me was making an absolute fortune,” Molyneux remembers. “You had to be a complete imbecile in those days not to make a fortune. Yet here I was with Entrepreneur and Composer, making nothing.” He wasn’t, it appeared, very good at playing his own game of entrepreneurship; his own £1 million remained far out of reach. Nevertheless, he moved on to the next scheme.

Scheme #3 was to crack the business and personal-productivity markets via a new venture called Taurus, initiated by Molyneux and his friend Les Edgar, who were later joined by one Kevin Donkin. Molyneux having studied accounting at one time in preparation for a possible career in the field (“the figures would look so messy that no one would ever employ me”), it was decided that Taurus would initially specialize in financial software with exciting names like Taurus Accounts, Taurus Invoicing, and Taurus Stock Control. Those products, like all the others Molyneux had created, went nowhere. But now came a bizarre story of mistaken identity that… well, it wouldn’t make Molyneux a prominent game designer just yet, but it would move him further down the road to that destination.

Commodore was about to launch the Amiga in Britain, and, this being early on when they still saw it as potential competition for the IBMs of the world, was looking to convince makers of productivity software to write for the machine.  They called up insignificant little Taurus of all people to request a meeting to discuss porting the “new software” the latter had in the works to the Amiga. Molyneux and Edgar assumed Commodore must have somehow gotten wind of a database program they were working on. In a state of no small excitement, they showed up at Commodore UK’s headquarters on the big day and met a representative. Molyneux:

He kept talking about “the product,” and I thought they were talking about the database. At the end of the meeting, they say, “We’re really looking forward to getting your network running on the Amiga.” And it suddenly dawned on me that this guy didn’t know who we were. Now, we were called Taurus, as in the star sign. He thought we were Torus, a company that produced networking systems. I suddenly had this crisis of conscience. I thought, “If this guy finds out, there go my free computers down the drain.” So I just shook his hand and ran out of that office.

An appropriately businesslike advertisement for Taurus's database manager gives no hint of what lies in the company's futures.

An appropriately businesslike advertisement for Taurus’s database manager gives no hint of what actually lies in the company’s future…

By the time Commodore figured out they had made a terrible mistake, Taurus had already been signed as official Amiga developers and given five free Amigas. They parlayed those things into a two-year career as makers of somewhat higher-profile but still less than financially successful productivity software for the Amiga. After the database, which they named Acquisition and declared “the most complete database system conceived on any microcomputer” — Peter Molyneux’s habit of over-promising, which gamers would come to know all too well, was already in evidence — they started on a computer-aided-design package called X-CAD Designer. Selling in the United States for the optimistic prices of $300 and $500 respectively, both programs got lukewarm reviews; they were judged powerful but kind of incomprehensible to actually use. But even had the reviews been better, high-priced productivity software was always going to be a hard sell on the Amiga. There were just three places to really make money in Amiga software: in personal-creativity software like paint programs, in video-production tools, and, most of all, in games. In spite of all of Commodore’s earnest efforts to the contrary, the Amiga had by now become known first and foremost as the world’s greatest gaming computer.

The inspiration for the name of Bullfrog Software.

The inspiration for Bullfrog Software.

Molyneux and his colleagues therefore began to wind down their efforts in productivity software in favor of a new identity. They renamed their company Bullfrog after a ceramic figurine they had lying around in the “squalor” of what Molyneux describes as their “absolutely shite” office in a Guildford pensioner’s attic. Under the new name, they planned to specialize in games — Scheme #4 for Peter Molyneux. “We had a simple choice of hitting our head against a brick wall with business software,” he remembers, “or doing what I really wanted to do with my life anyway, which was write games.” Having made the choice to make Bullfrog a game developer, their first actual product was not a game but a simple drum sequencer for the Amiga called A-Drum. Hobgoblins and little minds and all the rest. When A-Drum duly flopped, they finally got around to games.

A friend of Molyneux’s had written a budget-priced action-adventure for the Commodore 64 called Druid II: Enlightenment, and was looking for someone to do an Amiga conversion. Bullfrog jumped at the chance, even though Molyneux, who would always persist in describing himself as a “rubbish” programmer, had very little idea how to program an action game. When asked by Enlightenment‘s publisher Firebird whether he could do the game in one frame — i.e., whether he could update everything onscreen within a single pass of the electron gun painting the screen to maintain the impression of smooth, fluid movement — an overeager Molyneux replied, “Are you kidding me? I can do it in ten frames!” It wasn’t quite the answer Firebird was looking for. But in spite of it all, Bullfrog somehow got the job, producing what Molyneux describes as a “technically rather poor” port of what had been a rather middling game in the first place. (Molyneux’s technique for getting everything drawn in one frame was to simply keep shrinking the size of the display until even his inefficient routines could do the job.) And then, as usual for everything Molyneux touched, it flopped. But Bullfrog did get two important things out of the project: they learned much about game programming, and they recruited as artist for the project one Glenn Corpes, who was not only a talented pixel pusher but also a talented programmer and fount of ideas almost the equal of Molyneux.

Despite the promising addition of Corpes, the first original game conjured up by the slowly expanding Bullfrog fared little better than Enlightenment. Corpes and Kevin Donkin turned out a very of-its-time top-down shoot-em-up called Fusion, which Electronic Arts agreed to release. Dismissed as “a mixture of old ideas presented in a very unexciting manner” by reviewers, Fusion was even less impressive technically than had been the Enlightenment port, being plagued by clashing colors and jittery scrolling — not at all the sort of thing to impress the notoriously audiovisually-obsessed Amiga market. Thus Fusion flopped as well, keeping Molyneux’s long record of futility intact. But then, unexpectedly from this group who’d shown so little sign of ever rising above mediocrity, came genius.

To describe Populous as a stroke of genius would be a misnomer. It was rather a game that grew slowly into its genius over a considerable period of time, a game that Molyneux himself considers more an exercise in evolution than conscious design. “It wasn’t an idea that suddenly went ‘Bang!'” he says. “It was an idea that grew and grew.” And its genesis had as much to do with Glenn Corpes as it did with Peter Molyneux.

Every Populous world is built out of combinations of just 16 blocks.

Every Populous world is built out of combinations of just 56 blocks.

It all began when Corpes started showing off a routine he had written which let him build isometric landscapes out of three-dimensional blocks, like a virtual Lego set. You could move the viewpoint about the landscape, raising and lowering the land by left-clicking to add new blocks, right-clicking to remove them. Molyneux was immediately sure there was a game in there somewhere. His childhood memory of the ant farm leaping to mind, he said, “Let’s have a thousand people running around on it.”

Populous thus began with those little people in lieu of ants, wandering independently over Corpes’s isometric landscapes in real time. When they found a patch they liked, they would settle down, building little huts. Since, this being a computer game, the player would obviously need something to do as well, Molyneux started adding ways for you, as a sort of God on high, to influence the people’s behavior in indirect ways. He added something he called a “Papal Magnet,” a huge ankh you could place in the world to draw your people toward a given spot. But there would come a problem if the way to the Ankh happened to be blocked by, say, a lake. Molyneux claims he added Populous‘s most basic mechanic, the thing you spend by far the most time doing when playing the game, as a response to his “incompetence” as a coder and resulting inability to write a proper path-finding algorithm: when your people get stuck somewhere, you can, subject to your mana reserves — even gods have limits — raise or lower the land to help them out. With that innovation, Populous from the player’s perspective became largely an exercise in terraforming, creating smooth, even landscapes on which your people can build their huts, villages, and eventually castles. As your people become fruitful and multiply, their prayers fuel your mana reserves.

Next, Molyneux added warfare to the picture. Now you would be erecting mountains and lakes to protect your people from their enemies, who start out walking about independently on the other side of the world. The ultimate goal of the game, of course, is to use your people to wipe out your enemy’s people before they do the same to you; this is a very Old Testament sort of religious experience. To aid in that goal, Molyneux gradually added lots of other godly powers to your arsenal, more impressive than the mere raising and lowering of land if also far more expensive in terms of precious mana: flash floods, earthquakes, volcanic eruptions, etc. You know, all your standard acts of God, as found in the Bible and insurance claims.

Lego Populous. Bullfrog had so much fun with this implementation of the idea that they seriously discussed trying to turn it into a commercial board game.

Lego Populous. Bullfrog had so much fun with this implementation of the idea that they seriously discussed trying to turn it into a commercial board game.

Parts of Populous were prototyped on the tabletop. Bullfrog used Lego bricks to represent the landscapes, a handy way of implementing the raising-and-lowering mechanic in a physical space. They went so far as to discuss a license with Lego, only to be told that Lego didn’t support “violent games.” Molyneux admits that the board game, while playable, was very different from the computerized Populous, playing out as a slow-moving, chess-like exercise in strategy. The computer Populous, by contrast, can get as frantic as any action game, especially in the final phase when all the early- and mid-game maneuvering and feinting comes down to the inevitable final genocidal struggle between Good and Evil.

Bullfrog. From left: Glenn Corpes (artist), Shaun Cooper (tester), Peter Molyneux (designer and programmer), Kevin Donkin (designer and programmer), Les Edgar (office manager), Andy Jones (artist and tester).

Bullfrog. From left: Glenn Corpes (artist and programmer), Shaun Cooper (artist and tester), Peter Molyneux (designer and programmer), Kevin Donkin (designer and programmer), Les Edgar (office manager), Andy Jones (artist and tester).

Ultimately far more important to the finished product than Bullfrog’s Lego Populous were the countless matches Molyneux played on the computer against Glenn Corpes. Apart from all of its other innovations in helping to invent the god-game and real-time-strategy genres, Populous was also a pioneering effort in online gaming. Multi-player games — the only way to play Populous for many months — took place between two people seated at two separate Amigas, connected together via modem or, if together in the same room as Molyneux and Corpes were, via a cable. Vanishingly few other designers were working in this space at the time, for understandable reasons: even leaving aside the fact that the majority of computer owners didn’t own modems, running a multi-player game in real-time over a connection as slow as 1200 baud was hardly a programming challenge for the faint-hearted. The fact that it works at all in Populous rather puts the lie to Molyneux’s self-deprecating description of himself as a “rubbish” coder.

You draw your people toward different parts of the map by placing the Papal Magnet. The first one to touch it becomes the leader. There are very few words in the game, which made it much easier to localize and popularize across Europe. Everything is done using the initially incomprehensible suite of icons you near the bottom of the screen.

You draw your people toward different parts of the map by placing the Papal Magnet. The first one to touch it becomes the leader. There are very few words in the game, which only made it that much easier for Electronic Arts to localize and popularize across Europe. Everything is instead done using the initially incomprehensible suite of icons near the bottom of the screen. Populous does become intuitive in time, but it’s not without a learning curve.

Development of Populous fell into a comfortable pattern. Molyneux and Corpes would play together for several hours every evening, then nip off to the pub to talk about their experiences. Next day, they’d tweak the game, then they’d go at it again. It’s here that we come to the beating heart of Molyneux’s description of Populous as a game evolved rather than designed. Almost everything in the finished game beyond the basic concept was added in response to Molyneux and Corpes’s daily wars. For instance, Molyneux initially added knights, super-powered individuals who can rampage through enemy territory and cause a great deal of havoc in a very short period of time, to prevent their games from devolving into endless stalemates. “A game could get to the point where both players had massive populations,” he says, “and there was just no way to win.” With knights, the stronger player “could go and massacre the other side and end the game at a stroke.”

A constant theme of all the tweaking was to make a more viscerally exciting game that played more quickly. For commercial as well as artistic reasons — Amiga owners weren’t particularly noted for their patience with slow-paced, cerebral games — this was considered a priority. Over the course of development, the length of the typical game Molyneux played with Corpes shrank from several hours to well under one.

Give them time, and your people will turn their primitive villages into castles -- and no, the drawing isn't quite done to scale.

Give them time, and your people will turn their primitive huts into castles.

Even tweaked to play quickly and violently, Populous was quite a departure from the tried-and-true Amiga fare of shoot-em-ups, platformers, and action-adventures. The unenviable task of trying to sell the thing to a publisher was given to Les Edgar. After visiting about a dozen publishers, he convinced Electronic Arts to take a chance on it. Bullfrog promised EA a finished Populous in time for Christmas 1988. By the time that deadline arrived, however, it was still an online multiplayer-only game, a prospect EA knew to be commercially untenable. Molyneux and his colleagues thus spent the next few months creating Populous‘s single-player “Conquest Mode.”

In addition to the green and pleasant land of the early levels, there are also worlds of snow and ice, desert worlds, and even worlds of fire and lava to conquer.

In addition to the green and pleasant land of the early levels, there are also worlds of snow and ice, desert worlds, and even worlds of fire and lava to conquer.

Perilously close to being an afterthought to the multi-player experience though it was, Conquest Mode would be the side of the game that the vast majority of its eventual players would come to know best if not exclusively. Rather than design a bunch of scenarios by hand, Bullfrog wrote an algorithm to procedurally generate 500 different “worlds” for play against a computer opponent whose artificial intelligence also had to be created from scratch during this period. This method of content creation, used most famously by Ian Bell and David Braben in Elite, was something of a specialty and signpost of British game designers, who, plagued by hardware limitations far more stringent than their counterparts in the United States, often used it as a way to minimize the space their games consumed in memory and on disk. Most recently, Geoff Crammond’s hit game The Sentinel, published by Firebird, had used a similar scheme. Glenn Corpes believes it may have been an EA executive named Joss Ellis who first suggested it to Bullfrog.

Populous‘s implementation is fairly typical of the form. Each of the 500 worlds except the first is protected by a password that is, like everything else, itself procedurally generated. When you win at a given level, you’re given the password to a higher, harder level; whether and how many levels you get to skip is determined by how resounding a victory you’ve just managed. It’s a clever scheme, packing a hell of a lot of potential gameplay onto a single floppy disk and even making an effort to avoid boring the good player — and all without forcing Bullfrog to deal with the complications of actually storing any state whatsoever onto disk.

It inevitably all comes down to a frantic final free-for-all between your people and those of your enemy.

It inevitably all comes down to a frantic final free-for-all between your people and those of your enemy.

Given their previous failures, Bullfrog understandably wasn’t the most confident group when a well-known British games journalist named Bob Wade, who had already played a pre-release version of the game, came by for a visit. For hours, Molyneux remained too insecure to actually ask Wade the all-important question of what he thought of the game. At last, after Wade had joined the gang for “God knows how many” pints at their local, Molyneux worked up the courage to pop the question. Wade replied that it was the best game he’d ever played, and he couldn’t wait to get back to it — prompting Molyneux to think he must have made some sort of mistake, and that under no circumstances should he be allowed to play another minute of it in case his opinion should change. It was Wade and the magazine he was writing for at the time, ACE (Advanced Computer Entertainment), who coined the term “god game” in the glowing review that followed, the first trickle of a deluge of praise from the gaming press in Britain and, soon enough, much of the world.

Bullfrog’s first royalty check for Populous was for a modest £13,000. Their next was for £250,000, prompting a naive Les Edgar to call Electronic Arts about it, sure it was a mistake. It was no mistake; Populous alone reportedly accounted for one-third of EA’s revenue during its first year on the market. That Bullfrog wasn’t getting even bigger checks was a sign only of the extremely unfavorable deal they’d signed with EA from their position of weakness. Populous finally and definitively ended the now 30-year-old Peter Molyneux’s long run of obscurity and failure at everything he attempted. In his words, he went overnight from “urinating in the sink” and “owing more money than I could ever imagine paying back” to “an incredible life” in games. Port after port came out for the next couple of years, each of them becoming a bestseller on its platform. Populous was selected to become one of the launch titles for the Super Nintendo console in Japan, spawning a full-blown fad there that came to encompass comic books, tee-shirts, collectibles, and even a symphony concert. When they visited Japan for the first time on a promotional tour, Molyneux and Les Edgar were treated like… well, appropriately enough, like gods. Populous sold 3 million copies in all according to some reports, an almost inconceivable figure for a game during this period.

Amidst all its other achievements, Populous was also something of a pioneer in the realm of e-sports. The One magazine and Electronic Arts hosted a tournament to find the best player in Britain.

The One magazine and Electronic Arts hosted a tournament to find the best Populous player in Britain.

While a relatively small percentage of Populous players played online, those who did became pioneers of sorts in their own right. Some bulletin-board systems set up matchmaking services to pair up players looking for a game, any time, day or night; the resulting connections sometimes spanned national borders or even oceans. The matchmakers were aided greatly by Bullfrog’s forward-thinking decision to make all versions of Populous compatible with one another in terms of online play. In making it so quick and easy to find an online opponent, these services prefigured the modern world of Internet-enabled online gaming. Molyneux pronounced them “pretty amazing,” and at the time they really were. In 1992, he spoke excitedly of a recent trip to Japan, where’d he seen a town “with 10,000 homes all linked together. You can play games with anybody in the place. It’s enormous, really enormous, and it’s growing.” If only he’d known what online gaming would grow into in the next decade or two…

A youngster named Andrew Reader wound up winning the tournament, only to get trounced in an exhibitio match by the master, Peter Molyneux himself. There was talk of televising a follow-up tournament on Sky TV, but it doesn't appear to have happened.

A youngster named Andrew Reader wound up winning the tournament, only to get trounced in an exhibition match by the master, Peter Molyneux himself. There was talk of televising a follow-up tournament on Sky TV, but it doesn’t appear to have happened.

The original Amiga version of Populous had been released all but simultaneously with the Amiga version of SimCity. Press and public alike immediately linked the two games together; AmigaWorld magazine, for instance, went so far as to review them jointly in a single article. Both Will Wright of SimCity fame and Peter Molyneux were repeatedly asked in interviews whether they’d played the other’s game. Wright was polite but, one senses, a little disinterested in Populous, saying he “liked the idea of playing God and having a population follow you,” but “sort of wish they’d gone for a slightly more educational angle.” Molyneux was much more enthusiastic about his American counterpart’s work, repeatedly floating a scheme to somehow link the two games together in more literal fashion for online play.  He claimed at one point that Maxis (developers of SimCity) and his own Bullfrog had agreed on a liaison “to go backwards and forwards” between their two companies to work on linking their games. The liaison, he claimed, had “the Populous landscape moving to and from SimCity,” and a finished product would be out sometime in 1992. Like quite a number of the more unbelievable schemes Molyneux has floated over the years, it never happened.

The idea of a linkage between SimCity and Populous, whether taking place online or in the minds of press and public, can seem on the face of it an exceedingly strange one today. How would the online linkage actually work anyway? Would the little Medieval warriors from Populous suddenly start attacking SimCity‘s peaceful modern utopias? Or would Wright’s Sims plop themselves down in the middle of Molyneux’s apocalyptic battles and start building stadiums and power plants? These were very different games: Wright’s a noncompetitive, peaceful exercise in urban planning with strong overtones of edutainment; Molyneux’s a zero-sum game of genocidal warfare that aspired to nothing beyond entertainment. Knowing as we do today the future paths of these two designers — i.e., ever further in the directions laid down by these their first significant works — only heightens the seeming dichotomy.

That said, there actually were and are good reasons to think of SimCity and Populous as two sides of the same coin. For us today, the list includes first of all the reasons of simple historical concordance. Each marks the coming-out party of one of the most important game designers of all time, occurring within bare weeks of one another.

But of course the long-term importance of these two designers to their field wasn’t yet evident in 1989; obviously players were responding to something else in associating their games with one another. Once you stripped away their very different surface trappings and personalities, the very similar set of innovations at the heart of each was laid bare. AmigaWorld said it very well in that joint review: “The real joy of these programs is the interlocking relationships. Sure, you’re a creator, but even more a facilitator, influencer, and stage-setter for little computer people who act on your wishes in their own time and fashion.” It’s no coincidence that, just as Peter Molyneux was partly inspired by an ant hill to create Populous, one of Will Wright’s projects of the near future would be the virtual ant farm SimAnt. In creating the first two god games, the two were indeed implementing a very similar core idea, albeit each in his own very different way.

Joel Billings of the king of American strategy games SSI had founded his company back in 1979 with the explicit goal of making computerized versions of the board games he loved. SimCity and Populous can be seen as the point when computer strategy games transcended that traditional approach. The real-time nature of these games makes them impossible to conceive of as anything other than computer-based works, while their emergent complexity makes them objects of endless fascination for their designers as much or more so than for their players.

In winning so many awards and entrancing so many players for so long, SimCity and Populous undoubtedly benefited hugely from their sheer novelty. Their flaws stand out more clearly today. With its low-resolution graphics and without the aid of modern niceties like tool tips and graphical overlays, SimCity struggles to find ways to communicate vital information about what your city is really doing and why, making the game into something of an unsatisfying black box unless and until you devote a lot of time and effort to understanding what affects what. Populous has many of the same interface frustrations, along with other problems that feel still more fundamental and intractable, especially if you, like the vast majority of players back in its day, experience it through its single-player Conquest Mode. Clever as they are, the procedurally generated levels combined with the fairly rudimentary artificial intelligence of your computer opponent introduce a lot of infelicities. Eventually you begin to realize that one level is pretty much the same as any other; you just need to execute the same set of strategies and tactics more efficiently to have success at the higher levels.

Both Will Wright and Peter Molyneux are firm adherents to the experimental, boundary-pushing school of game design — an approach that yields innovative games but not necessarily holistically good games every time out. And indeed, throughout his long career each of them has produced at least as many misses as hits, even if we dismiss the complaints of curmudgeons like me and lump SimCity and Populous into the category of the hits. Both designers have often fallen into the trap, if trap it be, of making games that are more interesting for creators and commentators than they are fun for actual players. And certainly both have, like all of us, their own blind spots: in relying so heavily on scientific literature to inform his games, Wright has often produced end results with something of the feel of a textbook, while Molyneux has often lacked the discipline and gravitas to fully deliver on his most grandiose schemes.

But you know what? It really doesn’t matter. We need our innovative experimentalists to blaze new trails, just as we need our more sober, holistically-minded designers to exploit the terrain they discover. SimCity and Populous would be followed by decades of games that built on the possibilities they revealed — many of which I’d frankly prefer to play today over these two original ground-breakers. But, again, that reality doesn’t mean we should celebrate SimCity and Populous one iota less, for both resoundingly pass the test of historical significance. The world of gaming would be a much poorer place without Will Wright and Peter Molyneux and their first living worlds inside a box.

(Sources: The Official Strategy Guide for Populous and Populous II by Laurence Scotford; Master Populous: Blueprints for World Power by Clayton Walnum; Amazing Computing of October 1989; Next Generation of November 1998; PC Review of July 1992; The One of April 1989, September 1989, and May 1991; Retro Gamer 44; AmigaWorld of December 1987, June 1989, and November 1989; The Games Machine of November 1988; ACE of April 1989; the bonus content to the film From Bedrooms to Billions. Archived online sources include features on Peter Molyneux and Bullfrog for Wired Online, GameSpot, and Edge Online. Finally, Molyneux’s postmortem on Populous at the 2011 Game Developers Conference.

Populous is available for purchase from GOG.com.)

 

Tags: , , ,

Acorn and Amstrad

…he explains to her that Sinclair, the British inventor, had a way of getting things right, but also exactly wrong. Foreseeing the market for affordable personal computers, Sinclair decided that what people would want to do with them was to learn programming. The ZX81, marketed in the United States as the Timex 1000, cost less than the equivalent of a hundred dollars, but required the user to key in programs, tapping away on that little motel keyboard-sticker. This had resulted both in the short market-life of the product and, in Voytek’s opinion, twenty years on, in the relative preponderance of skilled programmers in the United Kingdom. They had had their heads turned by these little boxes, he believes, and by the need to program them. “Like hackers in Bulgaria,” he adds, obscurely.

“But if Timex sold it in the United States,” she asks him, “why didn’t we get the programmers?”

“You have programmers, but America is different. America wanted Nintendo. Nintendo gives you no programmers…”

— William Gibson, Pattern Recognition

A couple of years ago I ventured out of the man cave to give a talk about the Amiga at a small game-development conference in Oslo. I blazed through as much of the platform’s history as I could in 45 minutes or so, emphasizing for my audience of mostly young students from a nearby university the Amiga’s status as the preeminent gaming platform in Europe for a fair number of years. They didn’t take much convincing; even this crowd, young as they were, had their share of childhood memories involving Amiga 500s and 1200s. Mostly they seemed surprised that the Amiga hadn’t ever been all that terribly popular in the United States. During the question-and-answer session, someone asked a question that stopped me short: if American kids hadn’t been playing games on their Amigas, just what the hell had they been playing on?

The answer itself wasn’t hard to arrive at: the sorts of kids who migrated from 8-bit Sinclairs, Acorns, Amstrads, and Commodores to 16-bit Amigas and Atari STs in Britain made a much more lateral move in the United States, migrating to the 8-bit Nintendo Entertainment System.

More complex and interesting are the ramifications of these trends. Because the Atari VCS console was never a major presence in Britain and the rest of Europe during its heyday, and because Nintendo arrived only very belatedly, for many years videogames played in the home there meant games played on home computers. One could say much about how having a device useful for creation as well as consumption as the favored platform of most people affected the market across Europe. The magazines were filled with stories of bedroom gamers who had become bedroom coders and finally Software Stars. Such stories make a marked contrast to an American console-gaming magazine like Nintendo Power, all about consumption without the accompanying ethos of creation.

But most importantly for our purposes today, the relative neglect of Britain in particular by the big computing powers in the United States and Japan — for many years, Commodore was the only company of either nation to make a serious effort to sell their machines into British homes — gave space for a flourishing domestic trade in homegrown machines. When Britain became the nation with the most computers per capita on the planet at mid-decade, most of the computers in question bore the logo of either Acorn or Sinclair, the two great rivals at the heart of the young British microcomputer industry.

Acorn, co-founded by Clive Sinclair’s former right-hand man Chris Curry and an Austrian academic named Hermann Hauser, was an archetypal example of an engineering-driven company. Their machines were a little more baroque, a little better built, and consequently a little more expensive than they needed to be, while their public persona was reserved and just a little condescending, much like that of the BBC that had given its official imprimatur to Acorn’s most popular machine, the BBC Micro. Despite “Uncle Clive’s” public reputation as the British Inspector Gadget, Sinclair was just the opposite; cheap and cheerful, they had the common touch. Acorns sold to the educators, to the serious hobbyists, and to the posh, while Sinclairs dominated with the masses.

Yet Acorn and Sinclair were similar in one important respect: they were both in their own ways very poorly managed companies. When the British home-computer market hit an iceberg in 1985, both were caught in untenable positions, drowning in excess inventory. Acorn — quintessentially British, based in the storied heart of Britain’s “Silicon Fen” of Cambridge — was faced with a choice between dissolution and selling themselves to the Italian typewriter manufacturer Olivetti; after some hand-wringing, they chose the latter course. Sinclair also sold out: to the new kid on the block of British computing, Amstrad, owned by a gruff Cockney with a penchant for controversy named Alan Sugar who was well on his way to becoming the British Donald Trump.

Ever practical in their approach to technology, Amstrad made much of the CPC's bundled monitor in their advertising, noting that with the CPC Junior could play on the computer while the rest of the family watched television.

Ever mindful of the practical concerns of their largely working-class customers, Amstrad made much of the CPC’s bundled monitor in their advertising, noting that Junior could play on the CPC without tying up the family television.

Amstrad had already been well-established as a maker of inexpensive stereo equipment and other consumer electronics when their first computers, the CPC (“Colour Personal Computer”) line, debuted in June of 1984. The CPC range was created and sold as a somewhat more capable Sinclair Spectrum. It consisted of well-built and smartly priced if technically unimaginative computers that were fine choices for gaming, boasting as they did reasonably good if hardly revolutionary graphics and sound. Like most Amstrad products, they strained to be as easy to use as possible, shipping as complete units — tape or disk drive and monitor included — at a time when virtually all of their rivals had to be assembled piece by piece via separate purchases.

The CPC line did very well from the outset, even as Acorn and Sinclair were soon watching their own sales implode. Pundits attributed the line’s success to what they called “the Amstrad Effect”: Alan Sugar’s instinct for delivering practical products at a good price at the precise instant when the technology behind them was ready for the mass market — i.e., was about to become desirable to his oft-stated target demographic of “the truck driver and his wife.” Sugar preferred to let others advance the technical state of the art, then swoop in to reap the rewards of their innovations when the time was right. The CPC line was a great example of him doing just that.

But the most dramatic and surprising iteration of the Amstrad Effect didn’t just feed the existing market for colorful game machines; it found an entirely new market segment, one that Amstrad’s competitors had completely missed until now. The story of the creation of the Amstrad PCW line is a classic tale of Alan Sugar, a man who knew almost nothing about computers but knew all he needed to about the people who bought them.

One day just a few months after the release of the first CPC machines, Sugar found himself in an airplane over Asia with Bob Watkins, one of his most trusted executives. A restless Sugar asked Watkins for a piece of paper, and proceeded to draw on it a contraption that included a computer, a monitor, a disk drive, and a printer, all in one unit. Looking at the market during the run-up to the CPC launch, Sugar had recognized that the only true mainstream uses for the current generation of computers in the home were as game machines and word processors. With the CPC, he had the former application covered. But what about the latter? All of the inexpensive machines currently on the market, like the Sinclair Spectrum, were oriented toward playing games rather than word processing, trading the possibility of displaying crisp 80-column text for colorful graphics in lower resolutions. Meanwhile all of the more expensive ones, like the BBC Micro, were created by and for hardcore techies rather than Sugar’s truck drivers. If they could apply their patented technology-for-the-masses approach to a word processor for the home and small business — making a cheap, well-built, all-in-one design emphasizing ease of use for the common person — Amstrad might just have another hit on their hands, this time in a market of their own utterly without competition. Internally, the project was named after Sugar’s secretary Joyce, since it would hopefully make her job and those of many like her much easier. It would eventually come to market as the “PCW,” or “Personal Computer Word Processor.”

The first Amstrad PCW machine, complete with bundled printer.

The first Amstrad PCW machine, complete with bundled printer. Note how the disk drive and the computer itself are built into the same case as the monitor, a very unusual design for the period.

Even more so than the CPC, the PCW was a thoroughly underwhelming package for technophiles. It was build around the tried-and-true Z80 8-bit CPU and ran CP/M, an operating system already considered obsolete by big business, MS-DOS having become the standard in the wake of the IBM PC. The bundled word-processing software, contracted out to a company called Locomotive Software, wasn’t likely to impress power users of WordStar or WordPerfect overmuch — but it was, in keeping with the Amstrad philosophy, unusually friendly and easy to use. Sugar knew his target customers, knew that they “didn’t give a shit whether there was an elastic band or an 8086 or a 286 driving the thing. They wouldn’t know what you were talking about.”

As usual, most of Amstrad’s hardware-engineering efforts went into packaging and cost-cutting. It was decided that the printer would have to be housed separately from the system unit for technical reasons, but otherwise the finished machine conformed remarkably well to Sugar’s original vision. Best of all, it had a price of just £399. By way of comparison, Acorn’s most recent BBC Micro Model B+ had half as much memory and no disk drive, monitor, or printer included — and was priced at £499.

Nervous as ever about intimidating potential customers, Amstrad was at pains to market the PCW first and foremost as a turnkey word-processing solution for homes and small businesses, as a general-purpose computer only secondarily if at all. “It’s more than a word processor for less than most typewriters,” ran their tagline. At the launch event in the heart of the City in August of 1985, three female secretaries paraded across the stage: a snooty one who demanded one of the competition’s expensive computer systems; a tarty one who said a typewriter was more than good enough; and a smart, reasonable one who naturally preferred the PCW. Man-of-the-people Sugar crowed extravagantly that Amstrad had “brought word-processing within the reach of every small business, one-man band, home-worker, and two-finger typist in the country.” Harping on one of his favorite themes, he noted that once again Amstrad had “produced what the customer wants and not a boffin’s ego trip.”

Sugar’s aggressive manner may have grated with many buttoned-down trade journalists, but few could deny that he might just open up a whole new market for computers with the PCW. Electrical Retailer and Trader was typical, calling the PCW “a grown-up computer that does something people want, packaged and sold in a way they can understand, at a price they’ll accept.” But even that note of optimism proved far too mild for the reality of the machine’s success. The PCW exploded out of the gate, selling 350,000 units in the first eight months. It probably could have sold a lot more than that, but Amstrad, caught off-guard by the sales numbers despite their founder’s own bullishness on the product, couldn’t make and ship them fast enough.

Level 9's Time and Magic text adventure running on a PCW.

Level 9’s Time and Magik text adventure running on a PCW.

Surprisingly for such a utilitarian package, the PCW garnered considerable loyalty and even love among the millions in Britain and all across Europe who eventually bought one. Their enthusiasm was enough to sustain a big, glossy newsstand magazine dedicated to the PCW alone — an odd development indeed for this machine that seemed on the face of it to be anything but a hacker’s darling. A thriving software ecosystem that reached well beyond word processing sprung up around the machine. Despite the PCW’s monochrome display and virtually nonexistent animation and sound capabilities, even games were far from unheard of on the platform. For obvious reasons, text adventures in particular became big favorites of PCW owners; with its comfortable full-travel keyboard, its fast disk drive, its relatively cavernous 256 K of memory, and its 80-column text display, a PCW was actually a far better fit for the genre than the likes of a Sinclair Spectrum. The PCW market for text adventures was strong enough to quite possibly allow companies like Magnetic Scrolls and Level 9 to hang on a year or two longer than they might otherwise have managed.

So, Amstrad was already soaring on the strength of the CPC and especially the PCW when they shocked the nation and cemented their position as the dominant force in mainstream British computing with the acquisition of Sinclair in April of 1986. Eminently practical man of business that he was, Sugar bought Sinclair partly to eliminate a rival, but also because he realized that, home-computer slump or no, the market for a machine as popular as the Sinclair Spectrum wasn’t likely to just disappear overnight. He could pick up right where Uncle Clive had left off, selling the existing machine just as it was to new buyers who wanted access to the staggering number of cheap games available for the platform. Sugar thought he could make a hell of a lot of money this way while needing to expend very little effort.

Once again, time proved him more correct than even he had ever imagined. Driven by that huge base of games, demand for new Spectrums persisted into the 1990s. Amstrad repackaged the technology from time to time and, perhaps most importantly, dramatically improved on Sinclair’s infamously shoddy quality control. But they never seriously re-imagined the Spectrum. It was now what Sugar liked to call “a commodity product.” He compared it to suntan lotion of all things: the department stores “put it in their window in July and August and they take it away in the winter.” The Spectrum’s version of July and August was of course November and December; every Christmas sparked a new rush of sales to the parents of a new group of youngsters just coming of age and discovering the magic of videogames.

A battered and uncertain Acorn, now a subsidiary of Olivetti, faced a formidable rival indeed in Alan Sugar’s organization. In a sense, the fundamental dichotomies hadn’t changed that much since Amstrad took Sinclair’s place as the yin to Acorn’s yang. Acorn remained as technology-driven as ever, while Amstrad was all about giving the masses what they craved in the form of cheap computers that were technically just good enough. Amstrad, however, was a much more dangerous form of people’s computer company than had been their predecessor in the role. After releasing some notoriously shoddy stereo equipment under the Amstrad banner in the 1970s and paying the price in returns and reputation, Alan Sugar had learned a lesson that continued to elude Clive Sinclair: that selling well-built, reliable products, even at a price of a few more quid on the final price tag and/or a few less in the profit margin, pays off more than corner-cutting in the long run. Unlike Uncle Clive, who had bumbled and stumbled his way to huge success and just as quickly back to failure, Sugar was a seasoned businessman and a master marketer. The diffident boffins of Acorn looked destined to have a hard time against a seasoned brawler like Sugar, raised on the mean streets of the cutthroat Tottenham Court Road electronics trade. It hardly seemed a fair fight at all.

But then, in the immediate wake of their acquisition by Olivetti nothing at all boded all that well for Acorn. New hardware releases were limited to enhanced versions of the 1981-vintage, 8-bit BBC Micro line that were little more ambitious than Amstrad’s re-packagings of the Spectrum. It was an open secret that Acorn was putting much effort into designing a new CPU in-house to serve as the heart of their eventual next-generation machine, an unprecedented step in an industry where CPU-makers and computer-makers had always been separate entities. For many, it seemed yet one more example of Acorn’s boffinish tendencies getting the best of them, causing them to laboriously reinvent the wheel rather than do what the rest of the microcomputer world was doing: grabbing a 68000 from Motorola or an 80286 from Intel and just getting on with the 16-bit machine their customers were clamoring for. While Acorn dithered with their new chip, they continued to fall further and further behind Amstrad, who in the wake of the Sinclair acquisition had now gone from a British home-computer market share of 0 to 60 percent in less than two years. Acorn was beginning to look downright irrelevant to many Britons in the market for the sorts of affordable, practical computer systems Amstrad was happily providing them with by the bucketful.

Measured in terms of public prominence, Acorn’s best days were indeed already behind them; they would never recapture those high-profile halcyon days of the early 1980s, when the BBC Micro had first been anointed as the British establishment’s officially designated choice for those looking to get in on the ground floor of the computer revolution. Yet the new CPU they were now in the midst of creating, far from being a pointless boondoggle, would ultimately have a far greater impact than anything they’d done before — and not just in Britain but over the entire world. For the CPU architecture Acorn was creating in those uncertain mid-1980s was the one that has gone on to become the most popular ever: the ubiquitous ARM. Since retrofitted into “Advanced RISC Machine,” “ARM” originally stood for “Acorn RISC Machine.” Needless to say, no one at Acorn had any idea of the monster they were creating. How could they?

ARM, the chip that changed the world.

ARM, the chip that changed the world.

“RISC” stands for “Reduced Instruction Set Computer.” The idea didn’t originate with Acorn, but had already been kicking around American university and corporate engineering departments for some time. (As Hermann Hauser later wryly noted, “Normally British people invent something, and the exploitation is in America. But this is a counterexample.”) Still, the philosophy behind ARM was adhered to by only a strident minority before Acorn first picked it up in 1983.

The overwhelming trend in commercial microprocessor design up to that point had been for chips to offer ever larger and more complex instruction sets. By making “opcodes” — single instructions issued directly to the CPU — capable of doing more in a single step, machine-level code could be made more comprehensible for programmers and the programs themselves more compact. RISC advocates came to call this traditional approach to CPU architecture “CISC,” or “Complex Instruction Set Computing.” They believed that CISC was becoming increasingly counterproductive with each new generation of microprocessors. Seeing how the price and size of memory chips continued to drop significantly almost every year, they judged — in the long term, correctly — that memory usage would become much less important than raw speed in future computers. They therefore also judged that it would be more than acceptable in the future to trade smaller programs for faster ones. And they judged that they could accomplish exactly that trade-off by traveling directly against the prevailing winds in CPU design — by making a CPU that offered a radically reduced instruction set of extremely simple opcodes that were each ruthlessly optimized to execute very, very quickly.

A program written for a RISC processor might need to execute far more opcodes than the same program written for a CISC processor, but those opcodes would execute so quickly that the end result would still be a dramatic increase in throughput. Yes, it would use more memory, and, yes, it would be harder to read as machine code — but already fewer and fewer people were programming computers at such a low level anyway. The trend, which they judged likely only to accelerate, was toward high-level languages that abstracted away the details of processor design. In this prediction again, time would prove the RISC advocates correct. Programs may not even need to be as much larger as one might think; RISC advocates argued, with some evidence to back up their claims, that few programs really took full advantage of the more esoteric opcodes of the CISC chips, that the CISC chips were in effect being programed as if they were RISC chips much of the time anyway. In short, then, a definite but not insubstantial minority of academic and corporate researchers were beginning to believe that the time was ripe to replace CISC with RISC.

And now Acorn was about to act on their belief. In typical boffinish fashion, their ARM project was begun as essentially a personal passion project by Roger Wilson [1]Roger Wilson now lives as Sophie Wilson. As per my usual editorial policy on these matters, I refer to her as “he” and by her original name only to avoid historical anachronisms and to stay true to the context of the times. and Steve Furber, two key engineers behind the original BBC Micro. Hermann Hauser admits that for quite some time he gave them “no people” and “no money” to help with the work, making ARM “the only microprocessor ever to be designed by just two people.” When talks began with Olivetti in early 1985, ARM remained such a back-burner long-shot that Acorn never even bothered to tell their potential saviors about it. But as time went on the ARM chip came more and more to the fore as potentially the best thing Acorn had ever done. Having, almost perversely in the view of many, refused to produce a 16-bit replacement for the BBC Micro line for so long, Acorn now proposed to leapfrog that generation entirely; the ARM, you see, was a 32-bit chip. Early tests of the first prototype in April of 1985 showed that at 8 MHz it yielded an average throughput of about 3.5 MIPS, compared to 2.5 MIPS at 10 MHz for the 68020, the first 32-bit entry in Motorola’s popular 68000 line of CISC processors. And the ARM was much, much cheaper and simpler to produce than the 68020. It appeared that Wilson and Furber’s shoestring project had yielded a world-class microprocessor.

ARM made its public bow via a series of little-noticed blurbs that appeared in the British trade press around October of 1985, even as the stockbrokers in the City and BBC Micro owners in their homes were still trying to digest the news of Acorn’s acquisition by Olivetti. Acorn was testing a new “super-fast chip,” announced the magazine Acorn User, which had “worked the first time”: “It is designed to do a limited set of tasks very quickly, and is the result of the latest thinking in chip design.” From such small seeds are great empires sown.

The Acorn Archimedes

The Acorn Archimedes

The machine that Acorn designed as a home for the new chip was called the Acorn Archimedes — or at times, because Acorn had been able to retain the official imprimatur of the BBC, the BBC Archimedes. It was on the whole a magnificent piece of kit, in a different league entirely from the competition in terms of pure performance. It was, for instance, several times faster than a 68000-based Amiga, Macintosh, or Atari ST in many benchmarks despite running at a clock speed of just 8 MHz, roughly the same as all of the aforementioned competitors. Its graphic capabilities were almost as impressive, offering 256 colors onscreen at once from a palette of 4096 at resolutions as high as 640 X 512. So, Acorn had the hardware side of the house well in hand. The problem was the software.

Graphical user interfaces being all the rage in the wake of the Apple Macintosh’s 1984 debut, Acorn judged that the Archimedes as well had to be so equipped. Deciding to go to the source of the world’s very first GUI, they opened a new office for operating-system development a long, long way from their Cambridge home: right next door to Xerox’s famed Palo Alto Research Center, in the heart of California’s Silicon Valley. But the operating-system team’s progress was slow. Communication and coordination were difficult over such a distance, and the team seemed to be infected with the same preference for abstract research over practical product development that had always marked Xerox’s own facility in Palo Alto. The new operating system, to be called ARX, lagged far behind hardware development. “It became a black hole into which we poured effort,” remembers Wilson.

At last, with the completed Archimedes hardware waiting only on some software to make it run, Acorn decided to replace ARX with something they called Arthur, a BASIC-based operating environment very similar to the old BBC BASIC with a rudimentary GUI stuck on top. “All operating-system geniuses were firmly working on ARX,” says Wilson, “so we couldn’t actually spare any of the experts to work on Arthur.” The end result did indeed look like something put together by Acorn’s B team. Parts of Arthur were actually written in interpreted BASIC, which Acorn was able to get away with thanks to the blazing speed of the Archimedes hardware. Still, running Arthur on hardware designed for a cutting-edge Unix-like operating system with preemptive multitasking and the whole lot was rather like dropping a two-speed gearbox into a Lamborghini; it got the job done, after a fashion, but felt rather against the spirit of the thing.

When the Archimedes debuted in August of 1987, its price tag of £975 and up along with all of its infelicities on the software side gave little hope to those not blinded with loyalty to Acorn that this extraordinary machine would be able to compete with Amstrad’s good-enough models. The Archimedes was yet another Acorn machine for the boffins and the posh. Most of all, though, it would be bought by educators who were looking to replace aging BBC Micros and might still be attracted by the BBC branding and the partial compatibility of the new machine with the old, thanks to software emulators and the much-loved BBC BASIC still found as the heart of Arthur.

Even as Amstrad continued to dominate the mass market, a small but loyal ecosystem sprang up around the Archimedes, enough to support a software scene strong on educational software and technical tools for programming and engineering, all a natural fit for the typical Acorn user. And, while the Archimedes was never likely to become the first choice for pure game lovers, a fair number of popular games did get ported. After all, even boffins and educators — or, perhaps more likely, their students — liked to indulge in a bit of pure fun sometimes.

In April of 1989, after almost two long, frustrating years of delays, Acorn released a revision of Arthur comprehensive enough to be given a whole new name. The new RISC OS incorporated many if not all of the original ambitions for ARX, at last providing the Archimedes with an attractive modern operating system worthy of its hardware. But by then, of course, it was far too late to capture the buzz a more complete Archimedes package might have garnered at its launch back in 1987.

Much to the frustration of many of their most loyal customers, Acorn still seemed not so much inept at marketing their wares to the common person as completely disinterested in doing so. It was as if they felt themselves somehow above it all. Perhaps they had taken a lesson from their one earlier attempt to climb down from their ivory tower and sell a computer for the masses. That attempt had taken the form of the Acorn Electron, a cut-down version of the BBC Micro released in 1983 as a direct competitor to the Sinclair Spectrum. Poor sales and overproduction of the Electron had been the biggest single contributor to Acorn’s mid-decade financial collapse and the loss of their independence to Olivetti. Having survived that trauma (after a fashion), Acorn seemed content to tinker away with technology for its own sake and to let the chips fall where they would when it came to actually selling the stuff that resulted.

Alan Sugar shows off the first of his new line of PC clones.

Alan Sugar shows off the first of his new line of PC clones.

If it provided any comfort to frustrated Acorn loyalists, Amstrad also began to seem more and more at sea after their triumphant first couple of years in the computer market. In September of 1986, they added a fourth line of computers to their catalog with the release of the PC — as opposed to PCW — range. The first IBM clones targeted at the British mass market, the Amstrad PC line might have played a role in its homeland similar to that of the Tandy 1000 in the United States, popularizing these heretofore business-centric machines among home users. As usual with Amstrad, the price certainly looked right for the task. The cheapest Amstrad PC model, with a generous 512 K of memory but no hard drive, cost £399; the most expensive, which included a 20 Mb hard drive, £949. Before the Amstrad PC’s release, the cheapest IBM clone on the British market had retailed for £1429.

But, while not a flop, the PC range never took off quite as meteorically as some had expected. For months the line was dogged by reports of overheating brought on by the machine’s lack of a fan (shades of the Apple III fiasco) that may or may not have had a firm basis in fact. Alan Sugar himself was convinced that the reports could be traced back to skulduggery by IBM and other clone manufacturers trying to torpedo his cheaper machines. When he finally bowed to the pressure to add a fan, he did so as gracelessly as imaginable.

I’m a realistic person and we are a marketing organization, so if it’s the difference between people buying the machine or not, I’ll stick a bloody fan in it. And if they say they want bright pink spots on it, I’ll do that too. What is the use of me banging my head against a brick wall and saying, “You don’t need the damn fan, sunshine?”

But there were other problems as well, problems that were less easily fixed. Amstrad struggled to source hard disks, which had proved a far more popular option than expected, resulting in huge production backlogs on many models. And, worst of all, they found that they had finally overreached themselves by setting the prices too low to be realistically sustainable; prices began to creep upward almost immediately.

For that matter, prices were creeping upward across Amstrad’s entire range of computers. In 1986, after years of controversy over the alleged dumping of memory chips into the international market on the part of the Japanese semiconductor industry, the United States pressured Japan into signing a trade pact that would force them to throttle back their production and increase their prices. Absent the Japanese deluge, however, there simply weren’t enough memory chips being made in the world to fill an ever more voracious demand. By 1988, the situation had escalated into a full-blown crisis for volume computer manufacturers like Amstrad, who couldn’t find enough memory chips to build all the computers their customers wanted — and certainly not at the prices their customers were used to paying for them. Amstrad’s annual sales declined for the first time in a long time in 1988 after they were forced to raise prices and cut production dramatically due to the memory shortage. Desperate to secure a steady supply of chips so he could ramp up production again, Sugar bought into Micron Technology, one of only two American firms making memory chips, in October of 1988 to the tune of £45 million. But within a year the memory-chip crisis, anticipated by virtually everyone at the time of the Micron buy-in to go on for years yet, petered out when factories in other parts of Asia began to come online with new technologies to produce memory chips more cheaply and quickly than ever. Micron’s stock plummeted, another major loss for Amstrad. The buy-in hadn’t been “the greatest deal I’ve ever done,” admitted Sugar.

Many saw in the Amstrad of these final years of the 1980s an all too typical story in business: that of a company that had been born and grown wildly as a cult of personality around its founder, until one day it got too big for any one man to oversee. The founder’s vision seemed to bleed away as the middle managers and the layers of bureaucracy moved in. Seduced by the higher profit margins enjoyed by business computers, Amstrad strayed ever further from Sugar’s old target demographic. New models in the PC range crept north of £1000, even £2000 for the top-of-the-line machines, while the more truck-driver-focused PCW and CPC lines were increasingly neglected. The CPC line would be discontinued entirely in 1990, leaving only the antique Spectrum to soldier on for a couple more years for Amstrad in the role of general-purpose home computer. It seemed that Amstrad at some fundamental level didn’t really know how to go about producing a brand new machine in the spirit of the CPC in this era when making a new home computer was much more complicated than plugging together some off-the-shelf chips and hiring a few hackers to knock out a BASIC for the thing. Amstrad would continue to make computers for many years to come, but by the time the 1990s dawned their brief-lived glory days of 60 percent market share were already fading into the rosy glow of nostalgia.

For all their very real achievements over the course of a very remarkable decade in British computing, Acorn and Amstrad each had their own unique blind spot that kept them from achieving even more. In the Archimedes, Acorn had a machine that was a match for any other microcomputer in the world in any application you cared to name, from games to business to education. Yet they released it in half-baked form at too high a price, then failed to market it properly. In their various ranges, Amstrad had the most comprehensive lineup of computers of anyone in Britain during the mid- to late-1980s. Yet they lacked the corporate culture to imagine what people would want five years from now in addition to what they wanted today. The world needs visionaries and commodifiers alike. What British computing lacked in the 1980s was a company capable of integrating the two.

That lack left wide open a huge gap in the market: space for a next-generation home computer with a lot more power and much better graphics and sound than the likes of the old Sinclair Spectrum, but that still wouldn’t cost a fortune. Packaged, priced, and marketed differently, the Archimedes might have been that machine. As it was, buyers looked to foreign companies to provide. Neglected as Europe still was by the console makers of Japan, the British punters’ choice largely came down to one of two American imports, the Commodore Amiga and the Atari ST. Both — especially the former — would live very well in this gap that neither Acorn nor Amstrad deigned to fill for too long. Acorn did belatedly try with the release of the Archimedes A3000 model in mid-1989 — laid out in the all-in-one-case, disk-drive-on-the-side fashion of an Amiga 500, styled to resemble the old BBC Micro, and priced at a more reasonable if still not quite reasonable enough £745. But by that time the Archimedes’s fate as a boutique computer for the wealthy, the dedicated, and the well-connected was already decided. As the decade ended, an astute observer could already detect that the wild and woolly days of British computing as a unique culture unto itself were numbered.

The Archimedes A3000 marked the end of an era, the last Acorn machine to also bear the BBC logo.

The Archimedes A3000 marked the end of an era, the last Acorn machine to bear the BBC logo.

And that would be that, but for one detail: the fairly earth-shattering detail of ARM. The ARM CPU’s ability to get extraordinary performance out of a relatively low clock speed had a huge unintended benefit that was barely even noticed by Acorn when they were in the process of designing it. In the world of computer engineering, higher clock speeds translate quite directly into higher power usage. Thus the ARM chip could do more with less power, a quality that, along with its cheapness and simplicity, made it the ideal choice for an emerging new breed of mobile computing devices. In 1990 Apple Computer, hard at work on a revolutionary “personal digital assistant” called the Newton, came calling on Acorn. A new spinoff was formed in November of 1990, a partnership among Acorn, Apple, and the semiconductor firm VLSI Technology, who had been fabricating Acorn’s ARM chips from the start. Called simply ARM Holdings, it was intended as a way to popularize the ARM architecture, particularly in the emerging mobile space, among end-user computer manufacturers like Apple who might be leery of buying ARM chips directly from a direct competitor like Acorn.

And popularize it has. To date about ten ARM CPUs have been made for every man, woman, and child on the planet, and the numbers look likely to continue to soar almost exponentially for many years to come. ARM CPUs are found today in more than 95 percent of all mobile phones. Throw in laptops (even laptops built around Intel processors usually boast several ARM chips as well), tablets, music players, cameras, GPS units… well, you get the picture. If it’s portable and it’s vaguely computery, chances are there’s an ARM inside. ARM, the most successful CPU architecture the world has ever known, looks likely to continue to thrive for many, many years to come, a classic example of unintended consequences and unintended benefits in engineering. Not a bad legacy for an era, is it?

(Sources: the book Sugar: The Amstrad Story by David Thomas; Acorn User of July 1985, October 1985, March 1986, September 1986, November 1986, June 1987, August 1987, September 1987, October 1988, November 1988, December 1988, February 1989, June 1989, and December 1989; Byte of November 1984; 8000 Plus of October 1986; Amstrad Action of November 1985; interviews with Hermann Hauser, Sophie Wilson, and Steve Furber at the Computer History Museum.)

Footnotes

Footnotes
1 Roger Wilson now lives as Sophie Wilson. As per my usual editorial policy on these matters, I refer to her as “he” and by her original name only to avoid historical anachronisms and to stay true to the context of the times.
 

Tags: , , ,