RSS

The View from the Trenches (or, Some Deadly Sins of CRPG Design)

From the beginning of this project, I’ve worked to remove the nostalgia factor from my writing about old games, to evaluate each game strictly on its own merits and demerits. I like to think that this approach has made my blog a uniquely enlightening window into gaming history. Still, one thing my years as a digital antiquarian have taught me is that you tread on people’s nostalgia at your peril. Some of what I’ve written here over the years has certainly generated its share of heat as well as light, not so much among those of you who are regular readers and commenters — you remain the most polite, thoughtful, insightful, and just plain nice readers any writer could hope to have — as among the ones who fire off nasty emails from anonymous addresses, who post screeds on less polite sites to which I’m occasionally pointed, or who offer up their drive-by comments right here every once in a while.

A common theme of these responses is that I’m not worthy of writing about this stuff, whether because I wasn’t there at the time — actually, I was, but whatever — or because I’m just not man enough to take my lumps and power through the really evil, unfair games. This rhetoric of inclusion and exclusion is all too symptomatic of the uglier sides of gaming culture. Just why so many angry, intolerant personalities are so attracted to computer games is a fascinating question, but must remain a question for another day. For today I will just say that, even aside from their ugliness, I find such sentiments strange. As far as I know, there’s zero street cred to be gained in the wider culture from being good at playing weird old videogames — or for that matter from being good at playing videogames of any stripe. What an odd thing to construct a public persona around. I’ve made a job out of analyzing old games, and even I sometimes want to say, “Dude, they’re just old games! Really, truly, they’re not worth getting so worked up over.”

That said, there do remain some rays of light amidst all this heat. It’s true that my experience of these games today — of playing them in a window on this giant monitor screen of mine, or playing them on the go on a laptop — must be in some fairly fundamental ways different from the way the same games were experienced all those years ago. One thing that gets obviously lost is the tactile, analog side of the vintage experience: handling the physical maps and manuals and packages (I now reference that stuff as PDF files, which isn’t quite the same); drawing maps and taking notes using real pen and paper (I now keep programs open in separate windows on that aforementioned giant monitor for those purposes); listening to the chuck-a-chunk of disk drives loading in the next bit of text or scenery (replacing the joy of anticipation is the instant response of my modern supercomputer). When I allow myself to put on my own nostalgia hat, just for a little while, I recognize that all these things are intimately bound up with my own memories of playing games back in the day.

And I also recognize that the discrepancies between the way I play now and the way I played back then go even further. Some of the most treasured of vintage games weren’t so much single works to be played and completed as veritable lifestyle choices. Ultima IV, to name a classic example, was huge enough and complicated enough that a kid who got it for Christmas in 1985 might very well still be playing it by the time Ultima V arrived in 1988; rinse and repeat for the next few entries in the series. From my jaded perspective, I wouldn’t brand any of these massive CRPGs as overly well-designed in the sense of being a reasonably soluble game to be completed in a reasonable amount of time, but then that wasn’t quite what most of the people who played them way back when were looking for in them. Actually solving the games became almost irrelevant for a kid who wanted to live in the world of Britannia.

I get that. I really do. No matter how deep a traveler in virtual time delves into the details of any era of history, there are some things he can never truly recapture. Were I to try, I would have to go away to spend a year or two disconnected from the Web and playing no other game — or at least no other CRPG — than the Ultima I planned to write about next. That, as I hope you can all appreciate, wouldn’t be a very good model for a blog like this one.

When I think in the abstract about this journey through gaming history I’ve been on for so long now, I realize that I’ve been trying to tell at least three intertwining stories.

One story is a critical design history of games. When I come to a game I judge worthy of taking the time to write about in depth — a judgment call that only becomes harder with every passing year, let me tell you — I play it and offer you my thoughts on it, trying to judge it not only in the context of our times but also in the context of its own times, and in the context of its peers.

A second story is that of the people who made these games, and how they went about doing so — the inevitable postmortems, as it were.

Doing these first two things is relatively easy. What’s harder is the third leg of the stool: what was it like to be a player of computer games all those years ago? Sometimes I stumble upon great anecdotes in this area. For instance, did you know about Clancy Shaffer?

In impersonal terms, Shaffer was one of the slightly dimmer stars among the constellation of adventure-game superfans — think Roe Adams III, Shay Addams, Computer Gaming World‘s indomitable Scorpia — who parlayed their love of the genre and their talent for solving games quickly into profitable sidelines if not full-on careers as columnists, commentators, play-testers, occasionally even design consultants; for his part, Shaffer contributed his long experience as a player to the much-loved Sir-Tech title Jagged Alliance.

Most of the many people who talked with Shaffer via post, via email, or via telephone assumed he was pretty much like them, an enthusiastic gamer and technology geek in his twenties or thirties. One of these folks, Rich Heimlich, has told of a time when a phone conversation turned to the future of computer technology in the longer view. “Frankly,” said Shaffer, “I’m not sure I’ll even be here to see it.” He was, he explained to his stunned interlocutor, 84 years old. He credited his hobby for the mental dexterity that caused so many to assume he was in his thirties at the oldest. Shaffer believed he had remained mentally sharp through puzzling his way through so many games, while he needed only look at the schedule of upcoming releases in a magazine to have something to which to look forward in life.  Many of his friends who, like him, had retired twenty years ago were dead or senile, a situation Shaffer blamed on their having failed to find anything to do with themselves after leaving the working world behind.

Shaffer died in 2010 at age 99. Only after his passing, after reading his obituary, did Heimlich and other old computer-game buddies realize what an extraordinary life Shaffer had actually led, encompassing an education from Harvard University, a long career in construction and building management, 18 patents in construction engineering, an active leadership role in the Republican party, a Golden Glove championship in heavyweight boxing, and a long and successful run as a yacht racer and sailor of the world’s oceans. And yes, he had also loved to play computer games, parlaying that passion into more than 500 published articles.

But great anecdotes like this one from the consumption side of the gaming equation are the exception rather than the rule, not because they aren’t out there in spades in theory — I’m sure there have been plenty of other fascinating characters like Clancy Shaffer who have also made a passion for games a part of their lives — but because they rarely get publicized. The story of the players of vintage computer games is that of a huge, diffuse mass of millions of people whose individual stories almost never stretch beyond their immediate families and friends.

The situation becomes especially fraught when we try to zero in on the nitty-gritty details of how games were played and judged in their day. Am I as completely out of line as some have accused me of being in harping so relentlessly on the real or alleged design problems of so many games that others consider to be classics? Or did people back in the day, at least some of them, also get frustrated and downright angry at betrayals of their trust in the form of illogical puzzles and boring busywork? I know that I certainly did, but I’m only one data point.

One would think that the magazines, that primary link between the people who made games and those who played them, would be the best way of finding out what players were really thinking. In truth, though, the magazines rarely provided skeptical coverage of the games industry. The companies whose games they were reviewing were of course the very same companies that were helping to pay their bills by buying advertising — an obvious conflict of interest if ever there was one. More abstractly but no less significantly, there was a sense among those who worked for the magazines and those who worked for the game publishers that they were all in this together, living as they all were off the same hobby. Criticizing individual games too harshly, much less entire genres, could damage that hobby, ultimately damaging the magazines as much as the publishers. Thus when the latest heavily hyped King’s Quest came down the pipe, littered with that series’s usual design flaws, there was little incentive for the magazines to note that this monarch had no clothes.

So, we must look elsewhere to find out what average players were really thinking. But where? Most of the day-to-day discussions among gamers back in the day took place over the telephone, on school playgrounds, on computer bulletin boards, or on the early commercial online services that preceded the World Wide Web. While Jason Scott has done great work snarfing up a tiny piece of the online world of the 1980s and early 1990s, most of it is lost, presumably forever. (In this sense at least, historians of later eras of gaming history will have an easier time of it, thanks to archive.org and the relative permanence of the Internet.) The problem of capturing gaming as gamers knew it thus remains one without a comprehensive solution. I must confess that this is one reason I’m always happy when you, my readers, share your experiences with this or that game in the comments section — even, or perhaps especially, when you disagree with my own judgments on a game.

Still, relying exclusively on first-hand accounts from decades later to capture what it was like to be a gamer in the old days can be problematic in the same way that it can be problematic to rely exclusively on interviews with game developers to capture how and why games were made all those years ago: memories can fade, personal agendas can intrude, and those rose-colored glasses of nostalgia can be hard to take off. Pretty soon we’re calling every game from our adolescence a masterpiece and dumping on the brain-dead games played by all those stupid kids today — and get off my lawn while you’re at it. The golden age of gaming, like the golden age of science fiction, will always be twelve or somewhere thereabouts. All that’s fine for hoisting a beer with the other old-timers, but it can be worse than useless for doing serious history.

Thankfully, every once in a while I stumble upon another sort of cracked window into this aspect of gaming’s past. As many of you know, I’ve spent a couple of weeks over the last couple of years trolling through the voluminous (and growing) game-history archives of the Strong Museum of Play. Most of this material, hugely valuable to me though it’s been and will doubtless continue to be, focuses on the game-making side of the equation. Some of the archives, though, contain letters from actual players, giving that unvarnished glimpse into their world that I so crave. Indeed, these letters are among my favorite things in the archives. They are, first of all, great fun. The ones from the youngsters are often absurdly cute; it’s amazing how many liked to draw pictures to accompany their missives.

But it’s when I turn to the letters from older writers that I’m gratified and, yes, made to feel a little validated when I read that people were in fact noticing that games weren’t always playing fair with them. I’d like to share a couple of the more interesting letters of this type with you today.

We’ll begin with a letter from one Wes Irby of Plano, Texas, describing what he does and especially what he doesn’t enjoy in CRPGs. At the time he sent it to the Questbusters adventure-game newsletter in October of 1988, Irby was a self-described “grizzled computer adventurer” of age 43. Shay Addams, Questbusters’s editor, found the letter worthy enough to spread around among publishers of CRPGs. (Perhaps tellingly, he didn’t choose to publish it in his newsletter.)

Irby titles his missive “Things I Hate in a Fantasy-Role-Playing Game.” Taken on its own, it serves very well as a companion piece to a similar article I once wrote about graphic adventures. But because I just can’t shut up, and because I can’t resist taking the opportunity to point out places where Irby is unusually prescient or insightful, I’ve inserted my own comments into the piece; they appear in italics in the text that follows. Otherwise, I’ve only cleaned up the punctuation and spelling a bit here and there. The rest is Irby’s original letter from 1988.


I hate rat killing!!! In Shard of Spring, I had to kill dozens of rats, snakes, kobolds, and bats before I could get back to the tower after a Wind Walk to safety. In Wizardry, the rats were Murphy’s ghosts, which I pummeled for hours when developing a new character. Ultima IV was perhaps the ultimate rat-killing game of all time; hour upon hour was spent in tedious little battles that I could not possibly lose and that offered little reward for victory. Give me a good battle to test my mettle, but don’t sentence me to rat killing!

Amen. The CRPG genre became the victim of an expectation which took hold early on that the games needed to be really, really long, needed to consume dozens if not hundreds of hours, in order for players to get their money’s worth. With disk space precious and memory space even more so on the computers of the era, developers had to pad out their games with a constant stream of cheap low-stakes random encounters to reach that goal. Amidst the other Interplay materials hosted at the Strong archive are several mentions of a version of Wasteland, prepared specially for testers in a hurry, in which the random encounters were left out entirely. That’s the version of Wasteland I’d like to play.

I hate being stuck!!! I enjoy the puzzles, riddles, and quests as a way to give some story line to the real heart of the game, which is killing bad guys. Just don’t give me any puzzles I can’t solve in a couple of hours. I solved Rubik’s Cube in about thirty hours, and that was nothing compared to some of the puzzles in The Destiny Knight. The last riddle in Knight of Diamonds delayed my completion (and purchase of the sequel) for nearly six months, until I made a call to Sir-Tech.

I haven’t discussed the issue of bad puzzle design in CRPGs to the same extent as I have the same issue in adventure games, but suffice to say that just about everything I’ve written in the one context applies equally in the other. Certainly riddles remain among the laziest — they require almost no programming effort to implement — and most problematic — they rely by definition on intuition and external cultural knowledge — forms of puzzle in either genre. Riddles aren’t puzzles at all really; the answer either pops into your head right away or it doesn’t, meaning the riddle turns into either a triviality or a brick wall. A good puzzle, by contrast, is one you can experiment with on your way to the correct solution. And as for the puzzles in The Bard’s Tale II: The Destiny Knight… much more on them a little later.

Perhaps the worst aspect of being stuck is the clue-book dilemma. Buying a clue book is demeaning. In addition, buying clue books could encourage impossible puzzles to boost the aftermarket for clue books. I am a reformed game pirate (that is how I got hooked), and I feel it is just as unfair for a company to charge me to finish the game I bought as it was for me to play the games (years ago) without paying for them. Multiple solutions, a la Might and Magic, are very nice. That game also had the desirable feature of allowing you to work on several things simultaneously so that being stuck on one didn’t bring the whole game to a standstill.

Here Irby brings up an idea I’ve also touched on once or twice: that the very worst examples of bad design can be read as not just good-faith disappointments but actual ethical lapses on the part of developers and publishers. Does selling consumers a game with puzzles that are insoluble except through hacking or the most tedious sort of brute-force approaches equate to breaching good faith by knowingly selling them a defective product? I tend to feel that it does.

As part of the same debate, the omnipresent clue books became a locus of much dark speculation and conspiracy theorizing back in the day. Did publishers, as Irby suggests, intentionally release games that couldn’t be solved without buying the clue book, thereby to pick up additional sales? The profit margins on clue books, not incidentally, tended to be much higher than that enjoyed by the games themselves. Still, the answer is more complicated than the question may first appear. Based on my research into the industry of the time, I don’t believe that any publishers or developers made insoluble games with the articulated motive of driving clue-book sales. To the extent that there was an ulterior motive surrounding the subject of clue books, it was that the clue books would allow them to make money off some of the people who pirated their games. (Rumors — almost certainly false, but telling by their very presence — occasionally swirled around the industry about this or that popular title whose clue-book sales had allegedly outstripped the number of copies of the actual game which had been sold.) Yet the fact does remain that even the hope of using clue books as a way of getting money out of pirates required games that would be difficult enough to cause many pirates to go out and buy the book. The human mind is a funny place, and the clue-book business likely did create certain almost unconscious pressures on game designers to design less soluble games.

I hate no-fault life insurance! If there is no penalty, there is no risk, there is no fear — translate that to no excitement. The adrenaline actually surged a few times during play of the Wizardry series when I encountered a group of monsters that might defeat me. In Bard’s Tale II, death was so painless that I committed suicide several times because it was the most expedient way to return to the Adventurer’s Guild.

When you take the risk of loss out of the game, it might as well be a crossword puzzle. The loss of possessions in Ultima IV and the loss of constitution in Might and Magic were tolerable compromises. The undead status in Phantasie was very nice. Your character was unharmed except for the fact that no further advancement was possible. Penalties can be too severe, of course. In Shard of Spring, loss of one battle means all characters are permanently lost. Too tough.

Here Irby hits on one of the most fraught debates in CRPG design, stretching from the days of the original Wizardry to today: what should be the penalty for failure? There’s no question that the fact that you couldn’t save in the dungeon was one of the defining aspects of Wizardry, the game that did more than any other to popularize the budding genre in the very early 1980s. Exultant stories of escaping the dreaded Total Party Loss by the skin of one’s teeth come up again and again when you read about the game. Andrew Greenberg and Bob Woodhead, the designers of Wizardry, took a hard-line stance on the issue, insisting that the lack of an in-dungeon save function was fundamental to an experience they had carefully crafted. They went so far as to issue legal threats against third-party utilities designed to mitigate the danger.

Over time, though, the mainstream CRPG industry moved toward the save-often, save-anywhere model, leaving Wizardry’s approach only to a hardcore sub-genre known as roguelikes. It seems clear that the change had some negative effects on encounter design; designers, assuming that players were indeed saving often and saving everywhere, felt they could afford to worry less about hitting players with impossible fights. Yet it also seems clear that many or most players, given the choice, would prefer to avoid the exhilaration of escaping near-disasters in Wizardry in favor of avoiding the consequences of unescaped disasters. The best solution, it seems to me, is to make limited or unlimited saving a player-selectable option. Failing that, it strikes me as better to err on the side of generosity; after all, hardcore players can still capture the exhilaration and anguish of an iron-man mode by simply imposing their own rules for when they allow themselves to save. All that said, the debate will doubtless continue to rage.

I hate being victimized. Loss of life, liberty, etc., in a situation I could have avoided through skillful play is quite different from a capricious, unavoidable loss. The Amulet of Skill in Knight of Diamonds was one such situation. It was not reasonable to expect me to fail to try the artifacts I found — a fact I soon remedied with my backup disk!!! The surprise attacks of the mages in Wizardry was another such example. Each of the Wizardry series seems to have one of these, but the worst was the teleportation trap on the top level of Wizardry III, which permanently encased my best party in stone.

Beyond rather putting the lie to some of Greenberg and Woodhead’s claims of having exhaustively balanced the Wizardry games, these criticisms again echo those I’ve made in the context of adventure games. Irby’s examples are the CRPG equivalents of the dreaded adventure-game Room of Sudden Death — except that in CRPGs like Wizardry with perma-death, their consequences are much more dire than just having to go back to your last save.

I hate extraordinary characters! If everyone is extraordinary then extraordinary becomes extra (extremely) ordinary and uninteresting. The characters in Ultima III and IV and Bard’s Tale I and II all had the maximum ratings for all stats before the end of the game. They lose their personalities that way.

This is one of Irby’s subtler complaints, but also I think one of his most insightful. Characters in CRPGs are made interesting, as he points out, through a combination of strengths and weaknesses. I spent considerable time in a recent article describing how the design standards of SSI’s “Gold Box” series of licensed Dungeons & Dragons CRPGs declined over time, but couldn’t find a place for the example of Pools of Darkness, the fourth and last game in the series that began with Pool of Radiance. Most of the fights in Pools of Darkness are effectively unwinnable if you don’t have “extraordinary” characters, in that they come down to quick-draw contests to find out whether your party or the monsters can fire off devastating area-effect magic first. Your entire party needs to have a maxed-out dexterity score of 18 to hope to consistently survive these battles. Pools of Darkness thus rewards cheaters and punishes honest players; it represents a cruel betrayal of players who had played through the entire series honestly to that point, without availing themselves of character editors or the like. CRPGs should strive not to make the extraordinary ordinary, and they should certainly not demand extraordinary characters that the player can only come by through cheating.

There are several more features which I find undesirable, but are not sufficiently irritating to put them in the “I hate” category. One such feature is the inability to save the game in certain places or situations. It is miserable to find yourself in a spot you can’t get out of (or don’t want to leave because of the difficulty in returning) at midnight (real time). I have continued through the wee hours on occasion, much to my regret the next day. At other times it has gotten so bad I have dozed off at the keyboard. The trek from the surface to the final set of riddles in Ultima IV takes nearly four hours. Without the ability to save along the way, this doesn’t make for good after-dinner entertainment. Some of the forays in the Phantasie series are also long and difficult, with no provision to save. This problem is compounded when you have an old machine like mine that locks up periodically. Depending on the weather and the phase of the moon, sometimes I can’t rely on sessions that average over half an hour.

There’s an interesting conflict here, which I sense that the usually insightful Irby may not have fully grasped, between his demand that death have consequences in CRPGs and his belief that he should be able to save anywhere. At the same time, though, it’s not an irreconcilable conflict. Roguelikes have traditionally made it possible to save anywhere by quitting the game, but immediately delete the save when you start to play again, thus making it impossible to use later on as a fallback position.

Still, it should always raise a red flag when a given game’s designers claim something which just happens to have been the easier choice from a technical perspective to have been a considered design choice. This skepticism should definitely be applied to Wizardry. Were the no-save dungeons that were such an integral part of the Wizardry experience really a considered design choice or a (happy?) accident arising from technical affordances? It’s very difficult to say this many years on. What is clear is that saving state in any sort of comprehensive way was a daunting challenge for 8-bit CRPGs spread over multiple disk sides. Wizardry and The Bard’s Tale didn’t really even bother to try; literally the only persistent data in these games and many others like them is the state of your characters, meaning not only that the dungeons are completely reset every time you enter them but that it’s possible to “win” them over and over again by killing the miraculously resurrected big baddie again and again. The 8-bit Ultima games did a little better, saving the state of the world map but not that of the cities or the dungeons. (I’ve nitpicked the extreme cruelty of Ultima IV’s ending, which Irby also references, enough on earlier occasions that I won’t belabor it any more here.) Only quite late in the day for the 8-bit CRPG did games like Wasteland work out ways to create truly, comprehensively persistent environments — in the case of Wasteland, by rewriting all of the data on each disk side on the fly as the player travels around the world (a very slow process, particularly in the case of the Commodore 64 and its legendarily slow disk drive).

Tedium is a killer. In Bard’s Tale there was one battle with 297 bersekers that always took fifteen or twenty minutes with the same results (this wasn’t rat-killing because the reward was significant and I could lose, maybe). The process of healing the party in the dungeon in Wizardry and the process of identifying discovered items in Shard of Spring are laborious. How boring it was in Ultima IV to stand around waiting for a pirate ship to happen along so I could capture it. The same can be said of sitting there holding down a key in Wasteland or Wrath of Denethenor while waiting for healing to occur. At least give me a wait command so I can read a book until something interesting happens.

I’m sort of ambivalent toward most aspects of mapping. A good map is satisfying and a good way to be sure nothing has been missed. Sometimes my son will use my maps (he hates mapping) in a game and find he is ready to go to the next level before his characters are. Mapping is a useful way to pace the game. The one irritating aspect of mapping is running off the edge of the paper. In Realms of Darkness mapping was very difficult because there was no “locater” or “direction” spell. More bothersome to me, though, was the fact that I never knew where to start on my paper. I had the same problem with Shard of Spring, but in retrospect that game didn’t require mapping.

Mapping is another area where the technical affordances of the earliest games had a major effect on their designs. The dungeon levels in most 8-bit CRPGs were laid out on grids of a consistent number of squares across and down; such a template minimized memory usage and simplified the programmer’s task enormously. Unrealistic though it was, it was also a blessing for mappers. Wizardry, a game that was oddly adept at turning its technical limitations into player positives, even included sheets of graph paper of exactly the right size in the box. Later games like Dungeon Master, whose levels sprawl everywhere, run badly afoul of the problem Irby describes above — that of maps “running off the edge of the paper.” In the case of Dungeon Master, it’s the one glaring flaw in what could otherwise serve as a masterclass in designing a challenging yet playable dungeon crawl.

I don’t like it when a program doesn’t take advantage of my second disk drive, and I would feel that way about my printer if I had one. I don’t like junk magic (spells you never use), and I don’t like being stuck forever with the names I pick on the spur of the moment. A name that struck my fancy one day may not on another.

Another problem similar to “junk magic” that only really began to surface around the time that Irby was writing this letter is junk skills. Wasteland is loaded with skills that are rarely or never useful, along with others that are essential, and there’s no way for the new player to identify which are which. It’s a more significant problem than junk magic usually is because you invest precious points into learning and advancing your skills; there’s a well-nigh irreversible opportunity cost to your choices. All of what we might call the second generation of Interplay CRPGs, which began with Wasteland, suffer at least somewhat from this syndrome. Like the sprawling dungeon levels in Dungeon Master, it’s an example of the higher ambitions and more sophisticated programming of later games impacting the end result in ways that are, at best, mixed in terms of playability.

I suppose you are wondering why I play these stupid games if there is so much about them I don’t like. Actually, there are more things I do like, particularly when compared to watching Gilligan’s Island or whatever the current TV fare is. I suppose it would be appropriate to mention a few of the things I do like.

In discussing the unavoidably anachronistic experience we have of old games today, we often note how many other games are at our fingertips — a luxury a kid who might hope to get one new game every birthday and Christmas most definitely didn’t enjoy. What we perhaps don’t address as much as we should is how much the entertainment landscape in general has changed. It can be a little tough even for those of us who lived through the 1980s to remember what a desert television was back then. I remember a television commercial — and from the following decade at that — in which a man checked into a hotel of the future, and was told that every movie ever made was available for viewing at the click of a remote control. Back then, this was outlandish science fiction. Today, it’s reality.

I like variety and surprises. Give me a cast of thousands over a fixed party anytime. Of course, the game designer has to force the need for multiple parties on me, or I will stick with the same group throughout because that is the best way to “win” the game. The Minotaur Temple in Phantasie I and the problems men had in Portsmouth in Might and Magic and the evil and good areas of Wizardry III were nice. More attractive are party changes for strategic reasons. What good are magic users in no-magic areas or a bard in a silent room? A rescue mission doesn’t need a thief and repetitive battles with many small opponents don’t require a fighter that deals heavy damage to one bad guy.

I like variety and surprises in the items found, the map, the specials encountered, in short in every aspect of the game. I like figuring out what things are and how they work. What a delight the thief’s dagger in Wizardry was! The maps in Wasteland are wonderful because any map may contain a map. The countryside contains towns and villages, the towns contain buildings, some buildings contain floors or secret passages. What fun!!!

I like missions and quests to pursue as I proceed. Some of these games are so large that intermediate goals are necessary to keep you on track. Might and Magic, Phantasie, and Bard’s Tale do a good job of creating a path with the “missions.” I like self-contained clues about the puzzles. In The Return of Heracles the sage was always there to provide an assist (for money, of course)  if you got stuck. The multiple solutions or sources of vital information in Might and Magic greatly enhanced the probability of completing the missions and kept the game moving.

I like the idea of recruiting new characters, as opposed to starting over from scratch. In Galactic Adventures your crew could be augmented by recruiting survivors of a battle, provided they were less experienced than your leader. Charisma (little used in most games) could impact recruiting. Wasteland provides for recruiting of certain predetermined characters you encounter. These NPCs can be controlled almost like your characters and will advance with experience. Destiny Knight allows you to recruit (with a magic spell) any of the monsters you encounter, and requires that some specific characters be recruited to solve some of the puzzles, but these NPCs can’t be controlled and will not advance in level, so they are temporary members. They will occasionally turn on you, an interesting twist!!!

I like various skills, improved by practice or training for various characters. This makes the characters unique individuals, adding to the variety. This was implemented nicely in both Galactic Adventurers and Wasteland.

Eternal growth for my characters makes every session a little different and intriguing. If the characters “top out” too soon that aspect of the game loses its fascination. Wizardry was the best at providing continual growth opportunities because of the opportunity to change class and retain some of the abilities of the previous class. The Phantasie series seemed nicely balanced, with the end of the quest coming just before/as my characters topped out.

Speaking of eternal, I have never in all of my various adventures had a character retire because of age. Wizardry tried, but it never came into play because it was cheaper to heal at the foot of the stairs while identifying loot (same trip or short run to the dungeon for that purpose). Phantasie kept up with age, but it never affected play. I thought Might and Magic might, but I found the Fountain of Youth. The only FRPG I have played where you had to beat the clock is Tunnels of Doom, a simple hack-and-slash on my TI 99/4A that takes about ten hours for a game. Of course, it is quite different to spend ten hours and fail because the king died than it is to spend three months and fail by a few minutes. I like for time to be a factor to prevent me from being too conservative.

This matter of time affecting play really doesn’t fit into the “like” or the “don’t like” because I’ve never seen it effectively implemented. There are a couple of other items like that on my wish list. For example, training of new characters by older characters should take the place of slugging it out with Murphy’s ghost while the newcomers watch from the safety of the back row.

The placing of time limits on a game sounds to me like a very dangerous proposal. It was tried in 1989, the year after Irby wrote this letter, by The Magic Candle, a game that I haven’t played but that is quite well-regarded by the CRPG cognoscenti. That game was, however, kind enough to offer three difficulty levels, each with its own time limit, and the easiest level was generous enough that most players report that time never became a major factor. I don’t know of any game, even from this much crueler era of game design in general, that was cruel enough to let you play 100 hours or more and then tell you you’d lost because the evil wizard had finished conquering the world, thank you very much. Such an approach might have been more realistic than the alternative, where the evil wizard cackles and threatens occasionally but doesn’t seem to actually do much, but, as Sid Meier puts it, fun ought to trump realism every time in game design.

A very useful feature would be the ability to create my own macro consisting of a dozen or so keystrokes. Set up Control-1 through Control-9 and give me a simple way to specify the keystrokes to be executed when one is pressed.

Interestingly, this exact feature showed up in Interplay’s CRPGs very shortly after Irby wrote this letter, beginning with the MS-DOS version of Wasteland in March of 1989. And we do know that Interplay was one of the companies to which Shay Addams sent the letter. Is this a case of a single gamer’s correspondence being responsible for a significant feature in later games? The answer is likely lost forever to the vagaries of time and the inexactitude of memory.

A record of sorts of what has happened during the game would be nice. The chevron in Wizardry and the origin in Phantasie is the most I’ve ever seen done with this. How about a screen that told me I had 93 sessions, 4 divine interventions (restore backup), completed 12 quests, raised characters from the dead 47 times, and killed 23,472 monsters? Cute, huh?

Another crazily prescient proposal. These sorts of meta-textual status screens would become commonplace in CRPGs in later years. In this case, though, “later years” means much later. Thus, rather than speculating on whether he actively drove the genre’s future innovations, we can credit Irby this time merely with predicting them.

One last suggestion for the manufacturers: if you want that little card you put in each box back, offer me something I want. For example, give me a list of all the other nuts in my area code who have purchased this game and returned their little cards.

Enough of this, Wasteland is waiting.


With some exceptions — the last suggestion, for instance, would be a privacy violation that would make even the NSA raise an eyebrow — I agree with most of Irby’s positive suggestions, just as I do his complaints. It strikes me as I read through his letter that my own personal favorite among 8-bit CRPGs, Pool of Radiance, manages to avoid most of Irby’s pitfalls while implementing much from his list of desirable features — further confirmation of just what a remarkable piece of work that game, and to an only slightly lesser extent its sequel Curse of the Azure Bonds, really were. I hope Wes Irby got a chance to play them.

I have less to say about the second letter I’d like to share with you, and will thus present it without in-line commentary. This undated letter was sent directly to Interplay by its writer: Thomas G. Gutheil, an associate professor at the Harvard Medical School Department of Psychiatry, on whose letterhead it’s written. Its topic is The Bard’s Tale II: The Destiny Knight, a game I’ve written about only in passing but one with some serious design problems in the form of well-nigh insoluble puzzles. Self-serving though it may be, I present Gutheil’s letter to you today as one more proof that players did notice the things that were wrong with games back in the day — and that my perspective on them today therefore isn’t an entirely anachronistic one. More importantly, Gutheil’s speculations are still some of the most cogent I’ve ever seen on how bad puzzles make their way into games in the first place. For this reason alone, it’s eminently worthy of being preserved for posterity.


I am writing you a combination fan letter and critique in regard to the two volumes of The Bard’s Tale, of which I am a regular and fanatic user.

First, the good news: this is a TERRIFIC game, and I play it with addictive intensity, approximately an hour almost every day. The richness of the graphics, the cute depictions of the various characters, monsters, etc., and rich complexity and color of the mazes, tasks, issues, as well as the dry wit that pervades the program, make it a superb piece and probably the best maze-type adventure product on the market today. I congratulate you on this achievement.

Now, the bad news: the one thing I feel represents a defect in your program (and I only take your time to comment on it because it is so central) and one which is perhaps the only area where the Wizardry series (of which I am also an avid player and expert) is superior, is the notion of the so-called puzzles, a problem which becomes particularly noticeable in the “snares of death” in the second scenario. In all candor, speaking as an old puzzle taker and as a four-time grand master of the Boston Phoenix Puzzle Contest, I must say that these puzzles are simply too personal and idiosyncratic to be fair to the player. I would imagine you are doing a booming business in clue books since many of the puzzles are simply not accomplishable otherwise without hours of frustrating work, most of it highly speculative.

Permit me to try to clarify this point, since I am aware of the sensitive nature of these comments, given that I would imagine you regard the puzzles as being the “high art” of the game design. There should be an organic connection between the clues and the puzzles. For example, in Wizardry (sorry to plug the competition), there is a symbolic connection between the clue and its function. As one simplistic example, at the simplest level a bear statuette get you through a gate guarded by a bear, a key opens a particular door, and a ship-in-a-bottle item gets you across an open expanse of water.

Let me try to contrast this with some of the situations in your scenarios. You may recall that in one of the scenarios the presence of a “winged one” in the party was necessary to get across a particular chasm. The Winged One introduces himself to the party as one of almost a thousand individual wandering creatures that come and offer to join the party, to be attacked, or to be left in peace. This level of dilution and the failure to separate out the Winged One in some way makes it practically unrecallable much later on when you need it, particularly since there are several levels of dungeon (and in real life perhaps many interposing days and weeks) between the time you meet the Winged One (who does not stand out among the other wandering characters in any particular way) and the time you actually need him. Even if (as I do) you keep notes, there would be no particular reason to record this creature out of all. Moreover, to have this added character stuck in your party for long periods of time, when you could instead have the many-times more effective demons, Kringles, and salamanders, etc., would seem strategically self-defeating and therefore counter-intuitive for the normal strategy of game play AS IT IS ACTUALLY PLAYED.

This is my point: in many ways your puzzles in the scenarios seem to have been designed by someone who is not playing the game in the usual sequence, but designed as it were from the viewpoint of the programmer, who looks at the scenario “from above” — that is, from omniscient knowledge. In many situations the maze fails to take into account the fact that parties will not necessarily explore the maze in the predictable direct sequence you have imagined. The flow of doors and corridors do not appropriately guide a player so that they will take the puzzles in a meaningful sequence. Thus, when one gets a second clue before a first clue, only confusion results, and it is rarely resolved as the play advances.

Every once in a while you do catch on, and that is when something like the rock-scissors-paper game is invoked in your second scenario. That’s generally playing fair, although not everyone has played that game or would recognize it in the somewhat cryptic form in which it is presented. Thus the player does not gain the satisfaction of use of intellect in problem solving; instead, it’s the frustration of playing “guess what I’m thinking” with the author.

Despite all of the above criticism, the excitement and the challenge of playing the game still make it uniquely attractive; as you have no doubt caught on, I write because I care. I have had to actively fight the temptation to simply hack my way through the “snares of death” by direct cribbing from the clue books, so that I could get on to the real interest of the game, which is working one’s way through the dungeons and encountering the different items, monsters, and challenges. I believe that this impatience with the idiosyncratic (thus fundamentally unfair) design of these puzzles represents an impediment, and I would be interested to know if others have commented on this. Note that it doesn’t take any more work for the programmer, but merely a shift of viewpoint to make the puzzles relevant and fair to the reader and also proof against being taken “out of order,” which largely confuses the meaning. A puzzle that is challenging and tricky is fair; a puzzle that is idiosyncratically cryptic may not be.

Thank you for your attention to this somewhat long-winded letter; it was important to me to write. Given how much I care for this game and how devoted I am to playing it and to awaiting future scenarios, I wanted to call your attention to this issue. You need not respond personally, but I would of course be interested in any of your thoughts on this.


I conclude this article as a whole by echoing Gutheil’s closing sentiments; your feedback is the best part of writing this blog. I hope you didn’t find my musings on the process of doing history too digressive, and most of all I hope you found Wes Irby and Thomas Gutheil’s all too rare views from the trenches as fascinating as I did.

 

Tags: , ,

From Wingleader to Wing Commander

No one at Origin had much time to bask in the rapturous reception accorded to Wingleader at the 1990 Summer Consumer Electronics Show. Their end-of-September deadline for shipping the game was now barely three months away, and there remained a daunting amount of work to be done.

At the beginning of July, executive producer Dallas Snell called the troops together to tell them that crunch time was beginning in earnest; everyone would need to work at least 55 hours per week from now on. Most of the people on the project only smiled bemusedly at the alleged news flash. They were already working those kinds of hours, and knew all too well that a 55-hour work week would probably seem like a part-timer’s schedule before all was said and done.

Dallas Snell

At the beginning of August, Snell unceremoniously booted Chris Roberts, the project’s founder, from his role as co-producer, leaving him with only the title of director. Manifesting a tendency anyone familiar with his more recent projects will immediately recognize, Roberts had been causing chaos on the team by approving seemingly every suggested addition or enhancement that crossed his desk. Snell, the brutal pragmatist in this company full of dreamers, appointed himself as Warren Spector’s new co-producer. His first action was to place a freeze on new features in favor of getting the game that currently existed finished and out the door. Snell:

The individuals in Product Development are an extremely passionate group of people, and I love that. Everyone is here because, for the most part, they love what they’re doing. This is what they want to do with their lives, and they’re very intense about it and very sensitive to your messing around with what they’re trying to accomplish. They don’t live for getting it done on time or having it make money. They live to see this effect or that effect, their visions, accomplished.

It’s always a continual antagonistic relationship between the executive producer and the development teams. I’m always the ice man, the ogre, or something. It’s not fun, but it gets the products done and out. I guess that’s why I have the room with the view. Anyway, at the end of the project, all of Product Development asked me not to get that involved again.

One problem complicating Origin’s life enormously was the open architecture of MS-DOS, this brave new world they’d leaped into the previous year. Back in the Apple II days, they’d been able to write their games for a relatively static set of hardware requirements, give or take an Apple IIGS running in fast mode or a Mockingboard sound card. The world of MS-DOS, by contrast, encompassed a bewildering array of potential hardware configurations: different processors, different graphics and sound cards, different mice and game controllers, different amounts and types of memory, different floppy-disk formats, different hard-disk capacities. For a game like Wingleader, surfing the bleeding edge of all this technology but trying at the same time to offer at least a modicum of playability on older setups, all of this variance was the stuff of nightmares. Origin’s testing department was working 80-hour weeks by the end, and, as we’ll soon see, the final result would still leave plenty to be desired from a quality-control perspective.

As the clock was ticking down toward release, Origin’s legal team delivered the news that it probably wouldn’t be a good idea after all to call the game Wingleader — already the company’s second choice for a name — thanks to a number of existing trademarks on the similar “Wingman.” With little time to devote to yet another naming debate, Origin went with their consensus third choice of Wing Commander, which had lost only narrowly to Wingleader in the last vote. This name finally stuck. Indeed, today it’s hard to imagine Wing Commander under any other name.

The game was finished in a mad frenzy that stretched right up to the end; the “installation guide” telling how to get it running was written and typeset from scratch in literally the last five hours before the whole project had to be packed into a box and shipped off for duplication. That accomplished, everyone donned their new Wing Commander baseball caps and headed out to the front lawn for Origin’s traditional ship-day beer bash. There Robert Garriott climbed onto a picnic table to announce that all of Chris Roberts’s efforts in creating by far the most elaborate multimedia production Origin had ever released had been enough to secure him, at long last, an actual fast job at the company. “As of 5 P.M. this afternoon,” said Garriott, “Chris is Origin’s Director of New Technologies. Congratulations, Chris, and welcome to the Origin team.” The welcome was, everyone had to agree, more than a little belated.

We’ll turn back to Roberts’s later career at Origin in future articles. At this point, though, this history of the original Wing Commander must become the story of the people who played it rather than that of the people who created it. And, make no mistake, play it the people did. Gamers rushed to embrace what had ever since that Summer CES show been the most anticipated title in the industry. Roberts has claimed that Wing Commander sold 100,000 copies in its first month, a figure that would stand as ridiculous if applied to just about any other computer game of the era, but which might just be ridiculous enough to be true in the case of Wing Commander. While hard sales figures for the game or the franchise it would spawn have never to my knowledge been made public, I can feel confident enough in saying that sales of the first Wing Commander soared into the many, many hundreds of thousands of units. The curse of Ultima was broken; Origin now had a game which had not just become a hit in spite of Ultima‘s long shadow, they had a game which threatened to do the unthinkable — to overshadow Ultima in their product catalog. Certainly all indications are that Wing Commander massively outsold Ultima VI, possibly by a factor of two to one or more. It would take a few years, until the release of Doom in 1993, for any other name to begin to challenge that of Wing Commander as the most consistent money spinner in American computer gaming.

But why should that have been? Why should this particular game of all others have become such a sensation? Part of the reason must be serendipitous timing. During the 1990s as in no decade before or since, the latest developments in hardware would drive sales of games that could show them off to best effect, and Wing Commander set the stage for this trend. Released at a time when 80386-based machines with expanded memory, sound cards, and VGA graphics were just beginning to enter American homes in numbers, Wing Commander took advantage of all those things like no other game on the market. It benefited enormously from this singularity among those who already owned the latest hardware setups, while causing yet many more jealous gamers who hadn’t heretofore seen a need to upgrade to invest in hot machines of their own — the kind of virtuous circle to warm any capitalist’s heart.

Yet there was also something more going on with Wing Commander than just a cool-looking game for showing off the latest hardware, else it would have suffered the fate of the slightly later bestseller Myst: that of being widely purchased, but very rarely actually, seriously played. Unlike the coolly cerebral Myst, Wing Commander was a crowd-pleaser from top to bottom, with huge appeal, even beyond its spectacular audiovisuals, to anyone who had ever thrilled to the likes of a Star Wars film. It was, in other words, computerized entertainment for the mainstream rather than for a select cognoscenti. Just as all but the most incorrigible snobs could have a good time at a Star Wars showing, few gamers of any stripe could resist the call of Wing Commander. In an era when the lines of genre were being drawn more and more indelibly, one of the most remarkable aspects of Wing Commander‘s reception is the number of genre lines it was able to cross. Whether they normally preferred strategy games or flight simulators, CRPGs or adventures, everybody wanted to play Wing Commander.

At a glance, Chris Roberts’s gung-ho action movie of a game would seem to be rather unsuited for the readership of Computer Gaming World, a magazine that had been born out of the ashes of the tabletop-wargaming culture of the 1970s and was still beholden most of all to computer games in the old slow-paced, strategic grognard tradition. Yet the magazine and its readers loved Wing Commander. In fact, they loved Wing Commander as they had never loved any other game before. After reaching the number-one position in Computer Gaming World‘s readers’ poll in February of 1991, it remained there for an unprecedented eleven straight months, attaining already in its second month on top the highest aggregate score ever recorded for a game. When it was finally replaced at number one in January of 1992, the replacement was none other than the new Wing Commander IIWing Commander I then remained planted right there behind its successor at number two until April, when the magazine’s editors, needing to make room for other games, felt compelled to “retire” it to their Hall of Fame.

In other places, the huge genre-blurring success of Wing Commander prompted an identity crisis. Shay Addams, adventure-game solver extraordinaire, publisher of the Questbusters newsletter and the Quest for Clues series of books, received so many requests to cover Wing Commander that he reported he had been “on the verge of scheduling a brief look” at it. But in the end, he had decided a little petulantly, it “is just a shoot-em-up-in-space game in which the skills necessary are vastly different from those required for completing a quest. (Then again, there is always the possibility of publishing Simulationbusters.)” The parenthetical may have sounded like a joke, but Addams apparently meant it seriously – or, at least, came to mean it seriously. The following year, he started publishing a sister newsletter to Questbusters called Simulations!. It’s hard to imagine him making such a decision absent the phenomenon that was Wing Commander.

So, there was obviously much more to Wing Commander than a glorified tech demo. If we hope to understand what its secret sauce might have been, we need to look at the game itself again, this time from the perspective of a player rather than a developer.

One possibility can be excised immediately. The “space combat simulation” part of the game — i.e., the game part of the game — is fun today and was graphically spectacular back in 1990, but it’s possessed of neither huge complexity nor the sort of tactical or strategic interest that would seem to be required of a title that hoped to spend eleven months at the top of the Computer Gaming World readers’ charts. Better graphics and embodied approach aside, it’s a fairly commonsense evolution of Elite‘s combat engine, complete with inertia and sounds in the vacuum of space and all the other space-fantasy trappings of Star Wars. If we hope to find the real heart of the game’s appeal, it isn’t here that we should look, but rather to the game’s fiction — to the movie Origin Systems built around Chris Roberts’s little shoot-em-up-in-space game.

Wing Commander casts you as an unnamed young pilot, square-jawed and patriotic, who has just been assigned to the strike carrier Tiger’s Claw, out on the front lines of humanity’s war against the vicious Kilrathi, a race of space-faring felines. (Cat lovers should approach this game with caution!) Over the course of the game, you fly a variety of missions in a variety of star systems, affecting the course of the wider war as you do so in very simple, hard-branching ways. Each mission is introduced via a briefing scene, and concluded, if you make it back alive, with a debriefing. (If you don’t make it back alive, you at least get the rare pleasure of watching your own funeral.) Between missions, you can chat with your fellow pilots and a friendly bartender in the Tiger’s Claw‘s officers lounge, play on a simulator in the lounge that serves as the game’s training mode, and keep track of your kill count along with that of the other pilots on the squadron blackboard. As you fly missions and your kill count piles up, you rise through the Tiger’s Claw‘s hierarchy from an untested rookie to the steely-eyed veteran on which everyone else in your squadron depends. You also get the chance to fly several models of space-borne fighters, each with its own flight characteristics and weapons loadouts.

A mission briefing.

The inspirations for Wing Commander as a piece of fiction aren’t hard to find in either the game itself or the many interviews Chris Roberts has given about it over the years. Leaving aside the obvious influence of Star Wars on the game’s cinematic visuals, Wing Commander fits most comfortably into the largely book-bound sub-genre of so-called “military science fiction.” A tradition which has Robert Heinlein’s 1959 novel Starship Troopers as its arguable urtext, military science fiction is less interested in the exploration of strange new worlds, etc., than it is in the exploration of possible futures of warfare in space.

There isn’t much doubt where Wing Commander‘s historical inspiration lies.

Because worldbuilding is hard and extrapolating the nitty-gritty details of future modes of warfare is even harder, much military science fiction is built out of thinly veiled stand-ins for the military and political history of our own little planet. So, for example, David Weber’s long-running Honor Harrington series transports the Napoleonic Wars into space, while Joe Haldeman’s The Forever War — probably the sub-genre’s best claim to a work of real, lasting literary merit — is based largely on the author’s own experiences in Vietnam. Hewing to this tradition, Wing Commander presents a space-borne version of the grand carrier battles which took place in the Pacific during World War II — entirely unique events in the history of human warfare and, as this author can well attest, sheer catnip to any young fellow with a love of ships and airplanes and heroic deeds and things that go boom. Wing Commander shares this historical inspiration with another of its obvious fictional inspirations, the fun if terminally cheesy 1978 television series Battlestar Galactica. (Come to think of it, much the same description can be applied to Wing Commander.)

Sparkling conversationalists these folks aren’t.

Wing Commander is also like Battlestar Galactica in another respect: it’s not so much interested in constructing a detailed technological and tactical framework for its vision of futuristic warfare — leave that stuff to the books! — as it is in choosing whatever thing seems coolest at any given juncture. We know nothing really about how or why any of the stuff in the game works, just that’s it’s our job to go out and blow stuff up with it. Nowhere is that failing, if failing it be, more evident than in the very name of the game. “Wing Commander” is a rank in the Royal Air Force and those of Commonwealth nations denoting an officer in charge of several squadrons of aircraft. It’s certainly not an appropriate designation for the role you play here, that of a rookie fighter pilot who commands only a single wingman. This Wing Commander is called Wing Commander strictly because it sounds cool.

In time, Origin’s decision to start hiring people to serve specifically in the role of writer would have a profound effect on the company’s games, but few would accuse this game, one of Origin’s first with an actual, dedicated “lead writer,” of being deathless fiction. To be fair to David George, it does appear that he spent the majority of his time drawing up the game’s 40 missions, serving in a role that would probably be dubbed “scenario designer” or “level designer” today rather than “writer.” And  it’s not as if Chris Roberts’s original brief gave him a whole lot to work with. This is, after all, a game where you’re going to war against a bunch of anthropomorphic house cats. (Our cat told me she thought about conquering the galaxy once or twice, but she wasn’t sure she could fit it into the three hours per day she spends awake.) The Kilrathi are kind of… well, there’s just no getting around it, is there? The whole Kilrathi thing is pretty stupid, although it does allow your fellow pilots to pile on epithets like “fur balls,” “fleabags,” and, my personal favorite, “Killie-cats.”

Said fellow pilots are themselves a collection of ethnic stereotypes so over-the-top as to verge on the offensive if it wasn’t so obvious that Origin just didn’t have a clue. Spirit is Japanese, so of course she suffixes every name with “-san” or “-sama” even when speaking English, right? And Angel is French, so of course she says “bonjour” a lot, right? Right?

My second favorite Wing Commander picture comes from the manual rather than the game proper. Our cat would look precisely this bitchy if I shoved her into a spacesuit.

Despite Chris Roberts’s obvious and oft-stated desire to put you into an interactive movie, there’s little coherent narrative arc to Wing Commander, even by action-movie standards. Every two to four missions, the Tiger’s Claw jumps to some other star system and some vague allusion is made to the latest offensive or defensive operation, but there’s nothing to really hang your hat on in terms of a clear unfolding narrative of the war. A couple of cut scenes do show good or bad events taking place elsewhere, based on your performance in battle — who knew one fighter pilot could have so much effect on the course of a war? — but, again, there’s just not enough detail to give a sense of the strategic situation. One has to suspect that Origin didn’t know what was really going on any better than the rest of us.

My favorite Wing Commander pictures, bar none. What I love best about these and the picture above is the ears on the helmets. And what I love best about the ears on the helmets is that there’s no apparent attempt to be cheeky or funny in placing them there. (One thing this game is totally devoid of is deliberate humor. Luckily, there’s plenty of non-deliberate humor to enjoy.) Someone at Origin said, “Well, they’re cats, so they have to have space in their helmets for their ears, right?” and everyone just nodded solemnly and went with it. If you ask me, nothing illustrates Wing Commander‘s charming naivete better than this.

In its day, Wing Commander was hugely impressive as a technological tour de force, but it’s not hard to spot the places where it really suffered from the compressed development schedule. There’s at least one place, for example, where your fellow pilots talk about an event that hasn’t actually happened yet, presumably due to last minute juggling of the mission order. More serious are the many and varied glitches that occur during combat, from sound drop-outs to the occasional complete lock-up. Most bizarrely of all to our modern sensibilities, Origin didn’t take the time to account for the speed of the computer running the game. Wing Commander simply runs flat-out all the time, as fast as the hosting computer can manage. This delivered a speed that was just about perfect on a top-of-the-line 80386-based machine of 1990, but that made it effectively unplayable on the next generation of 80486-based machines that started becoming popular just a couple of years later; this game was definitely not built with any eye to posterity. Wing Commander would wind up driving the development of so-called “slowdown” programs that throttled back later hardware to keep games like this one playable.

Still, even today Wing Commander remains a weirdly hard nut to crack in this respect. For some reason, presumably involving subtle differences between real and emulated hardware, it’s impossible to find an entirely satisfactory speed setting for the game in the DOSBox emulator. A setting which seems perfect when flying in open space slows down to a crawl in a dogfight; a setting which delivers a good frame rate in a dogfight is absurdly fast when fewer other ships surround you. The only apparent solution to the problem is to adjust the DOSBox speed settings on the fly as you’re trying not to get shot out of space by the Kilrathi — or, perhaps more practically, to just find something close to a happy medium and live with it. One quickly notices when reading about Wing Commander the wide variety of opinions about its overall difficulty, from those who say it’s too easy to those who say it’s way too hard to those who say it’s just right. I wonder whether this disparity is down to the fact that, thanks to the lack of built-in throttling, everyone is playing a slightly different version of the game.

The only thing worse than being a cat lover in this game is being a pacifist. And everyone knows cats don’t like water, Shotglass… sheesh.

It becomes clear pretty quickly that the missions are only of a few broad types, encompassing patrols, seek-and-destroy missions, and escort missions (the worst!), but the context provided by the briefings keeps things more interesting than they might otherwise be, as do the variety of spacecraft you get to fly and fight against. The mission design is pretty good, although the difficulty does ebb and spike a bit more than it ideally might. In particular, one mission found right in the middle of the game — the second Kurosawa mission, for those who know the game already — is notorious for being all but impossible. Chris Roberts has bragged that the missions in the finished game “were exactly the ones that Jeff George designed on paper — we didn’t need to do any balancing at all!” In truth, I’m not sure the lack of balancing isn’t a bug rather than a feature.

Um, yes. I’m standing here, aren’t I? Should this really be a judgment call?

Roberts’s decision to allow you to take your lumps and go on even when you fail at a mission was groundbreaking at the time. Yet, having made this very progressive decision, he then proceeded to implement it in the most regressive way imaginable. When you fail in Wing Commander, the war as a whole goes badly, thanks again to that outsize effect you have upon it, and you get punished by being forced to fly against even more overwhelming odds in inferior fighters. Imagine, then, what it’s like to play Wing Commander honestly, without recourse to save games, as a brand new player. Still trying to get your bearings as a rookie pilot, you don’t perform terribly well in the first two or three missions. In response, your commanding officer delivers a constant drumbeat of negative feedback, while the missions just keep getting harder and harder at what feels like an almost exponential pace, ensuring that you continue to suck every time you fly. By the time you’ve failed at 30 missions and your ineptitude has led to the Tiger’s Claw being chased out of the sector with its (striped?) tail between its legs, you might just need therapy to recover from the experience.

What ought to happen, of course, is that failing at the early missions should see you assigned to easier rather than harder ones — no matter the excuse; Origin could make something up on the fly, as they so obviously did so much of the game’s fiction — that give you a chance to practice your skills. Experienced, hardcore players could still have their fun by trying to complete the game in as few missions as possible, while newcomers wouldn’t have to feel like battered spouses. Or, if such an elegant solution wasn’t possible, Origin could at least have given us player-selectable difficulty levels.

As it is, the only practical way to play as a newcomer is to ignore all of Origin’s exhortations to play honestly and just keep reloading until you successfully complete each mission; only in this way can you keep the escalating difficulty manageable. (The one place where I would recommend that you take your lumps and continue is in the aforementioned second Kurosawa mission. Losing here will throw you briefly off-track, but the missions that follow aren’t too difficult, and it’s easier to play your way to victory through them than to try to beat Mission Impossible.) This approach, it should be noted, drove Chris Roberts crazy; he considered it nothing less than a betrayal of the entire premise around which he’d designed his game. Yet he had only himself to blame. Like much in Wing Commander, the discrepancy between the game Roberts wants to have designed and the one he’s actually designed speaks to the lack of time to play it extensively before its release, and thereby to shake all these problems out.

And yet. And yet…

Having complained at such length about Wing Commander, I find myself at something of an impasse, in that my overall verdict on the game is nowhere near as negative as these complaints would imply. It’s not even a case of Wing Commander being, like, say, most of the Ultima games, a groundbreaking work in its day that’s a hard sell today. No, Wing Commander is a game I continue to genuinely enjoy despite all its obvious problems.

In writing about all these old games over the years, I’ve noticed that those titles I’d broadly brand as classics and gladly recommend to contemporary players tend to fall into two categories. There are games like, say, The Secret of Monkey Island that know exactly what they’re trying to do and proceed to do it all almost perfectly, making all the right choices; it’s hard to imagine how to improve these games in any but the tiniest of ways within the context of the technology available to their developers. And then there are games like Wing Commander that are riddled with flaws, yet still manage to be hugely engaging, hugely fun, almost in spite of themselves. Who knows, perhaps trying to correct all the problems I’ve spent so many words detailing would kill something ineffably important in the game. Certainly the many sequels and spinoffs to the original Wing Commander correct many of the failings I’ve described in this article, yet I’m not sure any of them manage to be a comprehensively better game. Like so many creative endeavors, game design isn’t a zero-sum game. Much as I loathe the lazy critic’s cliché “more than the sum of its parts,” it feels hard to avoid it here.

It’s true that many of my specific criticisms have an upside to serve as a counterpoint. The fiction may be giddy and ridiculous, but it winds up being fun precisely because it’s so giddy and ridiculous. This isn’t a self-conscious homage to comic-book storytelling of the sort we see so often in more recent games from this Age of Irony of ours. No, this game really does think this stuff it’s got to share with you is the coolest stuff in the world, and it can’t wait to get on with it; it lacks any form of guile just as much as it does any self-awareness. In this as in so many other senses, Wing Commander exudes the personality of its creator, helps you to understand why it was that everyone at Origin Systems so liked to have this high-strung, enthusiastic kid around them. There’s an innocence about the game that leaves one feeling happy that Chris Roberts was steered away from his original plans for a “gritty” story full of moral ambivalence; one senses that he wouldn’t have been able to do that anywhere near as well as he does this. Even the Kilrathi enemies, silly as they are, take some of the sting out of war; speciesist though the sentiment may be, at least it isn’t people you’re killing out there. Darned if the fiction doesn’t win me over in the end with its sheer exuberance, all bright primary emotions to match the bright primary colors of the VGA palette. Sometimes you’re cheering along with it, sometimes you’re laughing at it, but you’re always having a good time. The whole thing is just too gosh-darned earnest to annoy me like most bad writing does.

Even the rogue’s gallery of ethnic stereotypes that is your fellow pilots doesn’t grate as much as it might. Indeed, Origin’s decision to include lots of strong, capable women and people of color among the pilots should be applauded. Whatever else you can say about Wing Commander, its heart is almost always in the right place.

Winning a Golden Sun for “surviving the destruction of my ship.” I’m not sure, though, that “sacrificing my vessel” was really an act of bravery, under the circumstances. Oh, well, I’ll take whatever hardware they care to give me.

One thing Wing Commander understands very well is the value of positive reinforcement — the importance of, as Sid Meier puts it, making sure the player is always the star of the show. In that spirit, the kill count of even the most average player will always advance much faster on the squadron’s leader board than that of anyone else in the squadron. As you play through the missions, you’re given promotions and occasionally medals, the latter delivered amidst the deafening applause of your peers in a scene lifted straight from the end of the first Star Wars film (which was in turn aping the Nuremberg Rally shown in Triumph of the Will, but no need to think too much about that in this giddy context). You know at some level that you’re being manipulated, just as you know the story is ridiculous, but you don’t really care. Isn’t this feeling of achievement a substantial part of the reason that we play games?

Another thing Wing Commander understands — or perhaps stumbled into accidentally thanks to the compressed development schedule — is the value of brevity. Thanks to the tree structure that makes it impossible to play all 40 missions on any given run-through, a typical Wing Commander career spans no more than 25 or 30 missions, most of which can be completed in half an hour or so, especially if you use the handy auto-pilot function to skip past all the point-to-point flying and just get to the places where the shooting starts. (Personally, I prefer the more organic feel of doing all the flying myself, but I suspect I’m a weirdo in this as in so many other respects.) The relative shortness of the campaign means that the game never threatens to run into the ground the flight engine’s rather limited box of tricks. It winds up leaving you wanting more rather than trying your patience. For all these reasons, and even with all its obvious problems technical and otherwise, Wing Commander remains good fun today.

Which doesn’t of course mean that any self-respecting digital antiquarian can afford to neglect its importance to gaming history. The first blockbuster of the 1990s and the most commercially dominant franchise in computer gaming until the arrival of Doom in 1993 shook everything up yet again, Wing Commander can be read as cause or symptom of the changing times. There was a sense even in 1990 that Wing Commander‘s arrival, coming so appropriately at the beginning of a new decade, marked a watershed moment, and time has only strengthened that impression. Chris Crawford, this medium’s eternal curmudgeon — every creative field needs one of them to serve as a corrective to the hype-merchants — has accused Wing Commander of nothing less than ruining the culture of gaming for all time. By raising the bar so high on ludic audiovisuals, runs his argument, Wing Commander dramatically raised the financial investment necessary to produce a competitive game. This in turn made publishers, reluctant to risk all that capital on anything but a sure bet, more conservative in the sorts of projects they were willing to approve, causing more experimental games with only niche appeal to disappear from the market. “It became a hit-driven industry,” Crawford says. “The whole marketing strategy, economics, and everything changed, in my opinion, much for the worse.”

There’s some truth to this assertion, but it’s also true that publishers had been growing more conservative and budgets had been creeping upward for years before Wing Commander. By 1990, Infocom’s literary peak was years in the past, as were Activison’s experimental period and Electronic Arts’s speculations on whether computers could make you cry. In this sense, then, Wing Commander can be seen as just one more point on a trend line, not the dramatic break which Crawford would claim it to be. Had it not come along when it did to raise the audiovisual bar, something else would have.

Where Wing Commander does feel like a cleaner break with the past is in its popularizing of the use of narrative in a traditionally non-narrative-driven genre. This, I would assert, is the real source of the game’s appeal, then and now. The shock and awe of seeing the graphics and hearing the sound and music for the first time inevitably faded even back in the day, and today of course the whole thing looks garish and a little kitschy with those absurdly big pixels. And certainly the space-combat game alone wasn’t enough to sustain obsessive devotion back in the day, while today the speed issues can at times make it more than a little exasperating to actually play Wing Commander at all. But the appeal of, to borrow from Infocom’s old catch-phrase, waking up inside a story — waking up inside a Star Wars movie, if you like — and being swept along on a rollicking, semi-interactive ride is, it would seem, eternal. It may not have been the reason most people bought Wing Commander in the early 1990s — that had everything to do with those aforementioned spectacular audiovisuals — but it was the reason they kept playing it, the reason it remained the best single computer game in the country according to Computer Gaming World‘s readers for all those months. Come for the graphics and sound, stay for the story. The ironic aspect of all this is that, as I’ve already noted, Wing Commander‘s story barely qualified as a story at all by the standards of conventional fiction. Yet, underwhelming though it was on its own merits, it worked more than well enough in providing structure and motivation for the individual missions.

The clearest historical antecedent to Wing Commander must be the interactive movies of Cinemaware, which had struggled to combine cinematic storytelling with modes of play that departed from traditional adventure-game norms throughout the second half of the 1980s, albeit with somewhat mixed success. John Cutter, a designer at Cinemaware, has described how Bob Jacob, the company’s founder and president, reacted to his first glimpse of Wing Commander: “I don’t think I’ve ever seen him look so sad.” With his company beginning to fall apart around him, Jacob had good reason to feel sad. He least of all would have imagined Origin Systems — they of the aesthetically indifferent CRPG epics — as the company that would carry the flag of cinematic computer gaming forward into the new decade, but the proof was right there on the screen in front of him.

There are two accounts, both of them true in their way, to explain how the adventure game, a genre that in the early 1990s was perhaps the most vibrant and popular in computer gaming, ended the decade an irrelevancy to gamers and publishers alike. One explanation, which I’ve gone into a number of times already on this blog, focuses on a lack of innovation and, most of all, a lack of good design practices among far too many adventures developers; these lacks left the genre identified primarily with unfun pixel hunts and illogical puzzles in the minds of far too many players. But another, more positive take on the subject says that adventure games never really went away at all: their best attributes were rather merged into other genres. Did adventure games disappear or did they take over the world? As in so many cases, the answer depends on your perspective. If you focus on the traditional mechanics of adventure games — exploring landscapes and solving puzzles, usually non-violently — as their defining attributes, the genre did indeed go from thriving to all but dying in the course of about five years. If, on the other hand, you choose to see adventure games more broadly as games where you wake up inside a story, it can sometimes seem like almost every game out there today has become, whatever else it is, an adventure game.

Wing Commander was the first great proof that many more players than just adventure-game fans love story. Players love the way a story can make them feel a part of something bigger as they play, and, more prosaically but no less importantly, they love the structure it can give to their play. One of the dominant themes of games in the 1990s would be the injection of story into genres which had never had much use for it before: the unfolding narrative of discovery built into the grand-strategy game X-Com, the campaign modes of the real-time-strategy pioneers Warcraft and Starcraft, the plot that gave meaning to all the shooting in Half-Life. All of these are among the most beloved titles of the decade, spawning franchises that remain more than viable to this day. One has to assume this isn’t a coincidence. “The games I made were always about narrative because I felt that was missing for me,” says Chris Roberts. “I wanted that sense of story and progression. I felt like I wasn’t getting that in games. That was one of my bigger drives when I was making games, was to get that, that I felt like I really wanted and liked from other media.” Clearly many others agreed.

(Sources: the books Wing Commander I and II: The Ultimate Strategy Guide by Mike Harrison and Game Design Theory and Practice by Richard Rouse III; Retro Gamer 59 and 123; Questbusters of July 1989, August 1990, and April 1991; Computer Gaming World of September 1989 and November 1992; Amiga Computing of December 1988. Online sources include documents hosted at the Wing Commander Combat Information Center, US Gamer‘s profile of Chris Roberts, The Escapist‘s history of Wing Commander, Paul Dean’s interview with Chris Roberts, and Matt Barton’s interview with George “The Fat Man” Sanger. Last but far from least, my thanks to John Miles for corresponding with me via email about his time at Origin, and my thanks to Casey Muratori for putting me in touch with him.

Wing Commander I and II can be purchased in a package together with all of their expansion packs from GOG.com.)

 
 

Tags: , ,

From Squadron to Wingleader

Chris Roberts and Richard Garriott, 1988

At the Summer Consumer Electronics Show in June of 1989, Origin Systems and Brøderbund Software announced that they wouldn’t be renewing the distribution contract the former had signed with the latter two years before. It was about as amicable a divorce as has ever been seen in the history of business; in this respect, it could hardly have stood in greater contrast to the dust-up that had ended Origin’s relationship with Electronic Arts, their previous distributor, in 1987. Each company was full of rosy praise and warm wishes for the other at a special “graduation party” Brøderbund threw for Origin at the show. “Brøderbund has been one of the few affiliated-label programs that truly helps a small company grow to a size where it can stand on its own and enter the real world,” said Origin’s Robert Garriott, making oblique reference to the more predatory approach of Electronic Arts. In response, Brøderbund’s Gary Carlston toasted that “it’s been rewarding to have helped Origin pursue its growth, and it’s exciting to see the company take this step,” confirming yet one more time Brøderbund’s well-earned reputation as the nice guys of their industry who somehow kept managing to finish first. And so, with a last slap on the rump and a final chorus of “Kumbaya,” Brøderbund sent Origin off to face the scary “world of full-service software publishing” alone.

It was a bold step for Origin, especially given that they still hadn’t solved a serious problem that had dogged them since their founding in the Garriott brothers’ family garage six years earlier. The first two games released by the young company back in 1983 had been Ultima III, the latest installment in Richard Garriott’s genre-defining CRPG series, and Caverns of Callisto, an action game written by Richard’s high-school buddy Chuck Bueche. Setting the frustrating pattern for what was to come, Ultima III soared up the bestseller charts, while Caverns of Callisto disappeared without a trace. In the years that followed, Origin released some non-Ultima games that were moderately successful, but never came close to managing a full-on hit outside of their signature franchise. This failure left them entirely dependent for their survival on Richard Garriott coming up with a new and groundbreaking Ultima game every couple of years, and on that game then proceeding to sell over 200,000 copies. Robert Garriott, as shrewd a businessman as any in his industry, knew that staking his company’s entire future on a single game every two years was at best a risky way to run things. Yet, try as he might, he couldn’t seem to break the pattern.

Origin had a number of factors working against them in their efforts to diversify, but the first and most ironic among them must be the very outsize success of Ultima itself. The company had become so identified with Ultima that many gamers barely realized that they did anything else. As for other folks working in the industry, they had long jokingly referred to Origin Systems as “Ultima Systems.” Everyone knew that the creator of Ultima was also the co-founder of Origin, and the brother of the man who directed its day-to-day operations. In such a situation, there must be a real question of whether any other game project, even a potentially great one, could avoid being overshadowed by the signature franchise, could find enough oxygen to thrive. Added to these concerns, which would be applicable to any company in such a situation, must be the unique nature of the cast of characters at Origin. Richard Garriott’s habit of marching around trade-show floors in full Lord British regalia, his entourage in tow, didn’t always endear him to the rest of the industry. There were, it sometimes seemed, grounds to question whether Richard himself knew that he wasn’t actually a monarch, just a talented kid from suburban Houston with nary a drop of royal blood coursing through his veins. At times, Origin Systems could feel perilously close to a cult of personality. Throw in the company’s out-of-the-way location in Austin, Texas, and attracting really top-flight projects became quite a challenge for them.

So, when it came to games that weren’t Ultima Origin had had to content themselves with projects one notch down from the top tier — projects which, whether because they weren’t flashy enough or were just too nichey, weren’t of huge interest to the bigger publishers. Those brought in enough revenue to justify their existence but not much more, and thus Robert Garriott continued to bet the company every two years on his brother’s latest Ultima. It was a nerve-wracking way to live.

And then, in 1990, all that changed practically overnight. This article and the one that follows will tell the story of how the house that Ultima built found itself with an even bigger franchise on its hands.


Chris Roberts

By the end of the 1980s, the North American and European computer-game industries, which had heretofore existed in almost total isolation from one another, were becoming slowly but steadily more interconnected. The major American publishers were setting up distribution arms in Europe, and the smaller ones were often distributing their wares through the British importer U.S. Gold. Likewise, the British Firebird and Rainbird labels had set up offices in the United States, and American publishers like Cinemaware were doing good business importing British games for American owners of the Commodore Amiga, a platform that was a bit neglected by domestic developers. But despite these changes, the industry as a whole remained a stubbornly bifurcated place. European developers remained European, American developers remained American, and the days of a truly globalized games industry remained far in the future. The exceptions to these rules stand out all the more thanks to their rarity. And one of these notable exceptions was Chris Roberts, the young man who would change Origin Systems forever.

With a British father and an American mother, Chris Roberts had been a trans-Atlantic sort of fellow right from the start. His father, a sociologist at the University of Manchester, went with his wife to Guatemala to do research shortly after marrying, and it was there that Chris was conceived in 1967. The mother-to-be elected to give birth near her family in Silicon Valley. (From the first, it seems, computers were in the baby’s blood.) After returning for a time to Guatemala, where Chris’s father was finishing his research, the little Roberts clan settled back in Manchester, England. A second son arrived to round out the family in 1970.

His first international adventure behind him, Chris Roberts grew up as a native son of Manchester, developing the distinct Mancunian intonation he retains to this day along with his love of Manchester United football. When first exposed to computers thanks to his father’s position at Manchester University, the boy was immediately smitten. In 1982, when Chris was 14, his father signed him up for his first class in BASIC programming and bought a BBC Micro for him to practice on at home. As it happened, the teacher of that first programming class became a founding editor of the new magazine BBC Micro User. Hungry for content, the magazine bought two of young Chris’s first simple BASIC games to publish as type-in listings. Just like that, he was a published game developer.

Britain at the time was going absolutely crazy for computers and computer games, and many of the new industry’s rising stars were as young or younger than Roberts. It thus wasn’t overly difficult for him to make the leap to designing and coding boxed games to be sold in stores. Imagine Software published his first such, a platformer called Wizadore, in 1985; Superior Software published a second, a side-scrolling shooter called Stryker’s Run, in 1986. But the commercial success these titles could hope to enjoy was limited by the fact that they ran on the BBC Micro, a platform which was virtually unknown outside of Britain and even inside of its home country was much less popular than the Sinclair Spectrum as a gaming machine. Being amply possessed of the contempt most BBC Micro owners felt toward the cheap and toy-like “Speccy,” Roberts decided to shift his attention instead to the Commodore 64, the most popular gaming platform in the world at the time. This decision, combined with another major decision made by his parents, set him on his unlikely collision course with Origin Systems in far-off Austin, Texas.

In early 1986, Roberts’s father got an offer he couldn’t refuse in the form of a tenured professorship at the University of Texas. After finishing the spring semester that year, he, his wife, and his younger son thus traded the gray skies of Manchester for the sunnier climes of Austin. Chris was just finishing his A-Levels at the time. Proud Mancunian that he was, he declared that he had no intention of leaving England — and certainly not for a hick town in the middle of Texas. But he had been planning all along to take a year off before starting at the University of Manchester, and his parents convinced him to at least join the rest of the family in Austin for the summer. He agreed, figuring that it would give him a chance to work free of distractions on a new action/adventure game he had planned as his first project for the Commodore 64. Yet what he actually found in Austin was lots of distractions — eye-opening distractions to warm any young man’s heart. Roberts:

The weather was a little nicer in Austin. The American girls seemed to like the English accent, which wasn’t bad, and there was definitely a lot… everything seemed like it was cheaper and there was more of it, especially back then. Now, the world’s become more homogenized so there’s not things you can only get in America that you don’t get in England as well. Back then it was like, the big American movies would come out in America and then they would come out in England a year later and stuff. So I came over and was like, “Ah, you know, this is pretty cool.”

There were also the American computers to consider; these tended to be much more advanced than their British counterparts, sporting disk drives as universal standard equipment at a time when most British games — including both of Roberts’s previous games — were still published on cassette tapes. In light of all these attractions, it seems doubtful whether Roberts would have kept his resolution to return to Manchester in any circumstances. But there soon came along the craziest of coincidences to seal the deal.

Roberts had decided that he really needed to find an artist to help him with his Commodore 64 game-in-progress. Entering an Austin tabletop-gaming shop one day, he saw a beautiful picture of a gladiator hanging on the wall. The owner of the shop told him the picture had been drawn by a local artist, and offered to call the artist for him right then and there if Roberts was really interested in working with him. Roberts said yes, please do. The artist in question was none other than Denis Loubet, whose professional association with Richard Garriott stretched back to well before Origin Systems had existed, to when he’d drawn the box art for the California Pacific release of Akalabeth in 1980.

Denis Loubet

After years of working as a contractor, Loubet was just about to be hired as Origin’s first regular in-house artist. Nevertheless, he liked Roberts and thought his game had potential, and agreed to do the art for it as a moonlighting venture. Loubet soon showed what he was working on to Richard Garriott and Dallas Snell, the latter of whom tended to serve as a sort of liaison between the business side of the company, in the person of Robert Garriott, and the creative side, in the person of Richard. All three parties were as impressed by the work-in-progress as Loubet had been, and they invited Chris to Origin’s offices to ask if he’d be interested in publishing it through them. Prior to this point, Roberts had never even heard of Origin Systems or the Ultima series; he’d grown up immersed in the British gaming scene, where neither had any presence whatsoever. But he liked the people at Origin, liked the atmosphere around the place, and perhaps wasn’t aware enough of what the company represented to be leery of it in the way of other developers who were peddling promising projects around the industry. “After my experiences in England, which is like swimming in a big pool of sharks,” he remembers, “I felt comfortable dealing with Origin.”

Times of Lore

All thoughts of returning to England had now disappeared. Working from Origin’s offices, albeit still as a contracted outside developer rather than an employee, Roberts finished his game, which came to be called Times of Lore. In the course of its development, the game grew considerably in scope and ambition, and, as seemed only appropriate given the company that was to publish it, took on some light CRPG elements as well. In much of this, Roberts was inspired by David Joiner’s 1987 action-CRPG The Faery Tale Adventure. American influences aside, though, Times of Lore still fit best of all into the grand British tradition of free-scrolling, free-roaming 8-bit action/adventures, a sub-genre that verged on completely unknown to American computer gamers. Roberts made sure the whole game could fit into the Commodore 64’s memory at once to facilitate a cassette-based version for the European market.

Unfortunately, his game got to enjoy only a middling level of sales success in return for all his efforts. As if determined to confirm the conventional wisdom that had caused so many developers to steer clear of them, Origin released Times of Lore almost simultaneously with the Commodore 64 port of Ultima V in 1988, leaving Roberts’s game overshadowed by Lord British’s latest. And in addition to all the baggage that came with the Origin logo in the United States, Times of Lore suffered all the disadvantages of being a pioneer of sorts in Europe, the first Origin title to be pushed aggressively there via a new European distribution contract with MicroProse. While that market would undoubtedly have understood the game much better had they given it a chance, no one there yet knew what to make of the company whose logo was on the box. Despite its strengths, Times of Lore thus failed to break the pattern that had held true for Origin for so long. It turned into yet another non-Ultima that was also a non-hit.

Times of Lore

But whatever the relative disappointments, Times of Lore at least wasn’t a flop, and Chris Roberts stayed around as a valued member of the little Origin family. Part of the reason the Origin people wanted to keep him around was simply because they liked him so much. He nursed the same passions for fantasy and science fiction as most of them, with just enough of a skew provided by his British upbringing to make him interesting. And he positively radiated energy and enthusiasm. He’s never hard to find in Origin group shots of the time. His face stands out like that of a nerdy cherub — he had never lost his facial baby fat, making him look pudgier in pictures than he was in real life — as he beams his thousand-kilowatt smile at all and sundry. Still, it was hardly his personality alone that made him such a valued colleague; the folks at Origin also came to have a healthy respect for his abilities. Indeed, and as we’ve already seen in an earlier article, the interface of Times of Lore had a huge influence on that of no less vital an Origin game than Ultima VI.

Alas, Roberts’s own next game for Origin would be far less influential. After flirting for a while with the idea of doing a straightforward sequel to Times of Lore, he decided to adapt the engine to an even more action-oriented post-apocalyptic scenario. Roberts’s first game for MS-DOS, Bad Blood was created in desultory fits and starts, one of those projects that limps to completion more out of inertia than passion. Released at last in 1990, it was an ugly flop on both sides of the Atlantic. Roberts blames marketplace confusion at least partially for its failure: “People who liked arcade-style games didn’t buy it because they thought Bad Blood would be another fantasy-role-play-style game. It was the worst of both worlds, a combination of factors that contributed to its lack of success.” In reality, though, the most telling factor of said combination was just that Bad Blood wasn’t very good, evincing little of the care that so obviously went into Times of Lore. Reviewers roundly panned it, and buyers gave it a wide berth. Thankfully for Chris Roberts’s future in the industry, the game that would make his name was already well along at Origin by the time Bad Blood finally trickled out the door.

Bad Blood

Had it come to fruition in its original form, Roberts’s third game for Origin would have marked even more of a departure for him than the actual end result would wind up being. Perhaps trying to fit in better with Origin’s established image, he had the idea of doing, as he puts it, “a space-conquest game where you take over star systems, move battleships around, and invade planets. It was going to be more strategic than my earlier games.” But Roberts always craved a little more adrenaline in his designs than such a description would imply, and it didn’t take him long to start tinkering with the formula. The game moved gradually from strategic battles between slow-moving dreadnoughts in space to manic dogfights between fighter planes in space. In other words, to frame the shift the way the science-fiction-obsessed Roberts might well have chosen, his inspiration for his space battles changed from Star Trek to Star Wars. He decided “it would be more fun flying around in a fighter than moving battleships around the screen”; note the (unconscious?) shift in this statement from the player as a disembodied hand “moving” battleships around to the player as an embodied direct participant “flying around” herself in fighters. Roberts took to calling his work-in-progress Squadron.

To bring off his idea for an embodied space-combat experience, Roberts would have to abandon the overhead views used by all his games to date in favor of a first-person out-the-cockpit view, like that used by a game he and every other BBC Micro veteran knew well, Ian Bell and David Braben’s Elite. “It was the first space game in which I piloted a ship in combat,” says Roberts of Elite, “and it opened my eyes to the possibilities of where it could go.” On the plus side, Roberts knew that this and any other prospective future games he might make for Origin would be developed on an MS-DOS machine with many times the processing power of the little BBC Micro (or, for that matter, the Commodore 64). On the negative side, Roberts wasn’t a veritable mathematics genius like Ian Bell, the mastermind behind Elite‘s 3D graphics. Nor could he get away in the current marketplace with the wire-frame graphics of Elite. So, he decided to cheat a bit, both to simplify his life and to up the graphics ante. Inspired by the graphics of the Lucasfilm Games flight simulator Battlehawks 1942, he used pre-rendered bitmap images showing ships from several different sides and angles, which could then be scaled to suit the player’s out-the-cockpit view, rather than making a proper, mathematically rigorous 3D engine built out of polygons. As becomes clear all too quickly to anyone who plays the finished game, the results could be a little wonky, with views of the ships suddenly popping into place rather than smoothly rotating. Nevertheless, the ships themselves looked far better than anything Roberts could possibly have hoped to achieve on the technology of the time using a more honest 3D engine.

Denis Loubet, Roberts’s old partner in crime from the early days of Times of Lore, agreed to draw a cockpit as part of what must become yet another moonlighting gig for both of them; Roberts was officially still supposed to be spending his days at Origin on Bad Blood, while Loubet was up to his eyebrows in Ultima VI. Even at this stage, they were incorporating little visceral touches into Squadron, like the pilot’s hand moving the joystick around in time with what the player was doing with her own joystick in front of the computer screen. As the player’s ship got shot up, the damage was depicted visually there in the cockpit. Like the sparks and smoke that used to burst from the bridge controls on the old Star Trek episodes, it might not have made much logical sense — haven’t any of these space-faring societies invented fuses? — but it served the purpose of creating an embodied, visceral experience. Roberts:

It really comes from wanting to put the player in the game. I don’t want you to think you’re playing a simulation, I want you to think you’re really in that cockpit. When I visualized what it would be like to sit in a cockpit, those are the things I thought of.

I took the approach that I didn’t want to sacrifice that reality due to the game dynamics. If you would see wires hanging down after an explosion, then I wanted to include it, even if it would make it harder to figure out how to include all the instruments and readouts. I want what’s taking place inside the cockpit to be as real as what I’m trying to show outside it, in space. I’d rather show you damage as if you were there than just display something like “damage = 20 percent.” That’s abstract. I want to see it.

Squadron, then, was already becoming an unusually cinematic space-combat “simulation.” Because every action-movie hero needs a sidekick, Roberts added a wingman to the game, another pilot who would fly and fight at the player’s side. The player could communicate with the wingman in the midst of battle, passing him orders, and the wingman in turn would communicate back, showing his own personality; he might even refuse to obey orders on occasion.

As a cinematic experience, Squadron felt very much in tune with the way things in general were trending at Origin, to such an extent that one might well ask who was influencing whom. Like so many publishers in this era in which CD-ROM and full-motion video hovered alluringly just out of view on the horizon, Origin had begun thinking of themselves more and more in the terms of Hollywood. The official “product development structure” that was put in place around this time by Dallas Snell demanded an executive producer, a producer, an assistant producer, a director, an assistant director, and a lead writer for every game; of all the positions on the upper rungs of the chart, only that of lead artist and lead programmer wouldn’t have been listed in the credits of a typical Hollywood film. Meanwhile Origin’s recent hire Warren Spector, who came to them with a Masters in film studies, brought his own ideas about games as interactive dramas that were less literal than Snell’s, but that would if anything prove even more of an influence on his colleagues’ developing views of just what it was Origin Systems really ought to be about. Just the previous year, Origin had released a game called Space Rogue, another of that long line of non-Ultima middling sellers, that had preceded Squadron in attempting to do Elite one better. A free-form player-directed game of space combat and trading, Space Rogue was in some ways much more ambitious than the more railroaded experience Roberts was now proposing. Yet there was little question of which game fit better with the current zeitgeist at Origin.

All of which does much to explain the warm reception accorded to Squadron when Chris Roberts, with Bad Blood finally off his plate, pitched it to Origin’s management formally in very early 1990. Thanks to all those moonlighting hours — as well as, one suspects, more than a few regular working hours — Roberts already had a 3D space-combat game that looked and played pretty great. A year or two earlier, that likely would have been that; Origin would have simply polished it up a little and shipped it. But now Roberts had the vision of building a movie around the game. Between flying a series of scripted missions, you would get to know your fellow pilots and follow the progress of a larger war between humanity and the Kilrathi, a race of savage cats in space.

Having finally made the hard decision to abandon the 8-bit market at the beginning of 1989, Origin was now pushing aggressively in the opposite direction from their old technological conservatism, being determined to create games that showed what the very latest MS-DOS machines could really do. Like Sierra before them, they had decided that if the only way to advance the technological state of the art among ordinary consumers was to release games whose hardware requirements were ahead of the curve — a reversal of the usual approach among game publishers, who had heretofore almost universally gone where the largest existing user base already was — then that’s what they would do. Squadron could become the first full expression of this new philosophy, being unapologetically designed to run well only on a cutting-edge 80386-based machine. In what would be a first for the industry, Chris Roberts even proposed demanding expanded memory beyond the traditional 640 K for the full audiovisual experience. For Roberts, stepping up from a Commodore 64, it was a major philosophical shift indeed. “Sod this, trying to make it work for the lowest common denominator—I’m just going to try and push it,” he said, and Origin was happy to hear it.

Ultima VI had just been completed, freeing personnel for another major project. Suspecting that Squadron might be the marketplace game changer he had sought for so long for Origin, Robert Garriott ordered a full-court press in March of 1990. He wanted his people to help Chris Roberts build his movie around his game, and he wanted them to do it in less than three months. They should have a preview ready to go for the Summer Consumer Electronics Show at the beginning of June, with the final product to ship very shortly thereafter.

Jeff George

Responsibility for the movie’s script was handed to Jeff George, one of the first of a number of fellow alumni of the Austin tabletop-game publisher Steve Jackson Games who followed Warren Spector to Origin. George was the first Origin employee hired explicitly to fill the role of “writer.” This development, also attributable largely to the influence of Spector, would have a major impact on Origin’s future games.

Obviously inspired by the ethical quandaries the Ultima series had become so known for over its last few installments, Chris Roberts had imagined a similarly gray-shaded world for his game, with scenarios that would cause the player to question whether the human empire she was fighting for was really any better than that of the Kilrathi. But George, to once again frame the issue in terms Roberts would have appreciated, pushed the game’s fiction toward the clear-cut good guys and bad guys of Star Wars, away from the more complicated moral universe of Star Trek. All talk of a human “empire,” for one thing, would have to go; everyone at Origin knew what their players thought of first when they thought of empires in space. Jeff George:

In the context of a space opera, empire had a bad connotation that would make people think they were fighting for the bad guys. The biggest influence I had on the story was to make it a little more black and white, where Chris had envisioned something grittier, with more shades of gray. I didn’t want people to worry about moral dilemmas while they were flying missions. That’s part of why it worked so well. You knew what you were doing, and knew why you were doing it. The good guys were really good, the bad guys were really bad.

The decision to simplify the political situation and sand away the thorny moral dilemmas demonstrates, paradoxical though it may first seem, a more sophisticated approach to narrative rather than the opposite. Some interactive narratives, like some non-interactive ones, are suited to exploring moral ambiguity. In others, though, the player just wants to fight the bad guys. While one can certainly argue that gaming has historically had far too many of the latter type and far too few of the former, there nevertheless remains an art to deciding which games are best suited for which.

Glen Johnson

Five more programmers and four more artists would eventually join what had been Chris Roberts and Denis Loubet’s little two-man band. With the timetable so tight, the artists were left to improvise large chunks of the narrative along with the game’s visuals. By imagining and drawing the “talking head” portraits of the various other pilots with which the player would interact, artist Glen Johnson wound up playing almost as big a role as Jeff George in crafting the fictional context for the game’s dogfights in space. Johnson:

I worked on paper first, producing eleven black-and-white illustrations. In most games, I would work from a written description of the character’s likes, dislikes, and personality. In this case, I just came up with the characters out of thin air, although I realized they wanted a mixture of men and women pilots. I assigned a call sign to each portrait.

Despite the lack of time at their disposal, the artists were determined to fit the movements of the characters’ mouths to the words of dialog that appeared on the screen, using techniques dating back to classic Disney animation. Said techniques demanded that all dialog be translated into its phonetic equivalent, something that could only be done by hand. Soon seemingly half the company was doing these translations during snatches of free time. Given that many or most players never even noticed the synchronized speech in the finished game, whether it was all worth it is perhaps a valid question, but the determination to go that extra mile in this regard does say much about the project’s priorities.

The music wound up being farmed out to a tiny studio specializing in videogame audio, one of vanishingly few of its kind at the time, which was run by a garrulous fellow named George Sanger, better known as “The Fat Man.” (No, he wasn’t terribly corpulent; that was sort of the joke.) Ever true to his influences, Chris Roberts’s brief to Sanger was to deliver something “between Star Wars and Star Trek: The Motion Picture.” Sanger and his deputy Dave Govett delivered in spades. Hugely derivative of John Williams’s work though the soundtrack was — at times it threatens to segue right into Williams’s famous Star Wars theme — it contributed hugely to the cinematic feel of the game. Origin was particularly proud of the music that played in the background when the player was actually flying in space; the various themes ebbed and swelled dynamically in response to the events taking place on the computer screen. It wasn’t quite the first time anyone had done something like this in a game, but no one had ever managed to do it in quite this sophisticated a way.

The guiding theme of the project remained the determination to create an embodied experience for the player. Chris Roberts cites the interactive movies of Cinemaware, which could be seen as the prototypes for the sort of game he was now trying to perfect, as huge influences in this respect as in many others. Roberts:

I didn’t want anything that made you sort of… pulled you out of being in this world. I didn’t want that typical game UI, or “Here’s how many lives you’ve got, here’s what high score you’ve got.” I always felt that broke the immersion. If you wanted to save the game you’d go to the barracks and you’d click on the bunk. If you wanted to exit, you’d click on the airlock. It was all meant to be in that world and so that was what the drive was. I love story and narrative and I think you can use that story and narrative to tie your action together and that will give your action meaning and context in a game. That was my idea and that was what really drove what I was doing.

The approach extended to the game’s manual. Harking back to the beloved scene-setting packaging of Infocom, the manual, which was written by freelancer Aaron Allston, took the form of Claw Marks, “The Onboard Magazine of TCS Tiger’s Claw” — the Tiger’s Claw being the name of the spaceborne aircraft carrier from which the player would be flying all of the missions. Like the artists, Allston would wind up almost inadvertently creating vital pieces of the game as a byproduct of the compressed schedule. “I couldn’t really determine everything at that point in development,” he remembers, “so, in some cases, specifically for the tactics information, we made some of it up and then retrofitted it and adjusted the code in the game to make it work.”

Once again in the spirit of creating a cohesive, embodied experience for the player, Roberts wanted to get away from the save-and-restore dance that was so typical of ludic narratives of the era. Therefore, instead of structuring the game’s 40 missions as a win-or-go-home linear stream, he created a branching mission tree in which the player’s course through the narrative would be dictated by her own performance. There would, in other words, be no way to definitively lose other than by getting killed. Roberts would always beg players to play the game “honestly,” beg them not to reload and replay each mission until they flew it perfectly. Only in this way would they get the experience he had intended for them to have.

Warren Spector

As the man responsible for tying all of the elements together to create the final experience, Roberts bore the titles of director and producer under Origin’s new cinematic nomenclature. He worked under the watchful eye of Squadron‘s co-producer Warren Spector, who, being older and in certain respects wiser, was equipped to handle the day-to-day administrative tasks that Roberts wasn’t. Spector:

When I came on as producer, Chris was really focused on the direction he wanted to take with the game. He knew exactly where he was going, and it would have been hard to deflect him from that course. It would have been crazy to even want to, so Chris and I co-produced the game. Where his talent dropped out, mine started, and vice versa. We did a task breakdown, and I ended up updating, adjusting, and tracking scheduling and preparing all the documentation. He handled the creative and qualitative issues. We both juggled the resources.

In implying that his own talent “dropped out” when it came to creative issues, Spector is selling himself about a million dollars short. He was a whirling dervish of creative energy throughout the seven years he spent with Origin, if anything even more responsible than Richard Garriott for the work that came out of the company under the Ultima label during this, the franchise’s most prolific period. But another of the virtues which allowed him to leave such a mark on the company was an ability to back off, to defer to the creative visions of others when it was appropriate. Recognizing that no one knew Chris Roberts’s vision like Chris Roberts, he was content in the case of Squadron to act strictly as the facilitator of that vision. In other words, he wasn’t too proud to just play the role of organizer when it was appropriate.

Still, it became clear early on that no combination of good organization and long hours would allow Squadron to ship in June. The timetable slipped to an end-of-September ship date, perfect to capitalize on the Christmas rush.

Although Squadron wouldn’t ship in June, the Summer Consumer Electronics Show loomed with as much importance as ever as a chance to show off the game-to-be and to drum up excitement that might finally end the sniggering about Ultima Systems. Just before the big show, Origin’s lawyers delivered the sad news that calling the game Squadron would be a bad idea thanks to some existing trademarks on the name. After several meetings, Wingleader emerged as the consensus choice for a new name, narrowly beating out Wing Commander. It was thus under the former title that the world at large got its first glimpse of what would turn into one of computer gaming’s most iconic franchises. Martin Davies, Origin’s Vice President of Sales:

I kicked hard to have a demo completed for the show. It was just a gut reaction, but I knew I needed to flood retail and distribution channels with the demo. Before the release of the game, I wanted the excitement to grow so that the confidence level would be extremely high. If we could get consumers beating a path in and out of the door, asking whether the game was out, distribution would respond.

With Wingleader still just a bunch of art and sound assets not yet wired up to the core game they were meant to complement, an interactive demo was impossible. Instead Chris Roberts put together a demo on videotape, alternating clips of the battles in space with clips of whatever other audiovisual elements he could assemble from what the artists and composers had managed to complete. Origin brought a big screen and a booming sound system out to Chicago for the show; the latter prompted constant complaints from other exhibitors. The noise pollution was perfect for showing the world that there was now more to Origin Systems than intricate quests and ethical dilemmas — that they could do aesthetic maximalism as well as anyone in their industry, pushing all of the latest hardware to its absolute limit in the process. It was a remarkable transformation for a company that just eighteen months before had been doing all development on the humble little 8-bit Apple II and Commodore 64. Cobbled together though it was, the Wingleader demo created a sensation at CES.

Indeed, one can hardly imagine a better demonstration of how the computer-game industry as a whole was changing than the game that had once been known as Squadron, was now known as Wingleader, and would soon go onto fame as Wing Commander. In my next article, I’ll tell the story of how the game would come to be finished and sold, along with the even more important story of what it would mean for the future of digital entertainment.

(Sources: the books Wing Commander I and II: The Ultimate Strategy Guide by Mike Harrison and Game Design Theory and Practice by Richard Rouse III; Retro Gamer 59 and 123; Questbusters of July 1989, August 1990, and April 1991; Computer Gaming World of September 1989 and November 1992; Amiga Computing of December 1988. Online sources include documents hosted at the Wing Commander Combat Information Center, US Gamer‘s profile of Chris Roberts, The Escapist‘s history of Wing Commander, Paul Dean’s interview with Chris Roberts, and Matt Barton’s interview with George “The Fat Man” Sanger. Last but far from least, my thanks to John Miles for corresponding with me via email about his time at Origin, and my thanks to Casey Muratori for putting me in touch with him.

Wing Commander I and II can be purchased in a package together with all of their expansion packs from GOG.com.)

 
 

Tags: , ,

The 640 K Barrier

There was a demon in memory. They said whoever challenged him would lose. Their programs would lock up, their machines would crash, and all their data would disintegrate.

The demon lived at the hexadecimal memory address A0000, 655,360 in decimal, beyond which no more memory could be allocated. He lived behind a barrier beyond which they said no program could ever pass. They called it the 640 K barrier.

— with my apologies to The Right Stuff[1]Yes, that is quite possibly the nerdiest thing I’ve ever written.

The idea that the original IBM PC, the machine that made personal computing safe for corporate America, was a hastily slapped-together stopgap has been vastly overstated by popular technology pundits over the decades since its debut back in August of 1981. Whatever the realities of budgets and scheduling with which its makers had to contend, there was a coherent philosophy behind most of the choices they made that went well beyond “throw this thing together as quickly as possible and get it out there before all these smaller companies corner the market for themselves.” As a design, the IBM PC favored robustness, longevity, and expandability, all qualities IBM had learned the value of through their many years of experience providing businesses and governments with big-iron solutions to their most important data–processing needs. To appreciate the wisdom of IBM’s approach, we need only consider that today, long after the likes of the Commodore Amiga and the original Apple Macintosh architecture, whose owners so loved to mock IBM’s unimaginative beige boxes, have passed into history, most of our laptop and desktop computers — including modern Macs — can trace the origins of their hardware back to what that little team of unlikely business-suited visionaries accomplished in an IBM branch office in Boca Raton, Florida.

But of course no visionary has 20-20 vision. For all the strengths of the IBM PC, there was one area where all the jeering by owners of sexier machines felt particularly well-earned. Here lay a crippling weakness, born not so much of the hardware found in that first IBM PC as the operating system the marketplace chose to run on it, that would continue to vex programmers and ordinary users for two decades, not finally fading away until Microsoft’s release of Windows XP in 2001 put to bed the last legacies of MS-DOS in mainstream computing. MS-DOS, dubbed the “quick and dirty” operating system during the early days of its development, is likely the piece of software in computing history with the most lopsided contrast between the total number of hours put into its development and the total number of hours it spent in use, on millions and millions of computers all over the world. The 640 K barrier, the demon all those users spent so much time and energy battling for so many years, was just one of the more prominent consequences of corporate America’s adoption of such a blunt instrument as MS-DOS as its standard. Today we’ll unpack the problem that was memory management under MS-DOS, and we’ll also examine the problem’s multifarious solutions, all of them to one degree or another ugly and imperfect.


 

The original IBM PC was built around an Intel 8088 microprocessor, a cost-reduced and somewhat crippled version of an earlier chip called the 8086. (IBM’s decision to use the 8088 instead of the 8086 would have huge importance for the expansion buses of this and future machines, but the differences between the two chips aren’t important for our purposes today.) Despite functioning as a 16-bit chip in most ways, the 8088 had a 20-bit address space, meaning it could address a maximum of 1 MB of memory. Let’s consider why this limitation should exist.

Memory, whether in your brain or in your computer, is of no use to you if you can’t keep track of where you’ve put things so that you can retrieve them again later. A computer’s memory is therefore indexed by bytes, with every single byte having its own unique address. These addresses, numbered from 0 to the upper limit of the processor’s address space, allow the computer to keep track of what is stored where. The biggest number that can be represented in 20 bits is 1,048,575, or 1 MB. Thus this is the maximum amount of memory which the 8088, with its 20-bit address bus, can handle. Such a limitation hardly felt like a deal breaker to the engineers who created the IBM PC. Indeed, it’s difficult to overemphasize what a huge figure 1 MB really was when they released the machine in 1981, in which year the top-of-the-line Apple II had just 48 K of memory and plenty of other competing machines shipped with no more than 16 K.

A processor needs to address other sorts of memory besides the pool of general-purpose RAM which is available for running applications. There’s also ROM memory — read-only memory, burned inviolably into chips — that contains essential low-level code needed for the computer to boot itself up, along with, in the case of the original IBM PC, an always-available implementation of the BASIC programming language. (The rarely used BASIC in ROM would be phased out of subsequent models.) And some areas of RAM as well are set aside from the general pool for special purposes, like the fully 128 K of addresses given to video cards to keep track of the onscreen display in the original IBM PC. All of these special types of memory must be accessed by the CPU, must be given their own unique addresses to facilitate that, and must thus be subtracted from the address space available to the general pool.

IBM’s engineers were quite generous in drawing the boundary between their general memory pool and the area of addresses allocated to special purposes. Focused on expandability and longevity as they were, they reserved big chunks of “special” memory for purposes that hadn’t even been imagined yet. In all, they reserved the upper three-eighths of the available addresses for specialized purposes actual or potential, leaving the lower five-eighths — 640 K — to the general pool. In time, this first 640 K of memory would become known as “conventional memory,” the remaining 384 K — some of which would be ROM rather than RAM — as “high memory.” The official memory map which IBM published upon the debut of the IBM PC looked like this:

It’s important to understand when looking at a memory map like this one that the existence of a logical address therein doesn’t necessarily mean that any physical memory is connected to that address in any given real machine. The first IBM PC, for instance, could be purchased with as little as 16 K of conventional memory installed, and even a top-of-the-line machine had just 256 K, leaving most of the conventional-memory space vacant. Similarly, early video cards used just 32 K or 64 K of the 128 K of address space offered to them in high memory. The 640 K barrier was thus only a theoretical limitation early on, one few early users or programmers ever even noticed.

That blissful state of affairs, however, wouldn’t last very long. As IBM’s creations — joined, soon enough, by lots of clones — became the standard for American business, more and more advanced applications appeared, craving more and more memory alongside more and more processing power. Already by 1984 the 640 K barrier had gone from a theoretical to a very real limitation, and customers were beginning to demand that IBM do something about it. In response, IBM that year released the PC/AT, built around Intel’s new 80286 microprocessor, which boasted a 24-bit address space good for 16 MB of memory. To unlock all that potential extra memory, IBM made the commonsense decision to extend the memory map above the specialized high-memory area that ended at 1 MB, making all addresses beyond 1 MB a single pool of “extended memory” available for general use.

Problem solved, right? Well, no, not really — else this would be a much shorter article. Due more to software than hardware, all of this potential extended memory proved not to be of much use for the vast majority of people who bought PC/ATs. To understand why this should be, we need to examine the deadly embrace between the new processor and the old operating system people were still running on it.

The 80286 was designed to be much more than just a faster version of the old 8086/8088. Developing the chip before IBM PCs running MS-DOS had come to dominate business computing, Intel hadn’t allowed the need to stay compatible with that configuration to keep them from designing a next-generation chip that would help to take computing to where they saw it as wanting to go. Intel believed that microcomputers were at the stage at which the big institutional machines had been a couple of decades earlier, just about ready to break free of what computer scientist Brian L. Stuart calls the “Triangle of Ones”: one user running one program at a time on one machine. At the very least, Intel believed, the second leg of the Triangle must soon fall; everyone recognized that multitasking — running several programs at a time and switching freely between them — was a much more efficient way to do complex work than laboriously shutting down and starting up application after application. But unfortunately for MS-DOS, the addition of multitasking complicates the life of an operating system to an absolutely staggering degree.

Operating systems are of course complex subjects worthy of years or a lifetime of study. We might, however, collapse their complexities down to a few fundamental functions: to provide an interface for the user to work with the computer and manage her programs and files; to manage the various tasks running on the computer and allocate resources among them; and to act as a buffer or interface between applications and the underlying hardware of the computer. That, anyway, is what we expect at a minimum of our operating systems today. But for a computer ensconced within the Triangle of Ones, the second and third functions were largely moot: with only one program allowed to run at a time, resource-management concerns were nonexistent, and, without the need for a program to be concerned about clashing with other programs running at the same time, bare-metal programming — manipulating the hardware directly, without passing requests through any intervening layer of operating-system calls — was often considered not only acceptable but the expected approach. In this spirit, MS-DOS provided just 27 function calls to programmers, the vast majority of them dealing only with disk and file management. (Compare that, my fellow programmers, with the modern Windows or OS X APIs!) For everything else, banging on the bare metal was fine.

We can’t even begin here to address all of the complications that are introduced when we add multitasking into the equation, asking the operating system in the process to fully embrace all three of the core functions listed above. Memory management alone, the one aspect we will look deeper into today, becomes complicated enough. A program which is sharing a machine with other programs can no longer have free run of the memory map, placing whatever it wants to wherever it wants to; to do so risks overwriting the code or data of another program running on the system. Instead the operating system must demand that individual programs formally request the memory they’d like to use, and then must come up with a way to keep a program, whether due to bugs or malice, from running roughshod over areas of memory that it hasn’t been granted.

Or perhaps not. The Commodore Amiga, the platform which pioneered multitasking on personal computers in 1985, didn’t so much solve the latter part of this problem as punted it away. An application program is expected to request from the Amiga’s operating system any memory that it requires. The operating system then returns a pointer to a block of memory of the requested size, and trusts the application not to write to  memory outside of these bounds. Yet nothing besides the programmer’s skill and good nature absolutely prevents such unauthorized memory access from happening. Every application on the Amiga, in other words, can write to any address in the machine’s memory, whether that address be properly allocated to it or not. Screen memory, free memory, another program’s data, another program’s code — all are fair game to the errant program. Such unauthorized memory access will almost always eventually result in a total system crash. A non-malicious programmer who wants her program to be a good citizen would of course never intentionally write to memory she hasn’t properly requested, but bugs of this nature are notoriously easy to create and notoriously hard to track down, and on the Amiga a single instance of one can bring down not only the offending program but the entire operating system. With all due respect to the Amiga’s importance as the first multitasking personal computer, this is obviously not the ideal way to implement it.

A far more sustainable approach is to take the extra step of tracking and protecting the memory that has been allocated to each program. Memory protection is usually accomplished using  what’s known as virtual memory: when a program requests memory, it’s returned not a true address within the system’s memory pool but rather a virtual address that’s translated back into the real address to which it corresponds every time the program accesses its data. Each program is thus effectively sandboxed from everything else, allowed to read from and write to only its own data. Only the lowest levels of the operating system have global access to the memory pool as a whole.

Implementing such memory protection in software alone, however, must be an untenable drain on the resources available to systems engineers in the 1980s — a fact which does everything to explain its absence from the Amiga. Intel therefore decided to give software a leg up via hardware. They built into the 80286 a memory-management unit that could automatically translate from virtual to real memory addresses and vice versa, making this constantly ongoing process fairly transparent even to the operating system.

Nevertheless, the operating system must know about this capability, must in fact be written very differently if it’s to run on a CPU with memory protection built into its circuitry. Intel recognized that it would take time for such operating systems to be created for the new chip, and recognized that compatibility with the earlier 8086/8088 chips would be a very good thing to have in the meantime. They therefore built two possible operating modes into the 80286. In “protected mode” — the mode they hoped would eventually come to be used almost universally — the chip’s full potential would be realized, including memory protection and the ability to address up to 16 MB of memory. In “real mode,” the 80286 would function essentially like a turbocharged 8086/8088, with no memory-protection capabilities and with the old limitation on addressable memory of 1 MB still in place. Assuming that in the early days at least the new chip would need to run on operating systems with no knowledge of its full capabilities, Intel made the 80286 default to real mode on startup. An operating system which did know about the 80286 and wanted to bring out its full potential could switch it to protected mode at boot-up and be off to the races.

It’s at the intersection between the 80286 and the operating system that Intel’s grand plans for the future of their new chip went awry. An overwhelming percentage of the early 80286s were used in IBM PC/ATs and clones, and an overwhelming percentage of those machines were running MS-DOS. Microsoft’s erstwhile “quick and dirty” operating system knew nothing of the 80286’s full capabilities. Worse, trying to give it knowledge of those capabilities would have to entail a complete rewrite which would break compatibility with all existing MS-DOS software. Yet the whole reason MS-DOS was popular in the first place — it certainly wasn’t because of a generous feature set, a friendly interface, or any aesthetic appeal — was that very same huge base of business software. Getting users to make the leap to some hypothetical new operating system in the absence of software to run on it would be as difficult as getting developers to write programs for an operating system with no users. It was a chicken-or-the-egg situation, and neither chicken nor egg was about to stick its neck out anytime soon.

IBM was soon shipping thousands upon thousands of PC/ATs every month, and the clone makers were soon shipping even more 80286-based machines of their own. Yet at least 95 percent of those machines were idling along at only a fraction of their potential, thanks to the already creakily archaic MS-DOS. For all these users, the old 640 K barrier remained as high as ever. They could stuff their machines full of extended memory if they liked, but they still couldn’t access it. And of course the multitasking that the 80286 was supposed to have enabled remained as foreign a concept to MS-DOS as a GPS unit to a Model T. The only solution IBM offered those who complained about the situation was to run another operating system. And indeed, there were a number of alternatives to MS-DOS available for the PC/AT and other 80286-based machines, including several variants of the old institutional-computing favorite Unix — one of them even from Microsoft — and new creations like Digital Research’s Concurrent DOS, which struggled with mixed results to wedge in some degree of MS-DOS compatibility. Still, the only surefire way to take full advantage of MS-DOS’s huge software base was to run the real — in more ways than one now! — MS-DOS, and this is what the vast majority of people with 80286-equipped machines wound up doing.

Meanwhile the very people making the software which kept MS-DOS the only viable choice for most users were feeling the pinch of being confined to 640 K more painfully almost by the month. Finally Lotus Corporation —  makers of the Lotus 1-2-3 spreadsheet package that ruled corporate America, the greatest single business-software success story of their era — decided to use their clout to do something about it. They convinced Intel to join them in devising a scheme for breaking the 640 K barrier without abandoning MS-DOS. What they came up with was one mother of an ugly kludge — a description the scheme has in common with virtually all efforts to break through the 640 K barrier.

Looking through the sparsely populated high-memory area which the designers of the original IBM PC had so generously carved out, Lotus and Intel realized it should be possible on almost any extant machine to identify a contiguous 64 K chunk of those addresses which wasn’t being used for anything. This chunk, they decided, would be the gateway to potentially many more megabytes installed elsewhere in the machine. Using a combination of software and hardware, they implemented what’s known as a bank-switching scheme. The 64 K chunk of high-memory addresses was divided into four segments of 16 K, each of which could serve as a lens focused on a 16 K segment of additional memory above and beyond 1 MB. When the processor accessed the addresses in high memory, the data it would actually access would be the data at whatever sections of the additional memory their lenses were currently pointing to. The four lenses could be moved around at will, giving access, albeit in a roundabout way, to however much extra memory the user had installed. The additional memory unlocked by the scheme was dubbed “expanded memory.”  The name’s unfortunate similarity to “extended memory” would cause much confusion over the years to come; from here on, we’ll call it by its common acronym of “EMS.”

All those gobs of extra memory wouldn’t quite come for free: applications would have to be altered to check for the existence of EMS memory and make use of it, and there would remain a distinct difference between conventional memory and EMS memory with which programmers would always have to reckon. Likewise, the overhead of constantly moving those little lenses around made EMS memory considerably slower to access than conventional memory. On the brighter side, though, EMS worked under MS-DOS with only the addition of a single device driver during startup. And, since the hardware mechanism for moving the lenses around was completely external to the CPU, it would even work on machines that weren’t equipped with the new 80286.

This diagram shows the different types of memory available on PCs of the mid-1980s. In blue, we see the original 1 MB memory map of the IBM PC. In green, we see a machine equipped with additional extended memory. And in orange we see a machine equipped with additional expanded memory.

Shortly before the scheme made its official debut at a COMDEX trade show in May of 1985, Lotus and Intel convinced a crucial third partner to come aboard: Microsoft. “It’s garbage! It’s a kludge!” said Bill Gates. “But we’re going to do it.” With the combined weight of Lotus, Intel, and Microsoft behind it, EMS took hold as the most practical way of breaking the 640 K barrier. Imperfect and kludgy though it was, software developers hurried to add support for EMS memory to whatever programs of theirs could practically make use of it, while hardware manufacturers rushed EMS memory boards onto the market. EMS may have been ugly, but it was here today and it worked.

At the same time that EMS was taking off, however, extended memory wasn’t going away. Some hardware makers — most notably IBM themselves — didn’t want any part of EMS’s ugliness. Software makers therefore continued to probe at the limits of machines equipped with extended memory, still looking for a way to get at it from within the confines of MS-DOS. What if they momentarily switched the 80286 into protected mode, just for as long as they needed to manipulate data in extended memory, then went back into real mode? It seemed like a reasonable idea — except that Intel, never anticipating that anyone would want to switch modes on the fly like this, had neglected to provide a way to switch an 80286 in protected mode back into real mode. So, proponents of extended memory had to come up with a kludge even uglier than the one that allowed EMS memory to function. They could force the 80286 back into real mode, they realized, by resetting it entirely, just as if the user had rebooted her computer. The 80286 would go through its self-check again — a process that admittedly absorbed precious milliseconds — and then pick back up where it left off. It was, as Microsoft’s Gordon Letwin memorably put it, like “turning off the car to change gears.” It was staggeringly kludgy, it was horribly inefficient, but it worked in its fashion. Given the inefficiencies involved, the scheme was mostly used to implement virtual disks stored in the extended memory, which wouldn’t be subject to the constant access of an application’s data space.

In 1986, the 32-bit 80386, Intel’s latest and greatest chip, made its public bow at the heart of the Compaq Deskpro 386 rather than an IBM machine, a landmark moment signaling the slow but steady shift of business computing’s power center from IBM to Microsoft and the clone makers using their operating system. While working on the new chip, Intel had had time to see how the 80286 was actually being used in the wild, and had faced the reality that MS-DOS was likely destined to be cobbled onto for years to come rather than replaced in its entirety with something better. They therefore made a simple but vitally important change to the 80386 amidst its more obvious improvements. In addition to being able to address an inconceivable total of 4 GB of memory in protected mode thanks to its 32-bit address space, the 80386 could be switched between protected mode and real mode on the fly if one desired, without needing to be constantly reset.

In freeing programmers from that massive inefficiency, the 80386 cracked open the door that much further to making practical use of extended memory in MS-DOS. In 1988, the old EMS consortium of Lotus, Intel, and Microsoft came together once again, this time with the addition to their ranks of the clone manufacturer AST; the absence of IBM is, once again, telling. Together they codified a standard approach to extended memory on 80386 and later processors, which corresponded essentially to the scheme I’ve already described in the context of the 80286, but with a simple command to the 80386 to switch back to real mode replacing the resets. They called it the eXtended Memory Specification; memory accessed in this way soon became known universally as “XMS” memory. Under XMS as under EMS, a new device driver would be loaded into MS-DOS. Ordinary real-mode programs could then call this driver to access extended memory; the driver would do the needful switching to protected mode, copy blocks of data from extended memory into conventional memory or vice versa, then switch the processor back to real mode when it was time to return control to the program. It was still inelegant, still a little inefficient, and still didn’t use the capabilities of Intel’s latest processors in anything like the way Intel’s engineers had intended them to be used; true multitasking still remained a pipe dream somewhere off in a shadowy future. Owners of sexier machines like the Macintosh and Amiga, in other words, still had plenty of reason to mock and scoff. In most circumstances, working with XMS memory was actually slower than working with EMS memory. The primary advantage of XMS was that it let programs work with much bigger chunks of non-conventional memory at one time than the four 16 K chunks that EMS allowed. Whether any given program chose EMS or XMS came to depend on which set of advantages and disadvantages best suited its purpose.

The arrival of XMS along with the ongoing use of EMS memory meant that MS-DOS now had two competing memory-management solutions. Buyers now had to figure out not only whether they had enough extra memory to run a program but whether they had the right kind of extra memory. Ever accommodating, hardware manufacturers began shipping memory boards that could be configured as either EMS or XMS memory — whatever the application you were running at the moment happened to require.

The next stage in the slow crawl toward parity with other computing platforms in the realm of memory management would be the development of so-called “DOS extenders,” software to allow applications themselves to run in protected mode, thus giving them direct access to extended memory without having to pass their requests through an inefficient device driver. An application built using a DOS extender would only need to switch the processor to real mode when it needed to communicate with the operating system. The development of DOS extenders was driven by Microsoft’s efforts to turn Windows, which like seemingly everything else in business computing ran on top of MS-DOS, into a viable alternative to the command line and a viable challenger to the Macintosh. That story is thus best reserved for a future article, when we look more closely at Windows itself. As it is, the story that I’ve told so far today moves us nicely into the era of computer-gaming history we’ve reached on the blog in general.

In said era, the MS-DOS machines that had heretofore been reserved for business applications were coming into homes, where they were often used to play a new generation of games taking advantage of the VGA graphics, sound cards, and mice sported by the latest systems. Less positively, all of the people wanting to play these new games had to deal with the ramifications of a 640 K barrier that could still be skirted only imperfectly. As we’ve seen, both EMS and XMS imposed to one degree or another a performance penalty when accessing non-conventional memory. What with games being the most performance-sensitive applications of all, that made that first 640 K of lightning-fast conventional memory most precious of all for them.

In the first couple of years of MS-DOS’s gaming dominance, developers dealt with all of the issues that came attached to using memory beyond 640 K by the simple expedient of not using any memory beyond 640 K. But that solution was compatible neither with developers’ growing ambitions for their games nor with the gaming public’s growing expectations of them.

The first harbinger of what was to come was Origin Systems’s September 1990 release Wing Commander, which in its day was renowned — and more than a little feared — for pushing the contemporary state of the art in hardware to its limits. Even Wing Commander didn’t go so far as to absolutely require memory beyond 640 K, but it did use it to make the player’s audiovisual experience snazzier if it was present. Setting a precedent future games would largely follow, it was quite inflexible in its approach, demanding EMS — as opposed to XMS — memory. In the future, gamers would have to become all too familiar with the differences between the two standards, and how to configure their machines to use one or the other. Setting another precedent, Wing Commander‘s “installation guide” included a section on “memory usage” that was required reading in order to get things working properly. In the future, such sections would only grow in length and complexity, and would need to be pored over by long-suffering gamers with far more concentrated attention than anything in the manual having anything to do with how to actually play the games they purchased.

In Accolade’s embarrassing Leisure Suit Larry knockoff Les Manley in: Lost in LA, the title character explains EMS and XMS memory to some nubile companions. The ironic thing was that anyone who wished to play the latest games on an MS-DOS machine really did need to know this stuff, or at least have a friend who did.

Thus began the period of almost a decade, remembered with chagrin but also often with an odd sort of nostalgia by old-timers today, in which gamers spent hours monkeying about with MS-DOS’s “config.sys” and “autoexec.bat” files and swapping in and out various third-party utilities in the hope of squeezing out that last few kilobytes of conventional memory that Game X needed to run. The techniques they came to employ were legion.

In the process of developing Windows, Microsoft had discovered that the kernel of MS-DOS itself, a fairly tiny program thanks to its sheer age, could be stashed into the first 64 K of memory beyond 1 MB and still accessed like conventional memory on an 80286 or later processor in real mode thanks to what was essentially an undocumented technical glitch in the design of those processors. Gamers thus learned to include the line “DOS=HIGH” in their configuration files, freeing up a precious block of conventional memory. Likewise, there was enough unused space scattered around in the 384 K of high memory on most machines to stash many or all of MS-DOS’s device drivers there instead of in conventional memory. Thus “DOS=HIGH” soon became “DOS=HIGH,UMB,” the second parameter telling the computer to make use of these so-called “upper-memory blocks” and thereby save that many kilobytes more.

These were the most basic techniques, the starting points. Suffice to say that things got a lot more complicated from there, turning into a baffling tangle of tweaks, some saving mere bytes rather than kilobytes of conventional memory, but all of them important if one was to hope to run games that by 1993 would be demanding 604 K of 640 K for their own use. That owners of machines which by that point typically contained memories in the multi-megabytes should have to squabble with the operating system over mere handfuls of bytes was made no less vexing by being so comically absurd. And every new game seemed to up the ante, seemed to demand that much more conventional memory. Those with a sunnier disposition or a more technical bent of mind took the struggle to get each successive purchase running as the game before the game got started, as it were. Everyone else gnashed their teeth and wondered for the umpteenth time if they might not have been better off buying a console where games Just Worked. The only thing that made it all worthwhile was the mixture of relief, pride, and satisfaction that ensued when you finally got it all put together just right and the title screen came up and the intro music sprang to life — if, that is, you’d managed to configure your sound card properly in the midst of all your other travails. Such was the life of the MS-DOS gamer.

Before leaving the issue of the 640 K barrier behind in exactly the way that all those afflicted by it for so many years were so conspicuously unable to do, we have to address Bill Gates’s famous claim, allegedly made at a trade show in 1981, that “640 K ought to be enough for anybody.” The quote has been bandied about for years as computer-industry legend, seeming to confirm as it does the stereotype of Bill Gates as the unimaginative dirty trickster of his industry, as opposed to Steve Jobs the guileless visionary (the truth is, needless to say, far more complicated). Sadly for the stereotypers, however, the story of the quote is similar to all too many legends in the sense that it almost certainly never happened. Gates himself, for one, vehemently denies ever having said any such thing. Fred Shapiro, for another, editor of The Yale Book of Quotations, conducted an exhaustive search for a reputable source for the quote in 2008, going so far as to issue a public plea in The New York Times for anyone possessing knowledge of such a source to contact him. More than a hundred people did so, but none of them could offer up the smoking gun Shapiro sought, and he was left more certain than ever that the comment was “apocryphal.” So, there you have it. Blame Bill Gates all you want for the creaky operating system that was the real root cause of all of the difficulties I’ve spent this article detailing, but don’t ever imagine he was stupid enough to say that. “No one involved in computers would ever say that a certain amount of memory is enough for all time,” said Gates in 2008. Anyone doubting the wisdom of that assertion need only glance at the history of the IBM PC.

(Sources: the books Upgrading and Repairing PCs, 3rd edition by Scott Mueller and Principles of Operating Systems by Brian L. Stuart; Computer Gaming World of June 1993; Byte of January 1982, November 1984, and March 1992; Byte‘s IBM PC special issues of Fall 1985 and Fall 1986; PC Magazine of May 14 1985, January 14 1986, May 30 1989, June 13 1989, and June 27 1989; the episode of the Computer Chronicles television show entitled “High Memory Management”; the online article “The ‘640K’ quote won’t go away — but did Gates really say it?” on Computerworld.)

Footnotes

Footnotes
1 Yes, that is quite possibly the nerdiest thing I’ve ever written.
 
 

Tags: , , ,

Ultima VI

After Richard Garriott and his colleagues at Origin Systems finished each Ultima game — after the manic final crunch of polishing and testing, after the release party, after the triumphant show appearances and interviews in full Lord British regalia — there must always arise the daunting question of what to do next. Garriott had set a higher standard for the series than that of any of its competitors almost from the very beginning, when he’d publicly declared that no Ultima would ever reuse the engine of its predecessor, that each new entry in the series would represent a significant technological leap over what had come before. And just to add to that pressure, starting with Ultima IV he’d begun challenging himself to make each new Ultima a major thematic statement that also built on what had come before. Both of these bars became harder and harder to meet as the series advanced.

As if that didn’t present enough of a burden, each individual entry in the series came with its own unique psychological hurdles for Garriott to overcome. For example, by the time he started thinking about what Ultima V should be he’d reached the limits of what a single talented young man like himself could design, program, write, and draw all by himself on his trusty Apple II. It had taken him almost a year — a rather uncomfortable year for his brother Robert and the rest of Origin’s management — to accept that reality and to begin to work in earnest on Ultima V with a team of others.

The challenge Garriott faced after finishing and releasing that game in March of 1988 was in its way even more emotionally fraught: the challenge of accepting that, just as he’d reached the limits of what he could do alone on the Apple II a couple of years ago, he’d now reached the limits of what any number of people could do on Steve Wozniak’s humble little 8-bit creation. Ultima V still stands today as one of the most ambitious things anyone has ever done on an Apple II; it was hard at the time and remains hard today to imagine how Origin could possibly push the machine much further. Yet that wasn’t even the biggest problem associated with sticking with the platform; the biggest problem could be seen on each monthly sales report, which showed the Apple II’s numbers falling off even faster than those of the Commodore 64, the only other viable 8-bit computer remaining in the American market.

After serving as the main programmer on Ultima V, John Miles’s only major contribution to Ultima VI was the opening sequence. The creepy poster of a pole-dancing centaur hanging on the Avatar’s wall back on Earth has provoked much comment over the years…

Garriott was hardly alone at Origin in feeling hugely loyal to the Apple II, the only microcomputer he’d ever programmed. While most game developers in those days ported their titles to many platforms, almost all had one which they favored. Just as Epyx knew the Commodore 64 better than anyone else, Sierra had placed their bets on MS-DOS, and Cinemaware was all about the Commodore Amiga, Origin was an Apple II shop through and through. Of the eleven games they’d released from their founding in 1983 through to the end of 1988, all but one had been born and raised on an Apple II.

Reports vary on how long and hard Origin tried to make Ultima VI work on the Apple II. Richard Garriott, who does enjoy a dramatic story even more than most of us, has claimed that Origin wound up scrapping nine or even twelve full months of work; John Miles, who had done the bulk of the programming for Ultima V and was originally slated to fill the same role for the sequel, estimated to me that “we probably spent a few months on editors and other utilities before we came to our senses.” At any rate, by March of 1989, the one-year anniversary of Ultima V‘s release, the painful decision had been made to switch not only Ultima VI but all of Origin’s ongoing and future projects to MS-DOS, the platform that was shaping up as the irresistible force in American computer gaming. A slightly petulant but nevertheless resigned Richard Garriott slapped an Apple sticker over the logo of the anonymous PC clone now sitting on his desk and got with the program.

Richard Garriott with an orrery, one of the many toys he kept at the recently purchased Austin house he called Britannia Manor.

Origin was in a very awkward spot. Having frittered away a full year recovering from the strain of making the previous Ultima, trying to decide what the next Ultima should be, and traveling down the technological cul de sac that was now the Apple II, they simply had to have Ultima VI finished — meaning designed and coded from nothing on an entirely new platform — within one more year if the company was to survive. Origin had never had more than a modestly successful game that wasn’t an Ultima; the only way their business model worked was if Richard Garriott every couple of years delivered a groundbreaking new entry in their one and only popular franchise and it sold 200,000 copies or more.

John Miles, lacking a strong background in MS-DOS programming and the C language in which all future Ultimas would be coded, was transferred off the team to get himself up to speed and, soon enough, to work on middleware libraries and tools for the company’s other programmers. Replacing him on the project in Origin’s new offices in Austin, Texas, were Herman Miller and Cheryl Chen, a pair of refugees from the old offices in New Hampshire, which had finally been shuttered completely in January of 1989. It was a big step for both of them to go from coding what until quite recently had been afterthought MS-DOS versions of Origin’s games to taking a place at the center of the most critical project in the company. Fortunately, both would prove more than up to the task.

Just as Garriott had quickly learned to like the efficiency of not being personally responsible for implementing every single aspect of Ultima V, he soon found plenty to like about the switch to MS-DOS. The new platform had four times the memory of the Apple II machines Origin had been targeting before, along with (comparatively) blazing-fast processors, hard drives, 256-color VGA graphics, sound cards, and mice. A series that had been threatening to burst the seams of the Apple II now had room to roam again. For the first time with Ultima VI, time rather than technology was the primary restraint on Garriott’s ambitions.

But arguably the real savior of Ultima VI was not a new computing platform but a new Origin employee: one Warren Spector, who would go on to join Garriott and Chris Roberts — much more on him in a future article — as one of the three world-famous game designers to come out of the little collective known as Origin Systems. Born in 1955 in New York City, Spector had originally imagined for himself a life in academia as a film scholar. After earning his Master’s from the University of Texas in 1980, he’d spent the next few years working toward his PhD and teaching undergraduate classes. But he had also discovered tabletop gaming at university, from Avalon Hill war games to Dungeons & Dragons. When a job as a research archivist which he’d thought would be his ticket to the academic big leagues unexpectedly ended after just a few months, he wound up as an editor and eventually a full-fledged game designer at Steve Jackson Games, maker of card games, board games, and RPGs, and a mainstay of Austin gaming circles. It was through Steve Jackson, like Richard Garriott a dedicated member of Austin’s local branch of the Society for Creative Anachronism, that Spector first became friendly with the gang at Origin; he also discovered Ultima IV, a game that had a profound effect on him. He left Austin in March of 1987 for a sojourn in Wisconsin with TSR, the makers of Dungeons & Dragons, but, jonesing for the warm weather and good barbecue of the city that had become his adopted hometown, he applied for a job with Origin two years later. Whatever role his acquaintance with Richard Garriott and some of the other folks there played in getting him an interview, it certainly didn’t get him a job all by itself; Spector claims that Dallas Snell, Robert Garriott’s right-hand man running the business side of the operation, grilled him for an incredible nine hours before judging him worthy of employment. (“May you never have to live through something like this just to get a job,” he wishes for all and sundry.) Starting work at Origin on April 12, 1989, he was given the role of producer on Ultima VI, the high man on the project totem pole excepting only Richard Garriott himself.

Age 33 and married, Spector was one of the oldest people employed by this very young company; he realized to his shock shortly after his arrival that he had magazine subscriptions older than Origin’s up-and-coming star Chris Roberts. A certain wisdom born of his age, along with a certain cultural literacy born of all those years spent in university circles, would serve Origin well in the seven years he would remain there. Coming into a company full of young men who had grand dreams of, as their company’s tagline would have it, “creating worlds,” but whose cultural reference points didn’t usually reach much beyond Lord of the Rings and Star Wars, Spector was able to articulate Origin’s ambitions for interactive storytelling in a way that most of the others could not, and in time would use his growing influence to convince management of the need for a real, professional writing team to realize those ambitions. In the shorter term — i.e., in the term of the Ultima VI project — he served as some badly needed adult supervision, systematizing the process of development by providing everyone on his team with clear responsibilities and by providing the project as a whole with the when and what of clear milestone goals. The project was so far behind that everyone involved could look forward to almost a year of solid crunch time as it was; Spector figured there was no point in making things even harder by letting chaos reign.

On the Ultima V project, it had been Dallas Snell who had filled the role of producer, but Snell, while an adept organizer and administrator, wasn’t a game designer or a creative force by disposition. Spector, though, proved himself capable of tackling the Ultima VI project from both sides, hammering out concrete design documents from the sometimes abstracted musings of Richard Garriott, then coming up with clear plans to bring them to fruition. In the end, the role he would play in the creation of Ultima VI was as important as that of Garriott himself. Having learned to share the technical burden with Ultima V — or by now to pass it off entirely; he never learned C and would never write a single line of code for any commercial game ever again — Garriott was now learning to share the creative burden as well, another necessary trade-off if his ever greater ambitions for his games were to be realized.

If you choose not to import an Ultima V character into Ultima VI, you go through the old Ultima IV personality test, complete with gypsy soothsayer, to come up with your personal version of the Avatar. By this time, however, with the series getting increasingly plot-heavy and the Avatar’s personality ever more fleshed-out within the games, the personality test was starting to feel a little pointless. Blogger Chet Bolingbroke, the “CRPG Addict,” cogently captured the problems inherent in insisting that all of these disparate Ultima games had the same hero:
 
Then there’s the Avatar. Not only is it unnecessary to make him the hero of the first three games, as if the Sosarians and Britannians are so inept they always need outside help to solve their problems, but I honestly think the series should have abandoned the concept after Ultima IV. In that game, it worked perfectly. The creators were making a meta-commentary on the very nature of playing role-playing games. The Avatar was clearly meant to be the player himself or herself, warped into the land through the “moongate” of his or her computer screen, represented as a literal avatar in the game window. Ultima IV was a game that invited the player to act in a way that was more courageous, more virtuous, more adventurous than in the real world. At the end of the game, when you’re manifestly returned to your real life, you’re invited to “live as an example to thine own people”–to apply the lesson of the seven virtues to the real world. It was brilliant. They should have left it alone.
 
Already in Ultima V, though, they were weakening the concept. In that game, the Avatar is clearly not you, but some guy who lives alone in his single-family house of a precise layout. But fine, you rationalize, all that is just a metaphor for where you actually do live. By Ultima VI, you have some weird picture of a pole-dancing centaur girl on your wall, you’re inescapably a white male with long brown hair.

Following what had always been Richard Garriott’s standard approach to making an Ultima, the Ultima VI team concentrated on building their technology and then building a world around it before adding a plot or otherwise trying to turn it all into a real game with a distinct goal. Garriott and others at Origin would always name Times of Lore, a Commodore 64 action/CRPG hybrid written by Chris Roberts and published by Origin in 1988, as the main influence on the new Ultima VI interface, the most radically overhauled version of same ever to appear in an Ultima title. That said, it should be noted that Times of Lore itself lifted many or most of its own innovations from The Faery Tale Adventure, David Joiner’s deeply flawed but beautiful and oddly compelling Commodore Amiga action/CRPG of 1987. By way of completing the chain, much of Times of Lore‘s interface was imported wholesale into Ultima VI; even many of the onscreen icons looked exactly the same. The entire game could now be controlled, if the player liked, with a mouse, with all of the keyed commands duplicated as onscreen buttons; this forced Origin to reduce the “alphabet soup” that had been previous Ultima interfaces, which by Ultima V had used every letter in the alphabet plus some additional key combinations, to ten buttons, with the generic “use” as the workhorse taking the place of a multitude of specifics.

Another influence, one which Origin was for obvious reasons less eager to publicly acknowledge than that of Times of Lore, was FTL’s landmark 1987 CRPG Dungeon Master, a game whose influence on its industry can hardly be overstated. John Miles remembers lots of people at Origin scrambling for time on the company’s single Atari ST in order to play it soon after its release. Garriott himself has acknowledged being “ecstatic” for his first few hours playing it at all the “neat new things I could do.” Origin co-opted  Dungeon Master‘s graphical approach to inventory management, including the soon-to-be ubiquitous “paper doll” method of showing what characters were wearing and carrying.

Taking a cue from theories about good interface design dating back to Xerox PARC and Apple’s Macintosh design team, The Faery Tale Adventure, Times of Lore, and Dungeon Master had all abandoned “modes”: different interfaces — in a sense entirely different programs — which take over as the player navigates through the game. The Ultima series, like most 1980s CRPGs, had heretofore been full of these modes. There was one mode for wilderness travel; another for exploring cities, towns, and castles; another, switching from a third-person overhead view to a first-person view like Wizardry (or, for that matter, Dungeon Master), for dungeon delving. And when a fight began in any of these modes, the game switched to yet another mode for resolving the combat.

Ultima VI collapsed all of these modes down into a single unified experience. Wilderness, cities, and dungeons now all appeared on a single contiguous map on which combat also occurred, alongside everything else possible in the game; Ultima‘s traditionally first-person dungeons were now displayed using an overhead view like the rest of the game. From the standpoint of realism, this was a huge step back; speaking in strictly realistic terms, either the previously immense continent of Britannia must now be about the size of a small suburb or the Avatar and everyone else there must now be giants, building houses that sprawled over dozens of square miles. But, as we’ve had plenty of occasion to discuss in previous articles, the most realistic game design doesn’t always make the best game design. From the standpoint of creating an immersive, consistent experience for the player, the new interface was a huge step forward.

As the world of Britannia had grown more complex, the need to give the player a unified window into it had grown to match, in ways that were perhaps more obvious to the designers than they might have been to the players. The differences between the first-person view used for dungeon delving and the third-person view used for everything else had become a particular pain. Richard Garriott had this to say about the problems that were already dogging him when creating Ultima V, and the changes he thus chose to make in Ultima VI:

Everything that you can pick up and use [in Ultima V] has to be able to function in 3D [i.e., first person] and also in 2D [third person]. That meant I had to either restrict the set of things players can use to ones that I know I can make work in 3D or 2D, or make them sometimes work in 2D but not always work in 3D or vice versa, or they will do different things in one versus the other. None of those are consistent, and since I’m trying to create an holistic world, I got rid of the 3D dungeons.

Ultima V had introduced the concept of a “living world” full of interactive everyday objects, along with characters who went about their business during the course of the day, living lives of their own. Ultima VI would build on that template. The world was still constructed, jigsaw-like, from piles of tile graphics, an approach dating all the way back to Ultima I. Whereas that game had offered 16 tiles, however, Ultima VI offered 2048, all or almost all of them drawn by Origin’s most stalwart artist, Denis Loubet, whose association with Richard Garriott stretched all the way back to drawing the box art for the California Pacific release of Akalabeth. Included among these building blocks were animated tiles of several frames — so that, for instance, a water wheel could actually spin inside a mill and flames in a fireplace could flicker. Dynamic, directional lighting of the whole scene was made possible by the 256 colors of VGA. While Ultima V had already had a day-to-night cycle, in Ultima VI the sun actually rose in the east and set in the west, and torches and other light sources cast a realistic glow onto their surroundings.

256 of the 2048 tiles from which the world of Ultima VI was built.

In a clear signal of where the series’s priorities now lay, other traditional aspects of CRPGs were scaled back, moving the series further from its roots in tabletop Dungeons & Dragons. Combat, having gotten as complicated and tactical as it ever would with Ultima V, was simplified, with a new “auto-combat” mode included for those who didn’t want to muck with it at all; the last vestiges of distinct character races and classes were removed; ability scores were boiled down to just three numbers for Strength, Dexterity, and Intelligence. The need to mix reagents in order to cast spells, one of the most mind-numbingly boring aspects of a series that had always made you do far too many boring things, was finally dispensed with; I can’t help but imagine legions of veteran Ultima players breathing a sigh of relief when they read in the manual that “the preparation of a spell’s reagents is performed at the moment of spellcasting.” The dodgy parser-based conversation system of the last couple of games, which had required you to try typing in every noun mentioned by your interlocutor on the off chance that it would elicit vital further information, was made vastly less painful by the simple expedient of highlighting in the text those subjects into which you could inquire further.

Inevitably, these changes didn’t always sit well with purists, then or now. Given the decreasing interest in statistics and combat evinced by the Ultima series as time went on, as well as the increasing emphasis on what we might call solving the puzzles of its ever more intricate worlds, some have accused later installments of the series of being gussied-up adventure games in CRPG clothing; “the last real Ultima was Ultima V” isn’t a hard sentiment to find from a vocal minority on the modern Internet. What gives the lie to that assertion is the depth of the world modeling, which makes these later Ultimas flexible in ways that adventure games aren’t. Everything found in the world has, at a minimum, a size, a weight, and a strength. Say, then, that you’re stymied by a locked door. There might be a set-piece solution for the problem in the form of a key you can find, steal, or trade for, but it’s probably also possible to beat the door down with a sufficiently big stick and a sufficiently strong character, or if all else fails to blast it open with a barrel of dynamite. Thus your problems can almost never become insurmountable, even if you screw up somewhere else. Very few other games from Ultima VI‘s day made any serious attempt to venture down this path. Infocom’s Beyond Zork tried, somewhat halfheartedly, and largely failed at it; Sierra’s Hero’s Quest was much more successful at it, but on nothing like the scale of an Ultima. Tellingly, almost all of the “alternate solutions” to Ultima VI‘s puzzles emerge organically from the simulation, with no designer input whatsoever. Richard Garriott:

I start by building a world which you can interact with as naturally as possible. As long as I have the world acting naturally, if I build a world that is prolific enough, that has as many different kinds of natural ways to act and react as possible, like the real world does, then I can design a scenario for which I know the end goal of the story. But exactly whether I have to use a key to unlock the door, or whether it’s an axe I pick up to chop down the door, is largely irrelevant.

The complexity of the world model was such that Ultima VI became the first installment that would let the player get a job to earn money in lieu of the standard CRPG approach of killing monsters and taking their loot. You can buy a sack of grain from a local farmer, take the grain to a mill and grind it into flour, then sell the flour to a baker — or sneak into his bakery at night to bake your own bread using his oven. Even by the standards of today, the living world inside Ultima VI is a remarkable achievement — not to mention a godsend to those of us bored with killing monsters; you can be very successful in Ultima VI whilst doing very little killing at all.

A rare glimpse of Origin’s in-house Ultima VI world editor, which looks surprisingly similar to the game itself.

Plot spoilers begin!

It wasn’t until October of 1989, just five months before the game absolutely, positively had to ship, that Richard Garriott turned his attention to the Avatar’s reason for being in Britannia this time around. The core idea behind the plot came to him during a night out on Austin’s Sixth Street: he decided he wanted to pitch the Avatar into a holy war against enemies who, in classically subversive Ultima fashion, turn out not to be evil at all. In two or three weeks spent locked together alone in a room, subsisting on takeout Chinese food, Richard Garriott and Warren Spector created the “game” part of Ultima VI from this seed, with Spector writing it all down in a soy-sauce-bespattered notebook. Here Spector proved himself more invaluable than ever. He could corral Garriott’s sometimes unruly thoughts into a coherent plan on the page, whilst offering plenty of contributions of his own. And he, almost uniquely among his peers at Origin, commanded enough of Garriott’s respect — was enough of a creative force in his own right — that he could rein in the bad and/or overambitious ideas that in previous Ultimas would have had to be attempted and proved impractical to their originator. Given the compressed development cycle, this contribution too was vital. Spector:

An insanely complicated process, plotting an Ultima. I’ve written a novel, I’ve written [tabletop] role-playing games, I’ve written board games, and I’ve never seen a process this complicated. The interactions among all the characters — there are hundreds of people in Britannia now, hundreds of them. Not only that, but there are hundreds of places and people that players expect to see because they appeared in five earlier Ultimas.

Everybody in the realm ended up being a crucial link in a chain that adds up to this immense, huge, wonderful, colossal world. It was a remarkably complicated process, and that notebook was the key to keeping it all under control.

The chain of information you follow in Ultima VI is, it must be said, far clearer than in any of the previous games. Solving this one must still be a matter of methodically talking to everyone and assembling a notebook full of clues — i.e., of essentially recreating Garriott and Spector’s design notebook — but there are no outrageous intuitive leaps required this time out, nor any vital clues hidden in outrageously out-of-the-way locations. For the first time since Ultima I, a reasonable person can reasonably be expected to solve this Ultima without turning it into a major life commitment. The difference is apparent literally from your first moments in the game: whereas Ultima V dumps you into a hut in the middle of the wilderness — you don’t even know where in the wilderness — with no direction whatsoever, Ultima VI starts you in Lord British’s castle, and your first conversation with him immediately provides you with your first leads to run down. From that point forward, you’ll never be at a total loss for what to do next as long as you do your due diligence in the form of careful note-taking. Again, I have to attribute much of this welcome new spirit of accessibility and solubility to the influence of Warren Spector.

Ultima VI pushes the “Gargoyles are evil!” angle hard early on, going so far as to have the seemingly demonic beasts nearly sacrifice you to whatever dark gods they worship. This of course only makes the big plot twist, when it arrives, all the more shocking.

At the beginning of Ultima VI, the Avatar — i.e., you — is called back to Britannia from his homeworld of Earth yet again by the remarkably inept monarch Lord British to deal with yet another crisis which threatens his land. Hordes of terrifyingly demonic-looking Gargoyles are pouring out of fissures which have opened up in the ground everywhere and making savage war upon the land. They’ve seized and desecrated the eight Shrines of Virtue, and are trying to get their hands on the Codex of Ultimate Wisdom, the greatest symbol of your achievements in Ultima IV.

But, in keeping with the shades of gray the series had begun to layer over the Virtues with Ultima V, nothing is quite as it seems. In the course of the game, you discover that the Gargoyles have good reason to hate and fear humans in general and you the Avatar in particular, even if those reasons are more reflective of carelessness and ignorance on the part of you and Lord British’s peoples than they are of malice. To make matters worse, the Gargoyles are acting upon a religious prophecy — conventional religion tends to take a beating in Ultima games — and have come to see the Avatar as nothing less than the Antichrist in their own version of the Book of Revelation. As your understanding of their plight grows, your goal shifts from that of ridding the land of the Gargoyle scourge by violent means to that of walking them back from attributing everything to a foreordained prophecy and coming to a peaceful accommodation with them.

Ultima VI‘s subtitle, chosen very late in the development process, is as subtly subversive as the rest of the plot. Not until very near the end of the game do you realize that The False Prophet is in fact you, the Avatar. As the old cliché says, there are two sides to every story. Sadly, the big plot twist was already spoiled by Richard Garriott in interviews before Ultima VI was even released, so vanishingly few players have ever gotten to experience its impact cold.

When discussing the story of Ultima VI, we shouldn’t ignore the real-world events that were showing up on the nightly news while Garriott and Spector were writing it. Mikhail Gorbachev had just made the impossibly brave decision to voluntarily dissolve the Soviet empire and let its vassal states go their own way, and just like that the Cold War had ended, not in the nuclear apocalypse so many had anticipated as its only possible end game but rather in the most blessed of all anticlimaxes in human history. For the first time in a generation, East was truly meeting West again, and each side was discovering that the other wasn’t nearly as demonic as they had been raised to believe. On November 10, 1989, just as Garriott and Spector were finishing their design notebook, an irresistible tide of mostly young people burst through Berlin’s forbidding Checkpoint Charlie to greet their counterparts on the other side, as befuddled guards, the last remnants of the old order, looked on and wondered what to do. It was a time of extraordinary change and hope, and the message of Ultima VI resonated with the strains of history.

Plot spoilers end.

When Garriott and Spector emerged from their self-imposed quarantine, the first person to whom they gave their notebook was an eccentric character with strong furry tendencies who had been born as David Shapiro, but who was known to one and all at Origin as Dr. Cat. Dr. Cat had been friends with Richard Garriott for almost as long as Denis Loubet, having first worked at Origin for a while when it was still being run out of Richard’s parents’ garage in suburban Houston. A programmer by trade — he had done the Commodore 64 port of Ultima V — Dr. Cat was given the de facto role of head writer for Ultima VI, apparently because he wasn’t terribly busy with anything else at the time. Over the next several months, he wrote most of the dialog for most of the many characters the Avatar would need to speak with in order to finish the game, parceling the remainder of the work out among a grab bag of other programmers and artists, whoever had a few hours or days to spare.

Origin Systems was still populating the games with jokey cameos drawn from Richard Garriott’s friends, colleagues, and family as late as Ultima VI. Thankfully, this along with other aspects of the “programmer text” syndrome would finally end with the next installment in the series, for which a real professional writing team would come aboard. More positively, do note the keyword highlighting in the screenshot above, which spared players untold hours of aggravating noun-guessing.

Everyone at Origin felt the pressure by now, but no one carried a greater weight on his slim shoulders than Richard Garriott. If Ultima VI flopped, or even just wasn’t a major hit, that was that for Origin Systems. For all that he loved to play His Unflappable Majesty Lord British in public, Garriott was hardly immune to the pressure of having dozens of livelihoods dependent on what was at the end of the day, no matter how much help he got from Warren Spector or anyone else, his game. His stress tended to go straight to his stomach. He remembers being in “constant pain”; sometimes he’d just “curl up in the corner.” Having stopped shaving or bathing regularly, strung out on caffeine and junk food, he looked more like a homeless man than a star game designer — much less a regal monarch — by the time Ultima VI hit the homestretch. On the evening of February 9, 1990, with the project now in the final frenzy of testing, bug-swatting, and final-touch-adding, he left Origin’s offices to talk to some colleagues having a smoke just outside. When he opened the security door to return, a piece of the door’s apparatus — in fact, an eight-pound chunk of steel — fell off and smacked him in the head, opening up an ugly gash and knocking him out cold. His panicked colleagues, who at first thought he might be dead, rushed him to the emergency room. Once he had had his head stitched up, he set back to work. What else was there to do?

Ultima VI shipped on time in March of 1990, two years almost to the day after Ultima V, and Richard Garriott’s fears (and stomach cramps) were soon put to rest; it became yet another 200,000-plus-selling hit. Reviews were uniformly favorable if not always ecstatic; it would take Ultima fans, traditionalists that so many of them were, a while to come to terms with the radically overhauled interface that made this Ultima look so different from the Ultimas of yore. Not helping things were the welter of bugs, some of them of the potentially showstopping variety, that the game shipped with (in years to come Origin would become almost as famous for their bugs as for their ambitious virtual world-building). In time, most if not all old-school Ultima fans were comforted as they settled in and realized that at bottom you tackled this one pretty much like all the others, trekking around Britannia talking to people and writing down the clues they revealed until you put together all the pieces of the puzzle. Meanwhile Origin gradually fixed the worst of the bugs through a series of patch disks which they shipped to retailers to pass on to their customers, or to said customers directly if they asked for them. Still, both processes did take some time, and the reaction to this latest Ultima was undeniably a bit muted — a bit conflicted, one might even say — in comparison to the last few games. It perhaps wasn’t quite clear yet where or if the Ultima series fit on these newer computers in this new decade.

Both the muted critical reaction and that sense of uncertainty surrounding the game have to some extent persisted to this day. Firmly ensconced though it apparently is in the middle of the classic run of Ultimas, from Ultima IV through Ultima VII, that form the bedrock of the series’s legacy, Ultima VI is the least cherished of that cherished group today, the least likely to be named as the favorite of any random fan. It lacks the pithy justification for its existence that all of the others can boast. Ultima IV was the great leap forward, the game that dared to posit that a CRPG could be about more than leveling up and collecting loot. Ultima V was the necessary response to its predecessor’s unfettered idealism; the two games together can be seen to form a dialog on ethics in the public and private spheres. And, later, Ultima VII would be the pinnacle of the series in terms not only of technology but also, and even more importantly, in terms of narrative and thematic sophistication. But where does Ultima VI stand in this group? Its plea for understanding rather than extermination is as important and well-taken today as it’s ever been, yet its theme doesn’t follow as naturally from Ultima V as that game’s had from Ultima IV, nor is it executed with the same sophistication we would see in Ultima VII. Where Ultima VI stands, then, would seem to be on a somewhat uncertain no man’s land.

Indeed, it’s hard not to see Ultima VI first and foremost as a transitional work. On the surface, that’s a distinction without a difference; every Ultima, being part of a series that was perhaps more than any other in the history of gaming always in the process of becoming, is a bridge between what had come before and what would come next. Yet in the case of Ultima VI the tautology feels somehow uniquely true. The graphical interface, huge leap though it is over the old alphabet soup, isn’t quite there yet in terms of usability. It still lacks a drag-and-drop capability, for instance, to make inventory management and many other tasks truly intuitive, while the cluttered onscreen display combines vestiges of the old, such as a scrolling textual “command console,” with this still imperfect implementation of the new. The prettier, more detailed window on the world is welcome, but winds up giving such a zoomed-in view in the half of a screen allocated to it that it’s hard to orient yourself. The highlighted keywords in the conversation engine are also welcome, but are constantly scrolling off the screen, forcing you to either lawnmower through the same conversations again and again to be sure not to miss any of them or to jot them down on paper as they appear. There’s vastly more text in Ultima VI than in any of its predecessors, but perhaps the kindest thing to be said about Dr. Cat as a writer is that he’s a pretty good programmer. All of these things would be fixed in Ultima VII, a game — or rather games; there were actually two of them, for reasons we’ll get to when the time comes — that succeeded in becoming everything Ultima VI had wanted to be. To use the old playground insult, everything Ultima VI can do Ultima VII can do better. One thing I can say, however, is that the place the series was going would prove so extraordinary that it feels more than acceptable to me to have used Ultima VI as a way station en route.

But in the even more immediate future for Origin Systems was another rather extraordinary development. This company that the rest of the industry jokingly referred to as Ultima Systems would release the same year as Ultima VI a game that would blow up even bigger than this latest entry in the series that had always been their raison d’être. I’ll tell that improbable story soon, after a little detour into some nuts and bolts of computer technology that were becoming very important — and nowhere more so than at Origin — as the 1990s began.

(Sources: the books Dungeons and Dreamers: The Rise of Computer Game Culture from Geek to Chic by Brad King and John Borland, The Official Book of Ultima, Second Edition by Shay Addams, and Ultima: The Avatar Adventures by Rusel DeMaria and Caroline Spector; ACE of April 1990; Questbusters of November 1989, January 1990, March 1990, and April 1990; Dragon of July 1987; Computer Gaming World of March 1990 and June 1990; Origin’s in-house newsletter Point of Origin of August 7 1991. Online sources include Matt Barton’s interviews with Dr. Cat and Warren Spector’s farewell letter from the Wing Commander Combat Information Center‘s document archive. Last but far from least, my thanks to John Miles for corresponding with me via email about his time at Origin, and my thanks to Casey Muratori for putting me in touch with him.

Ultima VI is available for purchase from GOG.com in a package that also includes Ultima IV and Ultima V.)

 

Tags: , , ,