RSS

Tag Archives: westwood

Blade Runner

Blade Runner has set me thinking about the notion of a “critical consensus.” Why should we have such a thing at all, and why should it change over time?

Ridley Scott’s 1982 film Blade Runner is an adaptation of Philip K. Dick’s 1968 novel Do Androids Dream of Electric Sheep?, about a police officer cum bounty hunter — a “blade runner” in street slang — of a dystopian near-future whose job is to “retire” android “replicants” of humans whose existence on Earth is illegal. The movie had a famously troubled gestation, full of time and budget overruns, disputes between Scott and his investors, and an equally contentious relationship between the director and his leading man, Harrison Ford. When it was finally finished, the first test audiences were decidedly underwhelmed, such that Scott’s backers demanded that the film be recut, with the addition of a slightly hammy expository voice-over and a cheesy happy-ending epilogue which was cobbled together quickly using leftover footage from, of all movies, Stanley Kubrick’s The Shining.

It didn’t seem to help. The critical consensus on the released version ranged over a continuum from ambivalence to outright hostility. Roger Ebert’s faint praise was typically damning: “I was never really interested in the characters in Blade Runner. I didn’t find them convincing. What impressed me in the film was the special effects, the wonderful use of optical trickery to show me a gigantic imaginary Los Angeles, which in the vision of this movie has been turned into sort of a futuristic Tokyo. It’s a great movie to look at, but a hard one to care about. I didn’t appreciate the predictable story, the standard characters, the cliffhanging clichés… but I do think the special effects make Blade Runner worth going to see.” Pauline Kael was less forgiving of what she saw as a cold, formless, ultimately pointless movie: “If anybody comes around with a test to detect humanoids, maybe Ridley Scott and his associates should hide. With all the smoke in this movie, you feel as if everyone connected with it needs to have his flue cleaned.” Audiences do not always follow the critics’ lead, but in this case they largely did. During its initial theatrical run, Blade Runner fell well short of earning back the $30 million it had cost to make.

Yet remarkably soon after it had disappeared from theaters, its rehabilitation got underway in fannish circles. In 1984, William Gibson published his novel Neuromancer, the urtext of a new “cyberpunk” movement in science fiction that began in printed prose but quickly spiraled out from there into comics, television, and games. Whereas Blade Runner‘s dystopic Los Angeles looked more like Tokyo than any contemporary American city, Gibson’s book actually began in Japan, before moving on to a similarly over-urbanized United States. The two works’ neon-soaked nighttime cityscapes were very much of a piece. The difference was that Gibson added to the equation a computer-enabled escape from reality known as cyberspace, creating a combination that would prove almost irresistibly alluring to science-fiction fans as the computer age around them continued to evolve apace.

Blade Runner‘s rehabilitation spread to the mainstream in 1992, when a “director’s cut” of the film was re-released in theaters, lacking the Captain Obvious voice-over or the tacked-on happy ending but sporting a handful of new scenes that added fresh layers of nuance to the story. Critics — many of them the very same critics who had dismissed the movie a decade earlier — now rushed to praise it as a singular cinematic vision and a science-fiction masterpiece. They found many reasons for its box-office failure on the first go-round, even beyond the infelicitous changes that Ridley Scott had been forced by his backers to make to it. For one thing, it had been unlucky enough to come out just one month after E.T.: The Extraterrestrial, the biggest box-office smash of all time to that point, whose long shadow was as foreboding and unforgiving a place to dwell as any of Blade Runner‘s own urban landscapes. Then, too, the audience was conditioned back then to see Harrison Ford as Han Solo or Indiana Jones — a charming rogue with a heart of gold, not the brooding, morally tormented cop Rick Deckard, who has a penchant for rough sex and a habit of shooting women in the back. In light of all this, surely the critics too could be forgiven for failing to see the film’s genius the first time they were given the chance.

Whether we wish to forgive them or not, I find it fascinating that a single film could generate such polarized reactions only ten years apart in time from people who study the medium for a living. The obvious riposte to my sense of wonder is, of course, that the Blade Runner of 1992 really wasn’t the same film at all as the one that had been seen in 1982. Yet I must confess to considerable skepticism about this as a be-all, end-all explanation. It seems to me that, for all that the voice-over and forced happy ending did the movie as a whole no favors, they were still a long way from destroying the qualities that made Blade Runner distinct.

Some of my skepticism may arise from the fact that I’m just not onboard with the most vaunted aspect of the director’s cut, its subtle but undeniable insinuation that Deckard is himself a replicant with implanted memories, no different from the androids he hunts down and kills. This was not the case in Philip K. Dick’s novel, nor was it the original intention of the film’s scriptwriters. I rather suspect, although I certainly cannot prove it, that even Ridley Scott’s opinion on the subject was more equivocal during the making of the film than it has since become. David Peoples, one of the screenwriters, attributes the genesis of the idea in Scott’s mind to an overly literal reading on his part of a philosophical meditation on free will and the nature of human existence in an early draft of the script. Peoples:

I invented a kind of contemplative voice-over for Deckard. Here, let me read it to you:

“I wondered who designs the ones like me and what choices we really have, and which ones we just think we have. I wondered which of my memories were real and which belonged to someone else. The great Tyrell [the genius inventor and business magnate whose company made the replicants] hadn’t designed me, but whoever had hadn’t done so much better. In my own modest way, I was a combat model.”

Now, what I’d intended with this voice-over was mostly metaphysical. Deckard was supposed to be philosophically questioning himself about what it was that made him so different from Rachael [a replicant with whom he falls in love or lust] and the other replicants. He was supposed to be realizing that, on the human level, they weren’t so different. That Deckard wanted the same things the replicants did. The “maker” he was referring to wasn’t Tyrell. It was supposed to be God. So, basically, Deckard was just musing about what it meant to be human.

But then, Ridley… well, I think Ridley misinterpreted me. Because right about this period of time, he started announcing, “Ah-ha! Deckard’s a replicant! What brilliance!” I was sort of confused by this response, because Ridley kept giving me all this praise and credit for this terrific idea. It wasn’t until many years later, when I happened to be browsing through this draft, that I suddenly realized the metaphysical material I had written could just as easily have been read to imply that Deckard was a replicant, even though it wasn’t what I meant at all. What I had meant was, we all have a maker, and we all have an incept date [a replicant’s equivalent to a date of birth]. We just can’t address them. That’s one of the similarities we had to the replicants. We couldn’t go find Tyrell, but Tyrell was up there somewhere. For all of us.

So, what I had intended as kind of a metaphysical speculation, Ridley had read differently, but now I realize there was nothing wrong with this reading. That confusion was my own fault. I’d written this voice-over so ambiguously that it could indeed have meant exactly what Ridley took it to mean. And that, I think, is how the whole idea of Deckard being a replicant came about.

The problem I have with Deckard being a replicant is that it undercuts the thematic resonance of the story. In the book and the movie, the quality of empathy, or a lack thereof, is described as the one foolproof way to distinguish real from synthetic humans. To establish which is which, blade runners like Deckard use something called the Voight-Kampff test, in which suspects are hooked up to a polygraph-like machine which measures their emotional response to shockingly transgressive statements, starting with stuff like “my briefcase is made out of supple human-baby skin” and getting steadily worse from there. Real humans recoil, intuitively and immediately. Replicants can try to fake the appropriate emotional reaction — might even be programmed to fake it to themselves, such that even they don’t realize what they are — but there is always a split-second delay, which the trained operator can detect.

The central irony of the film is that cops like Deckard are indoctrinated to have absolutely no empathy for the replicants they track down and murder, even as many of the replicants we meet evince every sign of genuinely caring for one another, leading one to suspect that the Voight-Kampff test may not be measuring pure, unadulterated empathy in quite the way everyone seems to think it is. The important transformation that Deckard undergoes, which eventually brings his whole world down around his head, is that of allowing himself to feel the pain and fear of those he hunts. He is a human who rediscovers and re-embraces his own humanity, who finally begins to understand that meting out suffering and death to other feeling creatures is no way to live, no matter how many layers of justification and dogma his actions are couched within.

But in Ridley Scott’s preferred version of the film, the central theme falls apart, to be replaced with psychological horror’s equivalent of a jump scare: “Deckard himself is really a replicant, dude! What a mind fuck, huh?” For this reason, it’s hard for me to see the director’s cut as an holistically better movie than the 1982 cut, which at least leaves some more room for debate about the issue.

This may explain why I’m lukewarm about Blade Runner as a whole, why none of the cuts — and there have been a lot of them by now — quite works for me. As often happens in cases like this one, I find that my own verdict on Blade Runner comes down somewhere between the extremes of then and now. There’s a lot about Roger Ebert’s first hot-take that still rings true to me all these years later. It’s a stunning film in terms of atmosphere and audiovisual composition; I defy anyone to name a movie with a more breathtaking opening shot than the panorama of nighttime Tokyo… er, Los Angeles that opens this one. Yet it’s also a distant and distancing, emotionally displaced film that aspires to a profundity it doesn’t completely earn. I admire many aspects of its craft enormously and would definitely never discourage anyone from seeing it, but I just can’t bring myself to love it as much as so many others do.

The opening shot of Blade Runner the movie.

These opinions of mine will be worth keeping in mind as we move on now to the 1997 computer-game adaptation of Blade Runner. For, much more so than is the case even with most licensed games, your reaction to this game might to be difficult to separate from your reaction to the movie.


Thanks to the complicated, discordant circumstances of its birth, Blade Runner had an inordinate number of vested interests even by Hollywood standards, such that a holding company known as The Blade Runner Partnership was formed just to administer them. When said company started to shop the property around to game publishers circa 1994, the first question on everyone’s lips was what had taken them so long. The film’s moody, neon-soaked aesthetic if not its name had been seen in games for years by that point, so much so that it had already become something of a cliché. Just among the games I’ve written about on this site, Rise of the Dragon, Syndicate, System Shock, Beneath a Steel Sky, and the Tex Murphy series all spring to mind as owing more than a small debt to the movie. And there are many, many more that I haven’t written about.

Final Fantasy VII is another on the long list of 1990s games that owes more than a little something to Blade Runner. It’s hard to imagine its perpetually dark, polluted, neon-soaked city of Midgar ever coming to exist without the example of Blade Runner’s Los Angeles. Count it as just one more way in which this Japanese game absorbed Western cultural influences and then reflected them back to their point of origin, much as the Beatles once put their own spin on American rock and roll and sold it back to the country of its birth.

Meanwhile the movie itself was still only a cult classic in the 1990s; far more gamers could recognize and enjoy the gritty-cool Blade Runner aesthetic than had actually seen its wellspring. Blade Runner was more of a state of mind than it was a coherent fictional universe in the way of other gaming perennials like Star Trek and Star Wars. Many a publisher therefore concluded that they could have all the Blade Runner they needed without bothering to pay for the name.

Thus the rights holders worked their way down through the hierarchy of publishers, beginning with the prestigious heavy hitters like Electronic Arts and Sierra and continuing into the ranks of the mid-tier imprints, all without landing a deal. Finally, they found an interested would-be partner in the financially troubled Virgin Interactive.

The one shining jewel in Virgin’s otherwise tarnished crown was Westwood Studios, the pioneer of the real-time-strategy genre that was on the verge of becoming one of the two hottest in all of gaming. And one of the founders of Westwood was a fellow named Louis Castle, who listed Blade Runner as his favorite movie of all time. His fandom was such that Westwood probably did more than they really needed to in order to get the deal. Over a single long weekend, the studio’s entire art department pitched in to meticulously recreate the movie’s bravura opening shots of dystopic Los Angeles. It did the trick; the Blade Runner contract was soon given to Virgin and Westwood. It also established, for better or for worse, the project’s modus operandi going forward: a slavish devotion not just to the film’s overall aesthetic but to the granular details of its shots and sets.

The opening shot of Blade Runner the game.

Thanks to the complicated tangle of legal rights surrounding the film, Westwood wasn’t given access to any of its tangible audiovisual assets. Undaunted, they endeavored to recreate almost all of them on the monitor screen for themselves by using pre-rendered 3D backgrounds combined with innovative real-time lighting effects; these were key to depicting the flashing neon and drifting rain and smoke that mark the film. The foreground actors were built from motion-captured human models, then depicted onscreen using voxels, collections of tiny cubes in a 3D space, essentially pixels with an added Z-dimension of depth.

At least half of what you see in the Blade Runner game is lifted straight from the movie, which Westwood pored over literally frame by frame in order to include even the tiniest details, the sorts of things that no ordinary moviegoer would ever notice. The Westwood crew took a trip from their Las Vegas offices to Los Angeles to measure and photograph the locations where the film had been shot, the better to get it all exactly correct. Even the icy, synth-driven soundtrack for the movie was deconstructed, analyzed, and then mimicked in the game, note by ominous note.

The two biggest names associated with the film, Ridley Scott and Harrison Ford, were way too big to bother with a project like this one, but a surprising number of the other actors agreed to voice their parts and to allow themselves to be digitized and motion-captured. Among them were Sean Young, who had played Deckard’s replicant love interest Rachael; Edward James Olmos, who had played his enigmatic pseudo-partner Gaff; and Joe Turkel, who had played Eldon Tyrell, the twisted genius who invented the replicants. Set designers and other behind-the-scenes personnel were consulted as well.

It wasn’t judged practical to clone the movie’s plot in the same way as its sights and sounds, if for no other reason than the absence of Harrison Ford; casting someone new in the role of Deckard would have been, one senses, more variance than Westwood’s dedication to re-creation would have allowed. Instead they came up with a new story that could play out in the seams of the old one, happening concurrently with the events of the film, in many of the same locations and involving many of the same characters. Needless to say, its thematic concerns too would be the same as those of the film — and, yes, its protagonist cop as well would eventually be given reason to doubt his own humanity. His name was McCoy, another jaded gumshoe transplanted from a Raymond Chandler novel into an equally noirish future. But was he a “real” McCoy?

Westwood promised great things in the press while Blade Runner was in development: a truly open-world game taking place in a living, breathing city, full of characters that went about their own lives and pursued their own agendas, whose response to you in the here and now would depend to a large degree on how you had treated them and their acquaintances and enemies in the past. There would be no fiddly puzzles for the sake of them; this game would expect you to think and act like a real detective, not as the typical adventure-game hero with an inventory full of bizarre objects waiting to be put to use in equally bizarre ways. To keep you on your toes and add replay value — the lack of which was always the adventure genre’s Achilles heel as a commercial proposition — the guilty parties in the case would be randomly determined, so that no two playthroughs would ever be the same. And there would be action elements too; you would have to be ready to draw your gun at almost any moment. “There’s actually very little action in the film,” said Castle years later, “but when it happens, it’s violent, explosive, and deadly. I wanted to make a game where the uncertainty of what’s going to happen makes you quiver with anticipation every time you click the mouse.”

As we’ll soon see, most of those promises would be fulfilled only partially, but that didn’t keep Blade Runner from becoming a time-consuming, expensive project by the standards of its era,  taking two years to make and costing about $2 million. It was one of the last times that a major, mainstream American studio swung for the fences with an adventure game, a genre that was soon to be relegated to niche status, with budgets and sales expectations to match.

In fact, Blade Runner’s commercial performance was among the reasons that down-scaling took place. Despite a big advertising push on Virgin Interactive’s part, it got lost in the shuffle among The Curse of Monkey Island, Riven, and Zork: Grand Inquisitor, three other swansongs of the AAA adventure game that all competed for a dwindling market share during the same holiday season of 1997. Reviews were mixed, often expressing a feeling I can’t help but share: what was ultimately the point of so slavishly re-creating another work of art if you’re weren’t going to add much of anything of your own to it? “The perennial Blade Runner images are here, including the winking woman in the Coca-Cola billboard and vehicles flying over the flaming smokestacks of the industrial outskirts,” wrote GameSpot. “Unfortunately, most of what’s interesting about the game is exactly what was interesting about the film, and not much was done to extend the concepts or explore them any further.” Computer and Video Games magazine aptly called it “more of a companion to the movie than a game.” Most gamers shrugged and moved on the next title on the shelf; Blade Runner sold just 15,000 copies in the month of its release.[1]Louis Castle has often claimed in later decades that Blade Runner did well commercially, stating at least once that it sold 1 million copies(!). I can’t see how this could possibly have been the case; I’ve learned pretty well over my years of researching these histories what a million-selling game looked like in the 1990s, and can say very confidently that it did not look like this one. Having said that, though, let me also say that I don’t blame him for inflating the figures. It’s not easy to pour your heart and soul into something and not have it do well. So, as the press of real data and events fades into the past, the numbers start to go up. This doesn’t make Castle dishonest so much as it just makes him human.

As the years went by, however, a funny thing happened. Blade Runner never faded completely from the collective gamer consciousness like so many other middling efforts did. It continued to be brought up in various corners of the Internet, became a fixture of an “abandonware” scene whose rise preceded that of back-catalog storefronts like GOG.com, became the subject of retrospectives and think pieces on major gaming sites. Finally, in spite of the complications of its licensing deal, it went up for sale on GOG.com in 2019. Then, in 2022, Night Dive Studios released an “enhanced” edition. It seems safe to say today that many more people have played Westwood’s Blade Runner since the millennium than did so before it. The critical consensus surrounding it has shifted as well. As of this writing, Blade Runner is rated by the users of MobyGames as the 51st best adventure game of all time — a ranking that doesn’t sound so impressive at first, until you realize that it’s slightly ahead of such beloved icons of the genre as LucasArts’s Monkey Island 2 and Indiana Jones and the Fate of Atlantis.[2]This chart in general is distorted greatly by the factor of novelty; many or most of the highest-ranking games are very recent ones, rated in the first blush of excitement following their release. I trust that I need not belabor the parallels with the reception history of Ridley Scott’s movie. In this respect as well as so many others, the film and the game seem joined at the hip. And the latter wouldn’t have it any other way.


In all my years of writing these histories, I’m not sure I’ve ever come across a game that combines extremes of derivation and innovation in quite the way of Westwood’s Blade Runner. While there is nary an original idea to be found in the fiction, the gameplay has if anything too many of them.

I’ve complained frequently in the past that most alleged mystery games aren’t what they claim to be at all, that they actually solve the mystery for you while you occupy your time with irrelevant lock-and-key puzzles and the like. Louis Castle and his colleagues at Westwood clearly had the same complaints; there are none of those irrelevancies here. Blade Runner really does let you piece together its clues for yourself. You feel like a real cop — or at least a television one — when you, say, pick out the license plate of a car on security-camera footage, then check the number in the database of the near-future’s equivalent to the Department of Motor Vehicles to get a lead. Even as it’s rewarding, the game is also surprisingly forgiving in its investigative aspects, not an adjective that’s frequently applied to adventures of this period. There are a lot of leads to follow, and you don’t need to notice and run down all of them all to make progress in your investigation. At its best, then, this game makes you feel smart — one of the main reasons a lot of us play games, if we’re being honest.

Those problems that do exist here arise not from the developers failing to do enough, but rather from trying to do too much. There’s an impossibly baroque “clues database” that purports to aid you in tying everything together. This experiment in associative, cross-referenced information theory would leave even Ted Nelson scratching his head in befuddlement. Thankfully, it isn’t really necessary to engage with it at all. You can keep the relevant details in your head, or at worst in your trusty real-world notepad, easily enough.

If you can make any sense of this, you’re a better detective than I am.

Features like this one seem to be artifacts of that earlier, even more conceptually ambitious incarnation of Blade Runner that was promoted in the press while the game was still being made.[3]Louis Castle’s own testimony contradicts this notion as well. He has stated in various interview that “Blade Runner is as close as I have ever come to realizing a design document verbatim.” I don’t wish to discount his words out of hand, but boy, does this game ever strike me, based on pretty long experience in studying these things, as being full of phantom limbs that never got fully wired into the greater whole. I decided in the end that I had to call it like I see it in this article. As I noted earlier, this was to have been a game that you could play again and again, with the innocent and guilty parties behind the crime you investigated being different each time. It appears that, under the pressure of time, money, and logistics, that concept got boiled down to randomizing which of the other characters are replicants and which are “real” humans, but not changing their roles in the story in response to their status in any but some fairly cosmetic ways. Then, too, the other characters were supposed to have had a great deal of autonomy, but, again, the finished product doesn’t live up to this billing. In practice, what’s left of this aspiration is more of an annoyance than anything else. While the other characters do indeed move around, they do so more like subway trains on a rigid schedule than independent human actors. When the person you need to speak to isn’t where you go to speak to him, all you can do is go away and return later. This leads to tedious rounds of visiting the same locations again and again, hoping someone new will turn up to jog the plot forward. While this may not be all that far removed from the nature of much real police work, it’s more realism than I for one need.

This was also to have been an adventure game that you could reasonably play without relying on saving and restoring, taking your lumps and rolling with the flow. Early on, the game just about lives up to this ideal. At one point, you chase a suspect into a dark alleyway where a homeless guy happens to be rooting through a dumpster. It’s damnably easy in the heat of the moment to shoot the wrong person. If you do so — thus committing a crime that counts as murder, unlike the “retiring” of a replicant — you have the chance to hide the body and continue on your way; life on the mean streets of Los Angeles is a dirty business, regardless of the time period. Even more impressively, you might stumble upon your victim’s body again much later in the game, popping up out of the murk like an apparition from your haunted conscience. If you didn’t kill the hobo, on the other hand, you might meet him again alive.

But sadly, a lot of this sort of thing as well falls away as the game goes on. The second half is rife with learning-by-death moments that would have done the Sierra of the 1980s proud, all people and creatures jumping out of the shadows and killing you without warning. Hope you have a save file handy, says the game. The joke’s on you!

By halfway through, the game has just about exhausted the movie’s iconic set-pieces and is forced to lean more on its own invention, much though this runs against its core conviction that imitation trumps originality. Perhaps that conviction was justified after all: the results aren’t especially inspiring. What we see are mostly generic sewers, combined with characters who wouldn’t play well in the dodgiest sitcom. The pair of bickering conjoined twins — one smart and urbane, the other crude and rude — is particularly cringe-worthy.

Writers and other artists often talk about the need to “kill your darlings”: to cut out those scenes and phrases and bits and bobs that don’t serve the art, that only serve to gratify the vanity of the artist. This game is full of little darlings that should have died well before it saw release. Some of them are flat-out strange. For example, if you like, you can pre-pick a personality for McCoy: Polite, Normal, (don’t call me) Surly, or Erratic. Doing so removes the conversation menu from the interface; walk up to someone and click on her, and McCoy just goes off on his own tangent. I don’t know why anyone would ever choose to do this, unless it be to enjoy the coprolalia of Erratic McCoy, who jumps from Sheriff Andy Taylor to Dirty Harry and back again at a whipsaw pace, leaving everyone on the scene flummoxed.

Even when he’s ostensibly under your complete control, Detective McCoy isn’t the nimblest cowboy at the intellectual rodeo. Much of the back half of the game degenerates into trying to figure out how and when to intervene to keep him from doing something colossally stupid. When a mobster you’ve almost nailed hands him a drink, you’re reduced to begging him silently: Please, please, do not drink it, McCoy! And of course he does so, and of course it’s yet another Game Over. (After watching the poor trusting schmuck screw up this way several times, you might finally figure out that you have about a two-second window of control to make him draw his gun on the other guy — no other action will do — before he scarfs down the spiked cocktail.)

Bottoms up! (…sigh…)

All my other complaints aside, though, for me this game’s worst failing remains its complete disinterest in standing on its own as either a piece of fiction or as an aesthetic statement of any stripe. There’s an embarrassingly mawkish, subservient quality that dogs it even as it’s constantly trying to be all cool and foreboding and all, with all its darkness and its smoke. Its brand of devotion is an aspect of fan culture that I just don’t get.

So, I’m left sitting here contemplating an argument that I don’t think I’ve ever had to make before in the context of game development: that you can actually love something too much to be able to make a good game out of it, that your fandom can blind you as surely as the trees of any forest. This game is doomed, seemingly by design, to play a distant second fiddle to its parent. You can almost hear the chants of “We’re not worthy!” in the background. When you visit Tyrell in his office, you know it can have no real consequences for your story because the resolution of that tycoon’s fate has been reserved for the cinematic story that stars Deckard; ditto your interactions with Rachael and Gaff and others. They exist here at all, one can’t help but sense, only because the developers were so excited at the prospect of having real live Blade Runner actors visit them in their studio that they just couldn’t help themselves. (“We’re not worthy!”) For the player who doesn’t live and breathe the lore of Blade Runner like the developers do, they’re living non sequiturs who have nothing to do with anything else that’s going on.

Even the endings here — there are about half a dozen major branches, not counting the ones where McCoy gets shot or stabbed or roofied midway through the proceedings — are sometimes in-jokes for the fans. One of them is a callback to the much-loathed original ending of the film — a callback that finds a way to be in much worse taste than its inspiration: McCoy can run away with one of his suspects, who happens to be a fourteen-year-old girl who’s already been the victim of adult molestation. Eww!

What part of “fourteen years old and already sexually traumatized” do you not understand, McCoy?

Even the options menu of this game has an in-joke that only fans will get. If you like, you can activate a “designer cut” here that eliminates all of McCoy’s explanatory voice-overs, a callback to the way that Ridley Scott’s director’s cut did away with the ones in the film. The only problem is that in this medium those voice-overs are essential for you to have any clue whatsoever what’s going on. Oh, well… the Blade Runner fans have been served, which is apparently the important thing.

I want to state clearly here that my objections to this game aren’t abstract objections to writing for licensed worlds or otherwise building upon the creativity of others. It’s possible to do great work in such conditions; the article I published just before this one praised The Curse of Monkey Island to the skies for its wit and whimsy, despite that game making absolutely no effort to bust out of the framework set up by The Secret of Monkey Island. In fact, The Curse of Monkey Island too is bursting at the seams with in-jokes and fan service. But it shows how to do those things right: by weaving them into a broader whole such that they’re a bonus for the people who get them but never distract from the experience of the people who don’t. That game illustrates wonderfully how one can simultaneously delight hardcore fans of a property and welcome newcomers into the fold, how a game can be both a sequel and fully-realized in an Aristotelian sense. I’m afraid that this game is an equally definitive illustration of how to do fan service badly, such that it comes across as simultaneously elitist and creatively bankrupt.

Westwood always prided themselves on their technical excellence, and this is indeed a  technically impressive game in many respects. But impressive technology is worth little on its own. If you’re a rabid fan of the movie in the way that I am not, I suppose you might be excited to live inside it here and see all those iconic sets from slightly different angles. If you aren’t, though, it’s hard to know what this game is good for. In its case, I think that the first critical consensus had it just about right.



Did you enjoy this article? If so, please think about pitching in to help me make many more like it. You can pledge any amount you like.


Sources: The book Future Noir: The Making of Blade Runner by Paul M. Sammon; Computer and Video Games of January 1998; PC Zone of May 1999; Next Generation of July 1997; Computer Gaming World of March 1998; Wall Street Journal of January 21 1998; New Yorker of July 1982; Retro Gamer 142.

Online sources include Ars Technica’s interview with Louis Castle, Game Developer‘s interview with Castle, Edges feature on the making of the game, the original Siskel and Ebert review of the movie, an unsourced but apparently authentic interview with Philip K. Dick, and GameSpot’s vintage Blade Runner review.

Blade Runner is available for digital purchase at GOG.com, in both its original edition that I played for this article and the poorly received enhanced edition. Note that the latter actually includes the original game as well as of this writing, and is often cheaper than buying the original alone…

Footnotes

Footnotes
1 Louis Castle has often claimed in later decades that Blade Runner did well commercially, stating at least once that it sold 1 million copies(!). I can’t see how this could possibly have been the case; I’ve learned pretty well over my years of researching these histories what a million-selling game looked like in the 1990s, and can say very confidently that it did not look like this one. Having said that, though, let me also say that I don’t blame him for inflating the figures. It’s not easy to pour your heart and soul into something and not have it do well. So, as the press of real data and events fades into the past, the numbers start to go up. This doesn’t make Castle dishonest so much as it just makes him human.
2 This chart in general is distorted greatly by the factor of novelty; many or most of the highest-ranking games are very recent ones, rated in the first blush of excitement following their release.
3 Louis Castle’s own testimony contradicts this notion as well. He has stated in various interview that “Blade Runner is as close as I have ever come to realizing a design document verbatim.” I don’t wish to discount his words out of hand, but boy, does this game ever strike me, based on pretty long experience in studying these things, as being full of phantom limbs that never got fully wired into the greater whole. I decided in the end that I had to call it like I see it in this article.
 

Tags: , ,

A Dialog in Real Time (Strategy)

At the end of the 1990s, the two most popular genres in computer gaming were the first-person shooter and the real-time strategy game. They were so dominant that most of the industry’s executives seemed to want to publish little else. And yet at the beginning of the decade neither genre even existed.

The stories of how the two rose to such heady heights are a fascinating study in contrasts, of how influences in media can either go off like an explosion in a TNT factory or like the slow burn of a long fuse. Sometimes something appears and everyone knows instantly that it’s just changed everything; when the Beatles dropped Sgt. Pepper’s Lonely Hearts Club Band in 1967, there was no doubt that the proverbial goalposts in rock music had just been shifted. Other times, though, influence can take years to make itself felt, as was the case for another album of 1967, The Velvet Underground & Nico, about which Brian Eno would later famously say that it “only sold 10,000 copies, but everyone who bought it formed a band.”

Games are the same. Gaming’s Sgt. Pepper was DOOM, which came roaring up out of the shareware underground at the tail end of 1993 to sweep everything from its path, blowing away all of the industry’s extant conventional wisdom about what games would become and what role they would play in the broader culture. Gaming’s Velvet Underground, on the other hand, was the avatar of real-time strategy, which came to the world in the deceptive guise of a sequel in the fall of 1992. Dune II: The Building of a Dynasty sported its Roman numeral because its transnational publisher had gotten its transatlantic cables crossed and accidentally wound up with two separate games based on Frank Herbert’s epic 1965 science-fiction novelone made in Paris, the other in Las Vegas. The former turned out to be a surprisingly evocative and playable fusion of adventure and strategy game, but it was the latter that would quietly — oh, so quietly in the beginning! — shift the tectonic plates of gaming.

For Dune II, which was developed by Westwood Studios and published by Virgin Games, really was the first recognizable implementation of the genre of real-time strategy as we have come to know it since. You chose one of three warring trading houses to play, then moved through a campaign made up of a series of set-piece scenarios, in which your first goal was always to make yourself an army by gathering resources and using them to build structures that could churn out soldiers, tanks, aircraft, and missiles, all of which you controlled by issuing them fairly high-level orders: “go here,” “harvest there,” “defend this building,” “attack that enemy unit.” Once you thought you were strong enough, you could launch your full-on assault on the enemy — or, if you weren’t quick enough, you might find yourself trying to fend off his attack. What made it so different from most of the strategy games of yore was right there in the name: in the fact that it all played out in real time, at a pace that ranged from the brisk to the frantic, making it a test of your rapid-fire mousemanship and your ability to think on your feet. Bits and pieces of all this had been seen before — perhaps most notably in Peter Molyneux and Bullfrog’s Populous and the Sega Genesis game Herzog Zwei — but Dune II was where it all came together to create a gaming paradigm for the ages.

That said, Dune II was very much a diamond in the rough, a game whose groundbreaking aspirations frequently ran up against the brick wall of its limitations. It’s likely to leave anyone who has ever played almost any other real-time-strategy game seething with frustration. It runs at a resolution of just 320 X 200, giving only the tiniest window into the battlefield; it only lets you select and control one unit at a time, making coordinated attacks and defenses hard to pull off; its scenarios are somewhat rote exercises, differing mainly in the number of enemy hordes they throw against you as you advance through the campaign rather than the nature of the terrain or your objectives. Even its fog of war is wonky: the whole battlefield is blank blackness until one of your units gets within visual range, after which you can see everything that goes on there forevermore, whether any of your units can still lay eyes on it or not. And it has no support whatsoever for the multiplayer free-for-alls that are for many or most players the biggest draw of the genre.

Certainly Virgin had no inkling that they had a nascent ludic revolution on their hands. They released Dune II with more of a disinterested shrug than a fulsome fanfare, having expended most of their promotional energies on the other Dune, which had come out just a few months earlier. It’s a testimony to the novelty of the gameplay experience that it did as well as it did. It didn’t become a massive hit, but it sold well enough to earn its budget back and then some on the strength of reasonably positive reviews — although, again, no reviewer had the slightest notion that he was witnessing the birth of what would be one of the two hottest genres in gaming six years in the future. Even Westwood seemed initially to regard Dune II as a one-and-done. They wouldn’t release another game in the genre they had just invented for almost three years.

But the gaming equivalent of all those budding bedroom musicians who listened to that Velvet Underground record was also out there in the case of Dune II. One hungry, up-and-coming studio in particular decided there was much more to be done with the approach it had pioneered. And then Westwood themselves belatedly jumped back into the fray. Thanks to the snowball that these two studios got rolling in earnest during the mid-1990s, the field of real-time strategy would be well and truly saturated by the end of the decade, the yin to DOOM‘s yang. This, then, is the tale of those first few years of these two studios’ competitive dialog, over the course of which they turned the real-time strategy genre from a promising archetype into one of gaming’s two biggest, slickest crowd pleasers.


Blizzard Studios is one of the most successful in the history of gaming, so much so that it now lends its name to the Activision Blizzard conglomerate, with annual revenues in the range of $7.5 billion. In 1993, however, it was Westwood, flying high off the hit dungeon crawlers Eye of the Beholder and Lands of Lore, that was by far the more recognizable name. In fact, Blizzard wasn’t even known yet as Blizzard.

The company had been founded in late 1990 by Allen Adham and Mike Morhaime, a couple of kids fresh out of university, on the back of a $15,000 loan from Morhaime’s grandmother. They called their venture Silicon & Synapse, setting it up in a hole-in-the-wall office in Costa Mesa, California. They kept the lights on initially by porting existing games from one platform to another for publishers like Interplay — the same way, as it happened, that Westwood had gotten off the ground almost a decade before. And just as had happened for Westwood, Silicon & Synapse gradually won opportunities to make their own games once they had proven themselves by porting those of others. First there was a little auto-racing game for the Super Nintendo called RPM Racing, then a pseudo-sequel to it called Rock ‘n’ Roll Racing, and then a puzzle platformer called The Lost Vikings, which appeared for the Sega Genesis, MS-DOS, and the Commodore Amiga in addition to the Super Nintendo. None of these titles took the world by storm, but they taught Silicon & Synapse what it took to create refined, playable, mass-market videogames from scratch. All three of those adjectives have continued to define the studio’s output for the past 30 years.

It was now mid-1993; Silicon & Synapse had been in business for more than two and a half years already. Adham and Morhaime wanted to do something different — something bigger, something that would be suitable for computers only rather than the less capable consoles, a real event game that would get their studio’s name out there alongside the Westwoods of the world. And here there emerged another of their company’s future trademarks: rather than invent something new from whole or even partial cloth, they decided to start with something that already existed, but make it better than ever before, polishing it until it gleamed. The source material they chose was none other than Westwood’s Dune II, now relegated to the bargain bins of last year’s releases, but a perennial after-hours favorite at the Silicon & Synapse offices. They all agreed as to the feature they most missed in Dune II: a way to play it against other people, like you could its ancestor Populous. The bane of most multiplayer strategy games was their turn-based nature, which left you waiting around half the time while your buddy was playing. Real-time strategy wouldn’t have this problem of downtime.

That became the design brief for Warcraft: Orcs & Humans: remake Dune II but make it even better, and then add a multiplayer feature. And then, of course, actually try to sell the thing in all the ways Virgin had not really tried to sell its inspiration.

To say that Warcraft was heavily influenced by Dune II hardly captures the reality. Most of the units and buildings to hand have a direct correspondent in Westwood’s game. Even the menu of icons on the side of the screen is a virtual carbon copy — or at least a mirror image. “I defensively joked that, while Warcraft was certainly inspired by Dune II, [our] game was radically different,” laughs Patrick Wyatt, the lead programmer and producer on the project. “Our radar mini-map was in the upper left corner of the screen, whereas theirs was in the bottom right corner.”

In the same spirit of change, Silicon & Synapse replaced the desert planet of Arrakis with a fantasy milieu pitting, as the subtitle would suggest, orcs against humans. The setting and the overall look of Warcraft owe almost as much to the tabletop miniatures game Warhammer as the gameplay does to Dune II; a Warhammer license was seriously considered, but ultimately rejected as too costly and potentially too restrictive. Years later, Wyatt’s father would give him a set of Warhammer miniatures he’d noticed in a shop: “I found these cool toys and they reminded me a lot of your game. You might want to have your legal department contact them because I think they’re ripping you off.”

Suffice to say, then, that Warcraft was even more derivative than most computer games. The saving grace was the same that it would ever be for this studio: that they executed their mishmash of influences so well. The squishy, squint-eyed art is stylized like a cartoon, a wise choice given that the game is still limited to a resolution of just 320 X 200, so that photo-realism is simply not on the cards. The overall look of Warcraft has more in common with contemporary console games than the dark, gritty aesthetic that was becoming so popular on computers. The guttural exclamations of the orcs and the exaggerated Monty Python and the Holy Grail-esque accents of the humans, all courtesy of regular studio staffers rather than outside voice actors, become a chorus line as you order them hither and yon, making Dune II seem rather stodgy and dull by comparison. “We felt too many games took themselves too seriously,” says Patrick Wyatt. “We just wanted to entertain people.”

Slavishly indebted though it is to Dune II in all the broad strokes, Warcraft doesn’t neglect to improve on its inspiration in those nitty-gritty details that can make the difference between satisfaction and frustration for the player. It lets you select up to four units and give them orders at the same time by simply dragging a box around them, a quality-of-life addition whose importance is difficult to overstate, one so fundamental that no real-time-strategy game from this point forward would dare not to include it. Many more keyboard shortcuts are added, a less technically impressive addition but one no less vital to the cause of playability when the action starts to heat up. There are now two resources you need to harvest, lumber and gold, in places of Dune II‘s all-purpose spice. Units are now a little more intelligent about interpreting your orders, such that they no longer blithely ignore targets of opportunity, or let themselves get mauled to death without counterattacking just because you haven’t explicitly told them to. Scenario design is another area of marked improvement: whereas every Dune II scenario is basically the same drill, just with ever more formidable enemies to defeat, Warcraft‘s are more varied and arise more logically out of the story of the campaign, including a couple of special scenarios with no building or gathering at all, where you must return a runaway princess to the fold (as the orcs) or rescue a stranded explorer (as the humans).

The orc on the right who’s stroking his “sword” looks so very, very wrong — and this screenshot doesn’t even show the animation…

And, as the cherry on top, there was multiplayer support. Patrick Wyatt finished his first, experimental implementation of it in June of 1994, then rounded up a colleague in the next cubicle over so that they could became the first two people ever to play a full-fledged real-time-strategy game online. “As we started the game, I felt a greater sense of excitement than I’d ever known playing any other game,” he says.

It was just this magic moment, because it was so invigorating to play against a human and know that it wasn’t some stupid AI. It was a player who was smart and doing his absolute best to crush you. I knew we were making a game that would be fun, but at that moment I knew the game would absolutely kick ass.

While work continued on Warcraft, the company behind it was going through a whirlwind of changes. Recognizing at long last that “Silicon & Synapse” was actually a pretty terrible name, Adham and Morhaime changed it to Chaos Studios, which admittedly wasn’t all that much better, in December of 1993. Two months later, they got an offer they couldn’t refuse: Davidson & Associates, a well-capitalized publisher of educational software that was looking to break into the gaming market, offered to buy the freshly christened Chaos for the princely sum of $6.75 million. It was a massive over-payment for what was in all truth a middling studio at best, such that Adham and Morhaime felt they had no choice but to accept, especially after Davidson vowed to give them complete creative freedom. Three months after the acquisition, the founders decided they simply had to find a decent name for their studio before releasing Warcraft, their hoped-for ticket to the big leagues. Adham picked up a dictionary and started leafing through it. He hit pay dirt when his eyes flitted over the word “blizzard.” “It’s a cool name! Get it?” he asked excitedly. And that was that.

So, Warcraft hit stores in time for the Christmas of 1994, with the name of “Blizzard Entertainment” on the box as both its developer and its publisher — the wheels of the latter role being greased by the distributional muscle of Davidson & Associates. It was not immediately heralded as a game that would change everything, any more than Dune II had been; real-time strategy continued to be more of a slowly growing snowball than the ton of bricks to the side of the head that the first-person shooter had been. Computer Gaming World magazine gave Warcraft a cautious four stars out of five, saying that “if you enjoy frantic real-time games and if you don’t mind a linear structure in your strategic challenges, Warcraft is a good buy.” At the same time, the extent of the game’s debt to Dune II was hardly lost on the reviewer: “It’s a good thing for Blizzard that there’s no precedent for ‘look and feel’ lawsuits in computer entertainment.”[1]This statement was actually not correct; makers of standup arcade games of the classic era and the makers of Tetris had successfully cowed the cloning competition in the courts.

Warcraft would eventually sell 400,000 units, bettering Dune II‘s numbers by a factor of four or more. As soon as it became clear that it was doing reasonably well, Blizzard started on a sequel.


Out of everyone who looked at Warcraft, no one did so with more interest — or with more consternation at its close kinship with Dune II — than the folks at Westwood. “When I played Warcraft, the similarities between it and Dune II were pretty… blatant, so I didn’t know what to think,” says the Westwood designer Adam Isgreen. Patrick Wyatt of Blizzard got the impression that his counterparts “weren’t exactly happy” at the slavish copying when they met up at trade shows, though he “reckoned they should have been pleased that we’d taken their game as a base for ours.” Only gradually did it become clear why Warcraft‘s existence was a matter of such concern for Westwood: because they themselves had finally decided to make another game in the style of Dune II.

The game that Westwood was making could easily have wound up looking even more like the one that Blizzard had just released. The original plan was to call it Command & Conquer: Fortress of Stone and to set it in a fantasy world. (Westwood had been calling their real-time-strategy engine “Command & Conquer” since the days of promoting Dune II.) “It was going to have goldmines and wood for building things. Sound familiar?” chuckles Westwood’s co-founder Louis Castle. “There were going to be two factions, humans and faerie folk… pretty fricking close to orcs versus humans.”

Some months into development, however, Westwood decided to change directions, to return to a science-fictional setting closer to that of Dune II. For they wanted their game to be a hit, and it seemed to them that fantasy wasn’t the best guarantee of such a thing: CRPGs were in the doldrums, and the most recent big strategy release with a fantasy theme, MicroProse’s cult-classic-to-be Master of Magic, hadn’t done all that well either. Foreboding near-future stories, however, were all the rage; witness the stellar sales of X-COM, another MicroProse strategy game of 1994. “We felt that if we were going to make something that was massive,” says Castle, “it had to be something that anybody and everybody could relate to. Everybody understands a tank; everybody understands a guy with a machine gun. I don’t have to explain to them what this spell is.” Westwood concluded that they had made the right decision as soon as they began making the switch in software: “Tanks and vehicles just felt better.” The game lost its subtitle to become simply Command & Conquer.

While the folks at Blizzard were plundering Warhammer for their units and buildings, those at Westwood were trolling the Jane’s catalogs of current military hardware and Soldier of Fortune magazine. “We assumed that anything that was talked about as possibly coming was already here,” says Castle, “and that was what inspired the units.” The analogue of Dune II‘s spice — the resource around which everything else revolved — became an awesomely powerful space-born element come to earth known as tiberium.

Westwood included most of the shortcuts and conveniences that Blizzard had built into Warcraft, but went one or two steps further more often than not. For example, they also made it possible to select multiple units by dragging a box around them, but in their game there was no limit to the number of units that could be selected in this way. The keyboard shortcuts they added not only let you quickly issue commands to units and buildings, but also jump around the map instantly to custom viewpoints you could define. And up to four players rather than just two could now play together at once over a local network or the Internet, for some true mayhem. Then, too, scenario design was not only more varied than in Dune II but was even more so than in Warcraft, with a number of “guerilla” missions in the campaigns that involved no resource gathering or construction. It’s difficult to say to what extent these were cases of parallel innovation and to what extent they were deliberate attempts to one-up what Warcraft had done. It was probably a bit of both, given that Warcraft was released a good nine months before Command & Conquer, giving Westwood plenty of time to study it.

But other innovations in Command & Conquer were without any precedent. The onscreen menus could now be toggled on and off, for instance, a brilliant stroke that gave you a better view of the battlefield when you really needed it. Likewise, Westwood differentiated the factions in the game in a way that had never been done before. Whereas the different houses in Dune II and the orcs and humans in Warcraft corresponded almost unit for unit, the factions in Command & Conquer reflected sharply opposing military philosophies, demanding markedly different styles of play: the establishment Global Defense Initiative had slow, strong, and expensive units, encouraging a methodical approach to building up and husbanding your forces, while the terroristic Brotherhood of Nod had weaker but faster and cheaper minions better suited to madcap kamikaze rushes than carefully orchestrated combined-arms operations.

Yet the most immediately obvious difference between Command & Conquer and Warcraft was all the stuff around the game. Warcraft had been made on a relatively small budget with floppy disks in mind. It sported only a brief opening cinematic, after which scenario briefings consisted of nothing but scrolling text and a single voice over a static image. Command & Conquer, by contrast, was made for CD-ROM from the outset, by a studio with deeper pockets that had invested a great deal of time and energy into both 3D animation and full-motion video, that trendy art of incorporating real-world actors and imagery into games. The much more developed story line of Command & Conquer is forwarded by little between-mission movies that, if not likely to make Steven Spielberg nervous, are quite well-done for what they are, featuring as they do mostly professional performers — such as a local Las Vegas weatherman playing a television-news anchorman — who were shot by a real film crew in Westwood’s custom-built blue-screen studio. Westwood’s secret weapon here was Joseph Kucan, a veteran theater director and actor who oversaw the film shoots and personally played the charismatic Nod leader Kane so well that he became the very face of Command & Conquer in the eyes of most gamers, arguably the most memorable actual character ever associated with a genre better known for its hordes of generic little automatons. Louis Castle reckons that at least half of Command & Conquer‘s considerable budget went into the cut scenes.

The game was released with high hopes in August of 1995. Computer Gaming World gave it a pretty good review, four stars out of five: “The entertainment factor is high enough and the action fast enough to please all but the most jaded wargamers.”

The gaming public would take to it even more than that review might imply. But in the meantime…


As I noted in an earlier article, numbered sequels weren’t really commonplace for strategy games prior to the mid-1990s. Blizzard had originally imagined Warcraft as a strategy franchise of a different stripe: each game bearing the name would take the same real-time approach into a completely different milieu, as SSI was doing at the time with their “5-Star General” series of turn-based strategy games that had begun with Panzer General and continued with the likes of Fantasy General and Star General. But Blizzard soon decided to make their sequel a straight continuation of the first game, an approach to which real-time strategy lent itself much more naturally than more traditional styles of strategy game; the set-piece story of a campaign could, after all, always be continued using all the ways that Hollywood had long since discovered for keeping a good thing going. The only snafu was that either the orcs or the humans could presumably have won the war in the first game, depending on which side the player chose. No matter: Blizzard decided the sequel would be more interesting if the orcs had been the victors and ran with that.

Which isn’t to say that building upon its predecessor’s deathless fiction was ever the real point of Warcraft II: Tides of Darkness. Blizzard knew now that they had a competitor in Westwood, and were in any case eager to add to the sequel all of the features and ideas that time had not allowed them to include in the first game. There would be waterways and boats to sail on them, along with oil, a third resource, one that could only be mined at sea. Both sides would get new units to play with, while elves, dwarves, trolls, ogres, and goblins would join the fray as allies of one of the two main racial factions. The interface would be tweaked with another welcome shortcut: selecting a unit and right-clicking somewhere would cause it to carry out the most logical action there without having to waste time choosing from a menu. (After all, if you selected a worker unit and sent him to a goldmine, you almost certainly wanted him to start collecting gold. Why should you have to tell the game the obvious in some more convoluted fashion?)

But perhaps the most vital improvement was in the fog of war. The simplistic implementations of same seen in the first Warcraft and Command & Conquer were inherited from Dune II: areas of the map that had been seen once by any of your units were revealed permanently, even if said units went away or were destroyed. Blizzard now made it so that you would see only a back-dated snapshot of areas currently out of your units’ line of sight, reflecting what was there the last time one of your units had eyes on them. This innovation, no mean feat of programming on the part of Patrick Wyatt, brought a whole new strategic layer to the game. Reconnaissance suddenly became something you had to think about all the time, not just once.

Other improvements were not so conceptually groundbreaking, but no less essential for keeping ahead of the Joneses (or rather the Westwoods). For example, Blizzard raised the screen-resolution stakes, from 320 X 200 to 640 X 480, even as they raised the number of people who could play together online from Command & Conquer‘s four to eight. And, while there was still a limit on the number of units you could select at one time using Blizzard’s engine, that limit at least got raised from the first Warcraft‘s four to nine.

The story and its presentation, however, didn’t get much more elaborate than last time out. While Westwood was hedging its bets by keeping one foot in the “interactive movie” space of games like Wing Commander III, Blizzard was happy to “just” make Warcraft a game. The two series were coming to evince very distinct personalities and philosophies, just as gamers were sorting themselves into opposing groups of fans — with a large overlap of less partisan souls in between them, of course.

Released in December of 1995, Warcraft II managed to shake Computer Gaming World free of some of its last reservations about the burgeoning genre of real-time strategy, garnering four and a half stars out of five: “If you enjoy fantasy gaming, then this is a sure bet for you.” It joined Command & Conquer near the top of the bestseller lists, becoming the game that well and truly made Blizzard a name to be reckoned with, a peer in every sense with Westwood.

Meanwhile, and despite the sometimes bitter rivalry between the two studios and their fans, Command & Conquer and Warcraft II together made real-time strategy into a commercial juggernaut. Both games became sensations, with no need to shirk from comparison to even DOOM in terms of their sales and impact on the culture of gaming. Each eventually sold more than 3 million copies, numbers that even the established Westwood, much less the upstart Blizzard, had never dreamed of reaching before, enough to enshrine both games among the dozen or so most popular computer games of the entire 1990s. More than three years after real-time strategy’s first trial run in Dune II, the genre had arrived for good and all. Both Westwood and Blizzard rushed to get expansion packs of additional scenarios for their latest entries in the genre to market, even as dozens of other developers dropped whatever else they were doing in order to make real-time-strategy games of their own. Within a couple of years, store shelves would be positively buckling under the weight of their creations — some good, some bad, some more imaginative, some less so, but all rendered just a bit anonymous by the sheer scale of the deluge. And yet even the most also-ran of the also-rans sold surprisingly well, which explained why they just kept right on coming. Not until well into the new millennium would the tide begin to slacken.


With Command & Conquer and Warcraft II, Westwood and Blizzard had arrived at an implementation of real-time strategy that even the modern player can probably get on with. Yet there is one more game that I just have to mention here because it’s so loaded with a quality that the genre is known for even less than its characters: that of humor. Command & Conquer: Red Alert is as hilarious as it is unexpected, the only game of this style that’s ever made me laugh out loud.

Red Alert was first envisioned as a scenario pack that would move the action of its parent game to World War II. But two things happened as work progressed on it: Westwood decided it was different enough from the first game that it really ought to stand alone, and, as designer Adam Isgreen says, “we found straight-up history really boring for a game.” What they gave us instead of straight-up history is bat-guano insane, even by the standards of videogame fictions.

We’re in World War II, but in a parallel timeline, because Albert Einstein — why him? I have no idea! — chose to travel back in time on the day of the Trinity test of the atomic bomb and kill Adolf Hitler. Unfortunately, all that’s accomplished is to make world conquest easier for Joseph Stalin. Now Einstein is trying to save the democratic world order by building ever more powerful gadgets for its military. Meanwhile the Soviet Union is experimenting with the more fantastical ideas of Nikola Tesla, which in this timeline actually work. So, the battles just keep getting crazier and crazier as the game wears on, with teleporters sending units jumping instantly from one end of the map to the other, Tesla coils zapping them with lightning, and a fetching commando named Tanya taking out entire cities all by herself when she isn’t chewing the scenery in the cut scenes. Those actually display even better production values than the ones in the first game, but the script has become pure, unadulterated camp worthy of Mel Brooks, complete with a Stalin who ought to be up there singing and dancing alongside Der Führer in Springtime for Hitler. Even our old friend Kane shows up for a cameo. It’s one of the most excessive spectacles of stupidity I’ve ever seen in a game… and one of the funniest.

Joseph Stalin gets rough with an underling. When you don’t have the Darth Vader force grip, you have to do things the old-fashioned way…

Up there at the top is the killer commando Tanya, who struts across the battlefield with no regard for proportion.

Released in the dying days of 1996, Red Alert didn’t add that much that was new to the real-time-strategy template, technically speaking; in some areas such as fog of war, it still lagged behind the year-old Warcraft II. Nonetheless, it exudes so much joy that it’s by far my favorite of the games I’ve written about today. If you ask me, it would have been a better gaming world had the makers of at least a few of the po-faced real-time-strategy games that followed looked here for inspiration. Why not? Red Alert too sold in the multiple millions.



Did you enjoy this article? If so, please think about pitching in to help me make many more like it. You can pledge any amount you like.



(Sources: the book Stay Awhile and Listen, Book I by David L. Craddock; Computer Gaming World of January 1995, March 1995, December 1995, March 1996, June 1996, September 1996, December 1996, March 1997, June 1997, and July 1997; Retro Gamer 48, 111, 128, and 148; The One of January 1993; the short film included with the Command & Conquer: The First Decade game collection. Online sources include Patrick Wyatt’s recollections at his blog Code of Honor, Dan Griliopoulos’s collection of interviews with Westwood alumni at Funambulism, Soren Johnson’s interview with Louis Castle for his Designer’s Notes podcast, and Richard Moss’s real-time-strategy retrospective for Ars Technica.

Warcraft: Orcs & Humans and Warcraft II: Tides of Darkness, are available as digital purchases at GOG.com. The first Command & Conquer and Red Alert are available in remastered versions as a bundle from Steam.)

Footnotes

Footnotes
1 This statement was actually not correct; makers of standup arcade games of the classic era and the makers of Tetris had successfully cowed the cloning competition in the courts.
 

Tags: , , , , ,

Life on the Grid

I’ve long been interested in the process by which new games turn into new gaming genres or sub-genres.

Most game designers know from the beginning that they will be working within the boundaries of an existing genre, whether due to their own predilections or to instructions handed down from above. A minority are brave and free enough to try something formally different from the norm, but few to none even of them, it seems safe to say, deliberately set out to create a new genre. Yet if the game they make turns into a success, it may be taken as the beginning of just that, even as — and this to me is the really fascinating part — design choices which were actually technological compromises with the Platonic ideal in the designer’s mind are taken as essential, positive parts of the final product.

A classic example of this process is a genre that’s near and dear to my heart: the text adventure. Neither of the creators of the original text adventure — they being Will Crowther and Don Woods — strikes me as a particularly literary sort. I suspect that, if they’d had the technology available to them to do it, they’d have happily made their game into a photorealistic 3D-rendered world to be explored using virtual-reality glasses. As it happened, though, all they had was a text-only screen and a keyboard connected to a time-shared DEC PDP-10. So, they made do, describing the environment in text and accepting input in the form of commands entered at the keyboard.

If we look at what happened over the ten to fifteen years following Adventure‘s arrival in 1977, we see a clear divide between practitioners of the form. Companies like Sierra saw the text-only format as exactly the technological compromise Crowther and Woods may also have seen, and ran away from it as quickly as possible. Others, however — most notably Infocom — embraced text, finding in it an expansive possibility space all its own, even running advertisements touting their lack of graphics as a virtue. The heirs to this legacy still maintain a small but vibrant ludic subculture to this day.

But it’s another, almost equally interesting example of this process that’s the real subject of our interest today: the case of the real-time grid-based dungeon crawler. After the release of Sir-Tech’s turn-based dungeon crawl Wizardry in 1981, it wasn’t hard to imagine what the ideal next step would be: a smooth-scrolling first-person 3D environment running in real time. Yet that was a tall order indeed for the hardware of the time — even for the next generation of 16-bit hardware that began to arrive in the mid-1980s, as exemplified by the Atari ST and the Commodore Amiga. So, when a tiny developer known as FTL decided the time had come to advance the state of the art over Wizardry, they compromised by going to real time but holding onto a discrete grid of locations inside the dungeon of Dungeon Master.

Gamers of today have come to refer to dungeon crawls on a grid as “blobbers,” which is as good a term as any. (The term arises from the way that these games typically “blob” together a party of four or six characters, moving them in lockstep and giving the player a single first-person — first-people? — view of the world.) The Dungeon Master lineage, then, are “real-time blobbers.”

By whatever name, this intermediate step between Wizardry and the free-scrolling ideal came equipped with its own unique set of gameplay affordances. Retaining the grid allowed you to do things that you simply couldn’t otherwise. For one thing, it allowed a game to combine the exciting immediacy of real time with what remains for some of us one of the foremost pleasures of the earlier, Wizardry style of dungeon crawl: the weirdly satisfying process of making your own maps — of slowly filling in the blank spaces on your graph paper, bringing order and understanding to what used to be the chaotic unknown.

This advertisement for the popular turn-based dungeon crawl Might and Magic makes abundantly clear how essential map-making was to the experience of these games. “Even more cartography than the bestselling fantasy game!” What a sales pitch…

But even if you weren’t among the apparent minority who enjoyed that sort of thing, the grid had its advantages, the most significant of which is implied by the very name of “blobber.” It was easy and natural in these games to control a whole party of characters moving in lockstep from square to square, thus retaining another of the foremost pleasures of turn-based games like Wizardry: that of building up not just a single character but a balanced team of them. In a free-scrolling, free-moving game, with its much more precise sense of embodied positioning, such a conceit would have been impossible to maintain. And much of the emergent interactivity of Dungeon Master‘s environment would also have been impossible without the grid. Many of us still recall the eureka moment when we realized that we could kill monsters by luring them into a gate square and pushing a button to bash them on the heads with the thing as it tried to descend, over and over again. Without the neat order of the grid, where a gate occupying a square fills all of that square as it descends, there could have been no eureka.

So, within a couple of years of Dungeon Master‘s release in 1987, the real-time blobber was establishing itself in a positive way, as its own own sub-genre with its own personality, rather than the unsatisfactory compromise it may first have seemed. Today, I’d like to do a quick survey of this popular if fairly brief-lived style of game. We can’t hope to cover all of the real-time blobbers, but we can hit the most interesting highlights.


Bloodwych running in its unique two-player mode.

Most of the games that followed Dungeon Master rely on one or two gimmicks to separate themselves from their illustrious ancestor, while keeping almost everything else the same. Certainly this rule applied to the first big title of the post-Dungeon Master blobber generation, 1989’s Bloodwych. It copies from FTL’s game not only the real-time approach but also its innovative rune-based magic system, and even the conceit of the player selecting her party from a diverse group of heroes who have been frozen in amber. By way of completing the facsimile, Bloodwych eventually got a much more difficult expansion disk, similar to Dungeon Master‘s famously difficult Chaos Strikes Back.

The unique gimmick here is the possibility for two players to play together on the same machine, either cooperatively or competitively, as they choose. A second innovation of sorts is the fact that, in addition to the usual Amiga and Atari ST versions, Bloodwych was also made for the Commodore 64, Amstrad CPC, and Sinclair Spectrum, much more limited 8-bit computers which still owned a substantial chunk of the European market in 1989.

Bloodwych was the work of a two-man team, one handling the programming, the other the graphics. The programmer, one Anthony Taglione, tells an origin story that’s exactly what you’d expect it to be:

Dungeon Master appeared on the ST and what a product it was! Three weeks later we’d played it to death, even taking just a party of short people. My own record is twelve hours with just two characters. I was talking with Mirrorsoft at the time and suggested that I could do a DM conversion for them on the C64. They ummed and arred a lot and Pete [the artist] carried on drawing screens until they finally said, “Yes!” and I said, “No! We’ve got a better design and it’ll be two-player-simultaneous.” They said, “Okay, but we want ST and Amiga as well.”

The two-player mode really is remarkable, especially considering that it works even on the lowly 8-bit systems. The screen is split horizontally, and both parties can roam about the dungeon freely in real time, even fighting one another if the players in control wish it. “An option allowing two players to connect via modem could only have boosted the game’s popularity,” noted Wizardry‘s designer Andrew Greenberg in 1992, in a review of the belated Stateside MS-DOS release. But playing Bloodwych in-person with a friend had to be if anything even more fun.

Unfortunately, the game has little beyond its two-player mode and wider platform availability to recommend it over Dungeon Master. Ironically, many of its problems are down to the need to accommodate the two-player mode. In single-player mode, the display fills barely half of the available screen real estate, meaning that everything is smaller and harder to manipulate than in Dungeon Master. The dungeon design as well, while not being as punishing as some later entries in this field, is nowhere near as clever or creative as that of Dungeon Master, lacking the older game’s gradual, elegant progression in difficulty and complexity. As would soon become all too typical of the sub-genre, Bloodwych offered more levels — some forty of them in all, in contrast to Dungeon Master‘s twelve — in lieu of better ones.

So, played today, Bloodwych doesn’t really have a lot to offer. It was doubtless a more attractive proposition in its own time, when games were expensive and length was taken by many cash-strapped teenage gamers as a virtue unto itself. And of course the multiplayer mode was its wild card; it almost couldn’t help but be fun, at least in the short term. By capitalizing on that unique attribute and the fact that it was the first game out there able to satiate eager fans of Dungeon Master looking for more, Bloodwych did quite well for its publisher.


Captive has the familiar “paper doll” interface of Dungeon Master, but you’re controlling robots here. The five screens along the top will eventually be used for various kinds of telemetry and surveillance as you acquire new capabilities.

The sub-genre’s biggest hit of 1990 — albeit once again only in Europe — evinced more creativity in many respects than Bloodwych, even if its primary claim to fame once again came down to sheer length. Moving the action from a fantasy world into outer space, Captive is a mashup of Dungeon Master and Infocom’s Suspended, if you can imagine such a thing. As a prisoner accused of a crime he didn’t commit, you must free yourself from your cell using four robots which you control remotely. Unsurprisingly, the high-tech complexes they’ll need to explore bear many similarities to a fantasy dungeon.

The programmer, artist, and designer behind Captive was a lone-wolf Briton named Tony Crowther, who had cranked out almost thirty simple games for 8-bit computers before starting on this one, his first for the Amiga and Atari ST. Crowther created the entire game all by himself in about fourteen months, an impressive achievement by any standard.

More so even than for its setting and premise, Captive stands out for its reliance on procedurally-generated “dungeons.” In other words, it doesn’t even try to compete with Dungeon Master‘s masterful level design, but rather goes a different way completely. Each level is generated by the computer on the fly from a single seed number in about three seconds, meaning there’s no need to store any of the levels on disk. After completing the game the first time, the player is given the option of doing it all over again with a new and presumably more difficult set of complexes to explore. This can continue virtually indefinitely; the level generator can produce 65,535 unique levels in all. That should be enough, announced a proud Crowther, to keep someone playing his game for fifty years by his reckoning: “I wanted to create a role-playing game you wouldn’t get bored of — a game that never ends, so you can feasibly play it for years and years.”

Procedural generation tended to be particularly appealing to European developers like Tony Crowther, who worked in smaller groups with tighter budgets than their American counterparts, and whose target platforms generally lacked the hard drives that had become commonplace on American MS-DOS machines by 1990. Yet it’s never been a technique which I find very appealing as anything but a preliminary template generator for a human designer. In Captive as in most games that rely entirely on procedural generation, the process yields an endless progression of soulless levels which all too obviously lack the human touch of those found in a game like Dungeon Master. In our modern era, when brilliant games abound and can often be had for a song, there’s little reason to favor a game with near-infinite amounts of mediocre content over a shorter but more concentrated experience. In Captive‘s day, of course, the situation was very different, making it just one more example of an old game that was, for one reason or another, far more appealing in its own day than it is in ours.


This is the screen you’ll see most in Knightmare.

Tony Crowther followed up Captive some eighteen months later with Knightmare, a game based on a children’s reality show of sorts which ran on Britain’s ITV network from 1987 until 1994. The source material is actually far more interesting than this boxed-computer-game derivative. In an early nod toward embodied virtual reality, a team of four children were immersed in a computer-generated dungeon and tasked with finding their way out. It’s an intriguing cultural artifact of Britain’s early fascination with computers and the games they played, well worth a gander on YouTube.

The computer game of Knightmare, however, is less intriguing. Using the Captive engine, but featuring hand-crafted rather than procedurally-generated content this time around, it actually hews far closer to the Dungeon Master template than its predecessor. Indeed, like so many of its peers, it slavishly copies almost every aspect of its inspiration without managing to be quite as good — much less better — at any of it. This lineage has always had a reputation for difficulty, but Knightmare pushes that to the ragged edge, in terms of both its ridiculously convoluted environmental puzzles and the overpowered monsters you constantly face. Even the laddish staff of Amiga Format magazine, hardly a bastion of thoughtful design analyses, acknowledged that it “teeters on unplayably tough.” And even the modern blogger known as the CRPG Addict, whose name ought to say it all about his skill with these types of games, “question[s] whether it’s possible to win it without hints.”

Solo productions like this one, created in a vacuum, with little to no play-testing except by a designer who’s intimately familiar with every aspect of his game’s systems, often wound up getting the difficulty balance markedly wrong. Yet Knightmare is an extreme case even by the standards of that breed. If Dungeon Master is an extended explication of the benefits of careful level design, complete with lots of iterative feedback from real players, this game is a cautionary tale about the opposite extreme. While it was apparently successful in its day, there’s no reason for anyone who isn’t a masochist to revisit it in ours.


Eye of the Beholder‘s dependence on Dungeon Master is, as the CRPG Addict puts it, “so stark that you wonder why there weren’t lawsuits involved.” What it does bring new to the table is a whole lot more story and lore. Multi-page story dumps like this one practically contain more text than the entirety of Dungeon Master.

None of the three games I’ve just described was available in North America prior to 1992. Dungeon Master, having been created by an American developer, was for sale there, but only for the Amiga, Atari ST, and Apple IIGS, computers whose installed base in the country had never been overly large and whose star there dwindled rapidly after 1989. Thus the style of gameplay that Dungeon Master had introduced was either completely unknown or, at best, only vaguely known by most American gamers — this even as real-time blobbers had become a veritable gaming craze in Europe. But there was no reason to believe that American gamers wouldn’t take to them with the same enthusiasm as their European counterparts if they were only given the chance. There was simply a shortage of supply — and this, as any good capitalist knows, spells Opportunity.

The studio which finally walked through this open door is one I recently profiled in some detail: Westwood Associates. With a long background in real-time games already behind them, they were well-positioned to bring the real-time dungeon crawl to the American masses. Even better, thanks to a long-established relationship with the publisher SSI, they got the opportunity to do so under the biggest license in CRPGs, that of Dungeons & Dragons itself. With its larger development team and American-sized budget for art and sound, everything about Eye of the Beholder screamed hit, and upon its release in March of 1991 — more than half a year before Knightmare, actually — it didn’t disappoint.

It really is an impressive outing in many ways, the first example of its sub-genre that I can honestly imagine someone preferring to Dungeon Master. Granted, Westwood’s game lacks Dungeon Master‘s elegance: the turn-based Dungeons & Dragons rules are rather awkwardly kludged into real time; the environments still aren’t as organically interactive (amazingly, none of the heirs to Dungeon Master would ever quite live up to its example in this area); the controls can be a bit clumsy; the level design is nowhere near as fiendishly creative. But on the other hand, the level design isn’t pointlessly hard either, and the game is, literally and figuratively, a more colorful experience. In addition to the better graphics and sound, there’s far more story, steeped in the lore of the popular Dungeons & Dragons Forgotten Realms campaign setting. Personally, I still prefer Dungeon Master‘s minimalist aesthetic, as I do its cleaner rules set and superior level design. But then, I have no personal investment in the Forgotten Realms (or, for that matter, in elaborate fantasy world-building in general). Your mileage may vary.

Whatever my or your opinion of it today, Eye of the Beholder hit American gamers like a revelation back in the day, and Europe too got to join the fun via a Westwood-developed Amiga port which shipped there within a few months of the MS-DOS original’s American debut. It topped sales charts in both places, becoming the first game of its type to actually outsell Dungeon Master. In fact, it became almost certainly the best-selling single example of a real-time blobber ever; between North America and Europe, total sales likely reached 250,000 copies or more, huge numbers at a time when 100,000 copies was the line that marked a major hit.

Following the success of Eye of the Beholder, the dam well and truly burst in the United States. Before the end of 1991, Westwood had cranked out an Eye of the Beholder II, which is larger and somewhat more difficult than its predecessor, but otherwise shares the same strengths and weaknesses. In 1993, their publisher SSI took over to make an Eye of the Beholder III in-house; it’s generally less well-thought-of than the first two games. Meanwhile Bloodwych and Captive got MS-DOS ports and arrived Stateside. Even FTL, whose attitude toward making new products can most generously be described as “relaxed,” finally managed to complete and release their long-rumored MS-DOS port of Dungeon Master — whereupon its dated graphics were, predictably if a little unfairly, compared unfavorably with the more spectacular audiovisuals of Eye of the Beholder in the American gaming press.


Black Crypt‘s auto-map.

Another, somewhat more obscure title from this peak of the real-time blobber’s popularity was early 1992’s Black Crypt, the very first game from the American studio Raven Software, who would go on to a long and productive life. (As of this writing, they’re still active, having spent the last eight years or so making new entries in the Call of Duty franchise.) Although created by an American developer and published by the American Electronic Arts, one has to assume that Black Crypt was aimed primarily at European players, as it was made available only for the Amiga. Even in Europe, however, it failed to garner much attention in an increasingly saturated market; it looked a little better than Dungeon Master but not as good as Eye of the Beholder, and otherwise failed to stand out from the pack in terms of level design, interface, or mechanics.

With, that is, one exception. For the first time, Black Crypt added an auto-map to the formula. Unfortunately, it was needlessly painful to access, being available only through a mana-draining wizard’s spell. Soon, though, Westwood would take up and perfect Raven’s innovation, as the real-time blobber entered the final phase of its existence as a gaming staple.


Black Crypt may have been the first real-time blobber with an auto-map, but Lands of Lore perfected the concept. Like every other aspect of the game, the auto-map here looks pretty spectacular.

Released in late 1993, Westwood’s Lands of Lore: The Throne of Chaos was an attempt to drag the now long-established real-time-blobber format into the multimedia age, while also transforming it into a more streamlined and accessible experience. It comes very, very close to realizing its ambitions, but is let down a bit by some poor design choices as it wears on.

Having gone their separate ways from SSI and from the strictures of the Dungeons & Dragons license, Westwood got to enjoy at last the same freedom which had spawned the easy elegance of Dungeon Master; they were free to, as Westwood’s Louis Castle would later put it, create cleaner rules that “worked within the context of a digital environment,” making extensive use of higher-math functions that could never have been implemented in a tabletop game. These designers, however, took their newfound freedom in a very different direction from the hardcore logistical and tactical challenge that was FTL’s game. “We’re trying to make our games more accessible to everybody,” said Westwood’s Brett Sperry at the time, “and we feel that the game consoles offer a clue as to where we should go in terms of interface. You don’t really have to read a manual for a lot of games, the entertainment and enjoyment is immediate.”

Lands of Lore places you in control of just two or three characters at a time, who come in and out of your party as the fairly linear story line dictates. The magic system is similarly condensed down to just seven spells. In place of the tactical maneuvering and environmental exploitation that marks combat within the more interactive dungeons of Dungeon Master is a simple but satisfying rock-paper-scissors approach: monsters are more or less vulnerable to different sorts of attacks, requiring you adjust your spells and equipment accordingly. And, most tellingly of all, an auto-map is always at your fingertips, even automatically annotating hidden switches and secret doors you might have overlooked in the first-person view.

Whether all of this results in a game that’s better than Dungeon Master is very much — if you’ll excuse the pun! — in the eye of the beholder. The auto-map alone changes the personality of the game almost enough to make it feel like the beginning of a different sub-genre entirely. Yet Lands of Lore has an undeniable charm all its own as a less taxing, more light-hearted sort of fantasy romp.

One thing at least is certain: at the time of its release, Lands of Lore was by far the most attractive blobber the world had yet seen. Abandoning the stilted medieval conceits of most CRPGs, its atmosphere is more fairy tale than Tolkien, full of bright cartoon-like tableaux rendered by veteran Hanna-Barbera and Disney animators. The music and voice acting in the CD-ROM version are superb, with none other than Patrick Stewart of Star Trek: The Next Generation fame acting as narrator.

Sadly, though, the charm does begin to evaporate somewhat as the game wears on. There’s an infamous one-level difficulty spike in the mid-game that’s all but guaranteed to run off the very newbies and casual players Westwood was trying to attract. Worse, the last 25 percent or so is clearly unfinished, a tedious slog through empty corridors with nothing of interest beyond hordes of overpowered monsters. When you get near the end and the game suddenly takes away the auto-map you’ve been relying on, you’re left wondering how the designers could have so completely lost all sense of the game they started out making. More so than any of the other games I’ve written about today, Lands of Lore: The Throne of Chaos, despite enjoying considerable commercial success which would lead to two sequels, feels like a missed opportunity.


Real-time blobbers would continue to appear for a couple more years after Lands of Lore. The last remotely notable examples are two 1995 releases: FTL’s ridiculously belated and rather unimaginative Dungeon Master II, which was widely and justifiably panned by reviewers; and Interplay’s years-in-the-making Stonekeep, which briefly dazzled some reviewers with such extraneous bells and whistles as an introductory cinematic that by at least one employee’s account cost ten times as much as the underwhelming game behind it. (If any other anecdote more cogently illustrates the sheer madness of the industry’s drunk-on-CD-ROM “interactive movie” period, I don’t know what it is.) Needless to say, neither game outdoes the original Dungeon Master where it counts.

At this point, then, we have to confront the place where the example I used in opening this article — that of interactive fiction and its urtext of Adventure — begins to break down when applied to the real-time blobber. Adventure, whatever its own merits, really was the launching pad for a whole universe of possibilities involving parsers and text. But the real-time blobber never did manage to transcend its own urtext, as is illustrated by the long shadow the latter has cast over this very article. None of the real-time blobbers that came after Dungeon Master was clearly better than it; arguably, none was ever quite as good. Why should this be?

Any answer to that question must, first of all, pay due homage to just how fully-realized Dungeon Master was as a game system, as well as to how tight its level designs were. It presented everyone who tried to follow it with one heck of a high bar to clear. Beyond that obvious fact, though, we must also consider the nature of the comparison with the text adventure, which at the end of the day is something of an apples-and-oranges proposition. The real-time blobber is a more strictly demarcated category than the text adventure; this is why we tend to talk about real-time blobbers as a sub-genre and text adventures as a genre. Perhaps there’s only so much you can do with wandering through grid-based dungeons, making maps, solving mechanical puzzles, and killing monsters. And perhaps Dungeon Master had already done it all about as well as it could be done, making everything that came after superfluous to all but the fanatics and the completists.

And why, you ask, had game developers largely stopped even trying to better Dungeon Master by the middle of the 1990s? [1]If one takes the really long view, they didn’t, at least not forever. In 2012, as part of the general retro-revival that has resurrected any number of dead sub-genres over the past decade, a studio known as Almost Human released Legend of Grimrock, the first significant commercial game of this type to be seen in many years. It got positive reviews, and sold well enough to spawn a sequel in 2014. I’m afraid I haven’t played either of them, and so can’t speak to the question of whether either or both of them finally managed the elusive trick of outdoing Dungeon Master. As it happens, there’s no mystery whatsoever about why the real-time blobber — or, for that matter, the blobber in general — disappeared from the marketplace. Even as the format was at its absolute peak of popularity in 1992, with Westwood’s Eye of the Beholder games selling like crazy and everything else rushing onto the bandwagon, an unassuming little outfit known as Blue Sky Productions gave notice to anyone who might have been paying attention that the blobber’s days were already numbered. This they did by taking a dungeon crawl off the grid. After that escalation in the gaming arms race, there was nothing for it but to finish whatever games in the old style were still in production and find a way to start making games in the new. Next time, then, we’ll turn our attention to the great leap forward that was Ultima Underworld.

(Sources: Computer Gaming World of April 1987, February 1991, June 1991, February 1992, March 1992, April 1992, November 1992, August 1993, November 1993, October 1994, October 1995, and February 1996; Amiga Format of December 1989, February 1992, March 1992, and May 1992; Questbusters of May 1991, March 1992, and December 1993; SynTax 22; The One of October 1990, August 1991, February 1992, October 1992, and February 1994. Online sources include Louis Castle’s interview for Soren Johnson’s Designer Notes podcast and Matt Barton’s interview with Peter Oliphant. Devotees of this sub-genre should also check out The CRPG Addict’s much more detailed takes on Bloodwych, Captive, Knightmare, Eye of the Beholder, Eye of the Beholder II, and Black Crypt.

The most playable of the games I’ve written about today, the Eye of the Beholder series and Lands of Lore: The Throne of Chaos, are available for purchase on GOG.com.)

Footnotes

Footnotes
1 If one takes the really long view, they didn’t, at least not forever. In 2012, as part of the general retro-revival that has resurrected any number of dead sub-genres over the past decade, a studio known as Almost Human released Legend of Grimrock, the first significant commercial game of this type to be seen in many years. It got positive reviews, and sold well enough to spawn a sequel in 2014. I’m afraid I haven’t played either of them, and so can’t speak to the question of whether either or both of them finally managed the elusive trick of outdoing Dungeon Master.
 
 

Tags: , , , , , , , , , , ,

Controlling the Spice, Part 3: Westwood’s Dune

Brett Sperry and Louis Castle

Louis Castle first became friends with Brett Sperry in 1982, when the two were barely out of high school. Castle was selling Apple computers at the time at a little store in his native Las Vegas, and Sperry asked him to print out a file for him. “I owned a printer, so I invited him over,” remembers Castle, “and he looked at some animation and programming I was working on.”

They found they had a lot in common. They were both Apple II fanatics, both talented programmers, and both go-getters accustomed to going above and beyond what was expected of them. Through Castle’s contacts at the store — the home-computer industry was quite a small place back then — they found work as contract programmers, porters who moved software from one platform to another. It wasn’t the most glamorous job in the industry, but, at a time when the PC marketplace was fragmented into close to a dozen incompatible platforms, it was certainly a vital one. Sperry and Castle eventually came to specialize in the non-trivial feat of moving slick action games such as Dragonfire and Impossible Mission from the Commodore 64 to the far less audiovisually capable Apple II without sacrificing all of their original appeal.

In March of 1985, they decided to give up working as independent contractors and form a real company, which they named Westwood Associates. The “Westwood” came from the trendy neighborhood of Los Angeles, around the UCLA campus, where they liked to hang out when they drove down from Las Vegas of a weekend. “We chose Westwood as the company name,” says Castle, “to capture some of the feeling of youthful energy and Hollywood business.” The “Associates,” meanwhile, was nicely non-specific, meaning they could easily pivot into other kinds of software development if the games work should dry up for some reason. (The company would become known as Westwood Studios in 1992, by which time it would be pretty clear that no such pivot would be necessary.)

The story of Westwood’s very first project is something of a harbinger of their future. Epyx hired them to port the hoary old classic Temple of Apshai to the sexy new Apple Macintosh, and Sperry and Castle got a bit carried away. They converted the game from a cerebral turn-based CRPG to a frenetic real-time action-adventure, only to be greeted with howls of protest from their employers. “Epyx felt,” remembers Castle with no small sense of irony, “that gamers would not want to make complicated tactical and strategic decisions under pressure.” More sensibly, Epyx noted that Westwood had delivered not so much a port as a different game entirely, one they couldn’t possibly sell as representing the same experience as the original. So, they had to begrudgingly switch it back to turn-based.

This blind alley really does have much to tell us about Westwood’s personality. Asked many years later what common thread binds together their dizzily eclectic catalog of games, Louis Castle hit upon real-time gameplay as the one reasonable answer. This love of immediacy would translate, as we’ll soon see, into the invention of a whole new genre known as real-time strategy, which would become one of the most popular of them all by the end of the 1990s.

But first, there were more games to be ported. Having cut their teeth making Commodore 64 games work within the constraints of the Apple II, they now found themselves moving them in the other direction: “up-porting” Commodore 64 hits like Super Cycle and California Games to the Atari ST and Commodore Amiga. Up-porting was in its way as difficult as down-porting; owners of those more expensive 16-bit machines expected their capabilities to be used to good effect, even by games that had originated on more humble platforms, and complained loudly at straight, vanilla ports that still looked like they were running on an 8-bit computer. Westwood became one of the best in the industry at a very tricky task, not so much porting their source games in any conventional sense as remaking them, with dramatically enhanced graphics and sound. They acquired a reputation for technical excellence, particularly when it came to their compression systems, which allowed them to pack their impressive audiovisuals into very little space and stream them in quickly from disk. And they made good use of the fact that the Atari ST and Amiga were both built around the same Motorola 68000 CPU by developing a library for the Amiga which translated calls to the ST’s operating system into their Amiga equivalents on the fly; thus they could program a game for the ST and get the same code running on the Amiga with very few changes. If you wanted an 8-to-16-bit port done efficiently and well, you knew you could count on Westwood.

Although they worked with quite a number of publishers, Westwood cultivated a particularly close relationship with SSI, a publisher of hardcore wargames who badly needed whatever pizazz Sperry and Castle’s flashier aesthetic could provide. When SSI wanted to convince TSR to give them the hugely coveted Dungeons & Dragons license in 1987, they hired Westwood to create some of the graphics demos for their presentation. The pitch worked; staid little SSI shocked the industry by snatching the license right out from under the noses of heavier hitters like Electronic Arts. Westwood remained SSI’s most trusted partner thereafter. They ported the  “Gold Box” line of Dungeons & Dragons CRPGs to the Atari ST and Amiga with their usual flair, adding mouse support and improving the graphics, resulting in what many fans consider to be the best versions of all.

Unfortunately, Westwood’s technical excellence wasn’t always paired with equally good design sense when they occasionally got a chance to make an original game of their own. Early efforts like Mars Saga, Mines of Titan, Questron II, and BattleTech: The Crescent Hawk’s Inception all have a lot of ideas that aren’t fully worked through and never quite gel, along with third acts that fairly reek of, “We’re out of time and money, and now we just have to get ‘er done.” Ditto the first two original games they did for SSI under the Dungeons & Dragons license: the odd California Games/Gold Box mashup Hillsfar and the even odder dragon flight simulator Dragon Strike.

Still, Brett Sperry and Louis Castle were two very ambitious young men, and neither was willing to settle for the anonymous life of a strict porting house. Nor did such a life make good business sense: with the North American market at least slowly coalescing around MS-DOS machines, it looked like porting houses might soon have no reason to exist. The big chance came when Sperry and Castle convinced SSI to let them make a full-fledged Dungeons & Dragons CRPG of their own — albeit one that would be very different from the slow-paced, turn-based Gold Box line. Westwood’s take on the concept would run in — you guessed it — real time, borrowing much from FTL’s Dungeon Master, one of the biggest sensations of the late 1980s on the Atari ST and Amiga. The result was Eye of the Beholder.

At the time of the game’s release in February of 1991, FTL had yet to publish an MS-DOS port of Dungeon Master. Eye of the Beholder was thus the first real-time dungeon crawl worth its salt to become available on North America’s computer-gaming platform of choice, and this fact, combined with the Dungeons & Dragons logo on the box, yielded sales of 130,000 copies in the United States alone — a sales figure far greater than that of any previous original Westwood game, greater even than all but the first two of SSI’s flagship Gold Box line. The era of Westwood as primarily a porting house had passed.


Over at Virgin Games, the indefatigable Martin Alper, still looking to make a splash in the American market, liked what he saw in Westwood, this hot American developer who clearly knew how to make the sorts of games Americans wanted to buy. And yet they were also long-established experts at getting the most out of the Amiga, Europe’s biggest gaming computer; Westwood would do their own port of Eye of the Beholder to the Amiga, in which form it would sell in considerable numbers in Europe as well. Such a skill set made the little Las Vegas studio immensely attractive to this executive of Virgin, a company of truly global reach and vision.

Alper knew as soon as he saw Eye of the Beholder that he wanted to make Westwood a permanent part of the Virgin empire, but, not wanting to spook his target, he approached them initially only to ask them to develop a game for him. As far as Alper or anyone else outside Virgin’s French subsidiary knew at this point, the Cryo Dune game was dead. But Alper hadn’t gone to all the trouble of securing the license not to use it. In April of 1991 — just one month before the departure of Jean-Martial Lefranc from Virgin Loisirs, combined with a routine audit, would bring the French Dune conspiracy to light — Alper signed Westwood to make a Dune game of their own. It wasn’t hard to convince them to take it on; it turned out that Dune was Brett Sperry’s favorite novel of all time.

Even better, Westwood, perhaps influenced by their association with the turn-based wargame mavens at SSI, had already been playing around with ideas for a real-time (of course!) game of military conflict. “It was an intellectual puzzle for me,” says Sperry. “How can we take this really small wargame category, bring in some fresh ideas, and make it a fun game that more gamers can play?” The theme was originally to be fantasy. But, says Louis Castle, “when Virgin offered up the Dune license, that sealed our fate and pulled us away from a fantasy theme.”

Several months later, after Martin Alper reluctantly concluded that Cryo’s Dune had already cost too much money and had too much potential of its own to cancel, he found himself with quite a situation on his hands. Westwood’s Dune hadn’t been in development anywhere near as long as Cryo’s, but he was already loving what he had seen of it, and was equally unwilling to cancel that project. In an industry where the average game frankly wasn’t very good at all, having two potentially great ones might not seem like much of a problem. For Virgin’s marketers, however, it was a nightmare. Their solution, which pleased neither Cryo nor Westwood much at all, was to bill the latter’s game as a sequel to the former’s, naming it Dune II: The Building of a Dynasty.

Westwood especially had good reason to feel disgruntled. They were understandably concerned that saddling their fresh, innovative new game with the label of sequel would cause it to be overlooked. The fact was, the sequel billing made no sense whatsoever, no matter how you looked at it. While both games were, in whole or in part, strategy games that ran in real time, their personalities were otherwise about as different as it was possible for two games to be. By no means could one imagine a fan of Cryo’s plot-heavy, literary take on Dune automatically embracing Westwood’s action-heavy, militaristic effort. Nor did the one game follow on from the other in the sense of plot chronology; both games depict the very same events from the novel, albeit with radically different sensibilities.

The press too was shocked to learn that a sequel to Cryo’s Dune was due to be released the very same year as its predecessor. “This has got to be a new world record for the fastest ever followup,” wrote the British gaming magazine The One a few weeks after the first Dune‘s release. “Unlike the more adventure-based original, Dune II is expected to be more of a managerial experience comparable to (if anything) the likes of SimCity, as the two warring houses of Atreides and Harkonnen attempt to mine as much spice as possible and blow each other up at the same time.”

The Westwood Studios team who made Dune II. On the front row are Ren Olsen and Dwight Okahara; on the middle row are Judith Peterson, Joe Bostic, Donna Bundy, and Aaron Powell; on the back row are Lisa Ballan and Scott Bowen. Of this group, Bostic and Powell were the game’s official designers, and thus probably deserve the most credit for inventing the genre of real-time strategy. Westwood’s co-founder Brett Sperry also played a critical — perhaps the critical — conceptual role.

It was, on the whole, about as good a description of Dune II as any that appeared in print at the time. Not only was the new game dramatically different from its predecessor, but it wasn’t quite like anything at all which anyone had ever seen before, and coming to grips with it wasn’t easy. Legend has it that Brett Sperry started describing Dune II in shorthand as “real-time strategy” very early on, thus providing a new genre with its name. If so, though, Virgin’s marketers didn’t get the memo. They would struggle mightily to describe the game, and what they ended up with took unwieldiness to new heights: a “strategy-based resource-management simulation with a heavy real-time combat element.” Whew! “Real-time strategy” does have a better ring to it, doesn’t it?

These issues of early taxonomy, if you will, are made intensely interesting by Dune II‘s acknowledged status as the real-time-strategy urtext. That is to say that gaming histories generally claim, correctly on the whole in my opinion, that it was the first real-time strategy game ever.

Yet we do need to be careful with our semantics here. There were actually hundreds of computerized strategy games prior to Dune II which happened to be played in real time, not least among them Cryo’s Dune. The neologism of “real-time strategy” (“RTS”) — like, say, those of “interactive fiction” or even “CRPG” — has a specific meaning separate from the meanings of the individual words which comprise it. It has come to denote a very specific type of game — a game that, yes, runs in real time, but also one where players start with a largely blank slate, gather resources, and use them to build a variety of structures. These structures can in turn build military units who can carry out simple orders of the “attack there” or “defend this” stripe autonomously. The whole game plays on an accelerated time scale which yields bursts if not sustained plateaus of activity as frantic as any action game. This combination of qualities is what Westwood invented, not the abstract notion of a strategy game played in real time rather than turns.

Of course, all inventions stand on the shoulders of those that came before, and RTS is no exception. It can be challenging to trace the bits and pieces which would gel together to become Dune II only because there are so darn many of them.

Utopia

The earliest strategy game to replace turns with real time may have been Utopia, an abstract two-player game of global conquest designed and programmed by Don Daglow for the Intellivision console in 1982. The same year, Dan Bunten’s [1]Dan Bunten died in 1998 as the woman Danielle Bunten Berry. As per my usual editorial policy on these matters, I refer to her as “he” and by her original name only to avoid historical anachronisms and to stay true to the context of the times. science-fiction-themed Cytron Masters and Chris Crawford’s Roman-themed Legionnaire became the first computer-based strategy games to discard the comfortable round of turns for something more stressful and exciting. Two years later, Brøderbund’s very successful The Ancient Art of War exposed the approach to more players than ever before.

In 1989, journalists started talking about a new category of “god game” in the wake of Will Wright’s SimCity and Peter Molyneux’s Populous. The name derived from the way that these games cast you as a god able to control your people only indirectly, by altering their city’s infrastructure in SimCity or manipulating the terrain around them in Populous. This control was accomplished in real time. While, as we’ve seen, this in itself was hardly a new development, the other innovations of these landmark games were as important to the eventual RTS genre as real time itself. No player can possibly micromanage an army of dozens of units in real time — at least not if the clock is set to run at anything more than a snail’s pace. For the RTS genre as we’ve come to know it to function, units must have a degree of autonomous artificial intelligence, must be able to carry out fairly abstract orders and react to events on the ground in the course of doing so. SimCity and Populous demonstrated for the first time how this could work.

By 1990, then, god games had arrived at a place that already bore many similarities to the RTS games of today. The main things still lacking were resource collecting and building. And even these things had to some extent already been done in non-god games: a 1987 British obscurity called Nether Earth demanded that you build robots in your factory before sending them out against your enemy, although there was no way of building new structures beyond your starting factory. Indeed, even the multiplayer death matches that would come to dominate so much of the RTS genre a generation later had already been pioneered before 1990, perhaps most notably in Dan Bunten’s 1988 game Modem Wars.

Herzog Zwei

But the game most often cited as an example of a true RTS in form and spirit prior to Dune II, if such a thing is claimed to exist at all, is one called Herzog Zwei, created by the Japanese developer Technosoft and first published for the Sega Genesis console in Japan in 1989. And yet Herzog Zwei‘s status as an alternative RTS urtext is, at the very least, debatable.

Players each start the game with a single main base, and an additional nine initially neutral “outposts” are scattered over the map. Players “purchase” units in the form of Transformers-like flying robots, which they then use to try to conquer outposts; controlling more of them yields more revenue, meaning one can buy more units more quickly. Units aren’t completely out of the player’s direct control, as in the case of SimCity and Populous, but are ordered about in a rather general way: stand and fight here, patrol this radius, retreat to this position or outpost. The details are then left to the unit-level artificial intelligence. For this reason alone, perhaps, Herzog Zwei subjectively feels more like an RTS than any game before it. But on the other hand, much that would come to mark the genre is still missing: resource collection is still abstracted away entirely, while there’s only one type of unit available to build, and no structures. In my opinion, Herzog Zwei is best seen as another of the RTS genre’s building blocks rather than an urtext.

The question of whether and to what extent Herzog Zwei influenced Dune II is a difficult one to answer with complete assurance. Brett Sperry and Louis Castle have claimed not to even have been aware of the Japanese game’s existence prior to making theirs. In fact, out of all of the widely acknowledged proto-RTS games I’ve just mentioned, they cite only Populous as a major influence. Their other three stated inspirations make for a rather counter-intuitive trio on the face of it: the 1984 Apple II game Rescue Raiders, a sort of Choplifter mated to a strategic wargame; the 1989 NEC TurboGrafx-16 game Military Madness, an abstract turn-based strategy game; and, later in the development process, Sid Meier’s 1991 masterpiece Civilization (in particular, the tech tree therein).

Muddying these waters, however, is an anecdote from Stephen Clarke-Willson, an executive in Virgin’s American offices during the early 1990s. He says that “everyone at the office was playing Herzog Zwei” circa April of 1991: “I was given the task of figuring out what to do with the Dune license since I’d read the book a number of times. I thought from a gaming point of view the real stress was the battle to control the spice, and that a resource-strategy game would be good.” Clarke-Willson further claims that from the outset “Westwood agreed to make a resource-strategy game based on Dune, and agreed to look at Herzog Zwei for design ideas.” Sperry and Castle, by contrast, describe a far more open-ended agreement that called for them simply to make something interesting out of the license, allowing the specifics of their eventual Dune to arise organically from the work they had already started on their fantasy-themed real-time wargame.

For what it’s worth, neither Sperry nor Castle has a reputation for dishonesty. Quite the opposite, in fact: Westwood throughout its life stood out as a bastion of responsibility and stability in an industry not much known for either. So, whatever the true facts may be, we’re better off ascribing these contradictory testimonies to the vagaries of memories than to disingenuousness. Certainly, regardless of the exact influences that went into it, Dune II has an excellent claim to the title of first RTS in the modern neologism’s sense. This really was the place where everything came together and a new genre was born.

In the novel of Dune, the spice is the key to everything. In the Westwood game, even in the absence of almost everything else that makes the novel memorable, the same thing is true. The spice was, notes Louis Castle, “very adaptable to this harvest, grow, build for war, attack gambit. That’s really how [Dune II] came about.” Thus was set up the gameplay loop that still defines the RTS genre to this day — all stemming from a novel published in 1965.

The overarching structure of Dune II is also far more typical of the games of today than those of its peers in the early 1990s. You play a “campaign” consisting of nine scenarios, linked by snippets of narrative, that grow progressively more difficult. There are three of these campaigns to choose from, depicting the war for Arrakis from the standpoint of House Atreides, House Harkonnen, and House Ordos — the last being a cartel of smugglers who don’t appear in the novel at all, having been invented for a non-canonical 1984 source book known as The Dune Encyclopedia. In addition to a different narrative, each faction has a slightly different slate of structures and units at its command.

There’s the suggestion of a more high-level strategic layer joining the scenarios together: between scenarios, the game lets you choose your next target for attack by clicking on a territory on a Risk-like map of the planet. Nothing you do here can change the fixed sequence of scenario goals and opposing enemy forces the game presents, but it does change the terrain on which the subsequent scenario takes place, thus adding a bit more replayability for the true completionists.

You begin a scenario with a single construction yard, a handful of pre-built units, and a sharply limited initial store of spice, that precious resource from which everything else stems. Fog of war is implemented; in the beginning, you can see only the territory that immediately surrounds your starting encampment. You’ll thus want to send out scouts immediately, to find deposits of spice ripe for harvesting and to learn where the enemy is.

While your scouts go about their business, you’ll want to get an economy of sorts rolling back at home. The construction yard with which you begin can build any structure available in a given scenario, although it’s advisable to first build a “concrete slab” to serve as its foundation atop the shifting sands of Arrakis. The first real structure you’re likely to build is a “wind trap” to provide power to those that follow. Then you’ll want a “spice refinery,” which comes complete with a unit known as a “harvester,” able to collect spice from the surrounding territory and return it to the refinery to become the stuff of subsequent building efforts. Next you’ll probably want an “outpost,” which not only lets you see much farther into the territory around your base without having to deploy units there but is a prerequisite for building any new units at all. After your outpost is in place, building each type of unit requires its own kind of structure, from a “barracks” for light infantry (read: cannon fodder) to a “high tech factory” for the ultimate weapon of airpower. Naturally, more powerful units are more expensive, both in terms of the spice required to build the structures that produce them and that required to build the units themselves afterward.

Your real goal, of course, is to attack and overwhelm the enemy — or, in some later scenarios, enemies — before he or they have the chance to do the same to you. There’s a balancing act here that one could describe as the central dilemma of the game. Just how long do you concentrate on building up your infrastructure and military before you throw your units into battle? Wait too long and the enemy could get overwhelmingly powerful before you cut him down to size; attack too soon and you could be defeated and left exposed to counterattack, having squandered the units you now need for defense. The amount of spice on the map is another stress point. The spice deposits are finite; once they’re gone, they’re gone, and it’s up to whatever units are left to battle it out. Do you stake your claim to that juicy spice deposit just over the horizon right now? Or do you try to eliminate that nearby enemy base first?

If you’ve played any more recent RTS games at all, all of this will sound thoroughly familiar. And, more so than anything else I could write here, it’s this sense of familiarity, clinging as it does to almost every aspect of Dune II, which crystallizes the game’s influence and importance. The only substantial piece of the RTS puzzle that’s entirely missing here is the multiplayer death match; this game is single-player only, lacking the element that for many is the most appealing of all about the RTS genre. Otherwise, though, the difference between this and more modern RTS games is in the details rather than the fundamentals. This anointed first example of an RTS is a remarkably complete example of the breed. All the pieces are here, and all the pieces fit together as we’ve come to expect them to.

So much for hindsight. As for foresight…

Upon its release in the fall of 1992, Dune II was greeted, like its predecessor from Cryo, with positive reviews, but with none of the fanfare one might expect for a game destined to go down in history as such a revolutionary genre-spawner. Computer Gaming World called it merely “a gratifying experience,” while The One was at least a bit more effusive, with the reviewer pronouncing it “one of the most absorbing games I’ve come across.” Yet everyone regarded it as just another fun game at bottom; no one had an inkling that it would in time birth a veritable new gaming subculture. It sold well enough to justify its development, but — very probably thanks in part to its billing as a sequel to a game with a completely different personality, which had itself only been on the market a few months — it never threatened Eye of the Beholder for the crown of Westwood’s biggest hit to date.

Nor did it prompt an immediate flood of games in the same mold, whether from Westwood or anyone else. The next notable example of the budding genre, Blizzard’s Warcraft, wouldn’t appear until late 1994. That title would be roundly mocked by the gaming intelligentsia for its similarities to Dune IIComputer Gaming World would call it “a perfect bit of creative larceny” — but it would sell much, much better, well and truly setting the flame to the RTS torch. To many Warcraft fans, Westwood would seem like the bandwagon jumpers when they belatedly returned to the genre they had invented with 1995’s Command & Conquer.

By the time that happened, Westwood would be a very different place. Just as they were finishing up Dune II, Louis Castle got a call from Richard Branson himself. “Hello, Louis, this is Richard. I’d like to buy your company.”

“I didn’t know it was for sale,” replied Castle.

“In my experience, everything is for sale!”

And, indeed, notwithstanding their unhappiness about Dune II‘s sequel billing, Brett Sperry and Louis Castle sold out to Virgin, with the understanding that their new parent company would stay out of their hair and let them make the games they wanted to make, holding them accountable only on the basis of the sales they generated. Unlike so many merger-and-acquisition horror stories, Westwood would have a wonderful relationship with Virgin and Martin Alper, who provided the investment they needed to thrive in the emerging new era of CD-ROM-based, multimedia-heavy gaming. We’ll doubtless be meeting Sperry, Castle, and Alper again in future articles.


Looked upon from the perspective of today, the two Dune games of 1992 make for an endlessly intriguing pairing, almost like an experiment in psychology or sociology. Not only did two development teams set out to make a game based on the same subject matter, but they each wound up with a strategy game running in real time. And yet the two games could hardly be more different.

In terms of historical importance, there’s no contest between the two Dunes. While Cryo’s Dune had no discernible impact on the course of gaming writ large, Westwood’s is one of the most influential games of the 1990s. A direct line can be traced from it to games played by tens if not hundreds of millions of people all over the world today. “He who controls the spice, controls the universe,” ran the blurb on the front cover of millions of Dune paperbacks and movie posters. Replace “spice” with the resource of any given game’s choice, and the same could be stated as the guiding tenet of the gaming genre Dune birthed.

And yet I’m going to make the perhaps-surprising claim that the less-heralded first Dune is the more enjoyable of the two to play today. Its fusion of narrative and strategy still feels bracing and unique. I’ve never seen another game which plays quite like this one, and I’ve never seen another ludic adaptation that does a better job of capturing the essential themes and moods of its inspiration.

Dune II, by contrast, can hardly be judged under that criterion at all, given that it’s just not much interested in capturing any of the subtleties of Herbert’s novel; it’s content to stop at “he who controls the spice controls the universe.” Judged on its own terms, meanwhile, strictly as a game rather than an adaptation, it’s become the ironic victim of its own immense influence. I noted earlier that all of the pieces of the RTS genre, with the exception only of the multiplayer death match, came together here for the first time, that later games would be left to worry only about the details. Yet it should also be understood that those details are important. The ability to give orders to groups of units; the ability to give more complex orders to units; ways to get around the map more quickly and easily; higher-resolution screens able to show more of the map at one time; a bigger variety of unit types, with greater variance between opposing factions; more varied and interesting scenarios and terrains; user-selectable difficulty levels (Dune II often seems to be stuck on “Brutal”)… later games would do all of this, and so much more besides. Again, these things do matter. Playing Dune II today is like playing your favorite RTS game stripped down to its most basic foundation. For a historian or a student of game design, that’s kind of fascinating. For someone who just wants to play a fun game, it’s harder to justify.

Still, none of this should detract from the creativity and sheer technical chops that went into realizing Dune II in its own time. Most gaming genres require some iteration to work out the kinks and hone the experience. The RTS genre in particular has been so honed by such a plethora of titles, all working within such a sharply demarcated set of genre markers, that Dune II is bound to seem like a blunt instrument indeed when we revisit it today.

So, there you have it: two disparate Dune games, both inspired and worthy, but in dramatically different ways. Dune as evocative storytelling experience or Dune as straightforward interactive ultra-violence? Take your pick. The choice seems appropriate for a novel that’s been pulled back and forth along much the same axis ever since its first publication in 1965. Does it have a claim to the mantle of High Literature or is it “just” an example of a well-crafted genre novel? Take your pick. The same tension shows itself in the troubled history of Dune as movie, in the way it could attract both filmmakers who pursued — or at least believed themselves to be pursuing — a higher artistic calling, like Alejandro Jodorowsky, and purveyors of the massiest of mass-market entertainments, like Arthur P. Jacobs. Dune as art film or Dune as blockbuster? Take your pick — but please, choose one or the other. Dino and Raffaella De Laurentiis, the first people to get an actual Dune film made, tried to split the difference, making it through a mainstream Hollywood studio with a blockbuster-sized budget, but putting all those resources in the hands of a director of art films. As we’ve seen, the result of that collision of sensibilities was unsatisfying to patrons of multiplexes and art-house theaters alike.

In that light, perhaps it really was for the best that Virgin wound up accidentally releasing two Dune games. Cryo’s Dune locked down the artsier side of Dune‘s split media personality, while Westwood’s was just good fun, satisfying the timeless urge of gamers to blow stuff up in entertaining ways. Thanks to a colossal bureaucratic cock-up at Virgin, there is, one might say, a Dune game for every Dune reader. Which one really is “better” is an impossible question to answer in the end. I’ve stated my opinion, but I have no doubt that plenty of you readers could make an equally compelling case in the other direction. So, vive la différence! With all due apologies to Frank Herbert, variety is the real spice of life.

(Sources: Computer Gaming World of April 1993, August 1993, and January 1995; Game Developer of June 2001; The One of October 1992, January 1993, and July 1993; Retro Gamer 90; Westwood Studios’s customer newsletter dated Fall 1992. Online sources include Louis Castle’s interview for Soren Johnson’s Designer Notes podcast, “Retro Throwback: Dune 2 by Cole Machin on CGM, “Build, gather, brawl, repeat: The history of real-time strategy games” by Richard Moss on Ars Technica, “A New Dawn: Westwood Studios 15th Anniversary” by Geoff Keighly with Amer Ajami on GameSpot, and “The Origin of Realtime Strategy Games on the PC” by Stephen Clarke Willson on his blog Random Blts.

Feel free to download Dune II from right here, packaged so as to make it as easy as possible to get running using your chosen platform’s version of DOSBox.)

Footnotes

Footnotes
1 Dan Bunten died in 1998 as the woman Danielle Bunten Berry. As per my usual editorial policy on these matters, I refer to her as “he” and by her original name only to avoid historical anachronisms and to stay true to the context of the times.
 
 

Tags: , ,

Opening the Gold Box, Part 5: All That Glitters is Not Gold

SSI entered 1989 a transformed company. What had been a niche maker of war games for grognards had now become one of the computer-game industry’s major players thanks to the first fruits of the coveted TSR Dungons & Dragons license. Pool of Radiance, the first full-fledged Dungeons & Dragons CRPG and the first in a so-called “Gold Box” line of same, was comfortably outselling the likes of Ultima V and The Bard’s Tale III, and was well on its way to becoming SSI’s best-selling game ever by a factor of four. To accommodate their growing employee rolls, SSI moved in 1989 from their old offices in Mountain View, California, which had gotten so crowded that some people were forced to work in the warehouse using piles of boxed games for desks, to much larger, fancier digs in nearby Sunnyvale. Otherwise it seemed that all they had to do was keep on keeping on, keep on riding Dungeons & Dragons for all it was worth — and, yes, maybe release a war game here and there as well, just for old times’ sake.

One thing that did become more clear than ever over the course of the year, however, was that not all Dungeons & Dragons products were created equal. Dungeon Masters Assistant Volume II: Characters & Treasures sold just 13,516 copies, leading to the quiet ending of the line of computerized aids for the tabletop game that had been one of the three major pillars of SSI’s original plans for Dungeons & Dragons. A deviation from that old master plan called War of the Lance, an attempt to apply SSI’s experience with war games to TSR’s Dragonlance campaign setting, did almost as poorly, selling 15,255 copies. Meanwhile the second of the “Silver Box” line of action-oriented games that made up the second of the pillars continued to perform well: Dragons of Flame sold 55,711 copies. Despite that success, though, 1989 would also mark the end of the line for the Silver Box, due to a breakdown in relations with the British developers behind those games. Going into the 1990s, then, Dungeons & Dragons on the computer would be all about the Gold Box line of turn-based traditional CRPGs, the only one of SSI’s three pillars still standing.

Thankfully, what Pool of Radiance had demonstrated in 1988 the events of 1989 would only confirm. What players seemed to hunger for most of all in the context of Dungeons & Dragons on the computer was literally Dungeons & Dragons on the computer: big CRPGs that implemented as many of the gnarly details of the rules as possible. Even Hillsfar, a superfluous and rather pointless sort of training ground for characters created in Pool of Radiance, sold 78,418 copies when SSI released it in March as a stopgap to give the hardcore something to do while they waited for the real Pool sequel.

Every female warrior knows that cleavage is more important than protection, right?

They didn’t have too long to wait. The big sequel dropped in June in the form of Curse of the Azure Bonds, and it mostly maintained the high design standard set by Pool of Radiance. Contrarians could and did complain that the free-roaming wilderness map of its predecessor had been replaced by a simple menu of locations to visit, but for this player anyway Pool‘s overland map always felt more confusing than necessary. A more notable loss in my view is the lack of any equivalent in Curse to the satisfying experience of slowly reclaiming the village of Phlan block by block from the forces of evil in Pool, but that brilliant design stroke was perhaps always doomed to be a one-off. Ditto Pool‘s unique system of quests to fulfill, some of them having little or nothing to do with the main plot.

What players did get in Curse of the Azure Bonds was the chance to explore a much wider area around Phlan with the same characters they had used last time, fighting a selection of more powerful and interesting monsters appropriate to their party’s burgeoning skills. At the beginning of the game, the party wakes up with a set of tattoos on their bodies —  the “azure bonds” of the title — and no memory of how they got there. (I would venture to guess that many of us have experienced something similar at one time or another…) It turns out that the bonds can be used to force the characters to act against their own will. Thus the quest is on to get them removed; each of the bonds has a different source, corresponding to a different area you will need to visit and hack and slash your way through in order to have it removed. By the end of Curse, your old Pool characters — or the new ones you created just for this game, who start at level 5 — will likely be in the neighborhood of levels 10 to 12, just about the point in Dungeons & Dragons where leveling up begins to lose much of its interest.

TSR was once again heavily involved in the making of Curse of the Azure Bonds, if not quite to the same extent as Pool of Radiance. As they had for Pool, they provided for Curse an official tie-in novel and tabletop adventure module. I can’t claim to have understood all of the nuances of the plot, such as they are, when I played the game; a paragraph book is once again used, but much of what I was told to read consisted of people that I couldn’t remember or never knew who they were babbling on about stuff I couldn’t remember or never knew what it was. But then, I know nothing about the Forgotten Realms setting other than what I learned in Pool of Radiance and never read the novel, so I’m obviously not the ideal audience. (Believe me, readers, I’ve done some painful things for this blog, but reading a Dungeons & Dragons novel was just a bridge too far…) Still, my cluelessness never interfered with my pleasure in mapping out each area and bashing things with my steadily improving characters; the standard of design in Curse remains as high as the writing remains breathlessly, entertainingly overwrought. Curse of the Azure Bonds did almost as well as its predecessor for SSI, selling 179,795 copies and mostly garnering the good reviews it deserved.

It was only with the third game of the Pool of Radiance series, 1990’s Secret of the Silver Blades, that some of the luster began to rub off of the Gold Box in terms of design, if not quite yet in that ultimate metric of sales. The reasons that Secret is regarded as such a disappointment by so many players — it remains to this day perhaps the least liked of the entire Gold Box line — are worth dwelling on for a moment.

One of the third game’s problems is bound up inextricably with the Dungeons & Dragons rules themselves. Secret of the Silver Blades allows you to take your old party from Pool of Radiance and/or Curse of the Azure Bonds up to level 15, but by this stage gaining a level is vastly less interesting than it was back in the day. Mostly you just get a couple of hit points, some behind-the-scenes improvements in to-hit scores, and perhaps another spell slot or two somewhere. Suffice to say that there’s no equivalent to, say, that glorious moment when you first gain access to the Fireball spell in Pool of Radiance.

The tabletop rules suggest that characters who reach such high levels should cease to concern themselves with dungeon delving in lieu of building castles and becoming generals or political leaders. Scorpia, Computer Gaming World‘s adventure and CRPG columnist, was already echoing these sentiments in the context of the Pool of Radiance series at the conclusion of her article on Curse of the Azure Bonds: “Characters have reached (by game’s end) fairly high levels, where huge amounts of experience are necessary to advance. If character transfer is to remain a part of the series (which I certainly hope it does), then emphasis needs to be placed on role-playing, rather than a lot of fighting. The true heart of AD&D is not rolling the dice, but the relationship between the characters and their world.” But this sort of thing, of course, the Gold Box engine was utterly unequipped to handle. In light of this, SSI probably should have left well enough alone, making Curse the end of the line for the Pool characters, but players were strongly attached to the parties they’d built up and SSI for obvious reasons wanted to keep them happy. In fact, they would keep them happy to the tune of releasing not just one but two more games which allowed players to use their original Pool of Radiance parties. By the time these characters finally did reach the end of the line, SSI would have to set them against the gods themselves in order to provide any semblance of challenge.

But by no means can all of the problems with Secret of the Silver Blades be blamed on high-level characters. The game’s other issues provide an interesting example of the unanticipated effects which technical affordances can have on game design, as well as a snapshot of changing cultures within both SSI and TSR.

A Gold Box map is built on a grid of exactly 16 by 16 squares, some of which can be “special” squares. When the player’s party enters one of the latter, a script runs to make something unusual happen — from something as simple as some flavor text appearing on the screen to something as complicated as an encounter with a major non-player character. The amount of special content allowed on any given map is restricted, however, by a limitation, stemming from the tiny memories of 8-bit machines like the Commodore 64 and Apple II, on the total size of all of the scripts associated with any given map.

One of the neat 16 by 16 maps found in Pool of Radiance and Curse of the Azure Bonds.

The need for each map to be no larger than 16 by 16 squares couldn’t help but have a major effect on the designs that were implemented with the Gold Box engine. In Pool of Radiance, for example, the division of the city of Phlan into a set of neat sections, to be cleared out and reclaimed one by one, had its origins as much in these technical restrictions as it did in design methodology. In that case it had worked out fantastically well, but by the time development began on Secret of the Silver Blades all those predictably uniform square maps had begun to grate on Dave Shelley, that game’s lead designer. Shelley and his programmers thus came up with a clever way to escape the system of 16 by 16 dungeons.

One of the things a script could do was to silently teleport the player’s party to another square on the map. Shelley and company realized that by making clever use of this capability they could create dungeon levels that gave the illusion of sprawling out wildly and asymmetrically, like real underground caverns would. Players who came into Secret of the Silver Blades expecting the same old 16 by 16 grids would be surprised and challenged. They would have to assume that the Gold Box engine had gotten a major upgrade. From the point of view of SSI, this was the best kind of technology refresh: one that cost them nothing at all. Shelley sketched out a couple of enormous underground complexes for the player to explore, each larger almost by an order of magnitude than anything that had been seen in a Gold Box game before.

A far less neat map from Secret of the Silver Blades. It may be more realistic in its way, but which would you rather try to draw on graph paper? It may help you to understand the scale of this map to know that the large empty squares at the bottom and right side of this map each represent a conventional 16 by 16 area like the one shown above.

But as soon as the team began to implement the scheme, the unintended consequences began to ripple outward. Because the huge maps were now represented internally as a labyrinth of teleports, the hugely useful auto-map had to be disabled for these sections. And never had the auto-map been needed more, for the player who dutifully mapped the dungeons on graph paper could no longer count on them being a certain size; they were constantly spilling off the page, forcing her to either start over or go to work on a fresh page stuck onto the old with a piece of tape. Worst of all, placing all of those teleports everywhere used just about all of the scripting space that would normally be devoted to providing other sorts of special squares. So, what players ended up with was an enormous but mind-numbingly boring set of homogeneous caverns filled with the same handful of dull random-monster encounters, coming up over and over and over. This was not, needless to say, an improvement on what had come before. In fact, it was downright excruciating.

At the same time that this clever technical trick was pushing the game toward a terminal dullness, other factors were trending in the same direction. Shelley himself has noted that certain voices within SSI were questioning whether all of those little extras found in Pool of Radiance and Curse of the Azure Bonds, like the paragraph books and the many scripted special encounters, were really necessary at all — or, at the least, perhaps it wasn’t necessary to do them with quite so much loving care. SSI was onto a good thing with these Gold Box games, said these voices — found mainly in the marketing department — and they ought to strike while the iron was hot, cranking them out as quickly as possible. While neither side would entirely have their way on the issue, the pressure to just make the games good enough rather than great in order to get them out there faster can be sensed in every Gold Box game after the first two. More and more graphics were recycled; fewer and fewer of those extra, special touches showed up. SSI never fully matched Pool of Radiance, much less improved on it, over the course of the ten Gold Box games that followed it. That SSI’s founder and president Joel Billings, as hardcore a gamer as any gaming executive ever, allowed this stagnation to take root is unfortunate, but isn’t difficult to explain. His passion was for the war games he’d originally founded SSI to make; all this Dungeons & Dragons stuff, while a cash cow to die for, was largely just product to him.

A similar complaint could be levied — and has been levied, loudly and repeatedly, by legions of hardcore Dungeons & Dragons fans over the course of decades — against Lorraine Williams, the wealthy heiress who had instituted a coup against Gary Gygax in 1985 to take over TSR. The idea that TSR’s long, slow decline and eventual downfall is due solely to Williams is more than a little dubious, given that Gygax and his cronies had already done so much to mismanage the company down that path before she ever showed up. Still, her list of wise strategic choices, at least after her very wise early decision to finally put Dungeons & Dragons on computers, is not a long one.

At the time they were signing the contract with SSI, TSR had just embarked on the most daunting project in the history of the company: a project to reorganize the Advanced Dungeons & Dragons rules, which had sprawled into eight confusing and sometimes contradictory hardcover books by that point, into a trio of books of relatively streamlined and logically organized information, all of it completely rewritten in straightforward modern English (as opposed to the musty diction of Gary Gygax, which read a bit like a cross of Samuel Johnson with H.P. Lovecraft). The fruits of the project appeared in 1989 in the form of a second-edition Player’s Handbook, Dungeon Master’s Guide, and Monstrous Compendium.

And then, right after expending so much effort to clean things up, TSR proceeded to muddy the second-edition waters even more indiscriminately than they had those of the first edition. Every single character class got its own book, and players with a hankering to play Dungeons & Dragons as a Viking or one of Charlemagne’s paladins were catered to. Indeed, TSR went crazy with campaign settings. By 1993, boxed sets were available to let you play in the Forgotten Realms, in the World of Greyhawk, or in Dragonlance‘s world of Krynn, or to play the game as a Jules Verne-esque science-fiction/fantasy hybrid called Spelljammer. You could also play Dungeons & Dragons as Gothic horror if you bought the Ravenloft set, as vaguely post-apocalyptic dark fantasy if you bought Dark Sun, as a set of tales from the Arabian Nights if you bought Al-Qadim, or as an exercise in surreal Expressionism worthy of Alfred Kubin if you bought Planescape.

Whatever the artistic merits behind all these disparate approaches — and some of them did, it should be said, have much to recommend them over the generic cookie-cutter fantasy that was vanilla Dungeons & Dragons — the commercial pressures that led Lorraine Williams to approve this glut of product aren’t hard to discern. The base of tabletop Dungeons & Dragons players hadn’t grown appreciably for many years. Just the opposite, in fact: it’s doubtful whether even half as many people were actively playing Dungeons & Dragons in 1990 as at the height of the brief-lived fad for the game circa 1982. After the existing player base had dutifully rushed out to buy the new second-edition core books, in other words, very few new players were discovering the game and thus continuing to drive their sales. Unless and until they could find a way to change that situation, the only way for TSR to survive was to keep generating gobs of new product to sell to their existing players. Luckily for them, hardcore Dungeons & Dragons players were tremendously loyal and tremendously dedicated to their hobby. Many would buy virtually everything TSR put out, even things that were highly unlikely ever to make it to their gaming tables, just out of curiosity and to keep up with the state of the art, as it were. It would take two or three years for players to start to evince some fatigue with the sheer volume of product pouring out of TSR’s Lake Geneva offices, much of it sorely lacking in play-testing and basic quality control, and to start giving large swathes of it a miss — and that, in turn, would spell major danger for TSR’s bottom line.

Lorraine Williams wasn’t unaware of the trap TSR’s static customer base represented; on the contrary, she recognized as plainly as anyone that TSR needed to expand into new markets if it was to have a bright long-term future. She made various efforts in that direction even as her company sustained itself by flooding the hardcore Dungeons & Dragons market. In fact, the SSI computer games might be described as one of these efforts — but even those, successful as they were on their own terms, were still playing at least partially to that same old captive market. In 1989, Williams opened a new TSR office on the West Coast in an attempt to break the company out of its nerdy ghetto. Run by Flint Dille, Williams’s brother, one of TSR West’s primary goals was to get Dungeons & Dragons onto television screens or, better yet, onto movie screens. Williams was ironically pursuing the same chimera that her predecessor Gary Gygax — now her sworn, lifetime arch-enemy — had so zealously chased. She was even less successful at it than he had been. Whereas Gygax had managed to get a Saturday morning cartoon on the air for a few seasons, Flint Dille’s operation managed bupkis in three long years of trying.

Another possible ticket to the mainstream, to be pursued every bit as seriously in Hollywood as a Dungeons & Dragons deal, was Buck Rogers, the source of the shared fortune of Lorraine Williams and Flint Dille. Their grandfather had been John F. Dille, owner of a newspaper syndicator known as the National Newspaper Service. In this capacity, the elder Dille had discovered the character that would become Buck Rogers — at the time, he was known as Anthony Rogers — in Armageddon 2419 A.D., a pulp novella written by Philip Francis Nowlan and published in Amazing Stories in 1928. Dille himself had come up with the nickname of “Buck” for the lead character, and convinced Nowlan to turn his adventures in outer space into a comic strip for his syndicator. It ended up running from 1929 until 1967 — only the first ten of those years under the stewardship of Nowlan — and was also turned into very popular radio and movie serials during the 1930s, the height of the character’s popularity. Having managed to secure all of the rights to Buck from a perhaps rather naive Nowlan, John Dille and his family profited hugely.

In marked contrast to her attitude toward TSR’s other intellectual properties, Lorraine Williams’s determination to return Buck Rogers to the forefront of pop culture was apparently born as much from a genuine passion for her family’s greatest legacy as it was from the dispassionate calculus of business. In addition to asking TSR West to lobby — once again fruitlessly, as it would transpire — for a Buck Rogers revival on television or film, she pushed a new RPG through the pipeline, entitled Buck Rogers XXVc and published in 1990. TSR supported the game fairly lavishly for several years in an attempt to get it to take off, releasing source books, adventure modules, and tie-in novels to little avail. With all due deference to Buck Rogers’s role as a formative influence on Star Wars among other beloved contemporary properties, in the minds of the Dungeons & Dragons generation it was pure cheese, associated mainly with the Dille family’s last attempt to revive the character, the hilariously campy 1979 television series Buck Rogers in the 25th Century. The game might have had a chance with some players had Williams been willing to recognize the cheese factor and let her designers play it up, but taken with a straight face? No way.

SSI as well was convinced — or coerced — to adapt the Gold Box engine from fantasy to science fiction for a pair of Buck Rogers computer games, 1990’s Countdown to Doomsday and 1992’s Matrix Cubed. SSI’s designers must have breathed a sigh of relief when they saw that the rules for the Buck Rogers tabletop RPG, much more so than any of TSR’s previous non-Dungeons & Dragons RPGs, had been based heavily on those of the company’s flagship game; thus the process of adaptation wasn’t quite so onerous as it might otherwise have been. That said, most agree that the end results are markedly less interesting than the other Gold Box games when it comes to combat, the very thing at which the engine normally excels; a combat system designed to include magic becomes far less compelling in its absence. Benefiting doubtless from its association with the Dungeons & Dragons Gold Box line, for which enthusiasm remained fairly high, the first Buck Rogers game sold a relatively healthy 51,528 copies; the second managed a somewhat less healthy 38,086 copies.

All of these competing interests do much to explain why TSR, after involving themselves so closely in the development of Pools of Radiance and Curse of the Azure Bonds, withdrew from the process almost entirely after those games and just left SSI to it. And that fact in turn is yet one more important reason why the Gold Box games not only failed to evolve but actually devolved in many ways. TSR’s design staff might not have had a great understanding of computer technology, but they did understand their settings and rules, and had pushed SSI to try to inject at least a little bit of what made for a great tabletop-role-playing experience into the computer games. Absent that pressure, SSI was free to fall back on what they did best — which meant, true to their war-game roots, lots and lots of combat. In both Pool and Curse, random encounters cease on most maps after you’ve had a certain number of them — ideally, just before they get boring. Tellingly, in Secret of the Silver Blades and most of the other later Gold Box games that scheme is absent. The monsters just keep on coming, ad infinitum.

Despite lukewarm reviews that were now starting to voice some real irritation with the Gold Box line’s failure to advance, Secret of the Silver Blades was another huge hit, selling 167,214 copies. But, in an indication that some of those who purchased it were perhaps disappointed enough by the experience not to continue buying Gold Box games, it would be the last of the line to break the 100,000-copy barrier. The final game in the Pool of Radiance series, Pools of Darkness, sold just 52,793 copies upon its release in 1991.

In addition to the four-game Pool series, SSI also released an alternate trilogy of Dungeons & Dragons Gold Box games set in Krynn, the world of the Dragonlance setting. Champions of Krynn was actually released before Secret of the Silver Blades, in January of 1990, and sold 116,693 copies; Death Knights of Krynn was released in 1991 and sold 61,958 copies; and The Dark Queen of Krynn, the very last Gold Box game, was released in 1992 and sold 40,640 copies. Another modest series of two games was developed out-of-house by Beyond Software (later to be renamed Stormfront Studios): Gateway to the Savage Frontier (1991, 62,581 copies sold) and Treasures of the Savage Frontier (1992, 31,995 copies sold). In all, then, counting the two Buck Rogers games but not counting the oddball Hillsfar, SSI released eleven Gold Box games over a period of four years.

While Secret of the Silver Blades still stands as arguably the line’s absolute nadir in design terms, the sheer pace at which SSI pumped out Gold Box games during the latter two years of this period in particular couldn’t help but give all of them a certain generic, interchangeable quality. It all began to feel a bit rote — a bit cheap, in stark contrast to the rarefied atmosphere of a Big Event that had surrounded Pool of Radiance, a game which had been designed and marketed to be a landmark premium product and had in turn been widely perceived as exactly that. Not helping the line’s image was the ludicrous knockoff-Boris Vallejo cover art sported by so many of the boxes, complete with lots of tawny female skin and heaving bosoms. Susan Manley has described the odd and somewhat uncomfortable experience of being a female artist asked to draw this sort of stuff.

They pretty much wanted everybody [female] to be the chainmail-bikini babes, as we called them. I said, “Look, not everybody wants to be a chainmail-bikini babe.” They said, “All the guys want that, and we don’t have very many female players.” I said, “You’re never going to have female players if you continue like this. Functional armor that would actually protect people would play a little bit better.”

Tom [Wahl, SSI’s lead artist] and I actually argued over whether my chest size was average or not, which was an embarrassing conversation to have. He absolutely thought that everybody needed to look like they were stepping out of a Victoria’s Secret catalog if they were female. I said, “Gee, how come all the guys don’t have to be super-attractive?” They don’t look like they’re off of romance-novel covers, let’s put it that way. They get to be rugged, they get to be individual, they get to all have different costumes. They get to all have different hairstyles, but the women all had to have long, flowing locks and lots of cleavage.

By 1991, the Gold Box engine was beginning to seem rather like a relic from technology’s distant past. In a sense, the impression was literally correct. When SSI had begun to build the Gold Box engine back in 1987, the Commodore 64 had still ruled the roost of computer gaming, prompting SSI to make the fateful decision not only to make sure the Gold Box games could run on that sharply limited platform, but also to build most of their development tools on it. Pool of Radiance then appeared about five minutes before the Commodore 64’s popularity imploded in the face of Nintendo. The Gold Box engine did of course run on other platforms, but it remained throughout its life subject to limitations born of its 8-bit origins — things like the aforementioned maps of exactly 16 by 16 squares and the strict bounds on the amount of custom scripting that could be included on a single one of those maps. Even as the rest of the industry left the 8-bit machines behind in 1989 and 1990, SSI was reluctant to do so in that the Commodore 64 still made up a major chunk of Gold Box sales: Curse of the Azure Bonds sold 68,622 copies on the Commodore 64, representing more than a third of its total sales, while Secret of the Silver Blades still managed a relatively healthy 40,425 Commodore 64 versions sold. Such numbers likely came courtesy of diehard Commodore 64 owners who had very few other games to buy in an industry that was moving more and more to MS-DOS as its standard platform. SSI was thus trapped for some time in something of a Catch-22, wanting to continue to reap the rewards of being just about the last major American publisher to support the Commodore 64 but having to compromise the experience of users with more powerful machines in order to do so.

SSI had managed to improve the Gold Box graphics considerably by the time of The Dark Queen of Krynn, the last game in the line.

When SSI finally decided to abandon the Commodore 64 in 1991, they did what they could to enhance the Gold Box engine to take advantage of the capabilities of the newer machines, introducing more decorative displays and pictures drawn in 256-color VGA along with some mouse support. Yet the most fundamental limitations changed not all; the engine was now aged enough that SSI wasn’t enthused about investing in a more comprehensive overhaul. And thus the Gold Box games seemed more anachronistic than ever. As SSI’s competitors worked on a new generation of CRPGs that took advantage of 32-bit processors and multi-megabyte memories, the Gold Box games remained the last surviving relics of the old days of 8 bits and 64 K. Looking at The Dark Queen of Krynn and the technical tour de force that was Origin’s Ultima VII side by side, it’s difficult to believe that the two games were released in the same year, much less that they were, theoretically at least, direct competitors.

It’s of course easy for us to look back today and say what SSI should have done. Instead of flooding the market with so many generic Gold Box games, they should have released just one game every year or eighteen months, each release reflecting a much more serious investment in writing and design as well as real, immediately noticeable technical improvements. They should, in other words, have strained to make every new Gold Box game an event like Pool of Radiance had been in its day. But this had never been SSI’s business model; they had always released lots of games, very few of which sold terribly well by the standard of the industry at large, but whose sales in the aggregate were enough to sustain them. When, beginning with Pool of Radiance, they suddenly were making hits by anybody’s standards, they had trouble adjusting their thinking to their post-Pool situation, had trouble recognizing that they could sell more units and make more money by making fewer but better games. Such is human nature; making such a paradigm shift would doubtless challenge any of us.

Luckily, just as the Gold Box sales began to tail off SSI found an alternative approach to Dungeons & Dragons on the computer from an unlikely source. Westwood Associates was a small Las Vegas-based development company, active since 1985, who had initially made their name doing ports of 8-bit titles to more advanced machines like the Commodore Amiga and Atari ST (among these projects had been ports of Epyx’s Winter Games, World Games, and California Games). What made Westwood unique and highly sought after among porters was their talent for improving their 8-bit source material enough, in terms of both audiovisuals and game play, that the end results would be accepted almost as native sons by the notoriously snobbish owners of machines like the Amiga. Their ambition was such that many publishers came to see the biggest liability of employing them as a tendency to go too far, to such an extent that their ports could verge on becoming new games entirely; for example, their conversion of Epyx’s Temple of Apshai on the Macintosh from turn-based to real-time play was rejected as being far too much of a departure.

Westwood first came to the attention of Gold Box fans when they were given the job of implementing Hillsfar, the stopgap “character training grounds” which SSI released between Pool of Radiance and Curse of the Azure Bonds. Far more auspicious were Westwood’s stellar ports of the mainline Gold Box games to the Amiga, which added mouse support and improved the graphics well before SSI’s own MS-DOS versions made the leap to VGA. But Brett Sperry and Louis Castle, Westwood’s founders, had always seen ports merely as a way of getting their foot in the door of the industry. Already by the time they began working with SSI, they were starting to do completely original games of their own for Electronic Arts and Mediagenic/Activision. (Their two games for the latter, both based on a board-game line called BattleTech, were released under the Infocom imprint, although the “real” Cambridge-based Infocom had nothing to do with them.) Westwood soon convinced SSI as well to let them make an original title alongside the implementation assignments: what must be the strangest of all the SSI Dungeons & Dragons computer games, a dragon flight simulator (!) called Dragon Strike. Released in 1990, it wasn’t quite an abject flop but neither was it a hit, selling 34,296 copies. With their next original game for SSI, however, Westwood would hit pay dirt.

Eye of the Beholder was conceived as Dungeons & Dragons meets Dungeon Master, bringing the real-time first-person game play of FTL’s seminal 1987 dungeon crawl to SSI’s product line. In a measure of just how ahead-of-its-time Dungeon Master had been in terms not only of technology but also of fundamental design, nothing had yet really managed to equal it over the three years since its release. Eye of the Beholder arguably didn’t fully manage that feat either, but it did at the very least come closer than most other efforts — and of course it had the huge advantage of the Dungeons & Dragons license. When a somewhat skeptical SSI sent an initial shipment of 20,000 copies into the distribution pipeline in February of 1991, “they all disappeared” in the words of Joel Billings: “We put them out and boom!, they were gone.” Eye of the Beholder went on to sell 129,234 copies, nicely removing some of the sting from the slow commercial decline of the Gold Box line and, indeed, finally giving SSI a major Dungeons & Dragons hit that wasn’t a Gold Box game. The inevitable sequel, released already in December of 1991, sold a more modest but still substantial 73,109 copies, and a third Eye of the Beholder, developed in-house this time at SSI, sold 50,664 copies in 1993. The end of the line for this branch of the computerized Dungeons & Dragons family came with the pointless Dungeon Hack, a game that, as its name implies, presented its player with an infinite number of generic randomly generated dungeons to hack her way through; it sold 27,110 copies following its release at the end of 1993.

This chart from the April 1991 Software Publishers Association newsletter shows just how quickly Eye of the Beholder took off. Unfortunately, this would mark the last time an SSI Dungeons & Dragons game would be in this position.

Despite their popularity in their heyday, the Eye of the Beholder games in my view have aged less gracefully than their great progenitor Dungeon Master, or for that matter even the early Gold Box games. If what you wished for more than anything when playing Dungeon Master was lots more — okay, any — story and lore to go along with the mapping, the combat, and the puzzles, these may be just the games for you. For the rest of us, though, the Dungeons & Dragons rules make for an awkward fit to real-time play, especially in contrast to Dungeon Master‘s designed-from-scratch-for-real-time systems of combat, magic, and character development. The dungeon designs and even the graphics similarly underwhelm; Eye of the Beholder looks a bit garish today in contrast to the clean minimalism of Dungeon Master. The world would have to wait more than another year, until the release of Ultima Underworld, to see a game that truly and comprehensively improved on the model of Dungeon Master. In the meantime, though, the Eye of the Beholder games would do as runners-up for folks who had played Dungeon Master and its sequel and still wanted more, or for those heavily invested in the Dungeons & Dragons rules and/or the Forgotten Realms setting.

For SSI, the sales of the Eye of the Beholder games in comparison to those of the latest Gold Box titles provided all too clear a picture of where the industry was trending. Players were growing tired of the Gold Box games; they hungered after faster-paced CRPGs that were prettier to look at and easier to control. While Eye of the Beholder was still high on the charts, TSR and SSI agreed to extend their original five-year contract, which was due to expire on January 1, 1993, by eighteen months to mid-1994. The short length of the extension may be indicative of growing doubts on the part of TSR about SSI’s ability to keep up with the competition in the CRPG market; one might see it as a way of putting them on notice that the TSR/SSI partnership was by no means set in stone for all time. At any rate, a key provision of the extension was that SSI must move beyond the fading Gold Box engine, must develop new technology to suit the changing times and to try to recapture those halcyon early days when Pool of Radiance ruled the charts and the world of gaming was abuzz with talk of Dungeons & Dragons on the computer. Accordingly, SSI put a bow on the Gold Box era in March of 1993 with the release of Unlimited Adventures, a re-packaging of their in-house development tools that would let diehard Gold Box fans make their own games to replace the ones SSI would no longer be releasing. It sold just 32,362 copies, but would go on to spawn a loyal community of adventure-makers that to some extent still persists to this day. As for what would come next for computerized Dungeons & Dragons… well, that’s a story for another day.

By way of wrapping up today’s story, I should note that my take on the Gold Box games, while I believe it dovetails relatively well with the consensus of the marketplace at the time, is by no means the only one in existence. A small but committed group of fans still loves these games — yes, all of them — for their approach to tactical combat, which must surely mark the most faithful implementation of the tabletop game’s rules for same ever to make it to the computer. “It’s hard to imagine a truly bad game being made with it,” says blogger Chester Bolingbroke — better known as the CRPG Addict — of the Gold Box engine. (Personally, I’d happily nominate Secret of the Silver Blades for that designation.)

Still, even the Gold Box line’s biggest fans will generally acknowledge that the catalog is very front-loaded in terms of innovation and design ambition. For those of you like me who aren’t CRPG addicts, I highly recommend Pool of Radiance and Curse of the Azure Bonds, which together let you advance the same party of characters just about as far as remains fun under the Dungeons & Dragons rules, showing off the engine at its best in the process. If the Gold Box games that came afterward wind up a bit of an anticlimactic muddle, we can at least still treasure those two genuine classics. And if you really do want more Gold Box after playing those two, Lord knows there’s plenty of it out there, enough to last most sane people a lifetime. Just don’t expect any of it to quite rise to the heights of the first games and you’ll be fine.

(Sources: This article is largely drawn from the collection of documents that Joel Billings donated to the Strong Museum of Play, which includes lots of internal SSI documents and some press clippings. Also, the book Designers & Dragons Volume 1 by Shannon Appelcline; Computer Gaming World of September 1989; Retro Gamer 52 and 89; Matt Barton’s video interviews with Joel Billings, Susan Manley, and Dave Shelley and Laura Bowen.

Many of the Gold Box games and the Eye of the Beholder trilogy are available for purchase from GOG.com. You may also wish to investigate The Gold Box Companion, which adds many modern conveniences to the original games.)

 
 

Tags: , ,