RSS

Realms of the Haunting

I like to imagine that Realms of the Haunting, Gremlin Interactive’s bizarre 1996 mélange of first-person shooter, point-and-click adventure, and interactive movie, was the brainchild of Ian and Nigel, students at Thomas Hughes Secondary School in Stow-on-the-Water.

Ian was sitting in a sheltered nook after school one day, sketching notes for a Call of Cthulhu scenario, when Nigel, the local football star, came barreling down the lane on his bicycle. Spotting his nerdy classmate, he dismounted in his customary way, by leaping off the still-moving bike, whose forward progress would have been arrested by Ian’s legs if he hadn’t swung them hastily aside. Nigel then joined his mode of transportation in marching up to see what Ian was up to, not without provoking some consternation from the latter, who wasn’t at all sure how this exchange was going to go. “What’cha up to, dude?”

Ian pushed his spectacles further up his nose. Even under the best of circumstances, he never quite knew how to respond to Nigel’s mix of Americanisms and Cockney, which he found almost incomprehensible at times. But he rallied. “Well, it’s sort of a scenario for a game…”

“A game? What kind of game?”

Call of Cthulhu. It’s sort of a horror game. You get together with some mates, and one of you is sort of the referee. The others are all investigators who learn about this creepy old mystery. First they find this old bunch of letters — letters between some sort of cultist and his girlfriend…” Ian trailed off. He could sense that this was already going horribly wrong.

But it appeared that Nigel had heard only the words “horror” and “girlfriend.” “Dude, I know what you mean!” he cried. “Like Duke Nukem! You know that one? You can shoot blokes in the head with a shotgun, and the blood goes everywhere! And you meet strippers! You can pay’em to get out their titties for you.”

Ian did not know Duke Nukem. His father, a line engineer for British Telecom, was possessed of a streak of patriotism that precluded the purchase of any American personal computer. He insisted the family could get by perfectly well with their homegrown Acorn Archimedes, which advertised its stolid Britishness by means of its incompatibility with everything else. Still, Ian dearly wanted this conversation not to get onto the wrong track, so he nodded, whilst pushing his glasses up once again. “Yes… sort of…”

Any sign of hesitation was lost on Nigel. “Dude, we could work together on it!”

“Um, okay…”

“So, maybe you start in a haunted house — you’ve got to start somewhere, right? — but then it just keeps getting bigger and bigger! You find, like, one of those snakehole thingies and go to other dimensions and stuff, with wickeder and wickeder monsters to fight. And there’ll be a bunch of bosses to fight too. A bloke who’s dressed up like a doctor, like he wants to do human experimentation on you and all, and a creepy evil priest in sunglasses –”

“Wait… why sunglasses?”

“Why not?” shrugged Nigel. He was in the habit of dwelling on the abstract cool in life, not on the details.

Ian felt his blood stirring despite himself. Maybe he should think bigger for once. “Maybe you could sort of go to ancient Egypt,” he piped up cautiously. He’d always loved those ancient-archaelogy picture books his dad collected, so much so that he’d been begging for a trip to Egypt for years. “Just imagine the sort of interesting puzzles we could have around the pyramids…”

“Sure, sure, dude,” said Nigel. All things were possible in his world. “But it’s got to be bigger than that even, you know? I know… maybe in the end you’ve got to go right down into Hell and kill Satan! That would be awesome! I’m sure no one’s ever made a game where you’ve got to kill Satan.”

Again, his words set Ian thinking. They were reading Paradise Lost in his honors literature course this semester, and The Divine Comedy was waiting in the wings. And then he’d read this less respectable thing called The Holy Blood and the Holy Grail last summer. “We could sort of pull a lot of the story right out of the Bible and Milton and Dante,” he said. “Sort of build on them. Even have the Garden of Eden in there. And Knights Templar. You’ve always got to have Knights Templar.”

Nigel looked unusually pensive for a moment, so much so that Ian almost thought he was giving serious consideration to the advantages and pitfalls of building a new mythology on the scaffolding of older ones. But he was soon disabused of that notion. “We’re going to need a chick,” said Nigel contemplatively.

“Sure, alright,” said Ian. “She can be sort of your companion, who’ll know a lot more than you do about all the mythology, and can sort of help to explain…”

But again, Nigel had only caught a word or two. “Companion, what?” he leered. “I like the sound of that.”

“Say,” said Ian suddenly, “have you been in the computer lab lately?” When Nigel unsurprisingly shook his head no, he rushed on. “They’ve got this computer version of Connections there. Have you seen that one on the telly?” Another shake of the head. “It’s a show about how everything in history is interrelated –”

“Inter- what? Dude, what are you on about?”

Ian decided to elide further details. “Anyway, the point is that they put the host of the show right in the computer game. I mean, the real host. We could do the same — have real actors in the game.”

Nigel looked dubious. “How will that make it cooler?”

“Well, it would make it seem more real… sort of more believable with real people.” Nigel still looked doubtful. So Ian tried another tack. “Just think how cool all those bosses would look as real people, sort of like they’re in a real horror movie. And then to play your companion we can get a real chick,” he said, pronouncing the word as naturally as he could, like an earnest foreigner trying out his French on the natives.

Nigel was sold. “Right, then, we gotta have that too,” he said. “We’ll look like wankers if we don’t.”

And so it went. Through the sylvan afternoon, Ian and Nigel raised their castle in the air to ever loftier heights, adding battlements and wings and dungeons until the whole edifice teetered in the merest hint of a breeze. When they were finally winding down, Nigel got unexpectedly thoughtful again. “Maybe it’s a bit much,” he mused, expressing a sense of moderation which neither Ian nor any of the adults in his life had ever suspected might lurk within him. “Maybe you shouldn’t actually kill Satan after all…”

Ian nodded. They had gotten rather carried away, hadn’t they? Obviously they would have to rein things in a little. Or a lot.

“We’re gonna need some material for the sequel, right, mate?” Nigel grinned. “And anyway, we got to leave Marilyn Manson a Satan to sing about. Am I right, what?” he queried with a friendly fist jab.

Ian wasn’t sure whether he was right or not, but he could always see the wisdom in restraint; his whole life to date had been a study in it. “Okay, we have enough already without Satan. Maybe too much. But if it does get to be too much, we could always just sort of say it was all in someone’s imagination at the end, like in the last episode of St. Elsewhere.” That last reference spoke to how comfortable he was beginning to feel with his rambunctious design partner. His mother had an eccentric fondness for soapy American television which she’d imparted to her son, but normally he would die before sharing this passion with any of his peers.

Nigel was unfazed, if also uninterested. “Sure, dude. Truth.” He held out his hand for a fist bump, which Ian navigated with only a little awkwardness. It seemed they were now fast friends, but it was also time for Ian to get home for dinner; his mother did scold so when he was late. Nigel nodded his acquiescence, digging his bicycle out of the bushes where it had landed a couple of hours before. “Check you later, mate,” he said as he swung one leg over the saddle.

“Yes… mate” said Ian. “Our game’s going to be… awesome.” He smiled to himself as Nigel rode off into the sunset. He rather liked the feel of the word on his tongue.

One week later, the boys sent their design document to Gremlin Interactive.


Fair warning: this article spoils the “shocking” denouement of Realms of the Haunting as well as some other plot details.

Alas, the real origin story of Realms of the Haunting is somewhat more prosaic. The game’s individual pieces mark it as a thoroughgoing product of its time; it’s only their amalgamation in one place that makes it so bat-guano insane.

The project began when Gremlin Interactive joined three-quarters of the other games studios on the planet in beating the bushes for a DOOM-like 2.5D engine in the wake of that game’s extraordinary success. They wound up sourcing the “True3D” engine — which was actually no more true 3D than DOOM had been — from Tony Crowther, a legendary British games programmer whose career’s beginning predated Gremlin’s own 1984 founding. Crowther also offered Gremlin two ideas for a game to make with his engine. One was a “generic monster game,” as he puts it, while the other had a “devil theme.” Gremlin chose the latter. So far, so DOOM-like, in theme as well as technology.

But here’s where it starts to get weird. Gremlin, virtually alone among the many studios working with engines like this one, thought that theirs could be twisted to suit the needs of a puzzle-based adventure game instead of being strictly a vehicle for first-person carnage. And the odd thing is, they were kind of right. After production had already started on Realms of the Haunting, another team at Gremlin used the True3D engine to create a non-violent comedy adventure in the LucasArts tradition, to surprisingly good effect. Normality‘s ramshackle 2.5D visual aesthetic proved a good fit with its cock-eyed protagonist’s stoner-dude perspective on the world, while its puzzle design was as buttoned down as the rest of the affair was comfortably casual. Despite being started after Realms of the Haunting, it came out months before it in 1996. Unfortunately, it garnered few sales. One senses that, in addition to being confused by the look of the thing, gamers just didn’t quite get its jokes. Nor did it help that the market was flooded with bigger, more expensive adventures from American studios that year, the last in which the adventure genre was still widely perceived as one of the industry’s AAA standard bearers.

Through it all, the Realms project trundled on, determined to be both a kick-ass first-person shooter and a brain-tickling adventure game. Writer Paul Green wanted the story to be “epic.” And indeed, it just kept growing and growing. Then someone got the bright idea to jump on yet another indelibly mid-1990s trend: the “full-motion-video” game, incorporating clips of real actors filmed in front of green screens, which backgrounds were filled in after the fact with conventional computer graphics. Gremlin hired Bright Light Studios, an outside video-production house, to cast and carry out the shoots, then spent much time and money massaging the end results into their Frankenstein’s monster of a game. By the time it came out in Britain, about a week before the Christmas of 1996, Realms of the Haunting had spent a good two and a half years in development — one year longer than had been intended — and had become by far the most expensive game Gremlin had ever made.

Programmer Greg Staples, engine architect Tony Crowther, and writer and designer Paul Green.

And what did they get for their money? Oh, my… where to begin? With the beginning, I suppose…

The very first impression Realms of the Haunting gives is of reaches exceeding grasps, establishing a leitmotif that will persist throughout. It opens with an epigraph that’s attributed only to “anonymous”: “Goodness reflects the light and evil bears the seed of all darkness. These are mirrors of the soul, reflections of the mind. Choose well.” This reads like pieces of other, better epigraphs cut and pasted together into a meaningless word salad. Yet it actually serves its purpose of being a harbinger of what is to come in two separate ways. Realms of the Haunting itself will play like chunks of other, better games cut and pasted together. And everyone you meet in the game will talk just like that epigraph is written.

Next we have the bravura eight-minute opening movie, in which we meet our protagonist Adam Randall, in the back of a taxi on his way to answer a mysterious summons to a deserted house out in the middle of nowhere — a visit he’s decided to make in the middle of a dark and stormy night, of course, as you do in these situations. When I first heard Adam speak, I thought it strange that Gremlin had opted to cast an American actor in the role, given that the actual text of the script shows every indication he ought to be British; the summons to the haunted house came from a self-purported colleague of his recently deceased father, who we’re told was “the pastor in a Cornish village.” But then I started noticing oddities in Adam’s vowels and in his “Ts” that didn’t fit with an American accent either. In the end, I decided he must be Canadian. (Hey, at least that puts him inside the Commonwealth!) But then, after I was finished with the game, I watched Gremlin’s short “making of” video, and all became clear: the actor was a Brit putting on an American accent. But… why, especially when it doesn’t make any sense in the context of the plot? I can only conclude that Gremlin believed they’d have a better chance of cracking the all-important American market with an American-sounding protagonist — plot fidelity be damned. Anyway, not much else about the plot will wind up making much sense. Why should this?

The actor in question is one David Tuomi, who has managed the neat trick of leaving no digital footprint whatsoever in all the years since. It’s as if he was immaculately created just to play Adam Randall, then returned to the dust from which he had been made as soon as shooting wrapped. To this day, his profile on The Internet Movie Database has exactly one entry: Realms of the Haunting. He doesn’t even enjoy the cult celebrity of someone like Dean Erickson, whose acting résumé is almost as scanty and similar long-abandoned, but who still pops up to give an interview from time to time about that one time he got to play Gabriel Knight. No, David Tuomi is just… gone. It seems he took his right to be forgotten seriously.

This is made still weirder by the fact that he really isn’t that bad here. He may not be Laurence Olivier, but he’s a good-looking, likable young man who doesn’t palpitate with nervousness when he speaks his lines, which puts him well ahead of Dean Erickson and many of his other peers in the full-motion-video field. The problems with this game’s storytelling aren’t down to him.

In fact, for all of its haunted-house clichés, the opening movie as a whole strikes a pensive note that raises the hopes of a writerly type like me. The Adam we meet in the back of that taxi is haunted by metaphorical rather than literal demons; he’s filled with regrets about all of the things he never said to his father, all of the times he could have picked up the phone to call him but didn’t. This sense of guilt is joined by other, bitterer sentiments: “He was well-liked. Had time for everyone. Except his son.” (Those might just be the most cogent lines in a script with very few of them.) You can almost begin to believe that, like all the best classic horror, this story will really be about the fears and worries and secret shames that are part and parcel of being a human being, those occasional dark nights of the soul that keep even those of us who don’t believe in ghosts wide awake from time to time. But never fear, would-be demon blasters: the game will never strike a note like this one again, and Adam will never again betray any sign of having an inner life that goes beyond the exigencies of the moment.

Adam pays the taxi driver and enters the house. As he does so, the doors shut behind him with a crash, like the sealing of a tomb. He doesn’t so much as twitch in response to this event. On the one hand, this is a typical discordance of these sorts of productions: the David Tuomi acting in front of a green screen had no slamming doors to react to, because both the sight and the sound of them were painted in later. But it also establishes a precedent in another way. Adam will stumble through everything to come comically unfazed by it all. Even now, at the outset, he just shrugs as he wanders the corridors of a house with glowing pentagrams splashed over the walls and doors, portraits that blink at him with livid red eyes, a fly-encrusted suit of armor that appears to contain a human corpse, decapitated animal bodies strewn randomly about the place, and a typewriter that’s typing “We live!” over and over again of its own accord. Unflappable doesn’t begin to describe this guy. “A rat. No head,” he mutters to himself, and moves on. In the case of the typewriter, he confines his observations to, “Ink ribbon’s missing.” Right. Better buy a new one in the morning, once I’m through with all this tedious business of demon blasting. Which Adam will soon be doing with laconic aplomb, mowing through his enemies like Duke Nukem — until it’s time for a cut scene, at which point he reverts to being the slim, harmless-looking guy we met in the taxi.

Could this library look any more Lovecraftian?

The aforementioned demons show up only after you’ve solved a puzzle or two and found your way into the house’s library, that natural repository of secrets. In the best tradition of a Call of Cthulhu scenario, you find a clutch of 70-year-old letters between a cultist and his paramour, talking about some ominous ritual they’re attempting to enact. And, even more disconcertingly, you meet Adam’s father’s ghost, entwined in chains that make him look like a parody of Jacob Marley in A Christmas Carol.

Then you open the inevitable secret door that’s hidden behind the bookcase, and the first demon comes running at you. With head-snapping speed, we’ve gone from pensive psychological horror to Gothic horror to a vaguely Lovecraftian story to a B-grade zombie flick.

And so it will go for the next couple of dozen hours. The designers’ response to any and all suggestions seems to have been, “Sure! Put it in there!” Realms of the Haunting is a study in excess, a game that wants so very badly to be all things to all people, evincing all of the sweaty desperation in pursuit of that goal that Adam so noticeably fails to display. It just goes on and on and on and on. Most games that tell pre-scripted, set-piece stories have around four or five chapters; this one has twenty.

Sometimes, however, less is more. Perhaps more so than any other game I’ve played, Realms of the Haunting descends linearly in quality — in all measures of quality, from writing to production values to gameplay — as it unspools from beginning to end.

That said, the descent doesn’t happen at the same pace along these different vectors. It’s the story that goes off the rails first. The central problem here is all too typical in videogames: stakes inflation. It’s not enough for this to be a tale of a father and son’s sins and redemption, not even enough for the fate of a family or a town or even a nation to hinge on Adam’s actions. No, the fate of the entire world — no, make that the fate of the entire universe! — has to be borne on the fashionably padded shoulders of Adam Randall.

The flames of Hell are so bright, Florentine’s got to wear shades.

It soon becomes clear that Paul Green has no idea how storytelling works at the most fundamental level. Instead of giving us one central villain to hate, he dilutes the impact with a whole rogue’s gallery of weirdos whose relationships to one another are almost impossible to keep track of: the creepy priest who seems to have been modeled on Rasputin, the guy who runs around in a doctor’s outfit, the guy dressed up like he’s auditioning for a Sam Spade flick who’s always flipping through a deck of cards and giggling like a low-rent Joker. If this was a Nintendo-style level-based videogame, these folks might work as a series of bosses. But as an interactive story told primarily through about 90 minutes of live-action video, it never gives anybody enough screen time to make you care. It’s not a problem of the acting; like David Tuomi, all of the actors perform what’s being asked of them serviceably enough, intoning their lines like the Shakespearean creatures of the British theater scene they probably were. It’s a problem of the writing.

This fellow looks like an evil Tex Murphy.

Your allies are no better. Again, there’s just too many angels and archangels and God knows what else running around, all talking in symbolic gibberish that brings to mind the game’s horrid opening “quotation” and never telling you what you actually want to know. At one point, one of them apologizes for “speaking in metaphors” — which is hilarious because absolutely everybody in this game speaks in nothing but metaphors, and terrible mixed ones at that, until you want to pull out your big old shotgun, point it at their foreheads, and demand a straight fricking answer, for once. The would-be drama has a way of shooting its gravitas in the foot at every turn. Your guardian angel Hawk, for example, runs around in an artfully crumpled tee-shirt that makes him look like a model in a Gap circular. Another character, one who has something or other to do with the Knights Templar, is named “Aelf” — pronounced as far as I can tell just like “Alf,” which always sets me giggling to myself about cat-loving anthropomorphic aliens.

The character you spend the most time with is Rebecca, a fetching lass in a chic pantsuit more appropriate for a day behind a desk in the City than a night in a haunted house. (She’s played by one Emma Powell, who unlike David Tuomi went on to a long and fruitful career as a supporting and voice actress in movies, television, and videogames.) Our avatar of indifference Adam comes across her in the library at the end of Chapter 2, inexplicably just sitting there, and, true to form, never bothers to ask her how she ended up there. For the bulk of the game thereafter, she serves as his sounding board and advisor as he wanders about, offering hints and commentary on the environment and adding a little spark of life to what would otherwise be a decidedly lonely experience. In that sense, she’s not a bad addition at all.

Emma Powell and David Tuomi

Indeed, the gameplay generally declines less precipitously than the writing — with one caveat. That comes in form of the interface, which will first flabbergast you with its inscrutability and then annoy you like a dull foot ache for all the hours to come. Some of this game’s confusions for the modern player exist in many first-person shooters that came out between 1993 and 1998, including to some extent even (un-modded) DOOM itself. These were the years before the control schemes that have been the standard for the last quarter-century had quite stabilized.

Still, Realms of the Haunting‘s problems in this department extend well beyond the lack of mouse-look or the unfamiliar default key mappings. The adventuring interface is fiddly almost beyond belief; everything you try to do is ten times harder than it would be if you were doing it in real life. Using or examining an object entails pressing “I” to bring up the inventory screen, then finding its stamp-sized icon among the four separately sorted categories of junk you’re carrying: “general items,” “weapons,” “mysterious or magical items,” or “documents.” (No, it isn’t always immediately obvious what the game considers to belong in what category.) For some reason, all of this is allowed to fill no more than a quarter of the screen. So, if it’s a document you’re interested in reading, you get to do so by dragging it around inside a small window; it’s like reading a book through a telescope. If you want to try to use an object on something in the world, you first have to place it in Adam’s left hand — his left hand, mind you; the right is reserved for weapons — then exit the inventory system completely before you can click on the target. This is so annoyingly convoluted a process that I’m going to tarnish my cred as a hardcore adventurer by strongly suggesting that you play this game, if you choose to do so, in “easy” adventuring mode, where it automatically uses the correct object in the correct place, as long as you have it in your inventory. What’s truly bizarre about all this is that Normality, the other Gremlin game that was built with the True3D engine, has a fast, elegant popup radial menu for examining and using objects. What on earth were these developers thinking?

You can almost always tell whether the makers of any given game have given it to anyone to actually play before they released it. These ones most definitively did not, as evidenced by the constant unnecessary niggles. When you find a new key, for example, you never know where it will appear among the twenty others you’re already carrying around with you. Why not dispose of the ones that have already served their purpose?

Still, if you can get past the torturous interface, the first chapters, when you’re still exploring the house itself, acquit themselves fairly well. The shooting parts serve their purpose well enough, while the puzzling parts can be surprisingly satisfying, revolving mostly around finding keys that let you open up more and more of the house for exploration. Realms of the Haunting is at its best at this stage, just about making you believe that its chocolate-and-peanut-butter combination of genres has something to recommend it after all. And, while I would by no means call it scary in a “I don’t want to play this alone” kind of way, its aesthetic does qualify as enjoyably creepy.

It’s only after you leave the house to start dimension-hopping about seven chapters in that things begin to fall apart on the gameplay front as well. An engine that can portray shadowy hallways to fairly good effect is less suited to conveying the splendor of ancient Egypt, the beauty of the Garden of Eden (yes, you really do travel there), or the horrors of Hell (yes, you really do travel there as well). All of these environments are much bigger, sprawling places than the house, with unfortunately less inside of them to see and do, a sure sign that constraints of budget and time were catching up with the designers’ ambitions. Even the normal spaces become hard to find your way around in; you are provided with helpful maps of most of them, but consulting them involves scrolling them around in that absurd little inventory window, a process so excruciating that you won’t want to bother until you’re truly at wit’s end. And then there are the deliberate mazes… oh, Lord, save us from the mazes in this game, which are scarier than any of its demons. There are three of them, extended, hair-pulling monstrosities all.

The game gives you maps of most of the larger areas, which is very kind and progressive of it. But then it makes you peer at them through a pointlessly tiny window. And every time you bring a map up again, it resets the view to the top left. The cruel irony here is that the full document seems very close to the size of your monitor screen. This is not rocket science, Gremlin.

As the game wears on, the puzzles become increasingly surreal, to say the least. For example, you run through one of the mazes collecting little brains to shove into a giant brain machine, apropos of nothing that comes before or after. And you’ll spend an inordinate amount of time constructing a bong so one of the angelic beings you meet can toke up properly. (I assume that all of the obvious jokes about what the folks at Gremlin were doing when they made this game have already been told, so I won’t bother.) If you’ve made it this far without ragequitting, you must long since have adopted Adam’s attitude. Just shrug your shoulders and go with it.

It gradually becomes clear that significant chunks of the story are simply missing in action, presumably due to budget shortfalls or space limitations; Realms of the Haunting shipped on four CDs as it was. At one point, Adam and Rebecca stand on the verge of a dramatic showdown with one of the ceaselessly rotating lazy Susan of villains. After everyone has engaged in the usual speechifying that precedes such things, the screen fades to black, a scream sounds, and you flex your mouse hand and get ready for battle… and suddenly Adam is in a prison cell, while Rebecca has somehow escaped and is trying to rescue him from outside the door. The game never explains how any of this happened. Just go with it.

At long last, you get to the very end and save the universe/multiverse/whatever. And then… you join a straitjacketed Adam in a mental hospital, as he finishes telling his tale to a skeptical doctor and a nurse with a syringe at the ready. Oh, my. You can almost hear the writers slapping one another on the back for being so clever. At least now we know why one of the villains was dressed up like a doctor. And perhaps, come to think of it, why we’ve never heard another peep from David Tuomi…

The dominant impression Realms of the Haunting leaves you with is that of bits that don’t fit together. This applies as much at the most granular levels of detail — as in the way that Adam’s reactions never quite seem to be in line with what is actually happening to him — as it is in the big picture. Rebecca, for instance, shows up in the movie bits, and is on-hand to offer commentary in the adventure-game bits, but simply doesn’t exist in the shooter bits. Likewise, we never see the veritable Fort Knox of weaponry Adam is carrying around with him when we see David Tuomi portraying him in the movie bits. Of course, Realms of the Haunting is hardly the first or the last game to hand-wave away inconvenient details like these. If it succeeded on at least one of its three levels — whether as a shooter, a puzzle-driven adventure game, or a well-scripted interactive movie — I’d be more inclined to overlook such inconsistencies. As it is, though, it’s failing everywhere by the time you make it halfway through: the shooter bits have become samey and janky, the adventure bits samey and illogical, the story samey and flat-out incomprehensible. This game is the inversion of the reviewer’s cliché about the creation under review being “more than the sum of its parts.” As bad as these parts are, the whole manages to be less than their sum. Every part of the game actively diminishes every other part.

And yet as hot messes go, this one is as intriguing as they come. Whatever else you can say about Realms of the Haunting, there’s never been another game like it. It’s amazing to think that a purportedly responsible management team ever approved such an outlandish monstrosity as this one. In fact, there’s a melancholy aspect to that: long shot that it was to get made even in 1996, it’s even harder to imagine this game appearing any later in the decade. For as gaming moved into the last third of the 1990s, genres were calcifying into fixed categories with inviolate sets of expectations. Soon absolutely no one would be taking fliers on crazy cross-genre experiments like this one anymore.

To know why, we need only look to Realms of the Haunting‘s commercial performance. Its British release date was not ideal, coming too late to reap the proper benefit of the Christmas buying season. And the circumstances of its American release, in the dog days of March of 1997 under the imprint of an unenthusiastic Interplay Entertainment, were no more auspicious. Yet a game with true mass appeal can overcome such factors — as, for example, Diablo did when it was shipped to American stores between the Christmas and New Years of 1996. Unfortunately, Realms of the Haunting was the antithesis of Diablo, being as clunky, fiddly, and scattered as Blizzard’s juggernaut was frictionless, polished, and laser-targeted. As Computer Gaming World put it in an (overly) generous 4.5 star review, Gremlin’s game had natural appeal to “action gamers looking for some adventure” and “adventure gamers looking for some action.” But just how many people meeting these descriptions were there? Not very many at all, it would appear. Realms of the Haunting was dead on arrival, selling less than 500 units in its first week on the market in Britain. Its high cost combined with its abject commercial failure had much to do with Gremlin’s subsequent collapse, which resulted in the company being bought out by the burgeoning French giant Infogrames in 1999.

Even today, however, some of the delusions of grandeur that allowed this game to be made still persist. Steve McKevitt, Gremlin’s former communications chief, blames its failure on “a backlash against full-motion video.” He claims that the developers “got just about everything right: the script, subject matter, story line, pacing. It was years ahead of its time. More The Last of Us than Quake.” Uh, no, Steve. Just no.

Yet, easy though it is to make fun of, Realms of the Haunting is a hard game to totally hate. It’s just so earnest in pursuit of its lofty ambitions, so fixated on being epic, man. How can you hate something that’s trying this hard to be the best game ever? Chalk it up as a last artifact of an older games industry that frequently had more vision than competence, of a time when budgets were small enough to take a chance on something crazy and just see what happened. In the years to come, both of those equations would be reversed. The result would be tighter, more polished experiences, but very few games that dared to throw out all the rules, whether wisely or unwisely. Instead of logical shorthands to bracket discussions, genres would begin to look like the straitjacket Adam Randall is wearing when we catch our last glimpse of him.

And it’s for that reason really that I’ve chosen to write about this game, even though I’m not at all in the habit of writing about bad games that didn’t sell well and didn’t have much influence on the field. There’s something kind of beautiful about Realms of the Haunting‘s passionate incompetence. I’m not usually an adherent of the “so bad it’s good” school of criticism, but I can almost make an exception in the case of this game. Many critics before me have argued that it would have been a far better game if it had been content to stay inside the haunted mansion and leave off with the apocalyptic fever dreams. They’re almost certainly right, but at the same time I’m not sure I would be talking about it today had anyone involved with it understood the virtues of restraint. Realms of the Haunting is a final holdover from a messier, more freewheeling time, when everyone was still making it all up as they went along. And so, having paid our last respects to the old ways, we can now march onward, into a future that could never give us anything as amateurish, ill-considered, excessive, and lovable as this.



Did you enjoy this article? If so, please think about pitching in to help me make many more like it. You can pledge any amount you like.


(Sources: the book A Gremlin in the Works by Mark James Hardisty; Retro Gamer 24 and 108; Computer Gaming World of January 1997 and May 1997; Edge of July 1996; PC Format of December 1996; PC Power of December 1996. Online sources include Sascha Kimmel’s Realms of the Haunting fan site, Jdanddiet’s interview with the aforementioned Sascha Kimmel, Retro Video Gamer‘s interview with Tony Crowther, and a retrospective of the game at The Genesis Temple.

Realms of the Haunting is available as a digital download from GOG.com.)

 
 

Tags: , ,

Diablo

All of us had become disappointed with computer RPGs because they were going in the opposite direction of where we thought they should be going. They were becoming story- and stat-laden, really appealing to a super-small niche of super RPG geeks — which we were in a way, but that wasn’t really our style.

So, when [David] Brevik mentioned these roguelike games, it was kind of a natural. “Yeah, let’s take that cool, addictive structure and modernize it. Let’s strip away the stuff that’s turning off a lot of game fans from RPGs.”

— Max Schaefer of Blizzard North

A palpable sense of ennui dogged the Consumer Electronics Shows of 1994. The venerable semiannual expo where such landmark gaming hardware as the Atari VCS, the Commodore 64 and Amiga, the Nintendo Entertainment System and Super Entertainment System, and the Sega Genesis had been seen for the first time seemed somehow past its sell-by date now. Attendance at the Summer CES in particular was down in a big way, so much so that the organizers would move the event out of its long-standing home in Chicago’s McCormick Place the following year and turn it into a traveling exhibition in the hope of drumming up some much-needed excitement. In the meantime, the makers of gaming software had an especially underwhelming time of it in Chicago that year: as usual, they were treated as second-class citizens by the organizers, relegated to the hall’s basement so that the choicer spaces were kept free for cutting-edge toasters, refrigerators, and microwave ovens.

Among the games people who were having the worst time of it of all were the folks behind a tiny San Mateo, California, studio called Condor, Incorporated. David Brevik and his co-founders, the brothers Max and Erich Schaefer, were ostensibly at the show to demonstrate their very first finished original game, a Genesis title called Justice League: Task Force. But they knew the game was no great shakes. They had made exactly what their publisher, the financially troubled Japanese giant Sunsoft, had ordered them to make in rather pedantic detail: a blatant clone of yesteryear’s massive hit Street Fighter II, with DC Comics superheroes inserted in place of its inspiration’s pugilists. They felt it was competently executed, but knew as well as anyone that it was no more than a quickie placeholder product for a five-year-old console that was soon due to be superseded by the next-generation Sega Saturn.

Their ulterior motive for being at CES was something else entirely. Brevik had an idea for a computer game called Diablo, which he had been slowly expanding upon ever since he had lived with his family at the foot of the California mountain of that name back in the mid-1980s. Now, he felt its time had come; he desperately wanted to interest a publisher in it. But every executive he talked to at the show starting shaking his head as soon as he saw the first line of the pitch document, stating that it was “a proposal for a role-playing game.” For CRPGs were dead and buried according to the industry’s conventional wisdom, having nothing to offer in an era when multimedia flash and 3D mayhem reigned supreme. They were quaint at best, deadly boring at worst, as their recent sales figures reflected.

Thoroughly disheartened by his proposal’s reception, Brevik duly turned up with the Schaefer brothers at the appointed time to show Justice League to the assembled press. And here they all got a shock. They learned only minutes before taking the stage that Sunsoft had actually arranged to make a second version of the game for the Super Nintendo, sending the same design brief to another little studio, Blizzard Entertainment of Costa Mesa, California. Both development teams could immediately see that the other had done a pretty solid, professional job with a less than inspiring project. Indeed, they were struck by how similar the two end results were to one another.

They soon learned that they had much more in common. Blizzard too had been founded on a shoestring by three games-obsessed kids just out of university, in this case by the names of Allen Adham, Mike Morhaime, and Frank Pearce. And they too had become all too familiar with workaday projects like Justice League, which they too saw as a way for their new, unproven studio to pay its dues on the way to bigger, better things to come. The big difference was that Blizzard was a few years older, and thus that much further along the road to becoming a marquee studio. They had recently been acquired by the educational-software giant Davidson & Associates, whose distributional pipeline they would be able to use to publish their own games under their own imprint. Now, they were hard at work finishing up the project that they hoped would change everything for them: a game for computers only called Warcraft. They took the Condor boys into a cramped back room and showed it to them. “I had no idea at that point that Warcraft would become an historically important game,” says Max Schaefer. “It just looked cool.” A relationship was forged. The Blizzard folks said they were just too busy to think about anything else just then, but they promised to listen to Condor’s pitch for Diablo once Warcraft was out the door.

They were true to their word. In January of 1995, with Warcraft on store shelves and selling well, everyone came together again in Blizzard’s conference room to talk about Diablo. No one in that room was unaware of the concerns that had caused publisher after publisher to walk away from the proposal; in fact, in many ways they shared them. CRPGs had glutted the market just a few years earlier, a bewildering procession of elves and dwarves and dragons. For the hardcore aficionados, all of the different games and series were (and still are) possessed of their own distinctive personalities and intricate subtleties, but it was hard for everybody else to keep Dungeons & Dragons separate from Dungeon Master, Might and Magic separate from The Magic Candle. I have a friend who likes to say that there are only two blues songs: “the fast one and the slow one.” Likewise, one might go so far as to say that for most gamers there were only two CRPGs, the first-person Wizardry style and the overhead Ultima style. As computers had gotten more capable, games of the former type had gotten ever more complex in terms of rules, while those of the latter type had threatened to collapse under the sheer weight of their lore and verbiage, which minuscule computer memories no longer restricted. Those sorts of things were not what the Condor guys were into at all. Sure, they had all played tabletop Dungeons & Dragons as kids, but world-building and storytelling hadn’t been their primary interest. “It was all about killing monsters and finding good stuff,” says Max Schaefer.

And so that was what Diablo was to be about as well. “As games today substitute gameplay with multimedia extravaganzas and strive toward needless scale and complexity,” read the pitch document, “we seek to reinvigorate the hack-and-slash, feel-good gaming audience. Emphasis will be on exploration, conflict, and character development.”

Diablo‘s most direct influence by far was the roguelike games, which David Brevik had played for hundreds upon hundreds of hours while a student at university. From roguelikes it inherited its minimalist narrative — amounting to little more than “make it to the last level and kill the boss of bosses Diablo” — as well as randomized dungeons that would be new with every playthrough, along with the randomized “good stuff” they contained. Brevik’s favorite roguelike of all was Angband, which distinguished itself from the likes of the original Rogue and its spiritual successor NetHack by having a town to serve as the player’s base of operations for her expeditions into the nearby dungeon, resulting in a slightly more relaxed pacing and introducing an economic element. Diablo was to duplicate this structure exactly: “Forays into the dungeon will be broken up by trips to the town located above. In the town, a general store will provide standard equipment and repairs, and will also purchase extra equipment from the player. A temple will provide healing for injured and sick characters. Training and other facilities may also be available.”

In Brevik’s initial vision, Diablo was even to have roguelike perma-death: if the player’s character was killed, “that character will be erased completely from the hard drive, and the player must start over from scratch.” Combat would be turn-based like in a roguelike, but heavily influenced by the game’s secondary inspiration, Julian Gollop’s 1994 strategy classic X-COM; Diablo would use a similar interface and action-points system. If it strikes you as strange that a game that would later be so commonly dismissed as nothing more than a mindless, frantic click-fest could have two such cerebral inspirations as these… well, such are the paradoxes of game development.

At any rate, Blizzard was suitably impressed, and agreed to fund and publish the game described in the pitch document. But several of the Blizzard folks who were present at the meeting have since claimed that they were already thinking about a major change: to make Diablo run in real time. Not long after work began on the game in earnest down in San Mateo, Blizzard began slowly but relentlessly to apply pressure to Condor — more specifically, to David Brevik — to make the switch.

Brevik was appalled. There was a certain kind of moment, familiar to every roguelike player, that he considered essential to recreate in Diablo. It’s that moment when you’re down to your last few hit points and are staring down the maw of a mind flayer or a wyvern, knowing that it’s about to hit you and kill you on its next turn unless you do something really clever and/or get really lucky on your own last turn before it can do so. Do you pull out that potion that you have no idea what it does and drink it down, hoping against hope that it’s a Potion of Protection? Or do you take one last swipe at the monster with your sword, hoping it’s as close to death as you are? Or do you try to get away by running down that nearby staircase, hoping against hope that it misses with its last lunge against your vulnerable backside? Most of the time, of course, you choose wrong and/or don’t get lucky, and another character goes to the graveyard. But every once in a while, it works out, your character lives to fight another day, and you shout and dance around the room and rush to tell your friends about it. That dopamine release is what keeps people coming back to roguelikes again and again. Brevik was understandably loath to lose it.

But the slow drip, drip, drip from Blizzard continued, seeping even into Condor’s own ranks. Knowing this, Allen Adham made a suggestion to Brevik in or around May of 1995: Why not ask your own people? Why not take a vote on whether just to try real time? If it doesn’t work, you can always go back to turn-based.

It was too reasonable a suggestion to refuse. Brevik asked for a show of hands among his own people of those interested in exploring real time, and was dismayed to see almost every hand in the room go up. Acceding to the will of the majority, he retreated into his office to have a good-faith go at something he was sure would never fit with the game he wanted to make. The quicker it was demonstrated to everyone that real time wasn’t a practical possibility, he thought, the quicker they could all get back to more productive endeavors. What followed instead was the project’s kairos moment.

I can remember the moment like it was yesterday. I was sitting and I was coding the game, and I had a warrior with a sword, and there was a skeleton on the other side of the screen. I’d been working on this code to make characters move smoothly, doing a whole bunch of testing, and we’d talked about how the controls would work.

We wanted it to be visceral. Click and swing, click and swing. We wanted it to automatically happen: if you clicked on the monster, your character would go over there and swing.

I remember very vividly: I clicked on the monster, the guy walked over, and he smashed this skeleton, and it fell apart onto the ground.

The light from heaven shone through the office down onto the keyboard. I said, “Oh, my God, this is so amazing!” I knew it was not only the right decision, but that Diablo was just going to be massive. It was really the most defining moment of my career, as well as for that genre of gaming.

A new genre was born in that moment, and it was really quite incredible to be the person coding it and creating it. I was just there by myself coding it up. It was pretty incredible.

Diablo may have lost that suspended instant of supreme tension that Brevik had always seen as essential, but it had gained something else, something that would make it a different sort of game entirely. Kelly Johnson, an artist who worked on the game:

In a turn-based game, when you win, you say, “Cool, my plan worked. I took time, I deliberated, I made a plan, and it worked out.” But in a real-time [game], it’s, “Wow! I won!” It’s visceral. You’re in the moment.

Everyone at Condor, including Brevik, was soon marveling that they had ever imagined Diablo being anything other than a real-time game. Millions of players would eventually feel the same way, as the game’s real-time nature became the core of its very identity.

The Diablo team with Diablo himself. We must hope that the keytar is intended ironically.

But before that could happen, Diablo had to be finished. In their excitement over not being rejected yet again, Condor had secured less than half a million dollars in funding from Blizzard, to support a team that numbered a dozen or more. By the beginning of 1996, that money was running out. The founders dipped deep into their personal bank accounts just to cover payroll, and their employees started racing one another to the bank on payday, knowing that the last checks deposited had a tendency to bounce. Meanwhile Blizzard was soaring. That Christmas, they had released Warcraft II, a refinement of its predecessor that blew up massively; it would sell 3 million copies before all was said and done.

The Schaefer brothers and David Brevik were stunned when their publisher came to them and asked whether they would be interested in being acquired; Blizzard was suddenly flush with cash, and the brain trust there was very, very excited about Diablo‘s prospects, such that they wanted to have it all for themselves. For the people making Diablo, the unexpected offer was a lifeline materializing out of thin air in front of a drowning man. In March of 1996, Condor became Blizzard North.

It was Blizzard that had pushed the erstwhile Condor to make Diablo run in real time. Now, it would be Blizzard South that drove another core feature into being. The initial pitch document had included “two-player and multiplayer game sessions via modem or network.” Since actual work had begun on the game, however, that aspiration had been all but forgotten. Yet Blizzard South knew how important multiplayer could be for a game in this new era of widespread network connectivity. They knew that multiplayer deathmatches had made DOOM what it was, and they knew that, long after players had finished Warcraft II‘s single-player campaign, it was multiplayer that kept them going there as well, turning the game into a veritable institution. They wanted all that for Diablo, so much so that they made their only significant technical intervention into its development, sending programmers up to San Mateo to apply their Warcraft II expertise to Diablo‘s multiplayer mode.

For Blizzard had huge plans for multiplayer games in general. Everyone could sense that a large percentage of future gaming would take place between real people on the Internet, that the “LAN parties” of the current age were just a temporary stopgap. Yet gaming over long distances was still technically challenging for the user, even as sessions had to be pre-planned with buddies who had bought the same game you had; spontaneous, pick-up-and-play matches were impossible. Various third-party companies were experimenting with ways to change both of these things, but everything was in a nascent, febrile state. Having money to spend as they did, Blizzard decided to introduce a game hosting and matchmaking service for their customers, under the name (and the Internet URL) of Battle.net. And they decided to offer it to buyers of their games for the low, low price of free, on the logic that the boxed-game sales it would generate would easily pay for its upkeep. It was a revolutionary idea, one that would prove as important to Blizzard’s rise into gaming’s stratosphere as any of their individual titles, iconic as they were. Thanks to Battle.net, you would always be able to find someone to play with, then be in a game with them within seconds. Patches would download automatically when you logged onto the service, a first step toward the always-online mentality that has taken over since. And Diablo was the very first Battle.net-enabled game. If it had achieved nothing else, it would be historically notable for this fact alone.

With Diablo being refined into an ever more effortless, frictionless experience, it was inevitable that another legacy of the roguelikes would fall away. The Southerners told the Northerners that perma-death just wouldn’t fly in the modern commercial market. David Brevik kvetched, but there was no way he was going to win this argument. Even if it hadn’t started out that way, Diablo was evolving into a lean-back rather than a lean-forward sort of game, designed to be more fun than it was demanding. Mistakes would happen in a game like that, and nobody wanted to lose a character he had spent eight hours building because he got distracted by the pizza guy ringing the doorbell. By way of compromise, the Southerners did agree to allow only one save slot, which fit in nicely with the game’s ethic of simplicity anyway. And of course, if anyone really wanted to play Diablo like a roguelike, there was nothing but the temptation of that extant last save file preventing it.

Warcraft II had made Blizzard one of the biggest names in mainstream gaming, on a level with id Software of DOOM and Quake fame and Westwood Studios, the makers of Command & Conquer, Blizzard’s great rival in the real-time-strategy space. Everything Blizzard did was now of interest to obsessive gamers. Diablo was to be their first game that ran under Windows 95 rather than MS-DOS; like Battle.net, this was another outcome of the company’s guiding principle of frictionless ease in all things. In the summer of 1996, Blizzard arranged to have a two-level demo of Diablo included on a Microsoft DirectX sampler disc. Interest in the game exploded. It became easily the most anticipated title of the 1996 holiday season.

That fact makes the next bit that much more remarkable. When the last possible instant to send the game out to be burned onto hundreds of thousands of CDs and shipped to stores all over the country in time for the Christmas buying season arrived, Blizzard took a long, hard look at its current state. It wasn’t in terrible shape, but it still had its fair share of minor niggles here and there. The vast majority of publishers would have said it was good enough and shipped it at this point — after all, they could always patch it later, right? (Wasn’t that one of the points of Battle.net?) But Blizzard decided to wait, resigning themselves to letting Christmas slip by without a major new release from them. It was better, they judged, to make sure Diablo was just exactly perfect when it did ship. More than anything else, it would be this thoroughgoing focus on quality — quality at almost any cost — that would make Blizzard one of the most extraordinary success stories in the entire history of gaming. From the beginning, their tender-aged founders understood something that eluded a bizarre number of their more grizzled peers: that one’s reputation is one’s most precious business asset of all, being laborious to build up and disconcertingly easy to lose. In an industry fueled by short-term hype, they took the long view. “If you truly put the game first,” says Allen Adham, “then decisions like holding a product an extra couple of months, even if it means missing Christmas, become fairly clear.” Gamers came to know that Blizzard would never let them down, and this knowledge fueled the company’s rise. The sacrificing of tens of thousands of sales the following month led to millions and millions of sales over the following decade.

So, Diablo missed the Christmas deadline, but not by much: the first copies wended their way onto store shelves between Christmas and New Years, when lots of younger gamers had gift checks from uncles and aunts and grandparents burning holes in their pockets. Others trotted down to their local software store and traded some less desirable Christmas present for Diablo. Retailers fended off the return-season blues by turning Diablo‘s release into an event, plastering posters all over their walls and filling their display windows with mannequins of the devil on the cover. All told, it’s questionable whether the belated release really hurt Diablo very much at all, even in the shortest of terms. By spring, it was clear both from the sales reports and from the level of activity on Battle.net that Diablo was the hottest computer game in the world. It was blowing up huge, even by comparison with Warcraft II. Diablo‘s sales surpassed 1 million units within months.



Diablo‘s eventual impact on the culture and practices of computer gaming was arguably more pronounced than that of any individual title since DOOM. It introduced phrases like “loot drop” into the gamer lexicon; it was the pioneer of a new era of easy online multiplayer gaming, between friends and strangers alike; it single-handedly dragged the entire genre of the CRPG back into public favor. This long shadow can make it oddly difficult to discuss as just a game. When I went back to play it recently for the first time in a quarter of century — boy, I’m getting old! — I was impressed if not blown away by the experience. And yet, despite my best efforts, I couldn’t quite avoid allowing my opinions to be colored by some of what Diablo has wrought. We’ll get to that in due course. But first, Diablo the game…[1]The commentary in this article deals only with the original Diablo. An expansion pack to the game called Hellfire, created out-of-house by the Sierra subsidiary Synergistic Software, was released in late 1997. The relationship between Blizzard North and Synergistic was plagued with discord from first to last, and David Brevik and many of his colleagues have since disowned many elements of Hellfire as fatal dilutions of their vision. So, we’ll honor Blizzard North’s original intentions here and stick to the base game.

When you start a new adventure in the world of Diablo, you first choose your character from three fantasy archetypes: the warrior, who is best at bashing things with his big old sword; the rogue, who fights a little more surgically, preferring the bow and arrow; or the mage, who unlike his counterparts is pretty good with spells from the outset. But you don’t spend any time fussing about with statistics. You’re dropped into the hardscrabble village of Tristram, which has had the misfortune to be built over a demon’s not-so-final resting place, as soon as you’ve given your character a name. In Tristram, you can buy and sell in a few different shops and talk to a handful of villagers, but it’s all kept very short and sweet. Before you know it, you’ll be in the first dungeon, which is found beneath the graveyard of the local church.

You’ll have to fight your way through sixteen dungeon levels in all, divided into four sets of four that open up one after another, presenting ever more powerful monsters for your ever more powerful character to battle. In keeping with the game’s roguelike heritage, each level is procedurally generated. There is a modicum of story, even a cut scene here and there, but nothing you ever need to think too much about. (Although a fairly elaborate backstory does appear in the manual, it too is nothing you need to concern yourself with if you don’t want to. It was tacked on very late in development by Blizzard South, who realized that some gamers at least still liked to see such things.) There are also some pre-scripted quests to carry out, selected randomly from a pool of possibilities each time you start a new game. Most of these are given to you by the townspeople when you talk to them — but, again, all are extremely basic, coming down to “kill this monster” or “collect this object” (which, come to think of it, always involves killing the monster guarding it).

In practice, playing Diablo is a very simple loop. You go into the depths and make as much progress as you can against the hordes of enemies that await you there. Then you return topside to sell off the stuff you’ve collected that you don’t need, heal up and buy any potions or other equipment you think you’re going to need, and go downstairs again. Rinse and repeat, until you meet and hopefully kill Diablo himself. Unlike the typical epic CRPG, Diablo is intended to be a game you play over and over again. Thus the average playthrough takes only ten hours or so, as opposed to the hundred or more of its weightier brethren.

Blizzard North’s stated goal was to make Diablo “so easy your mom could play it.” Setting aside the condescension of their choice of words, they certainly achieved their goal in spirit. Fighting monsters is simply a matter of clicking on them, which causes your character to whack them with his melee weapon or fire off an arrow or spell at them. Tactics in the dungeons come down to common sense: whittling away at the edges of large groups of monsters instead of charging right into the middle of them, using doorways and narrow corridors to your advantage, keeping a healthy distance and using ranged attacks if you’re playing a rogue or a mage. That said, it does pay to learn the monsters’ strengths and weaknesses and tailor your attacks to them: skeletons, for example, are more vulnerable to attacks by blunt weapons such as maces than edged weapons such as swords.

The biggest source of tension is the question of when you should leave off in the dungeon and return to the town for succor. Usually when you die, it’s because you’ve pressed your luck just a bit too much. On the whole, though — and ironically given its line of descent through one of the most infamously unforgiving sub-genres in all of gaming — Diablo is one of the less intrinsically challenging games I’ve played in the course of writing these histories. If you do find yourself feeling under-powered and over-matched — perhaps because you made poor choices about where to allocate the ability points your character is awarded every time she levels up — you can always restart the game whilst retaining your existing character, complete with her current statistics and all of her current kit. Poor character-building choices or a general lack of skill can, in other words, always be compensated for with patient grinding.

Notice the auto-map overlaid onto the standard display…

In lieu of challenge, Diablo thrives on its polished addictiveness. Vanishingly few of its contemporaries can even begin to touch it in terms of intuitive playability. It’s clear that every last detail — every last window, every last hotkey, every last mouse click — was fussed over for hours and hours, until it was just what it ought to be. The auto-map is a thing of wonder that I have to call out for special praise. In CRPGs of the 1990s, such things are usually found in a separate window on the main display that is always too small for comfort and yet takes up too much precious screen real estate — or the auto-map can only be accessed on a separate screen, leaving you constantly flipping back and forth between the two views as you try to get somewhere. Diablo‘s auto-map, on the other hand, appears as a transparent overlay right on top of the usual display, toggled on and off by pressing the TAB key. Like everything else here, it’s elegant and perfect, a brilliant stroke that could only have come about through dedicated, dogged iteration. You have to be in awe of the craftsmanship of this game. It knows precisely what it wants to be, and it achieves its best self in every respect.

This statement applies equally to the game’s aesthetics, which are nothing short of masterful; whatever Diablo lacks in set-piece storytelling, it makes up for in atmosphere. If I had to describe that atmosphere in one word, it would be “Gothic.” Diablo captures the side of the Middle Ages that all of those Tolkienesque CRPGs cheerfully ignore in the midst of all their elves and halflings romping merrily through the forest: the all-encompassing religion of Christianity, the almost tangible reality of another life that awaits after this one, which is as much a source of fear as comfort in the minds of the people. Diablo taps into something deep and almost primal in the human psyche, having more in common with The Exorcist than The Lord of the Rings, more in common with Hieronymus Bosch than Boris Vallejo. The shocking ending, which I won’t spoil here, is likewise more horror than fantasy. Diablo is lucky it wasn’t released during the Satanic Panic of the 1980s, given that it sports much of what all those concerned parents were looking for in Dungeons & Dragons and not quite finding.

The lair of the Butcher, one of the gorier locations in Diablo. “Fresh meat!”

Matt Uelmen’s amazingly sophisticated soundtrack, recorded partially on real instruments at a time when many games were still relying entirely on tinny MIDI sound fonts, could easily have played behind a big-budget horror movie. The “Town” theme, featuring the best use of a twelve-string guitar since the heyday of the Byrds, is especially unforgettable; it took me back instantly when I heard it again after 25 years away.


All that said, I won’t go so far as to say that Diablo itself is scary. It seems to me that gameplay that revolves around killing hundreds of monsters is incompatible with true horror. Horror depends on a feeling of powerlessness, whereas Diablo is, like almost all CRPGs, a power fantasy at bottom. Nevertheless, it’s as audiovisually focused and accomplished as any game I’ve ever seen. I say this even as I freely acknowledge that its unrelentingly dark atmosphere tends to wear thin with me pretty quickly. (For me, a bit of light and joy brings out the shadows that much more effectively.)

And sadly, that statement pretty much sums up my response to Diablo as a whole, which is the same today as it was 25 years ago. It does what it does brilliantly. I just wish I liked what it does a bit more. Let me tell you how I got on with it when I played it for this article…

Given its titanic importance, my first plan was to play through it three times, once for each of the character classes. I first bashed my way to the finish line as a warrior. As I did so, I admired all of the qualities described above, but I also found the experience a little hollow; I didn’t dread sitting down with the game on the couch after dinner each evening for an hour or two, but neither did I look forward to it all that much — and nor did my wife have to tell me twice that it was time for bed, as she has to when I’m playing some games. I came to regard my Diablo sessions much as I might, say, an old episode of Law & Order: a low-effort something to pass the time, which I could do while chatting intermittently with my wife about completely different things. When I finished the game, I put it on the shelf for several months, intending always to get back to it but never feeling all that excited about doing so. Finally, knowing I had to write this article soon, I forced myself to start a new game as a rogue, hoping that character might be more interesting to play. But this time I found myself actively bored; “been there, done that” was the dominant note. Halfway through, I just couldn’t muster the will to continue. I could admire Diablo for its craftsmanship, but I couldn’t love it.

What am I to make of this? Obviously, I’m in the group of people who just aren’t really in the market for what Diablo is selling — a group who tend to be as vocal in their criticisms as the game’s fans are in their praise. But I’m not eager to join the chest-beating grognards who call Diablo dumbed down, or who shout that it’s not even a real CRPG at all. (Is there anything more tedious than a semantic debate between intractably biased parties?) It’s actually not Diablo‘s simplicity that puts me off; I’m much more likely to scold a game for being too complicated than for being too simple. And then too, over the years I’ve been writing these histories, I’ve found many — perhaps most — games from the 1980s and 1990s to be more rather than less difficult than I really need them to be, so it’s not precisely the lack of challenge that bothers me about Diablo either. Too easy is far, far better in my book than too hard.

On the other hand, I do tend to prefer human-crafted to procedurally-generated content in general, and Diablo doesn’t do anything to disabuse me of that notion. Its randomized nature means that its dungeons can only be a collection of rooms, corridors, and monsters, without the guileful tricks and traps and drama of the best dungeon crawlers of yore. Beyond that, and beyond an aesthetic presentation that isn’t quite to my taste, I think my lack of receptivity to Diablo is to do with the passivity of the experience. I’ve seen it described as a good “hangover game,” what with how little it actually asks of you. Even more tellingly, I’ve seen it called the gaming equivalent of candy: you can eat an awful lot of it without thinking much about it, but it doesn’t leave you feeling all that great afterward.

One nice thing about getting older is that you learn what makes you feel good and bad. I’ve long since learned, for instance, that I’m happiest if I don’t play games for more than a couple of hours per day, even on those rare occasions when I have time for more. But I want those hours to have substance — to yield fun stories to tell, interesting decisions to remember, strategies or puzzle solutions to muse about while I’m cooking dinner or working out or taking a walk, accomplishments to feel good about. For me, Diablo is peculiarly flat; I went, I saw, I clicked on monsters. For me, it feels less like a time waster than a waste of time. I almost find myself wishing the game wasn’t so superbly polished in every particular, just to relieve the monotony.

More substantively, I do see one aspect of Diablo as vaguely ominous in the larger context of gaming history: the way it uses stuff to do the heavy lifting of player motivation. As I mentioned above, “loot drops” became a thing in gaming with this game. Although CRPGs had been tempting and teasing players with the prospect of a new magic sword or armor as long as they had existed, Diablo put that temptation front and center, making it the main driver of its gameplay loop. In doing so, David Brevik and company consciously tapped into something besides the allure of the Gothic that is primal in human psychology. They liked to use the analogy of a slot machine: you clicked endlessly on monsters in the hope that eventually something really good would drop out of one of them. When I hear these anecdotes, I can’t help but think of the glassy-eyed zombies to be found in casinos from Shreveport to Macau, pulling the handles of the one-armed bandits again and again for hours, likewise waiting for something good to drop into their laps. Pat Wyatt, Blizzard’s vice president of research and development at the time of Diablo‘s creation, proffers an even more disturbing metaphor: “Positive reinforcement is one of the hardest types of conditioning to break, which is why pets beg at the table: rewards may not happen very often, but every once in a while you get a scrap, so they keep begging.” In the decades after Diablo, this Pavlovian loop would be exploited mercilessly by cynical game makers, trapping players in unsatisfying cycles of addiction that drained their time and their wallet, leaving them with nothing but a few virtual trinkets to their names in a virtual world that would be gone in a year or two anyway.

In the late 1990s, the dangerous addictiveness of loot drops was most in evidence in multi-player Diablo, as played on Battle.net, which in its early years was a fascinating if ofttimes toxic social laboratory in its own right. I do have more to say about it, but I think I’ll reserve it for a future article which will look at this formative period of online gaming in a more holistic way.

Instead, let me say in conclusion today what I often say when I end a review on a downer note: that no game is for everyone, and no way of having fun is wrong, as long as you aren’t hurting anyone else or yourself. If you love Diablo, you’re in good company. It’s a fine, fine game by any objective measure. Whatever cynicism it might have inspired is on the conscience of the folks who displayed it; this game was made for all the right reasons. It’s a triumph of care and dedication from which many another studio could learn, then and now. Just be sure to remember that there’s a beautiful world out there with plenty of cloudless blue skies to contrast with Diablo‘s perpetually sooty ones, and you’ll be just fine. Click away, my friends, click away!



Did you enjoy this article? If so, please think about pitching in to help me make many more like it. You can pledge any amount you like.



(Sources: As was the case with my last article, I’m hugely indebted to David L. Craddock for Stay Awhile and Listen Book I and Book II, which I plundered for quotes with all the enthusiasm of a Diablo loot hunter. By all means, check out these books if you’re interested in learning more about the Blizzard story.

Magazine sources include Computer Gaming World of August 1996, December 1996, March 1997, April 1997, and May 1997; Retro Gamer 43 and 103. Online sources include Lee Hutchison’s interview with David Brevik for Ars Technica, the Dev Game Club interview with Brevik, and Brevik’s Diablo post-mortem at the 1996 Game Developers Conference.

Diablo and its controversial expansion Hellfire are available as a single digital purchase at GOG.com.)

Footnotes

Footnotes
1 The commentary in this article deals only with the original Diablo. An expansion pack to the game called Hellfire, created out-of-house by the Sierra subsidiary Synergistic Software, was released in late 1997. The relationship between Blizzard North and Synergistic was plagued with discord from first to last, and David Brevik and many of his colleagues have since disowned many elements of Hellfire as fatal dilutions of their vision. So, we’ll honor Blizzard North’s original intentions here and stick to the base game.
 

Tags: ,

Going Rogue

When a beleaguered Netscape announced in January of 1998 that it would release the source code to its browser for everyone to tinker with and improve upon, the news shook the worlds of technology and business to their foundations. This open-source “revolution,” as even many in the mainstream press took to calling it, had sprung up seemingly out of nowhere to challenge the conventional wisdom and perhaps the very livelihood of traditional tech giants like Microsoft. For the next several years, you couldn’t open a trade journal or a newspaper’s business section without seeing some mention of the open-source movement and its leading exemplar, the robust and yet totally free — in all senses of the word — operating system Linux. Linux and other software like it was, an eye-opening number of people said, destined to destroy Microsoft’s vaunted Windows monopoly any day now.

The movement’s Little Red Book came in the form of Eric S. Raymond’s 1997 essay “The Cathedral and the Bazaar.” Originally presented as a comparison of a top-down versus a bottom-up methodology in the context of open-source projects, the central metaphor quickly got blurred in the minds of the public into a broader comparison of closed source versus open source, with Raymond’s tacit acquiescence. In this telling, the cathedral was Microsoft’s software-development model, in which a closeted priesthood bestowed programs upon a grateful populace on its own terms and on its own schedule. The bazaar was the hacker way, in which the people came together in a spirit of delightfully chaotic egalitarianism to make software for themselves, sharing their source code in the name of the greater good. “No closed-source developer can match the pool of talent the Linux community can bring to bear on a problem,” wrote Raymond. “The closed-source world cannot win an evolutionary arms race with open-source communities that can put orders of magnitude more skilled time into a problem.” Thanks to Linux and the other open-source tools it enabled, he predicted elsewhere, Microsoft’s eagerly anticipated Windows 2000, the latest incarnation of its server-grade NT operating system, would “be either cancelled or dead on arrival. Either way, it will turn into a horrendous train wreck, the worst strategic disaster in Microsoft’s history.”

Alas, Raymond proved a less effective prophet than pundit. Not only was it not a failure upon its eventual release, but Windows 2000 evolved in 2001 into the consumer-grade Windows XP, by many standards the most successful single version of Windows in history.

Like that of all revolutions that have passed their heyday of strident ideology, the most extreme rhetoric of the late 1990s open-source movement can seem overheated if not downright silly today, the blinkered product of a tiny strata of metaphorical inside cats who have concluded, rather conveniently for themselves, that the most important social-justice campaign of their age is one that can be waged from behind their keyboards and monitors, just the place where they happen to feel most comfortable. As for the ideas they introduced into the public discourse: they were real, valid, and in many ways incredibly valuable, but in the end they would be woven into the fabric of existing corporate-software production practices rather than burning down the old ways wholesale.

For rigid ideology seldom makes a good fit with the real world; pragmatically mixed national economies, for example, succeed vastly better than dogmatically capitalist or communist ones. Similarly, instead of continuing to sort itself into two opposing camps at eternal loggerheads, the modern software ecosystem has learned to take the best from both sides to wind up with a sort of mixed economy of its own. The cleverest actors have learned to combine the cathedral and the bazaar in ways that maximize the strengths of each: Google builds its proprietary Web browser Chrome atop an open-source engine known as Chromium; Apple constructed the OS X desktop on the solid foundation of an open-source operating system known as Darwin; Android mobile phones and tablets have Linux at their core. Even Microsoft now embeds an optional “Linux subsystem” into Windows, as the cats lie down with the dogs.

The reasons for open source’s failure to more comprehensively conquer the world aren’t that hard to divine; they’re actually front and center in some of the movement’s founding principles. The editors of the grandiosely titled 1999 anthology Open Sources: Voices from the Revolution — one of those books whose very name clues you into the window of time in which it was published — wrote that “most open-source projects began with frustration: looking for a tool to do a job and finding none, or finding one that was broken or poorly maintained. Eric Raymond began fetchmail this way; Larry Wall began Perl this way; Linus Torvalds began Linux this way.” The latter two of these projects at least have remained among the most essential of the workhorses that make the Internet function, strong arguments for the superiority of the open-source model for developing some types of software.

But it appears that the same is not true for all types of software. A model in which programmers create only the programs that they most want to have threatens to yield a universe of software which is interesting and attractive only to programmers. Even Eric Raymond had to acknowledge that the production of software with mass appeal is only partially a “technical problem.”

It’s [also] a problem in ergonomic design and interface psychology, and hackers have historically been poor at it. That is, while hackers can be very good at designing interfaces for other hackers, they tend to be poor at modeling the thought processes of the other 95 percent of the population well enough to write interfaces that J. Random End-User and his Aunt Tillie will pay to buy. Computers are tools for human beings. Ultimately, therefore, the challenges of designing hardware and software must come back to designing for human beings — all human beings.

Open source has never entirely made this leap. It’s for this reason that its biggest success stories have come in the realm of back-end software rather than user-facing applications. Witness the long, frustrating history of “Linux on the desktop,” which, in an echo of the old hacker joke about strong artificial intelligence, has been perpetually just a few years away from world domination ever since the late 1990s. There is no theoretical bar to visual designers and experts in ergonomic psychology joining open-source projects, and in some times and places this has even happened. And yet the broad field of open source is still dominated by programmers writing software for themselves and for one another.

Game development joins graphical user interfaces as another notable area where the bazaar model doesn’t quite seem to do the trick. The open-source methodology excels at solving purely technical problems, but the making of a great game is a technical problem only in part — usually, not even the most important part. Consider the case of one of the most critically lauded games of the late 1990s, Valve’s Half-Life. It was a triumph of design and aesthetics, not of technology; its engine was borrowed from id Software’s two-and-a-half-year-old Quake, a technological showstopper in its day which has aged far less gracefully. It would seem that the best way — or perhaps the only way – to create a great game from whole cloth is through a priesthood with a strong and distinctive design and aesthetic vision.

Those open-source games which have become relatively popular have tended to build upon previous game designers’ visions in much the same way that Chrome is built on Chromium: think FreeCiv or Open Transport Tycoon Deluxe, worthy projects that are nevertheless more interested in making workmanlike technical improvements to their inspirations than bold fundamental leaps in design. The open-source movement has had the most pronounced impact on gaming in the form of tools, both for making games and for playing them. I could never have embarked with you on this journey through history that we’ve been on for over a decade now without the likes of DOSBox, ScummVM, UAE, VICE, and many, many other open-source emulators and utilities of all descriptions. I am deeply grateful to the many talented programmers who have given their time to them in order to keep our digital past accessible. Still, they do remain purely technical projects, not creative ones in the sense of the games which they enable to run on modern hardware.

The one ghetto of gaming where open-source projects have been able to forge a strong design and aesthetic sensibility all their own — a sensibility with no obvious antecedents in commercial, closed-source games — turns out upon examination to be not quite the anomaly it might first appear. The “roguelike” sub-genre of the CRPG dates all the way back to 1980, well before the modern open-source movement came to be. But, like that movement, it was a product of an institutional-computing hacker culture that had been around since the 1950s, in which proprietary software was regarded as not so much immoral as simply unheard of. It stands today as a fine example of open source at its best — and equally of what it does less well. Call it the exception that proves the rule.



In Hackers: Heroes of the Computer Revolution, his classic chronicle of the first few decades of institutional hackerdom, Steven Levy writes about the appeal that Adventure, a game that would lend its name to an entire genre, held for the first people to play it on the big multi-user DEC computers of the late 1970s.

In a sense, Adventure was a metaphor for computer programming itself — the deep recesses you explored in the Adventure world were akin to the basic, most obscure levels of the machine that you’d be traveling in when you hacked assembly code. You could get dizzy trying to remember where you were in both activities. Indeed, Adventure proved as addicting as programming…

Rogue, a game which would lend its name to a sub-genre that had even more appeal to the programming mindset, was itself a direct outgrowth of Adventure, with a couple of key elements added to the mix.

Michael Toy and Glenn Wichman were undergraduates at the University of California, Santa Cruz when they first encountered Adventure. Like so many others, they were absolutely entranced. The only drawback was that, once they finally beat the game for the first time, there wasn’t much more to be done; the puzzles were always the same, meaning that beating it again became a rote exercise. And there weren’t yet any other games like it. So, the pair started to talk about creating a game of their own, one that would play a little bit differently. What if, rather than building their game around a collection of pre-crafted set-piece puzzles, they made one that would offer up a new world to the player every single time through the magic of random procedural generation? That way, you could keep playing it forever, even after beating it once or twice or a dozen times. Even Toy and Wichman themselves would be able to have fun with it, given that they too would never know what sort of world they would be entering next.

But what exactly might such a game look like in practice? It wasn’t at all clear; the problem of describing a procedurally generated world in English prose like that used by Adventure was effectively insoluble in the context of the time. Then Toy stumbled upon a new programming library for the Unix operating system (the predecessor to and inspiration of Linux). The brainchild of a University of California, Berkeley student named Ken Arnold, “curses” let you arrange text however you wanted on a terminal screen, letting you change the contents of any one of the 1920 cells that made up a typical 80-character by 24-line display any time you wanted to; this made it possible to reserve different regions of the display for different sorts of information. Earlier games which hadn’t had access to curses, such as Adventure, had had to content themselves with teletype-like interactions: a continuous scrolling stream of text which, once fired at the screen, could only be forgotten. But curses changed all that at a stroke. You could use it to put up menus, maps, charts, and just about anything else you could write or draw using the ASCII character set, updating them all independently of one another.

It gave Toy and Wichman a viable path forward with their fondly imagined infinitely replayable game. For, while textual descriptions of a procedurally generated world were a nonstarter, showing a symbolic, visual representation of one using curses was another matter.

Avid players of tabletop Dungeons & Dragons, Toy and Wichman tried to recreate on the computer the dungeon-delving expeditions they enjoyed with their friends, exploring a network of rooms and tunnels filled with monsters to fight, traps and other hindrances to defuse, and treasures to collect. Whereas the main dish of Adventure had been set-piece puzzles, with only a side dish of dynamic logistical challenges — an expiring light source, an inventory limit, a pesky wandering thief with a sharp sword — the nature of their game meant that it would have to be all logistics. In making this switch, they half-accidentally invented not just the first roguelike but one of the first CRPGs, full stop. We cannot give them complete credit for that genre, mind you: other proto-CRPGs were being created at the same time on the PLATO system at the University of Illinois and on the earliest home microcomputers as well, as other Dungeons & Dragons fanatics also tried to bring the tabletop experience to the computer. Still, by all indications Toy and Wichman made the leap without knowing what anyone else was up to.

It was Wichman who came up with the name of Rogue:

I think the name just came to me. Names needed to be short because you invoked a program by typing its name in a command line. I liked the idea of a rogue. We were coming from a Dungeons & Dragons background, but we were creating a single-player game. You weren’t going down into the dungeon with a party. The idea was that this is a person going off on his or her own. It captured the theme very succinctly.

To depict their world, Toy and Wichman invented the iconography (textography?) that has remained the standard for roguelikes to this day. The walls of rooms were made from horizontal and vertical dashes (“-” and “|”), the tunnels between them from hash marks (“#”), doors from plus signs (“+”), treasure from dollar signs (“$”), monsters of varius types from any and all letters and symbols that weren’t already being used for something else. The focus of it all was your titular rogue, depicted as a forlorn little at-sign (“@”) adrift in this sea of promise and danger. The textual austerity of it all could become weirdly atmospheric. “You’d see a letter ‘T’ on the screen and it would startle you, because you knew it was a troll,” says Wichman.

Rogue

The goal of the game was to find a MacGuffin called the Amulet of Yendor, hidden 25 dungeon levels or so deep, and return it to the surface. Doing so would require fighting ever more dangerous monsters, building up your character as you did so in classic RPG fashion, both through the experience points you gained from killing them and the equipment you collected. From the first, Rogue was intended to be hard — hard enough to challenge the very people who had made it. This is another quality that has remained a core value of the sub-genre which Rogue invented.

You didn’t know what the stuff you found actually did. Would that yellow potion restore your health, or would it kill you instantly? The safest way to know for sure was to use an “identification” scroll on your new finds, but such things were rare and precious, and ironically had to be themselves identified first. In a pinch, you might just have to try on that new ring or armor and see what happened, praying as you did so that it wasn’t cursed.

Food was the most essential resource of all; while you could eat the corpses of many monsters, some of them would make you sick and some of them would get their posthumous revenge by outright killing you. (Roguelikes are a bit like the old saw about the Australian Outback: everything in them seems to be able to kill you.) The only way to have a chance of winning was to play the game over and over again, slowly ferreting out its secrets and devising optimal strategies in the course of dying again and again and again. Even once you got really good, the difference between success and failure could still come down to sheer dumb luck, as “CRPG Addict” Chet Bolingbroke noted in his articles about the game: “Sometimes you might find a two-handed sword +1 on the first level; other times, you’ll find three poison potions and a cursed dagger.” Rogue‘s own co-creator Glenn Wichman admits that he has never legitimately won it.

Rogue, in other words, flagrantly violated almost all of the modern rules of progressive game design: it was unfair in countless ways and about as unwelcoming to newcomers as a game can be. It was a comedian telling jokes at the poor player’s expense, its later levels stocked with rust monsters that instantly destroyed her hard-won magical armor (until she learned to take it off before fighting them) and rattlesnakes that poisoned her (until she learned that the only practical way to combat them was to chuck whatever junk was to hand at them from a distance). And death was an irrevocable state. Although you could save a game of Rogue and come back to it later, this was intended only for the purpose of resuming an interrupted session: the save file was deleted as soon as you restored it. There were no second chances in Rogue; a single ill-considered move, or a single errant key press, or just a simple stroke of random bad luck, could and usually did erase hours of careful, steady progress.

And yet people found it strangely compelling. This was doubtless partially down to the times; there weren’t a lot of games available to play, which meant that the amount of time and energy required to get good at this one could seem more like an advantage than a disadvantage. But there was also more to it than that, as is indicated by the survival of the roguelike sub-genre right down to the present day, with all of its legendary difficulty intact. Rogue seemed to scratch a different itch than most games, a rash from which hackers seemed particularly prone to suffer. Very few successfully retrieved the Amulet of Yendor, but that only made the prospect of doing so that much more tempting. In the hyper-competitive culture of hackerdom, beating Rogue became a badge of honor almost on a par with writing some super-useful, super-elegant program that made everyone else jealous.

All of this didn’t happen instantly. Like most games on the big institutional computers, Rogue was a work in progress for years after the first version of it went up at UC Santa Cruz, probably in 1980. In 1982, Michael Toy got kicked out of the university for spending too much time tinkering with Rogue and not enough keeping up with his classwork. He took a job in UC Berkeley’s computer lab instead, splintering the partnership that had taken Rogue this far. Wichman now dropped off the scene, to be replaced at Berkeley by, of all people, Ken Arnold, the very hacker whose curses library had inspired the initial creation of Rogue. Toy and Arnold continued to expand and refine the game until they left Berkeley in 1984.

It was during this period that Rogue got really popular, spreading far and wide with the Unix operating system on which it ran, by now the overwhelming hacker favorite. Rogue became an almost equivalent touchstone of hacker culture, being played obsessively everywhere from Bell Labs to the Nevada Test Site. The game’s creators were thrilled when they learned that both Ken Thompson and Dennis Ritchie — living gods among hackers, the creators of Unix itself — were major fans of the game; Ritchie jokingly called it the biggest single waster of CPU cycles in computing history. When Toy attempted to commercialize Rogue in 1984 by releasing an MS-DOS port through the publisher Epyx, he felt justified in advertising it as “the most popular game running on Unix” and “the most popular game on college campuses.”

By the time Rogue hit microcomputers, its partial inspiration Adventure had spawned its own thriving corner of the home-computer-games market, where companies like Infocom sold hundreds of thousands of slickly packaged parser-driven text adventures. But home users proved markedly less receptive to Rogue after its belated arrival. Even after Wichman came back on the scene to help Toy make prettier, semi-graphical versions of the game for the Apple Macintosh, Atari ST, and Commodore Amiga, Rogue didn’t sell all that many copies. Wichman could only conclude that the audience that had made it such a hit on the big computers “wasn’t the audience that was looking for games in software stores.” It was a fair assessment: roguelikes would remain staples of hacker culture, but would never make inroads into the flashier commercial-games market.

Epyx’s Rogue was one of the last artifacts of that company’s original, cerebral “Automated Simulations” identity, appearing the same year that Summer Games and Impossible Mission cemented its new image as a purveyor of slick, audiovisually polished, action-oriented titles. Small wonder that Rogue seemed to get lost in the marketing shuffle.

The Amiga Rogue was a graphical affair, but that didn’t do much for its sales.

In this as in so many other respects, Rogue laid down the template for all of the roguelikes to come as thoroughly as Adventure did for its progeny. But there was one important exception, albeit one external to the game itself: Toy, Wichman, and Arnold didn’t release their source code to the public, clinging to the role of the high priests of a cathedral rather than embracing the bazaar model of software development. “In retrospect, it would have been better to share,” admits Arnold. Yet it isn’t that surprising that they didn’t. Open source had yet to become an ideological movement, even among the hardcore hacker contingent to which Rogue‘s fathers belonged. And they did, after all, have hopes of commercializing the game, even if those hopes ultimately failed to come to complete fruition.

As it was, the lack of source code meant that those who dreamed of building a better Rogue had no choice but to start from scratch. Among the first to do so was a group of boys who hung out together in the computer lab at Lincoln-Sudbury High School in Sudbury, Massachusetts, at the dawn of the 1980s. The school’s single modest DEC PDP-11 minicomputer wasn’t wired to the Internet, but the gang nevertheless encountered Rogue early in its history: in the summer of 1981, when their mentor, a young teacher named Brian Harvey, finagled an invitation for them to go out to Berkeley for a few weeks, to see what life was like in the big leagues of institutional computing. One of the kids who went was named Jay Fenlason. He fell in love with Rogue at first sight, managing to play it for about eight hours by his own estimate during the visit. He returned to Massachusetts determined to make a game just like it. He corralled his buddies into an unlikely game-development team, and over the course of the next year they made Hack, working strictly from their memories of the game they had seen at Berkeley.

That initial version of Hack has been lost, leaving behind only scattered anecdotes. However, all indications are that it wasn’t any remarkable advance over Rogue in itself. What made it important — indeed, what changed everything for the nascent roguelike sub-genre — was the decision Fenlason and his friends made to give away not just their executable but their source code as well.

To celebrate their graduation in 1982, the computer-lab gang packaged up the source code to all of the programs they had written, Hack among them, and sent it to an organization called USENIX, a computing-research nonprofit that maintained a file archive for its members. The source bore a simple notice at the top, saying that anyone who wished to was free to make improvements to the software and distribute them, as long as due credit was given to the original creators as well and as long as they shared the updated source. Having done that, the youngsters who had made Hack went their separate ways, having no idea what the game they had loosed upon the world would someday grow into.

At first, their lack of expectations seemed more than justified; while Rogue went everywhere in hackerdom, Hack went nowhere. Then, in early 1984, a thirty-something Dutch mathematician and programmer named Andries Brouwer, who worked at the Amsterdam research center Mathematisch Centrum, chanced to troll through USENIX’s file archive, looking for interesting software. Just as Don Woods had rescued Will Crowther’s incomplete game of Adventure from oblivion back in 1977, Brouwer now stumbled across Hack and did it the same service. He tightened up the code and the gameplay, and then started adding new features, which he tested on his colleagues at Mathematisch Centrum, most of whom became certifiable Hack addicts. Beginning on December 17, 1984, he uploaded each new version to the Internet as well.

Brouwer added the concept of character classes to the game, introducing six of them; no rogue was to be found among them, but they did include the likes of a tourist and an archeologist, evidence of a quirky sense of humor that would continue to mark the game forevermore. He added shops in the dungeon for buying and selling equipment, and made the dungeon deeper; it now went down 40 levels, the last ten a special region called Hell that demanded magical protection from fire and a teleport spell to even enter. No longer did you find the Amulet of Yendor just lying around somewhere down there in the depths; now you had to defeat a Wizard of Yendor to get your mitts on it. To these big enhancements he added a wealth of smaller details that were likewise destined to remain indelible parts of the game, such as a dog or cat companion to accompany you on your expedition and the ability to write messages on the floor for various purposes.

For years, players of Rogue had been sharing their tips and travails on the Usenet group net.games.rogue. It was here that Brouwer now announced his new roguelike. The community there pounced upon Hack, which, if not clearly better than Rogue, did have the virtue of being subtly different from a game which most of them had already played to death. The volume of Hack-related traffic grew so extreme that, just one month after Brouwer had uploaded his game for the first time, the group net.games.hack came into being to accommodate it. “Please stop posting articles about Hack to net.games.rogue and use this new group instead,” wrote a Usenet administrator pointedly.

Brouwer kept his fire hose of additions and improvements spurting until July of 1985, when he pronounced himself satisfied with the game and moved on to other things. But, thanks to the fact that he had honored the wishes of Jay Fenlason and company and publicly released his source code, Hack could continue to morph and grow after his departure in a way that Rogue had not been able to after Michael Toy and Ken Arnold left Berkeley. Ports and modified versions were soon popping up everywhere. It was exciting in a way, but it became a bit too much like the babble of a bazaar. Three hackers, by the names of Mike Stephenson, Izchak Miller, and Janet Walz, decided that a little bureaucracy wouldn’t be amiss. They decided to create a sort of curated version of the game, incorporating changes from anyone who wished to contribute to the project, as long as they were well-coded, worthwhile, and not game-breaking. Because their home base was net.games.hack, they named their version of the game NetHack. Its first official release came in July of 1987; its most recent one as of this writing came out in February of 2023. I suspect that there will be many, many more before NetHack‘s full history can be written.

NetHack is an answer for every player of traditional adventure games who has ever asked why she can’t just bash a door open instead of searching hither and yon for the key.

The semi-anonymous wizards behind the NetHack curtain are known simply as the DevTeam. For 36 years, this rotating cast of characters has maintained and added to the game, making it one of if not the most systemically complex ever created, even as it retains in its canonical version an entirely textual display focused around a little wandering at-sign. Experienced players delight in ferreting out the emergent possibilities provided by the sheer depth of NetHack‘s systems. “The DevTeam thinks of everything,” goes a saying among players.

To wit: use a pair of gloves to pick up a dead cockatrice, a creature which turns any living thing it touches to stone, then bash your enemies with it to turn them to stone. (This technique is known among the NetHack cognoscenti as “wielding the rubber chicken.”) Of course, you’ll need to use a pick axe afterward to separate the statues of your enemies that are left behind from the loot they were carrying…

Or combine a Wand of Polymorph with a Ring of Polymorph Control to eliminate the middleman, as it were, turning yourself into a cockatrice. You can lay eggs in this form, which you can pick up and carry around once you revert to your natural form, throwing them at your enemies like grenades while you gleefully sing “Rainy Day Women #12 and 35.”

The possibilities are endless. NetHack even keeps track of the phases of the moon in the real world and uses them to influence your luck; this leads to devotees clearing their calendars once per month in order to maximize their chances when the moon is full.

NetHack has become an institution of old-school hacker culture, and with it an icon of the open-source movement. None other than Eric Raymond was the first to create an optional graphical skin for the game (a move that prompted considerable controversy). And well before he wrote The Cathedral and the Bazaar, he wrote the first manual for NetHack. Small wonder that it joined Rogue and Adventure as one of the very few games memorialized in 1996’s New Hacker’s Dictionary — edited by, you guessed it, Eric S. Raymond. DevTeam founding member Mike Stephenson has no doubts about NetHack‘s importance, not only as a standalone game but as a model for software development: “We predated open source [as a movement], but I do think we helped to promote the idea of making software available for public use without cost. I think the other thing that really contributed to the concept of open source is that NetHack has, and still does, accept bug reports and feature ideas from anyone.”

NetHack became the standard bearer of the roguelike sub-genre almost from the moment of its first release, and has never had its status in this regard seriously challenged. That said, hundreds of other roguelikes were made after it, and some even before it. The most important among them are arguably Moria and Angband. The former arrived at a complete form already in 1983, when it became the first game of this type to offer an above-ground town to serve as a base for your dungeon expeditions; this gave it a significantly different feel, more like, to put things in the terms of Dungeons & Dragons, an ongoing campaign than a single adventure module. Moria directly inspired 1990’s Angband, a much more complex implementation of the same approach, which, like NetHack, is still in active development today. Some players prefer NetHack‘s relentlessly escalating challenge, others Angband‘s somewhat more relaxed pacing and more free-form structure — but make no mistake, Angband too will kill you in a heartbeat if you let your guard down. And in it as well, dead is dead, permanently.

This roguelike “family tree” shows how the most historically and currently popular games in the sub-genre relate to one another.

This brings us back around to a statement I made at the outset: that roguelikes are the exception that proves the rule of open-source game development — and just possibly of open-source software development in general. The cast of thousands who contribute to them do so in order to make exactly the games that they want to play, which in the abstract is the best of all possible reasons to make a game. The experience they end up with is, unsurprisingly, much like high-wire programming at its most advanced, presenting players with an immense, multi-faceted system to be explored and mastered. And there is absolutely nothing wrong with this.

Still, it does seem to me that roguelikes tend to bring out some of the worst as well as the best of the hacker ethic, what with their insistence that they’re only for the “hardcore” and their lack of empathy for the newcomer. Few things in this world are less attractive than a nerd beating his chest. Robert Koeneke, the creator of Moria, admits that while he was working on it, “if anyone managed to win, I immediately found out how, and ‘enhanced’ the game to make it harder.” Likewise, for every cool interaction to be discovered in NetHack, there’s a cheap, heartless death in store, like stumbling down a staircase whilst carrying a cockatrice and turning yourself to stone, or missing a stirrup whilst trying to mount a horse and breaking your neck, or incinerating yourself by firing off your Wand of Lightning too close to a wall, or getting killed by your own pet dog when you attempt to use your Ring of Conflict to get that nearby band of orcs fighting one another. NetHack is the sort of game that likes to give you a fake Amulet of Yendor, then laugh at you when you scurry all the way back to the surface with it and think you’re about to win.

As with so much in life, one’s relationship to roguelikes comes down to questions of priorities. As someone who likes to play a variety of games, I’ve never done more than dabble in these ones. For the time required to get even minimally competent at them is more than I’m willing to invest in any single game — or that I can invest, if I want to keep doing what I do on this site.

Meanwhile the amount of time and effort required to get good at a game like NetHack is staggering, even if you’re far smarter and more diligent than I am. It took Chet Bolingbroke 262 hours of trying to win at NetHack for the first time — and that was playing in a fashion that many purists would consider illegitimate, by looking up spoilers on the game’s many interconnected components rather than learning strictly through experience, not to mention playing an old version that is much less complex than the current ones. Was it worth the time investment? He has his doubts. “Permadeath just sucks,” he concludes. Even Eric Raymond feels today that NetHack may have gone too far: “There was a natural tendency for the devs to see the game from the point of view of someone who played it constantly and obsessively. Thus, over time, their notion of not making it ‘too easy’ gradually ratcheted up the difficulty level to the point where you really couldn’t enjoy it casually anymore.” NetHack displays, in other words, open-source software’s usual Achilles heel, its developers’ inability to put themselves in the shoes of people who aren’t just like them.

Then again, it isn’t as if this represents some deep moral failing; there’s nothing wrong with being niche. Many or most lovers of NetHack and other roguelikes have never won them and quite probably never will, finding satisfaction merely in the trying, in hoping to get a little further than last time and walk away with some entertaining stories to share. Far be it from me to begrudge them their pleasures. Although I doubt that I will ever become a big fan of roguelikes, I do derive a quiet sort of satisfaction from knowing that things so implacably committed to being their own idiosyncratic selves exist in this world.

And if roguelikes will never go mainstream, that doesn’t mean they haven’t influenced the mainstream. Next time, we’ll learn how one of the most popular of all the slick commercial games of the late 1990s grew out of this odd little corner of hackerdom…



Did you enjoy this article? If so, please think about pitching in to help me make many more like it. You can pledge any amount you like.


(Sources: I highly recommend David L. Craddock’s book Dungeon H@acks: How NetHack, Angband, and Other Roguelikes Changed the Course of Video Games, a treasure trove of information that I have only touched upon here. The CRPG Addict blog is full of stories about what it’s like to actually play Rogue, Hack, 1987-vintage NetHack, 1989-vintage NetHackMoria, and Angband among other roguelikes, along with some more historical notes. I’m immensely indebted to David for all of his original research and to Chet for spending the hundreds of hours on these games that I couldn’t spare.

Other print sources include the books Hackers: Heroes of the Computer Revolution by Steven Levy, The Cathedral and the Bazaar: Musings on Linux and Open Source by an Accidental Revolutionary by Eric S. Raymond, and Open Sources: Voices from the Revolution edited by Chris DiBona, Sam Ockman, and Mark Stone; Byte of March 1984 and February 1987; Acorn User of February 1997; Computer Power User of March 2008. Other online sources include Glenn Wichman’s “Brief History of Rogue,” “The Best Game Ever” by Wagner James Au at Salon, “Playing the Open Source Game” by Shawn Hargreaves, “Freeing an Old Game” by Ben Asselstine at Free Software Magazine, and a retrospective on NetHack by Dave “Fargo” Kosak of GameSpy.

Much more information about all of the games mentioned in this article, and roguelikes in general, can be found at RogueBasin, as can download links for all of them.)

 

Tags: , , ,

A Dialog in Real Time (Strategy)

At the end of the 1990s, the two most popular genres in computer gaming were the first-person shooter and the real-time strategy game. They were so dominant that most of the industry’s executives seemed to want to publish little else. And yet at the beginning of the decade neither genre even existed.

The stories of how the two rose to such heady heights are a fascinating study in contrasts, of how influences in media can either go off like an explosion in a TNT factory or like the slow burn of a long fuse. Sometimes something appears and everyone knows instantly that it’s just changed everything; when the Beatles dropped Sgt. Pepper’s Lonely Hearts Club Band in 1967, there was no doubt that the proverbial goalposts in rock music had just been shifted. Other times, though, influence can take years to make itself felt, as was the case for another album of 1967, The Velvet Underground & Nico, about which Brian Eno would later famously say that it “only sold 10,000 copies, but everyone who bought it formed a band.”

Games are the same. Gaming’s Sgt. Pepper was DOOM, which came roaring up out of the shareware underground at the tail end of 1993 to sweep everything from its path, blowing away all of the industry’s extant conventional wisdom about what games would become and what role they would play in the broader culture. Gaming’s Velvet Underground, on the other hand, was the avatar of real-time strategy, which came to the world in the deceptive guise of a sequel in the fall of 1992. Dune II: The Building of a Dynasty sported its Roman numeral because its transnational publisher had gotten its transatlantic cables crossed and accidentally wound up with two separate games based on Frank Herbert’s epic 1965 science-fiction novelone made in Paris, the other in Las Vegas. The former turned out to be a surprisingly evocative and playable fusion of adventure and strategy game, but it was the latter that would quietly — oh, so quietly in the beginning! — shift the tectonic plates of gaming.

For Dune II, which was developed by Westwood Studios and published by Virgin Games, really was the first recognizable implementation of the genre of real-time strategy as we have come to know it since. You chose one of three warring trading houses to play, then moved through a campaign made up of a series of set-piece scenarios, in which your first goal was always to make yourself an army by gathering resources and using them to build structures that could churn out soldiers, tanks, aircraft, and missiles, all of which you controlled by issuing them fairly high-level orders: “go here,” “harvest there,” “defend this building,” “attack that enemy unit.” Once you thought you were strong enough, you could launch your full-on assault on the enemy — or, if you weren’t quick enough, you might find yourself trying to fend off his attack. What made it so different from most of the strategy games of yore was right there in the name: in the fact that it all played out in real time, at a pace that ranged from the brisk to the frantic, making it a test of your rapid-fire mousemanship and your ability to think on your feet. Bits and pieces of all this had been seen before — perhaps most notably in Peter Molyneux and Bullfrog’s Populous and the Sega Genesis game Herzog Zwei — but Dune II was where it all came together to create a gaming paradigm for the ages.

That said, Dune II was very much a diamond in the rough, a game whose groundbreaking aspirations frequently ran up against the brick wall of its limitations. It’s likely to leave anyone who has ever played almost any other real-time-strategy game seething with frustration. It runs at a resolution of just 320 X 200, giving only the tiniest window into the battlefield; it only lets you select and control one unit at a time, making coordinated attacks and defenses hard to pull off; its scenarios are somewhat rote exercises, differing mainly in the number of enemy hordes they throw against you as you advance through the campaign rather than the nature of the terrain or your objectives. Even its fog of war is wonky: the whole battlefield is blank blackness until one of your units gets within visual range, after which you can see everything that goes on there forevermore, whether any of your units can still lay eyes on it or not. And it has no support whatsoever for the multiplayer free-for-alls that are for many or most players the biggest draw of the genre.

Certainly Virgin had no inkling that they had a nascent ludic revolution on their hands. They released Dune II with more of a disinterested shrug than a fulsome fanfare, having expended most of their promotional energies on the other Dune, which had come out just a few months earlier. It’s a testimony to the novelty of the gameplay experience that it did as well as it did. It didn’t become a massive hit, but it sold well enough to earn its budget back and then some on the strength of reasonably positive reviews — although, again, no reviewer had the slightest notion that he was witnessing the birth of what would be one of the two hottest genres in gaming six years in the future. Even Westwood seemed initially to regard Dune II as a one-and-done. They wouldn’t release another game in the genre they had just invented for almost three years.

But the gaming equivalent of all those budding bedroom musicians who listened to that Velvet Underground record was also out there in the case of Dune II. One hungry, up-and-coming studio in particular decided there was much more to be done with the approach it had pioneered. And then Westwood themselves belatedly jumped back into the fray. Thanks to the snowball that these two studios got rolling in earnest during the mid-1990s, the field of real-time strategy would be well and truly saturated by the end of the decade, the yin to DOOM‘s yang. This, then, is the tale of those first few years of these two studios’ competitive dialog, over the course of which they turned the real-time strategy genre from a promising archetype into one of gaming’s two biggest, slickest crowd pleasers.


Blizzard Studios is one of the most successful in the history of gaming, so much so that it now lends its name to the Activision Blizzard conglomerate, with annual revenues in the range of $7.5 billion. In 1993, however, it was Westwood, flying high off the hit dungeon crawlers Eye of the Beholder and Lands of Lore, that was by far the more recognizable name. In fact, Blizzard wasn’t even known yet as Blizzard.

The company had been founded in late 1990 by Allen Adham and Mike Morhaime, a couple of kids fresh out of university, on the back of a $15,000 loan from Morhaime’s grandmother. They called their venture Silicon & Synapse, setting it up in a hole-in-the-wall office in Costa Mesa, California. They kept the lights on initially by porting existing games from one platform to another for publishers like Interplay — the same way, as it happened, that Westwood had gotten off the ground almost a decade before. And just as had happened for Westwood, Silicon & Synapse gradually won opportunities to make their own games once they had proven themselves by porting those of others. First there was a little auto-racing game for the Super Nintendo called RPM Racing, then a pseudo-sequel to it called Rock ‘n’ Roll Racing, and then a puzzle platformer called The Lost Vikings, which appeared for the Sega Genesis, MS-DOS, and the Commodore Amiga in addition to the Super Nintendo. None of these titles took the world by storm, but they taught Silicon & Synapse what it took to create refined, playable, mass-market videogames from scratch. All three of those adjectives have continued to define the studio’s output for the past 30 years.

It was now mid-1993; Silicon & Synapse had been in business for more than two and a half years already. Adham and Morhaime wanted to do something different — something bigger, something that would be suitable for computers only rather than the less capable consoles, a real event game that would get their studio’s name out there alongside the Westwoods of the world. And here there emerged another of their company’s future trademarks: rather than invent something new from whole or even partial cloth, they decided to start with something that already existed, but make it better than ever before, polishing it until it gleamed. The source material they chose was none other than Westwood’s Dune II, now relegated to the bargain bins of last year’s releases, but a perennial after-hours favorite at the Silicon & Synapse offices. They all agreed as to the feature they most missed in Dune II: a way to play it against other people, like you could its ancestor Populous. The bane of most multiplayer strategy games was their turn-based nature, which left you waiting around half the time while your buddy was playing. Real-time strategy wouldn’t have this problem of downtime.

That became the design brief for Warcraft: Orcs & Humans: remake Dune II but make it even better, and then add a multiplayer feature. And then, of course, actually try to sell the thing in all the ways Virgin had not really tried to sell its inspiration.

To say that Warcraft was heavily influenced by Dune II hardly captures the reality. Most of the units and buildings to hand have a direct correspondent in Westwood’s game. Even the menu of icons on the side of the screen is a virtual carbon copy — or at least a mirror image. “I defensively joked that, while Warcraft was certainly inspired by Dune II, [our] game was radically different,” laughs Patrick Wyatt, the lead programmer and producer on the project. “Our radar mini-map was in the upper left corner of the screen, whereas theirs was in the bottom right corner.”

In the same spirit of change, Silicon & Synapse replaced the desert planet of Arrakis with a fantasy milieu pitting, as the subtitle would suggest, orcs against humans. The setting and the overall look of Warcraft owe almost as much to the tabletop miniatures game Warhammer as the gameplay does to Dune II; a Warhammer license was seriously considered, but ultimately rejected as too costly and potentially too restrictive. Years later, Wyatt’s father would give him a set of Warhammer miniatures he’d noticed in a shop: “I found these cool toys and they reminded me a lot of your game. You might want to have your legal department contact them because I think they’re ripping you off.”

Suffice to say, then, that Warcraft was even more derivative than most computer games. The saving grace was the same that it would ever be for this studio: that they executed their mishmash of influences so well. The squishy, squint-eyed art is stylized like a cartoon, a wise choice given that the game is still limited to a resolution of just 320 X 200, so that photo-realism is simply not on the cards. The overall look of Warcraft has more in common with contemporary console games than the dark, gritty aesthetic that was becoming so popular on computers. The guttural exclamations of the orcs and the exaggerated Monty Python and the Holy Grail-esque accents of the humans, all courtesy of regular studio staffers rather than outside voice actors, become a chorus line as you order them hither and yon, making Dune II seem rather stodgy and dull by comparison. “We felt too many games took themselves too seriously,” says Patrick Wyatt. “We just wanted to entertain people.”

Slavishly indebted though it is to Dune II in all the broad strokes, Warcraft doesn’t neglect to improve on its inspiration in those nitty-gritty details that can make the difference between satisfaction and frustration for the player. It lets you select up to four units and give them orders at the same time by simply dragging a box around them, a quality-of-life addition whose importance is difficult to overstate, one so fundamental that no real-time-strategy game from this point forward would dare not to include it. Many more keyboard shortcuts are added, a less technically impressive addition but one no less vital to the cause of playability when the action starts to heat up. There are now two resources you need to harvest, lumber and gold, in places of Dune II‘s all-purpose spice. Units are now a little more intelligent about interpreting your orders, such that they no longer blithely ignore targets of opportunity, or let themselves get mauled to death without counterattacking just because you haven’t explicitly told them to. Scenario design is another area of marked improvement: whereas every Dune II scenario is basically the same drill, just with ever more formidable enemies to defeat, Warcraft‘s are more varied and arise more logically out of the story of the campaign, including a couple of special scenarios with no building or gathering at all, where you must return a runaway princess to the fold (as the orcs) or rescue a stranded explorer (as the humans).

The orc on the right who’s stroking his “sword” looks so very, very wrong — and this screenshot doesn’t even show the animation…

And, as the cherry on top, there was multiplayer support. Patrick Wyatt finished his first, experimental implementation of it in June of 1994, then rounded up a colleague in the next cubicle over so that they could became the first two people ever to play a full-fledged real-time-strategy game online. “As we started the game, I felt a greater sense of excitement than I’d ever known playing any other game,” he says.

It was just this magic moment, because it was so invigorating to play against a human and know that it wasn’t some stupid AI. It was a player who was smart and doing his absolute best to crush you. I knew we were making a game that would be fun, but at that moment I knew the game would absolutely kick ass.

While work continued on Warcraft, the company behind it was going through a whirlwind of changes. Recognizing at long last that “Silicon & Synapse” was actually a pretty terrible name, Adham and Morhaime changed it to Chaos Studios, which admittedly wasn’t all that much better, in December of 1993. Two months later, they got an offer they couldn’t refuse: Davidson & Associates, a well-capitalized publisher of educational software that was looking to break into the gaming market, offered to buy the freshly christened Chaos for the princely sum of $6.75 million. It was a massive over-payment for what was in all truth a middling studio at best, such that Adham and Morhaime felt they had no choice but to accept, especially after Davidson vowed to give them complete creative freedom. Three months after the acquisition, the founders decided they simply had to find a decent name for their studio before releasing Warcraft, their hoped-for ticket to the big leagues. Adham picked up a dictionary and started leafing through it. He hit pay dirt when his eyes flitted over the word “blizzard.” “It’s a cool name! Get it?” he asked excitedly. And that was that.

So, Warcraft hit stores in time for the Christmas of 1994, with the name of “Blizzard Entertainment” on the box as both its developer and its publisher — the wheels of the latter role being greased by the distributional muscle of Davidson & Associates. It was not immediately heralded as a game that would change everything, any more than Dune II had been; real-time strategy continued to be more of a slowly growing snowball than the ton of bricks to the side of the head that the first-person shooter had been. Computer Gaming World magazine gave Warcraft a cautious four stars out of five, saying that “if you enjoy frantic real-time games and if you don’t mind a linear structure in your strategic challenges, Warcraft is a good buy.” At the same time, the extent of the game’s debt to Dune II was hardly lost on the reviewer: “It’s a good thing for Blizzard that there’s no precedent for ‘look and feel’ lawsuits in computer entertainment.”[1]This statement was actually not correct; makers of standup arcade games of the classic era and the makers of Tetris had successfully cowed the cloning competition in the courts.

Warcraft would eventually sell 400,000 units, bettering Dune II‘s numbers by a factor of four or more. As soon as it became clear that it was doing reasonably well, Blizzard started on a sequel.


Out of everyone who looked at Warcraft, no one did so with more interest — or with more consternation at its close kinship with Dune II — than the folks at Westwood. “When I played Warcraft, the similarities between it and Dune II were pretty… blatant, so I didn’t know what to think,” says the Westwood designer Adam Isgreen. Patrick Wyatt of Blizzard got the impression that his counterparts “weren’t exactly happy” at the slavish copying when they met up at trade shows, though he “reckoned they should have been pleased that we’d taken their game as a base for ours.” Only gradually did it become clear why Warcraft‘s existence was a matter of such concern for Westwood: because they themselves had finally decided to make another game in the style of Dune II.

The game that Westwood was making could easily have wound up looking even more like the one that Blizzard had just released. The original plan was to call it Command & Conquer: Fortress of Stone and to set it in a fantasy world. (Westwood had been calling their real-time-strategy engine “Command & Conquer” since the days of promoting Dune II.) “It was going to have goldmines and wood for building things. Sound familiar?” chuckles Westwood’s co-founder Louis Castle. “There were going to be two factions, humans and faerie folk… pretty fricking close to orcs versus humans.”

Some months into development, however, Westwood decided to change directions, to return to a science-fictional setting closer to that of Dune II. For they wanted their game to be a hit, and it seemed to them that fantasy wasn’t the best guarantee of such a thing: CRPGs were in the doldrums, and the most recent big strategy release with a fantasy theme, MicroProse’s cult-classic-to-be Master of Magic, hadn’t done all that well either. Foreboding near-future stories, however, were all the rage; witness the stellar sales of X-COM, another MicroProse strategy game of 1994. “We felt that if we were going to make something that was massive,” says Castle, “it had to be something that anybody and everybody could relate to. Everybody understands a tank; everybody understands a guy with a machine gun. I don’t have to explain to them what this spell is.” Westwood concluded that they had made the right decision as soon as they began making the switch in software: “Tanks and vehicles just felt better.” The game lost its subtitle to become simply Command & Conquer.

While the folks at Blizzard were plundering Warhammer for their units and buildings, those at Westwood were trolling the Jane’s catalogs of current military hardware and Soldier of Fortune magazine. “We assumed that anything that was talked about as possibly coming was already here,” says Castle, “and that was what inspired the units.” The analogue of Dune II‘s spice — the resource around which everything else revolved — became an awesomely powerful space-born element come to earth known as tiberium.

Westwood included most of the shortcuts and conveniences that Blizzard had built into Warcraft, but went one or two steps further more often than not. For example, they also made it possible to select multiple units by dragging a box around them, but in their game there was no limit to the number of units that could be selected in this way. The keyboard shortcuts they added not only let you quickly issue commands to units and buildings, but also jump around the map instantly to custom viewpoints you could define. And up to four players rather than just two could now play together at once over a local network or the Internet, for some true mayhem. Then, too, scenario design was not only more varied than in Dune II but was even more so than in Warcraft, with a number of “guerilla” missions in the campaigns that involved no resource gathering or construction. It’s difficult to say to what extent these were cases of parallel innovation and to what extent they were deliberate attempts to one-up what Warcraft had done. It was probably a bit of both, given that Warcraft was released a good nine months before Command & Conquer, giving Westwood plenty of time to study it.

But other innovations in Command & Conquer were without any precedent. The onscreen menus could now be toggled on and off, for instance, a brilliant stroke that gave you a better view of the battlefield when you really needed it. Likewise, Westwood differentiated the factions in the game in a way that had never been done before. Whereas the different houses in Dune II and the orcs and humans in Warcraft corresponded almost unit for unit, the factions in Command & Conquer reflected sharply opposing military philosophies, demanding markedly different styles of play: the establishment Global Defense Initiative had slow, strong, and expensive units, encouraging a methodical approach to building up and husbanding your forces, while the terroristic Brotherhood of Nod had weaker but faster and cheaper minions better suited to madcap kamikaze rushes than carefully orchestrated combined-arms operations.

Yet the most immediately obvious difference between Command & Conquer and Warcraft was all the stuff around the game. Warcraft had been made on a relatively small budget with floppy disks in mind. It sported only a brief opening cinematic, after which scenario briefings consisted of nothing but scrolling text and a single voice over a static image. Command & Conquer, by contrast, was made for CD-ROM from the outset, by a studio with deeper pockets that had invested a great deal of time and energy into both 3D animation and full-motion video, that trendy art of incorporating real-world actors and imagery into games. The much more developed story line of Command & Conquer is forwarded by little between-mission movies that, if not likely to make Steven Spielberg nervous, are quite well-done for what they are, featuring as they do mostly professional performers — such as a local Las Vegas weatherman playing a television-news anchorman — who were shot by a real film crew in Westwood’s custom-built blue-screen studio. Westwood’s secret weapon here was Joseph Kucan, a veteran theater director and actor who oversaw the film shoots and personally played the charismatic Nod leader Kane so well that he became the very face of Command & Conquer in the eyes of most gamers, arguably the most memorable actual character ever associated with a genre better known for its hordes of generic little automatons. Louis Castle reckons that at least half of Command & Conquer‘s considerable budget went into the cut scenes.

The game was released with high hopes in August of 1995. Computer Gaming World gave it a pretty good review, four stars out of five: “The entertainment factor is high enough and the action fast enough to please all but the most jaded wargamers.”

The gaming public would take to it even more than that review might imply. But in the meantime…


As I noted in an earlier article, numbered sequels weren’t really commonplace for strategy games prior to the mid-1990s. Blizzard had originally imagined Warcraft as a strategy franchise of a different stripe: each game bearing the name would take the same real-time approach into a completely different milieu, as SSI was doing at the time with their “5-Star General” series of turn-based strategy games that had begun with Panzer General and continued with the likes of Fantasy General and Star General. But Blizzard soon decided to make their sequel a straight continuation of the first game, an approach to which real-time strategy lent itself much more naturally than more traditional styles of strategy game; the set-piece story of a campaign could, after all, always be continued using all the ways that Hollywood had long since discovered for keeping a good thing going. The only snafu was that either the orcs or the humans could presumably have won the war in the first game, depending on which side the player chose. No matter: Blizzard decided the sequel would be more interesting if the orcs had been the victors and ran with that.

Which isn’t to say that building upon its predecessor’s deathless fiction was ever the real point of Warcraft II: Tides of Darkness. Blizzard knew now that they had a competitor in Westwood, and were in any case eager to add to the sequel all of the features and ideas that time had not allowed them to include in the first game. There would be waterways and boats to sail on them, along with oil, a third resource, one that could only be mined at sea. Both sides would get new units to play with, while elves, dwarves, trolls, ogres, and goblins would join the fray as allies of one of the two main racial factions. The interface would be tweaked with another welcome shortcut: selecting a unit and right-clicking somewhere would cause it to carry out the most logical action there without having to waste time choosing from a menu. (After all, if you selected a worker unit and sent him to a goldmine, you almost certainly wanted him to start collecting gold. Why should you have to tell the game the obvious in some more convoluted fashion?)

But perhaps the most vital improvement was in the fog of war. The simplistic implementations of same seen in the first Warcraft and Command & Conquer were inherited from Dune II: areas of the map that had been seen once by any of your units were revealed permanently, even if said units went away or were destroyed. Blizzard now made it so that you would see only a back-dated snapshot of areas currently out of your units’ line of sight, reflecting what was there the last time one of your units had eyes on them. This innovation, no mean feat of programming on the part of Patrick Wyatt, brought a whole new strategic layer to the game. Reconnaissance suddenly became something you had to think about all the time, not just once.

Other improvements were not so conceptually groundbreaking, but no less essential for keeping ahead of the Joneses (or rather the Westwoods). For example, Blizzard raised the screen-resolution stakes, from 320 X 200 to 640 X 480, even as they raised the number of people who could play together online from Command & Conquer‘s four to eight. And, while there was still a limit on the number of units you could select at one time using Blizzard’s engine, that limit at least got raised from the first Warcraft‘s four to nine.

The story and its presentation, however, didn’t get much more elaborate than last time out. While Westwood was hedging its bets by keeping one foot in the “interactive movie” space of games like Wing Commander III, Blizzard was happy to “just” make Warcraft a game. The two series were coming to evince very distinct personalities and philosophies, just as gamers were sorting themselves into opposing groups of fans — with a large overlap of less partisan souls in between them, of course.

Released in December of 1995, Warcraft II managed to shake Computer Gaming World free of some of its last reservations about the burgeoning genre of real-time strategy, garnering four and a half stars out of five: “If you enjoy fantasy gaming, then this is a sure bet for you.” It joined Command & Conquer near the top of the bestseller lists, becoming the game that well and truly made Blizzard a name to be reckoned with, a peer in every sense with Westwood.

Meanwhile, and despite the sometimes bitter rivalry between the two studios and their fans, Command & Conquer and Warcraft II together made real-time strategy into a commercial juggernaut. Both games became sensations, with no need to shirk from comparison to even DOOM in terms of their sales and impact on the culture of gaming. Each eventually sold more than 3 million copies, numbers that even the established Westwood, much less the upstart Blizzard, had never dreamed of reaching before, enough to enshrine both games among the dozen or so most popular computer games of the entire 1990s. More than three years after real-time strategy’s first trial run in Dune II, the genre had arrived for good and all. Both Westwood and Blizzard rushed to get expansion packs of additional scenarios for their latest entries in the genre to market, even as dozens of other developers dropped whatever else they were doing in order to make real-time-strategy games of their own. Within a couple of years, store shelves would be positively buckling under the weight of their creations — some good, some bad, some more imaginative, some less so, but all rendered just a bit anonymous by the sheer scale of the deluge. And yet even the most also-ran of the also-rans sold surprisingly well, which explained why they just kept right on coming. Not until well into the new millennium would the tide begin to slacken.


With Command & Conquer and Warcraft II, Westwood and Blizzard had arrived at an implementation of real-time strategy that even the modern player can probably get on with. Yet there is one more game that I just have to mention here because it’s so loaded with a quality that the genre is known for even less than its characters: that of humor. Command & Conquer: Red Alert is as hilarious as it is unexpected, the only game of this style that’s ever made me laugh out loud.

Red Alert was first envisioned as a scenario pack that would move the action of its parent game to World War II. But two things happened as work progressed on it: Westwood decided it was different enough from the first game that it really ought to stand alone, and, as designer Adam Isgreen says, “we found straight-up history really boring for a game.” What they gave us instead of straight-up history is bat-guano insane, even by the standards of videogame fictions.

We’re in World War II, but in a parallel timeline, because Albert Einstein — why him? I have no idea! — chose to travel back in time on the day of the Trinity test of the atomic bomb and kill Adolf Hitler. Unfortunately, all that’s accomplished is to make world conquest easier for Joseph Stalin. Now Einstein is trying to save the democratic world order by building ever more powerful gadgets for its military. Meanwhile the Soviet Union is experimenting with the more fantastical ideas of Nikola Tesla, which in this timeline actually work. So, the battles just keep getting crazier and crazier as the game wears on, with teleporters sending units jumping instantly from one end of the map to the other, Tesla coils zapping them with lightning, and a fetching commando named Tanya taking out entire cities all by herself when she isn’t chewing the scenery in the cut scenes. Those actually display even better production values than the ones in the first game, but the script has become pure, unadulterated camp worthy of Mel Brooks, complete with a Stalin who ought to be up there singing and dancing alongside Der Führer in Springtime for Hitler. Even our old friend Kane shows up for a cameo. It’s one of the most excessive spectacles of stupidity I’ve ever seen in a game… and one of the funniest.

Joseph Stalin gets rough with an underling. When you don’t have the Darth Vader force grip, you have to do things the old-fashioned way…

Up there at the top is the killer commando Tanya, who struts across the battlefield with no regard for proportion.

Released in the dying days of 1996, Red Alert didn’t add that much that was new to the real-time-strategy template, technically speaking; in some areas such as fog of war, it still lagged behind the year-old Warcraft II. Nonetheless, it exudes so much joy that it’s by far my favorite of the games I’ve written about today. If you ask me, it would have been a better gaming world had the makers of at least a few of the po-faced real-time-strategy games that followed looked here for inspiration. Why not? Red Alert too sold in the multiple millions.



Did you enjoy this article? If so, please think about pitching in to help me make many more like it. You can pledge any amount you like.



(Sources: the book Stay Awhile and Listen, Book I by David L. Craddock; Computer Gaming World of January 1995, March 1995, December 1995, March 1996, June 1996, September 1996, December 1996, March 1997, June 1997, and July 1997; Retro Gamer 48, 111, 128, and 148; The One of January 1993; the short film included with the Command & Conquer: The First Decade game collection. Online sources include Patrick Wyatt’s recollections at his blog Code of Honor, Dan Griliopoulos’s collection of interviews with Westwood alumni at Funambulism, Soren Johnson’s interview with Louis Castle for his Designer’s Notes podcast, and Richard Moss’s real-time-strategy retrospective for Ars Technica.

Warcraft: Orcs & Humans and Warcraft II: Tides of Darkness, are available as digital purchases at GOG.com. The first Command & Conquer and Red Alert are available in remastered versions as a bundle from Steam.)

Footnotes

Footnotes
1 This statement was actually not correct; makers of standup arcade games of the classic era and the makers of Tetris had successfully cowed the cloning competition in the courts.
 

Tags: , , , , ,

Tomb Raider

If you have to stare at someone’s bum, it’s far better to look at a nice female bum than a bloke’s bum!

— Adrian Smith of Core Design

There was something refreshing about looking at the screen and seeing myself as a woman. Even if I was performing tasks that were a bit unrealistic… I still felt like, hey, this is a representation of me, as myself, as a woman. In a game. How long have we waited for that?

— gamer Nikki Douglas

Sure, she’s powerful and assertive. She takes care of herself, and she knows how to handle a gun. She’s a great role model for girls. But how many copies of Tomb Raider do you think they’d have sold if they’d made Lara Croft flat-chested?

— Charles Ardai, Computer Gaming World

It strikes me that Lara Croft must be the most famous videogame character in history if you take the word “character” literally. Her only obvious competition comes from the Nintendo stable — from Super Mario and Pac-Man and all the rest. But they aren’t so much characters as eternal mascots, archetypes out of time in the way of Mickey Mouse or Bugs Bunny. Lara, on the other hand, has a home, a reasonably coherent personal chronology, a reasonably fleshed-out personality — heck, she even has a last name!

Of course, Lara is by no means alone in any of these things among videogame stars. Nevertheless, for all the cultural inroads that gaming has made in recent decades, most people who don’t play games will still give you a blank stare if you try to talk to them about any of our similarly well-rounded videogame characters. Mention Solid Snake, Cloud, or Gordon Freeman to them and you’ll get nothing. But Lara is another story. After twenty games that have sold almost 100 million copies combined and three feature films whose box-office receipts approach $1 billion, everybody not living under a proverbial rock has heard of Lara Croft. Love her or hate her, she has become one of us in a way that none of her peers can match.



Lara’s roots reach back to the first wave of computer gaming in Britain, to the era when Sinclair Spectrums and Commodore 64s were the hottest machines on the market. In 1984, in the midst of this boom, Ian Stewart and Kevin Norburn founded the publisher Gremlin Graphics — later Gremlin Interactive — in the back room of a Sheffield software shop. Gremlin went on to become the Kevin Bacon of British game development: seemingly everybody who was anybody over the ensuing decades was associated with them at one time or another, or at the very least worked with someone who had been. This applies not least to Lara Croft, that most iconic woman in the history of British gaming.

Core Design, the studio that made her, was formed in 1986 as Gremlin Derby, around the talents of four young men from the same town who had just created the hit game Bounder using the Commodore 64s in their bedrooms. But not long after giving the four a real office to work in, the folks at Gremlin’s Sheffield headquarters began to realize that they should have looked before they leaped — that they couldn’t actually afford to be funding outside studios with their current revenue stream. (Such was the way of things in the topsy-turvy world of early British game development, when sober business expertise was not an overly plentiful commodity.) Rather than close the Derby branch they had barely had time to open, three Gremlin insiders — a sales executive named Jeremy Heath-Smith, the current manager of the Derby studio Greg Holmes, and the original Gremlin co-founder Kevin Norburn — cooked up a deal to take it over and run it themselves as an independent entity. They set up shop under the name of Core Design in 1988.

Over the year that followed, Core had its ups and downs: Heath-Smith bought out Holmes in 1990 and Norburn in 1992, both under circumstances that weren’t entirely amicable. But the little studio had a knack for squeezing out a solid seller whenever one was really needed, such as Rick Dangerous and Chuck Rock. Although most of these games were made available for MS-DOS among other platforms, few of them had much in common with the high-concept adventure games, CRPGs, and strategy games that dominated among American developers at the time. They were rather direct descendants of 8-bit games like Bounder: fast-paced, colorful, modest in size and ambition, and shot through with laddish humor. By 1991, Core had begun porting their games to consoles like the Sega Genesis and Super Nintendo, with whose sensibilities they were perhaps a more natural fit. And indeed, the consoles soon accounted for the majority of their sales.

In late 1994, Jeremy Heath-Smith was invited to fly out to Japan to check out the two latest and greatest consoles from that country, both of which were due for a domestic Japanese release before the end of that year and an international rollout during the following one. The Sega Saturn and the Sony PlayStation were groundbreaking in a number of ways: not only did they use capacious CDs instead of cramped cartridges as their standard storage media, but they each included a graphics processing unit (GPU) for doing 3D graphics. At the time, id Software’s DOOM was in the vanguard of a 3D insurgency on personal computers, one that was sweeping away older, slower games like so much chaff in the breeze. The current generation of consoles, however, just didn’t have the horsepower to do a credible job of running games like that; they had been designed for another paradigm, that of 2D sprites moving across pixel-graphic backgrounds. The Saturn and the PlayStation would change all that, allowing the console games that constituted 80 to 90 percent of the total sales of digital games to join the 3D revolution as well. Needless to say, the potential payoff was huge.

Back at Core Design in Derby, Heath-Smith told everyone what he had seen in Japan, then asked for ideas for making maximum use of the new consoles’ capabilities. A quiet 22-year-old artist and designer named Toby Gard raised his hand: “I’ve got this idea of pyramids.” You would play a dashing archaeologist, he explained, dodging traps and enemies on the trail of ancient relics in a glorious 3D-rendered environment.

It must be said that it wasn’t an especially fresh or unexpected idea in the broad strokes. Raiders of the Lost Ark had been a constant gaming touchstone almost from the moment it had first reached cinemas in 1981. Core’s own Rick Dangerous had been essentially the same game as the one that Gard was now proposing, albeit implemented using 2D sprites rather than 3D graphics. (Its titular hero there was a veritable clone of the Raiders‘s hero Indiana Jones, right down to his trademark whip and fedora; if you didn’t read the box copy, you would assume it was a licensed game.)

Still, Gard was enthusiastic, and possessed of “immense talent” in the opinion of Heath-Smith. His idea certainly had the potential to yield an exciting 3D experience, and Heath-Smith had been around long enough to know that originality in the abstract was often overrated when it came to making games that sold. He gave Tomb Raider the green light to become Core’s cutting-edge showcase for the next-generation consoles, Core’s biggest, most expensive game to date. Which isn’t to say that he could afford to make it all that big or expensive by the standards of the American and Japanese studios: a team of just half a dozen people created Tomb Raider.

The Tomb Raider team. Toby Gard is third from left, Jeremy Heath-Smith second from right. Heather Gibson was the sole woman to work on the game — which, to be fair, was one more woman than worked on most games from this period.

The game would depart in a significant way from the many run-and-gun DOOM clones on personal computers by being a bit less bloody-minded, emphasizing puzzle-solving and platforming as much as combat. The developers quickly decided that the style of gameplay they had in mind demanded that they show the player’s avatar onscreen from a behind-the-back view rather than going with the first-person viewpoint of DOOM — an innovative choice at the time, albeit one that several other studios were making simultaneously, with such diverse eventual results as Fade to BlackDie Hard Trilogy, Super Mario 64, and MDK. In the beginning, though, they had no inkling that it would be Lara Croft’s bum the player would be staring at for hours. The star was to be Rick Dangerous or another of his ilk — i.e., just another blatant clone of Indiana Jones.

But Heath-Smith was seasoned enough to know that that sort of thing wouldn’t fly anymore in a world in which games were becoming an ever bigger and more visible mass-media phenomenon. “You must be insane,” he said to Toby Gard as soon as he heard about his intended Indiana clone. “We’ll get sued from here to kingdom come!” He told him to go back to the drawing board — literally; he was an artist, after all — and create a more clearly differentiated character.

So, Gard sat down at his desk to see what he could do. He soon produced the first sketches of Lara — Lara Cruz, as he called her in the beginning. Gard:

Lara was based on Indiana Jones, Tank Girl, and, people always say, my sister. Maybe subconsciously she was my sister. Anyway, she was supposed to be this strong woman, this upper-class adventurer. The rules at the time were, if you’re going to make a game, make sure the main character is male and make sure he’s American; otherwise it won’t sell in America. Those were the rules coming down from the marketing men. So I thought, “Ah, I know how to fix this. I’ll make the bad guys all American and the lead character female and as British as I can make her.”

She wasn’t a tits-out-for-the-lads type of character in any way. Quite the opposite, in fact. I thought that what was interesting about her was, she was this unattainable, austere, dangerous sort of person.

Sex appeal aside, Lara was in tune with the larger zeitgeist around her in a way that few videogames characters before her could match. Gard first sketched her during the fall of 1995, when Cool Britannia and Britpop were the rages of the age in his homeland, when Oasis and Blur were trash-talking one another and vying for the top position on the charts. It was suddenly hip to be British in a way it hadn’t been since the Swinging Sixties. Bands like the aforementioned made a great point of singing in their natural accents — or, some would say, an exaggerated version of same — and addressing distinctly British concerns rather than lapsing into the typical Americanisms of rock and pop music. Lara was cut from the same cloth. Gard changed her last name to “Croft” when he decided “Cruz” just wasn’t British enough, and created a defiantly blue-blooded lineage for her, making her the daughter of a Lord Henshingly Croft, complete with a posh public-school accent.

Jeremy Heath-Smith was not initially impressed. “Are you insane?” he asked Gard for the second time in a month. “We don’t do girls in videogames!” But Gard could be deceptively stubborn when he felt strongly about something, and this was one of those occasions. Heath-Smith remembers Gard telling him that “she’d be bendy. She’d do things that blokes couldn’t do.” Finally, he relented. “There was this whole movement of, females can really be cool, particularly from Japan,” he says.

And indeed, Lara was first drawn with a distinctly manga sensibility. Only gradually, as Gard worked her into the actual game, did she take on a more realistic style. Comparatively speaking, of course. We’ll come back to that…

An early concept sketch of Lara Croft.

Tomb Raider was becoming ever more important for Core. In the wake of the Sega Saturn and the Sony PlayStation, the videogames industry was changing quickly, in tandem with its customers’ expectations of what a new game ought to look like; there was a lot of space on one of those shiny new CDs, and games were expected to fill it. The pressures prompted a wave of consolidations in Britain, a pooling of a previously diffuse industry’s resources in the service of fewer but bigger, slicker, more expensive games. Core actually merged twice in just a couple of years: first with the US Gold publishing label (its name came from its original business model, that of importing American games into Britain) and then with Domark, another veteran of the 1980s 8-bit scene. Domark began trading under the name of Eidos shortly after making the deal, with Core in the role of its premier studio.

Eidos had as chairman of its board Ian Livingstone, a legend of British gaming in analog spaces, the mastermind of the Warhammer tabletop game and the Fighting Fantasy line of paperback gamebooks that enthralled millions of youth during the 1980s. He went out to have a look at what Core had in the works. “I remember it was snowing,” he says. “I almost didn’t go over to Derby.” But he did, and “I guess you could say it was love at first sight when I stepped through the door. Seeing Lara on screen.”

With such a powerful advocate, Tomb Raider was elevated to the status of Eidos’s showcase game for the Christmas of 1996, with a commensurate marketing budget. But that meant that it simply had to be a hit, a bigger one by far than anything Core had ever done before. And Core was getting some worrisome push-back from Eidos’s American arm, expressing all the same conventional wisdom that Toby Gard had so carefully created Lara to defy: that she was too British, that the pronunciation of her first name didn’t come naturally to American lips, that she was a girl, for Pete’s sake. Cool Britannia wasn’t really a thing in the United States; despite widespread predictions of a second muscial British Invasion in the States to supersede the clapped-out Seattle grunge scene, Oasis had only partially broken through, Blur not at all, and Spice Girls — the latest Britpop sensation — had yet to see their music even released Stateside. Eidos needed another way to sell Lara Croft to Americans.

It may have been around this time that an incident which Toby Gard would tell of frequently in the years immediately after Tomb Raider‘s release occurred. He was, so the story goes, sitting at his computer tweaking his latest model of Lara when his mouse hand slipped, and her chest suddenly doubled or tripled in size. When a laughing Gard showed it to his co-workers in a “look what a silly thing I did!” sort of way, their eyes lit up and they told him to leave it that way. “The technology didn’t allow us to make her [look] visually as we wanted, so it was more of a way of heightening certain things so it would give her some shape,” claims Core’s Adrian Smith.

Be that as it may, Eidos’s marketing team, eying that all-important American market that would make or break this game that would make or break their company, saw an obvious angle to take. They plastered Lara, complete with improbably huge breasts and an almost equally bulbous rear end, all over their advertising. “Sometimes, having a killer body just isn’t enough,” ran a typical tagline. “Hey, what’s a little temptation? Especially when everything looks this good. In the game, we mean.” As for the enemies Lara would have to kill, “Not everyone sees a bright light just before dying. Lucky stiffs.” (The innuendo around Lara was never subtle…)

This, then, was the way that Lara Croft greeted the public when her game dropped in September of 1996. And Toby Gard hated it. Giving every indication of having half fallen in love with his creation, he took the tarting up she was receiving under the hands of Eidos’s marketers badly. He saw them rather as a young man might the underworld impresario who had convinced his girlfriend — or his sister? — to become a stripper. A suggestion that reached Core’s offices to include a cheat code to remove Lara’s clothing entirely was, needless to say, not well-received by Gard. “It’s really weird when you see a character of yours doing these things,” he says. “I’ve spent my life drawing pictures of things — and they’re mine, you know?”

But of course they weren’t his. As is par for the course in the games industry, Gard automatically signed over all of the rights to everything he made at Core just as soon as he made it. He was not the final arbiter of what Lara did — or what was done to her – from here on out. So, he protested the only way he knew how: he quit.

Jeremy Heath-Smith, whose hardheaded businessman’s view of the world was the polar opposite of Gard’s artistic temperament, was gobsmacked by the decision.

I just couldn’t believe it. I remember saying, “Listen, Toby, this game’s going to be huge. You’re on a commission for this, you’re on a bonus scheme, you’re going to make a fortune. Don’t leave. Just sit here for the next two years. Don’t do anything. You’ll make more money than you’ve ever seen in your life.” I’m not arty, I’m commercial. I couldn’t understand his rationale for giving up millions of pounds for some artistic bloody stand. I just thought it was insanity.

Heath-Smith’s predictions of Tomb Raider‘s success — and with them the amount of money Gard was leaving on the table — came true in spades.

Suspecting every bit as strongly as Heath-Smith that they had a winner on their hands, Eidos had already flown a lucky flock of reporters all the way to Egypt in August of 1996 to see Tomb Raider in action for the first time, with the real Pyramids of Giza as a backdrop. By now, the Sega Saturn and the Sony PlayStation had been out for a year in North America and Europe, with the PlayStation turning into by far the bigger success, thanks both to Sony’s superior marketing and a series of horrific unforced errors on Sega’s part. Nevertheless, Tomb Raider appeared first on the Saturn, thanks to a deal Eidos had inked which promised Sega one precious month of exclusivity in return for a substantial cash payment. Rather than reviving the fortunes of Sega’s moribund console, Tomb Raider on the Saturn wound up serving mostly as a teaser for the PlayStation and MS-DOS versions that everyone knew were waiting in the wings.

The game still has qualities to recommend it today, although it certainly does show its age in some senses as well. The plot is barely comprehensible, a sort of Mad Libs of Raiders of the Lost Ark, conveyed in fifteen minutes of cut scenes worth of pseudo-mystical claptrap. The environments themselves, however, are possessed of a windy grandeur that requires no exposition, with vistas that can still cause you to pull up short from time to time. If nothing else, Tomb Raider makes a nice change of pace from the blood-splattered killing fields of the DOOM clones. In the first half of the game, combat is mostly with wildlife, and is relatively infrequent. You’ll spend more of your time working out the straightforward but satisfying puzzles — locked doors and hidden keys, movable boulders waiting to be turned into staircases, that sort of thing — and navigating vertigo-inducing jumps. In this sense and many others, Tomb Raider is more of an heir to the fine old British tradition of 8-bit action-adventures than it is to the likes of DOOM. Lara is quite an acrobat, able to crouch and spring, flip forward and backward and sideways, swim, climb walls, grab ledges, and when necessary shoot an arsenal of weapons that expands in time to include shotguns and Uzis alongside her iconic twin thigh-holstered pistols.

Amidst all the discussion of Lara Croft’s appearance, a lot of people failed to notice the swath she cuts through some of the world’s most endangered species of wildlife. “The problem is that any animal that’s dangerous to humans we’ve already hunted to near extinction,” said Toby Gard. “Maybe we should have used non-endangered, harmless animals. Then you’d be asking me, ‘Why was Lara shooting all those nice bunnies and squirrels?’ You can’t win, can you?”

Unfortunately, Tomb Raider increasingly falls prey to its designers’ less worthy instincts in its second half. As the story ups the stakes from just a treasure-hunting romp to yet another world-threatening videogame conspiracy, the environments grow less coherent and more nonsensical in rhythm, until Lara is battling hordes of mutant zombies inside what appears for all the world to be a pyramid made out of flesh and blood. And the difficulty increases to match, until gameplay becomes a matter of die-and-die-again until you figure out how to get that one step further, then rinse and repeat. This is particularly excruciating on the console versions, which strictly ration their save points. (The MS-DOS version, on the other hand, lets you save any time you like, which eases the pain considerably.) The final gauntlet you must run to escape from the last of the fifteen levels is absolutely brutal, a long series of tricky, non-intuitive moves that you have to time exactly right to avoid instant death, an exercise in rote yet split-second button mashing to rival the old Dragon’s Lair game. It’s no mystery why Tomb Raider ended up like this: its amount of content is limited, and it needed to stretch its playing time to justify a price tag of $50 or more. Still, it’s hard not to think wistfully about what a wonderful little six or seven hour game it might have become under other circumstances, if it hadn’t needed to fill fifteen or twenty hours instead.

Tomb Raider‘s other weaknesses are also in the predictable places for a game of this vintage, a time when designers were still trying to figure out how to make this style of game playable. (“Everyone is sitting down and realizing that it’s bloody hard to design games for 3D,” said Peter Molyneux in a contemporaneous interview.) The controls can be a little awkward, what with the way they keep changing depending on what Lara’s actually up to. Ditto the distractingly flighty camera through which you view Lara and her environs, which can be uncannily good at finding exactly the angle you don’t want it to at times. Then, too, in the absence of a good auto-map or clear line of progression through each level, you might sometimes find orientation to be at least as much a challenge as any of the other, more deliberately placed obstacles to progress.

Games would slowly get better at this sort of thing, but it would take time, and it’s not really fair to scold Tomb Raider overmuch for failings shared by virtually all of the 3D action games of 1996. Tomb Raider is never less than a solidly executed game, and occasionally it becomes an inspired one; your first encounter with a Tyrannosaurus Rex (!) in a lost Peruvian valley straight out of Arthur Conan Doyle remains as shocking and terrifying today as it ever was.

As a purely technical feat, meanwhile, Tomb Raider was amazing in its day from first to last. The levels were bigger than any that had yet been seen outside the 2.5D Star Wars shooter Dark Forces. In contrast to DOOM and its many clones, in contrast even to id’s latest 3D extravaganza Quake, Tomb Raider stood out as its own unique thing, and not just because of its third-person behind-the-back perspective. It just had a bit more finesse about it all the way around. Those other games all relied on big bazooka-toting lunks with physiques that put Arnold Schwarzenegger to shame. Even with those overgrown balloons on her chest, Lara managed to be lithe, nimble, potentially deadly in a completely different way. DOOM and Quake were a carpet-bombing attack; she was a precision-guide missile.

Sex appeal and genuinely innovative gameplay and technology all combined to make Lara Croft famous. Shelley Blond, who voiced Lara’s sharply limited amount of dialog in the game, tells of wandering into a department store on a visit to Los Angeles, and seeing “an enormous cutout of Lara Croft. Larger than live-size.” She made the mistake of telling one of the staff who she was, whereupon she was mobbed like a Beatle in 1964: “I was bright red and shaking. They all wanted pictures, and that was when I thought, ‘Shit, this is huge!'”

In a landmark moment for the coming out of videogames as a force in mainstream pop culture, id Software had recently convinced the hugely popular industrial-rock band Nine Inch Nails to score Quake. But that was nothing compared to the journey that Lara Croft now made in the opposite direction, from the gaming ghetto into the mainstream. She appeared on the cover of the fashion magazine The Face: “Occasionally the camera angle allows you a glimpse of her slanted brown eyes and luscious lips, but otherwise Lara’s always out ahead, out of reach, like the perfect girl who passes in the street.” She was the subject of feature articles in Time, Newsweek, and Rolling Stone. Her name got dropped in the most unlikely places. David James, the star goalkeeper for the Liverpool football club, said he was having trouble practicing because he’d rather be playing Tomb Raider. Rave-scene sensations The Prodigy used their addiction to the game as an excuse for delaying their new album. U2 commissioned huge images of her to show on the Jumbotron during their $120 million Popmart tour. She became a spokeswoman for the soft drink Lucozade and for Fiat cars, was plastered across mouse pads, CD-wallets, and lunch boxes. She became a kids’ action figure and the star of her own comic book. It really was as if people thought she was an actual person; journalists clamored to “interview” her, and Eidos was buried in fan mail addressed to her. “This was like the golden goose,” says Heath-Smith. “You don’t think it’s ever going to stop laying. Everything we touched turned gold. It was just a phenomenon.” Already in 1997, negotiations began for an eventual Tomb Raider feature film.

Most of all, Lara was the perfect mascot for the PlayStation. Sony’s most brilliant marketing stroke of all had been to pitch their console toward folks in their late teens and early twenties rather than children and adolescents, thereby legitimizing gaming as an adult pursuit, something for urban hipsters to do before and/or after an evening out at the clubs. (It certainly wasn’t lost on Sony that this older demographic tended to have a lot more disposable income than the younger ones…) Lara may have come along a year too late for the PlayStation launch, but better late than never. What hipster videogaming had been missing was its very own It Girl. And now it had her. Tomb Raider sold seven and a half million copies, at least 80 percent of them on the PlayStation.

That said, it did very well for itself on computers as well, especially after Core posted on their website a patch to make the game work with the new 3Dfx Voodoo chipset for hardware-accelerated 3D graphics on that platform. Tomb Raider drove the first wave of Voodoo adoption; countless folks woke up to find a copy of the game alongside a shiny new graphics card under the tree that Christmas morning. Eidos turned a £2.6 million loss in 1996 into a £14.5 million profit in 1997, thanks entirely to Lara. “Eidos is now the house that Lara built,” wrote Newsweek magazine.

There followed the inevitable sequels, which kept Lara front and center through the balance of the 1990s and beyond: Tomb Raider II in 1997, Tomb Raider III in 1998, Tomb Raider: The Last Revelation in 1999, Tomb Raider: Chronicles in 2000. These games were competently done for the most part, but didn’t stretch overmuch the template laid down by the first one; even the forthrightly non-arty Jeremy Heath-Smith admits that “we sold our soul” to keep the gravy train running, to make sure a new Tomb Raider game was waiting in stores each Christmas. Just as the franchise was starting to look a bit tired, with each successive game posting slowly but steadily declining sales numbers, the long-in-the-works feature film Lara Croft: Tomb Raider arrived in 2001 to bring her to a whole new audience and ensure that she became one of those rare pop-culture perennials.

By this time, a strong negative counter-melody had long been detectable underneath the symphony of commercial success. A lot of people — particularly those who weren’t quite ready to admit videogames into the same halls of culture occupied by music, movies, and books — had an all too clear image of who played Tomb Raider and why. They pictured a pimply teenage boy or a socially stunted adult man sitting on the couch in his parents’ basement with one hand on a controller and another in his pants, gazing in slack-jawed fascination at Lara’s gyrating backside, perhaps with just a trace of drool running down his spotty chin. And it must be admitted that some of Lara’s biggest fans didn’t do much to combat this image: the site called Nude Raider, which did what Toby Gard had refused to do by patching a naked version of Lara into the game, may just have been the most pathetic thing on the Internet circa 1997.

But other fans leaped to Lara’s defense as something more than just the world’s saddest masturbation aid. She was smart, she was strong, she was empowered, they said, everything feminist critics had been complaining for years that most women in games were not.

The problem, answered Lara’s detractors, was that she was still all too obviously crafted for the male gaze. She was, in other words, still a male fantasy at bottom, and not a terribly mature one at that, looking as she did like something a horny teenager who had yet to lay hands on a real girl might draw in his notebook. Her proportions — proudly announced by Eidos as 34D-24-35 — were obtainable by virtually no real woman, at least absent the services of a plastic surgeon. “If you genetically engineered a Lara-shaped woman,” noted PC Gaming World‘s (female) reviews editor Cal Jones, “she would die within around fifteen seconds, since there’s no way her tiny abdomen could house all her vital organs.” Violet Berlin, a popular technology commentator on British television, called Lara “a ’70s throwback from the days when pouting lovelies were always to be found propped up against any consumer icon advertised for men.”

Everyone was right in her or his own way, of course. Lara Croft truly was different from the videogame bimbos of the past, and the fact that millions of boys were lining up to become her — or at least to control her — was progress of some sort. But still… as soon as you looked at her, you knew which gender had drawn her. Even Toby Gard, who had given up millions in a purely symbolic protest against the way his managers wished to exploit her, talked about her in ways that were far from free of male gazing — that could start to sound, if we’re being honest, just a little bit creepy.

Lara was designed to be a tough, self-reliant, intelligent woman. She confounds all the sexist clichés apart from the fact that she’s got an unbelievable figure. Strong, independent women are the perfect fantasy girls — the untouchable is always the most desirable.

Some feminist linguists would doubtless make much of the unconscious slip from “women” to “girls” in this comment…

The Lara in the games was rather a cipher in terms of personality, which worked for her benefit in the mass media. She could easily be re-purposed to serve as anything from a feminist hero to a sex kitten, depending on what was needed at that juncture.

For every point there was a counterpoint. Some girls and women saw Lara as a sign of progress, even as an aspirational figure. Others saw her only as one more stereotype of female perfection created by and for males, one to which they could never hope to measure up. “It’s a well-known fact that most [male] youngsters get their first good look at the female anatomy through porn mags, and come away thinking women have jutting bosoms, airbrushed skin, and neatly trimmed body hair,” said Cal Jones. “Now, thanks to Lara, they also think women are super fit, agile gymnasts with enough stamina to run several marathons back to back. Cheers.”

On the other hand, the same male gamers had for years been seeing images of almost equally unattainable masculine perfection on their screens, all bulging biceps and chiseled abs. How was this different? Many sensed that it was different, somehow, but few could articulate why. Michelle Goulet of the website Game Girlz perhaps said it best: Lara was “the man’s ideal image of a girl, not a girl’s ideal image of a girl.” The inverse was not true of all those warrior hunks: they were “based on the body image that is ideal to a lot of guys, not girls. They are nowhere near my ideal man.” The male gaze, that is to say, was the arbiter in both cases. What to do about it? Goulet had some interesting suggestions:

My thoughts on this matter are pretty straightforward. Include females in making female characters. Find out what the ideal female would be for both a man and a woman and work with that. Respect the females the same as you would the males.

Respecting the female characters is hard when they look like strippers with guns and seem to be nothing more than an erection waiting to happen. Believing that the industry in general respects females is hard when you see ads with women tied up on beds. In my opinion, respect is what most girls are after, and I feel that if the gaming community had more respect for their female characters they would attract the heretofore elusive female market. This doesn’t mean that girls in games have to be some kind of new butch race. Femininity is a big part of being female. This means that girls should be girls. Ideal body images and character aspects that are ideal for females, from a female point of view. I would be willing to bet that guys would find these females more attractive than the souped-up bimbos we are used to seeing. If sexuality is a major selling point, and a major attraction for the male gamer, then, fine, throw in all the sexuality you want, but doing so should not preclude respect for females.

To sum up, I have to say I think the gaming industry should give guys a little more credit, and girls a lot more respect, and I hope this will move the tide in that direction.

I’m happy to say that the tide has indeed moved in that direction for Lara Croft at least since Michelle Goulet wrote those words in the late 1990s. It began in a modest way with that first Tomb Raider movie in 2001. Although Angeline Jolie wore prosthetic breasts when she played Lara, it was impossible to recreate the videogame character’s outlandish proportions in their entirety. In order to maintain continuity with that film and a second one that came out in 2003, the Tomb Raider games of the aughts modeled their Laras on Jolie, resulting in a slightly more realistic figure. Then, too, Toby Gard returned to the franchise to work on 2007’s Tomb Raider: Anniversary and 2008’s Tomb Raider: Underworld, bringing some of his original vision of Lara with him.

But the real shift came when the franchise, which was once again fading in popularity by the end of the aughts, was rebooted in 2013, with a game that called itself simply Tomb Raider. Instead of pendulous breasts and booty mounted on spaghetti-thin legs and torso, it gave us a fit, toned, proportional Lara, a woman who looked like she had spent a lot of time and money at the local fitness center instead of the plastic surgeon’s office. If you ask this dirty old male gazer, she’s a thousand times more attractive than the old Lara, even as she’s a healthy, theoretically attainable ideal for a young woman who’s willing to put in some hard hours at the gym. This was proved by Alicia Vikander, the star of a 2018 Tomb Raider movie, the third and last to date; she looked uncannily like the latest videogame Lara up there on the big screen, with no prosthetics required.

Bravo, I say. If the original Lara Croft was a sign of progress in her way, the latest Lara is a sign that progress continued. If you were to say the new Lara is the one we should have had all along — within the limits of what the technology of the time would allow, of course — I wouldn’t argue with you. But still… better late than never.



Did you enjoy this article? If so, please think about pitching in to help me make many more like it. You can pledge any amount you like.



(Sources: The books Grand Thieves and Tomb Raiders: How British Video Games Conquered the World by Magnus Anderson and Rebecca Levene; From Barbie to Mortal Kombat: Gender and Computer Games, edited by Justine Cassell and Henry Jenkins; Beyond Barbie and Mortal Kombat: New Perspectives on Gender and Gaming, edited by Yasmin B. Kafai, Carrie Heeter, Jill Denner, and Jennifer Y. Sun; Gender Inclusive Game Design: Expanding the Market by Sheri Graner Ray; The Making of Tomb Raider by Daryl Baxter; 20 Years of Tomb Raider: Digging Up the Past, Defining the Future by Meagan Marie; and A Gremlin in the Works by Mark James Hardisty. Computer Gaming World of August 1996, October 1996, January 1997, March 1997, and November 1997; PC Powerplay of July 1997; Next Generation of May 1996, October 1996, and June 1998; The Independent of April 18 2004; Retro Gamer 20, 147, 163, and 245. Online sources include three pieces for the Game Studies journal, by Helen W. Kennedy, Janine Engelbrecht, and Esther MacCallum-Stewart. Plus two interview with Toby Gard, by The Guardian‘s Greg Howson and Game Developer‘s David Jenkins.

The first three Tomb Raider games are available as digital purchases at GOG.com, as are the many games that followed those three.)

 

Tags: , , ,