RSS

Category Archives: Interactive Fiction

Cracking Open the Mac

The Macintosh II

The biggest problem with the Macintosh hardware was pretty obvious, which was its limited expandability. But the problem wasn’t really technical as much as philosophical, which was that we wanted to eliminate the inevitable complexity that was a consequence of hardware expandability, both for the user and the developer, by having every Macintosh be identical. It was a valid point of view, even somewhat courageous, but not very practical, because things were still changing too fast in the computer industry for it to work, driven by the relentless tides of Moore’s Law.

— original Macintosh team-member Andy Hertzfeld

Jef Raskin and Steve Jobs didn’t agree on much, but they did agree on their loathing for expansion slots. The absence of slots was one of the bedrock attributes of Raskin’s original vision for the Macintosh, the most immediately obvious difference between it and Apple’s then-current flagship product, the Apple II. In contrast to Steve Wozniak’s beloved hacker plaything, Raskin’s computer for the people would be as effortless to set up and use as a stereo, a television, or a toaster.

When Jobs took over the Macintosh project — some, including Raskin himself, would say stole it — he changed just about every detail except this one. Yet some members of the tiny team he put together, fiercely loyal to their leader and his vision of a “computer for the rest of us” though they were, were beginning to question the wisdom of this aspect of the machine by the time the Macintosh came together in its final form. It was a little hard in January of 1984 not to question the wisdom of shipping an essentially unexpandable appliance with just 128 K of memory and a single floppy-disk drive for a price of $2495. At some level, it seemed, this just wasn’t how the computer market worked.

Jobs would reply that the whole point of the Macintosh was to change how computers worked, and with them the workings of the computer market. He wasn’t entirely without concrete arguments to back up his position. One had only to glance over at the IBM clone market — always Jobs’s first choice as the antonym to the Mac — to see how chaotic a totally open platform could be. Clone users were getting all too familiar with the IRQ and memory-address conflicts that could result from plugging two cards that were determined not to play nice together into the same machine, and software developers were getting used to chasing down obscure bugs that only popped up when their programs ran on certain combinations of hardware.

Viewed in the big picture, we could actually say that Jobs was prescient in his determination to stamp out that chaos, to make every Macintosh the same as every other, to make the platform in general a thoroughly known quantity for software developers. The norm in personal computing as most people know it — whether we’re talking phones, tablets, laptops, or increasingly even desktop computers — has long since become sealed boxes of one stripe or another. But there are some important factors that make said sealed boxes a better idea now than they were back then. For one thing, the pace of hardware and software development alike has slowed enough that a new computer can be viable just as it was purchased for ten years or more. For another, prices have come down enough that throwing a device away and starting over with a new one isn’t so cost-prohibitive as it once was. With personal computers still exotic, expensive machines in a constant state of flux at the time of the Mac’s introduction, the computer as a sealed appliance was a vastly more problematic proposition.

Determined to do everything possible to keep users out of the Macintosh's innards, Apple used Torx screws, which were almost unheard of at the time, and even threatened them with electrocution should they persist. The contrast with the Apple II, whose top could be popped in seconds, could hardly have been more striking.

Determined to do everything possible to keep users out of the Mac’s innards, Apple used Torx screws for which screwdrivers weren’t commonly available to seal it, and even threatened users with electrocution should they persist in trying to open it. The contrast with the Apple II, whose top could be popped in seconds using nothing more than a pair of hands to reveal seven tempting expansion slots, could hardly have been more striking.

It was the early adopters who spotted the potential in that first slow, under-powered Macintosh, the people who believed Jobs’s promise that the machine’s success or failure would be determined by the number who bought it in its first hundred days on the market, who bore the brunt of Apple’s decision to seal it as tightly as Fort Knox. When Apple in September of 1984 released the so-called “Fat Mac” with 512 K of memory, the quantity that in the opinion of just about everyone — including most of those at Apple not named Steve Jobs — the machine should have shipped with in the first place, owners of the original model were offered the opportunity to bring their machines to their dealers and have them retro-fitted to the new specifications for $995. This “deal” sparked considerable outrage and even a letter-writing campaign that tried to shame Apple into bettering the terms of the upgrade. Disgruntled existing owners pointed out that their total costs for a 512 K Macintosh amounted to $3490, while a Fat Mac could be bought outright by a prospective new member of the Macintosh fold for $2795. “Apple should have bent over backward for the people who supported it in the beginning,” said one of the protest’s ringleaders. “I’m never going to feel the same about Apple again.” Apple, for better or for worse never a company that was terribly susceptible to such public shaming, sent their disgruntled customers a couple of free software packages and told them to suck it up.

The Macintosh Plus

The Macintosh Plus

Barely fifteen months later, when Apple released the Macintosh Plus with 1 MB of memory among other advancements, the merry-go-round spun again. This time the upgrade would cost owners of the earlier models over $1000, along with lots of downtime while their machines sat in queues at their dealers. With software developers rushing to take advantage of the increased memory of each successive model, dedicated users could hardly stand to regard each successive upgrade as optional. As things stood, then, they were effectively paying a service charge of about $1000 per year just to remain a part of the Macintosh community. Owning a Mac was like owning a car that had to go into the shop for a week for a complete engine overhaul once every year. Apple, then as now, was famous for the loyalty of their users, but this was stretching even that legendary goodwill to the breaking point.

For some time voices within Apple had been mumbling that this approach simply couldn’t continue if the Macintosh was to become a serious, long-lived computing platform; Apple simply had to open the Mac up, even if that entailed making it a little more like all those hated beige IBM clones. During the first months after the launch, Steve Jobs was able to stamp out these deviations from his dogma, but as sales stalled and his relationship with John Sculley, the CEO he’d hand-picked to run the company he’d co-founded, deteriorated, the grumblers grew steadily more persistent and empowered.

The architect of one of the more startling about-faces in Apple’s corporate history would be Jean-Louis Gassée, a high-strung marketing executive newly arrived in Silicon Valley from Apple’s French subsidiary. Gassée privately — very privately in the first months after his arrival, when Jobs’s word still was law — agreed with many on Apple’s staff that the only way to achieve the dream of making the Macintosh into a standard to rival or beat the Intel/IBM/Microsoft trifecta was to open the platform. Thus he quietly encouraged a number of engineers to submit proposals on what direction they would take the platform in if given free rein. He came to favor the ideas of Mike Dhuey and Brian Berkeley, two young engineers who envisioned a machine with slots as plentiful and easily accessible as those of the Apple II or an IBM clone. Their “Little Big Mac” would be based around the 32-bit Motorola 68020 chip rather than the 16-bit 68000 of the current models, and would also sport color — another Jobsian heresy.

In May of 1985, Jobs made the mistake of trying to recruit Gassée into a rather clumsy conspiracy he was formulating to oust Sculley, with whom he was now in almost constant conflict. Rather than jump aboard the coup train, Gassée promptly blew the whistle to Sculley, precipitating an open showdown between Jobs and Sculley in which, much to Jobs’s surprise, the entirety of Apple’s board backed Sculley. Stripped of his power and exiled to a small office in a remote corner of Apple’s Cupertino campus, Jobs would soon depart amid recriminations and lawsuits to found a new venture called NeXT.

Gassée’s betrayal of Jobs’s confidence may have had a semi-altruistic motivation. Convinced that the Mac needed to open up to survive, perhaps he concluded that that would only happen if Jobs was out of the picture. Then again, perhaps it came down to a motivation as base as personal jealousy. With a penchant for leather and a love of inscrutable phraseology — “the Apple II smelled like infinity” is a typical phrase from his manifesto The Third Apple, “an invitation to voyage into a region of the mind where technology and poetry exist side by side, feeding each other” — Gassée seemed to self-consciously adopt the persona of a Gallic version of Jobs himself. But regardless, with Jobs now out of the picture Gassée was able to consolidate his own power base, taking over Jobs’s old role as leader of the Macintosh division. He went out and bought a personalized license plate for his sports car: “OPEN MAC.”

Coming some four months after Jobs’s final departure, the Mac Plus already included such signs of the changing times as a keyboard with arrow keys and a numeric keypad, anathema to Jobs’s old mouse-only orthodoxy. But much, much bigger changes were also well underway. Apple’s 1985 annual report, released in the spring of 1986, dropped a bombshell: a Mac with slots was on the way. Dhuey and Berkeley’s open Macintosh was now proceeding… well, openly.

The Macintosh II

The Macintosh II

When it debuted five months behind schedule in March of 1987, the Macintosh II was greeted as a stunning but welcome repudiation of much of what the Mac had supposedly stood for. In place of the compact all-in-one-case designs of the past, the new Mac was a big, chunky box full of empty space and empty slots — six of them altogether — with the monitor an item to be purchased separately and perched on top. Indeed, one could easily mistake the Mac II at a glance for a high-end IBM clone; its big, un-stylish case even included a cooling fan, an item that placed even higher than expansion slots and arrow keys on Steve Jobs’s old list of forbidden attributes.

Apple’s commitment to their new vision of a modular, open Macintosh was so complete that the Mac II didn’t include any on-board video at all; the buyer of the $6500 machine would still have to buy the video card of her choice separately. Apple’s own high-end video card offered display capabilities unprecedented in a personal computer: a palette of over 16 million colors, 256 of them displayable onscreen at any one time at resolutions as high as 640 X 480. And, in keeping with the philosophy behind the Mac II as a whole, the machine was ready and willing to accept a still more impressive graphics card just as soon as someone managed to make one. The Mac II actually represented colors internally using 48 bits, allowing some 281 trillion different shades. These idealized colors were then translated automatically into the closest approximations the actual display hardware could manage. This fidelity to the subtlest vagaries of color would make the Mac II the favorite of people working in many artistic and image-processing fields, especially when those aforementioned even better video cards began to hit the market in earnest. Even today no other platform can match the Mac in its persnickety attention to the details of accurate color reproduction.

Some of the Mac II's capabilities truly were ahead of their time. Here we see a desktop extended across two monitors, each powered by a separate video card.

Some of the Mac II’s capabilities truly were ahead of their time. Here we see a desktop extended across two monitors, each powered by its own video card.

The irony wasn’t lost on journalists or users when, just weeks after the Mac II’s debut, IBM debuted their new PS/2 line, marked by sleeker, slimmer cases and many features that would once have been placed on add-on-cards now integrated into the motherboards. While Apple was suddenly encouraging the sort of no-strings-attached hardware hacking on the Macintosh that had made their earlier Apple II so successful, IBM was trying to stamp that sort of thing out on their own heretofore open platform via their new Micro Channel Architecture, which demanded that anyone other than IBM who wanted to expand a PS/2 machine negotiate a license and pay for the privilege. “The original Mac’s lack of slots stunted its growth and forced Apple to expand the machine by offering new models,” wrote Byte. “With the Mac II, Apple — and, more importantly, third-party developers — can expand the machine radically without forcing you to buy a new computer. This is the design on which Apple plans to build its Macintosh empire.” It seemed like the whole world of personal computing was turning upside down, Apple turning into IBM and IBM turning into Apple.

The Macintosh SE

The Macintosh SE

If so, however, Apple’s empire would be a very exclusive place. By the time you’d bought a monitor, video card, hard drive, keyboard — yes, even the keyboard was a separate item — and other needful accessories, a Mac II system could rise uncomfortably close to the $10,000 mark. Those who weren’t quite flush enough to splash out that much money could still enjoy a taste of the Mac’s new spirit of openness via the simultaneously released Mac SE, which cost $3699 for a hard-drive-equipped model. The SE was a 68000-based machine that looked much like its forefathers — built-in black-and-white monitor included — but did have a single expansion slot inside its case. The single slot was a little underwhelming in comparison to the Mac II, but it was better than nothing, even if Apple did still recommend that customers take their machines to their dealers if they wanted to actually install something in it. Apple’s not-terribly-helpful advice for those needing to employ more than one expansion card was to buy an “integrated” card that combined multiple functions. If you couldn’t find a card that happened to combine exactly the functions you needed, you were presumably just out of luck.

During the final years of the 1980s, Apple would continue to release new models of the Mac II and the Mac SE, now established as the two separate Macintosh flavors. These updates enhanced the machines with such welcome goodies as 68030 processors and more memory, but, thanks to the wonders of open architecture, didn’t immediately invalidate the models that had come before. The original Mac II, for instance, could be easily upgraded from the 68020 to the 68030 just by dropping a card into one of its slots.

The Steve Jobs-less Apple, now thoroughly under the control of the more sober and pragmatic John Sculley, toned down the old visionary rhetoric in favor of a more businesslike focus. Even the engineers dutifully toed the new corporate line, at least publicly, and didn’t hesitate to denigrate Apple’s erstwhile visionary-in-chief in the process. “Steve Jobs thought that he was right and didn’t care what the market wanted,” Mike Dhuey said in an interview to accompany the Mac II’s release. “It’s like he thought everyone wanted to buy a size-nine shoe. The Mac II is specifically a market-driven machine, rather than what we wanted for ourselves. My job is to take all the market needs and make the best computer. It’s sort of like musicians — if they make music only to satisfy their own needs, they lose their audience.” Apple, everyone was trying to convey, had grown up and left all that changing-the-world business behind along with Steve Jobs. They were now as sober and serious as IBM, their machines ready to take their places as direct competitors to those of Big Blue and the clonesters.

To a rather surprising degree, the world of business computing accepted Apple and the Mac’s new persona. Through 1986, the machines to which the Macintosh was most frequently compared were the Commodore Amiga and Atari ST. In the wake of the Mac II and Mac SE, however, the Macintosh was elevated to a different plane. Now the omnipresent point of comparison was high-end IBM compatibles; the Amiga and ST, despite their architectural similarities, seldom even saw their existence acknowledged in relation to the Mac. There were some good reasons for this neglect beyond the obvious ones of pricing and parent-company rhetoric. For one, the Macintosh was always a far more polished experience for the end user than either of the other 68000-based machines. For another, Apple had enjoyed a far more positive reputation with corporate America than Commodore or Atari had even well before any of the three platforms in question had existed. Still, the nature of the latest magazine comparisons was a clear sign that Apple’s bid to move the Mac upscale was succeeding.

Whatever one thought of Apple’s new, more buttoned-down image, there was no denying that the market welcomed the open Macintosh with a matching set of open arms. Byte went so far as to call the Mac II “the most important product that Apple has released since the original Apple II,” thus elevating it to a landmark status greater even than that of the first Mac model. While history hasn’t been overly kind to that judgment, the fact remains that third-party software and hardware developers, who had heretofore been stymied by the frustrating limitations of the closed Macintosh architecture, burst out now in myriad glorious ways. “We can’t think of everything,” said an ebullient Jean-Louis Gassée. “The charm of a flexible, open product is that people who know something you don’t know will take care of it. That’s what they’re doing in the marketplace.” The biannual Macworld shows gained a reputation as the most exciting events on the industry’s calendar, the beat to which every journalist lobbied to be assigned. The January 1988 show in San Francisco, the first to reflect the full impact of Apple’s philosophical about-face, had 20,000 attendees on its first day, and could have had a lot more than that had there been a way to pack them into the exhibit hall. Annual Macintosh sales more than tripled between 1986 and 1988, with cumulative sales hitting 2 million machines in the latter year. And already fully 200,000 of the Macs out there by that point were Mac IIs, an extraordinary number really given that machine’s high price. Granted, the Macintosh had hit the 2-million mark fully three years behind the pace Steve Jobs had foreseen shortly after the original machine’s introduction. But nevertheless, it did look like at least some of the more modest of his predictions were starting to come true at last.

An Apple Watch 27 years before its time? Just one example of the extraordinary innovation of the Macintosh market was the WristMac from Ex Machina, a "personal information manager" that could be synchronized with a Mac to function as your appointment calendar and a telephone Rolodex among other possibilities.

An Apple Watch 27 years before its time? Just one example of the extraordinary innovation of the Macintosh market was the WristMac from Ex Machina, a “personal information manager” that could be synchronized with a Mac to take the place of your appointment calendar, to-do list, and Rolodex.

While the Macintosh was never going to seriously challenge the IBM standard on the desks of corporate America when it came to commonplace business tasks like word processing and accounting, it was becoming a fixture in design departments of many stripes, and the staple platform of entire niche industries — most notably, the publishing industry, thanks to the revolutionary combination of Aldus PageMaker (or one of the many other desktop-publishing packages that followed it) and an Apple LaserWriter printer (or one of the many other laser printers that followed it). By 1989, Apple could claim about 10 percent of the business-computing market, making them the third biggest player there after IBM and Compaq — and of course the only significant player there not running a Microsoft operating system. What with Apple’s premium prices and high profit margins, third place really wasn’t so bad, especially in comparison with the moribund state of the Macintosh of just a few years before.

Steve Jobs and John Sculley in happier times.

Steve Jobs and John Sculley in happier times.

So, the Macintosh was flying pretty high as the curtain began to come down on the 1980s. It’s instructive and more than a little ironic to contrast the conventional wisdom that accompanied that success with the conventional wisdom of today. Despite the strong counterexample of Nintendo’s exploding walled garden over in the videogame-console space, the success the Macintosh had enjoyed since Apple’s decision to open up the platform was taken as incontrovertible proof that openness in terms of software and hardware alike was the only viable model for computing’s future. In today’s world of closed iOS and Android ecosystems and computing via disposable black boxes, such an assertion sounds highly naive.

But even more striking is the shift in the perception of Steve Jobs. In the late 1980s, he was loathed even by many strident Mac fans, whilst being regarded in the business and computer-industry press and, indeed, much of the popular press in general as a dilettante, a spoiled enfant terrible whose ill-informed meddling had very nearly sunk a billion-dollar corporation. John Sculley, by contrast, was lauded as exactly the responsible grown-up Apple had needed to scrub the company of Jobs’s starry-eyed hippie meanderings and lead them into their bright businesslike present. Today popular opinion on the two men has neatly reversed itself: Sculley is seen as the unimaginative corporate wonk who mismanaged Jobs’s brilliant vision, Jobs as the greatest — or at least the coolest — computing visionary of all time. In the end, of course, the truth must lie somewhere in the middle. Sculley’s strengths tended to be Jobs’s weaknesses, and vice versa. Apple would have been far better off had the two been able to find a way to continue to work together. But, in Jobs’s case especially, that would have required a fundamental shift in who these men were.

The loss among Apple’s management of that old Jobsian spirit of zealotry, overblown and impractical though it could sometimes be, was felt keenly by the Macintosh even during these years of considerable success. Only Jean-Louis Gassée was around to try to provide a splash of the old spirit of iconoclastic idealism, and everyone had to agree in the end that he made a rather second-rate Steve Jobs. When Sculley tried on the mantle of visionary — as when he named his fluffy corporate autobiography Odyssey and subtitled it “a journey of adventure, ideas, and the future” — it never quite seemed to fit him right. The diction was always off somehow, like he was playing a Silicon Valley version of Mad Libs. “This is an adventure of passion and romance, not just progress and profit,” he told the January 1988 Macworld attendees, apparently feeling able to wax a little more poetic than usual before this audience of true believers. “Together we set a course for the world which promises to elevate the self-esteem of the individual rather than a future of subservience to impersonal institutions.” (Apple detractors might note that elevating their notoriously smug users’ self-esteem did indeed sometimes seem to be what the company was best at.)

It was hard not to feel that the Mac had lost something. Jobs had lured Sculley from Pepsi because the latter was widely regarded as a genius of consumer marketing; the Pepsi Challenge, one of the most iconic campaigns in the long history of the cola wars, had been his brainchild. And yet, even before Jobs’s acrimonious departure, Sculley, bowing to pressure from Apple’s stockholders, had oriented the Macintosh almost entirely toward taking on the faceless legions of IBM and Compaq that dominated business computing. Consumer computing was largely left to take care of itself in the form of the 8-bit Apple II line, whose final model, the technically impressive but hugely overpriced IIGS, languished with virtually no promotion. Sculley, a little out of his depth in Silicon Valley, was just following the conventional wisdom that business computing was where the real money was. Businesspeople tended to be turned off by wild-eyed talk of changing the world; thus Apple’s new, more sober facade. And they were equally turned off by any whiff of fun or, God forbid, games; thus the old sense of whimsy that had been one of the original Mac’s most charming attributes seemed to leach away a little more with each successive model.

Those who pointed out that business computing had a net worth many times that of home computing weren’t wrong, but they were missing something important and at least in retrospect fairly obvious: namely, the fact that most of the companies who could make good use of computers had already bought them by now. The business-computing industry would doubtless continue to be profitable for many and even to grow steadily alongside the economy, but its days of untapped potential and explosive growth were behind it. Consumer computing, on the other hand, was still largely virgin territory. Millions of people were out there who had been frustrated by the limitations of the machines at the heart of the brief-lived first home-computer boom, but who were still willing to be intrigued by the next generation of computing technology, still willing to be sold on computers as an everyday lifestyle accessory. Give them a truly elegant, easy-to-use computer — like, say, the Macintosh — and who knew what might happen. This was the vision Jef Raskin had had in starting the ball rolling on the Mac back in 1979, the one that had still been present, if somewhat obscured even then by a high price, in the first released version of the machine with its “the computer for the rest of us” tagline. And this was the vision that Sculley betrayed after Jobs’s departure by keeping prices sky-high and ignoring the consumer market.

“We don’t want to castrate our computers to make them inexpensive,” said Jean-Louis Gassée. “We make Hondas, we don’t make Yugos.” Fair enough, but the Mac was priced closer to Mercedes than Honda territory. And it was common knowledge that Apple’s profit margins remained just about the fattest in the industry, thus raising the question of how much “castration” would really be necessary to make a more reasonably priced Mac. The situation reached almost surrealistic levels with the release of the Mac IIfx in March of 1990, an admittedly “wicked fast” addition to the product line but one that cost $9870 sans monitor or video card, thus replacing the metaphorical with the literal in Gassée’s favored comparison: a complete Mac IIfx system cost more than most actual brand-new Hondas. By now, the idea of the Mac as “the computer for the rest of us” seemed a bitter joke.

Apple was choosing to fight over scraps of the business market when an untapped land of milk and honey — the land of consumer computing — lay just over the horizon. Instead of the Macintosh, the IBM-compatible machines lurched over in fits and starts to fill that space, adopting in the process most of the Mac’s best ideas, even if they seldom managed to implement those ideas quite as elegantly. By the time Apple woke up to what was happening in the 1990s and rushed to fill the gap with a welter of more reasonably priced consumer-grade Macs, it was too late. Computing as most Americans knew it was exclusively a Wintel world, Macs incompatible, artsy-fartsy oddballs. All but locked out of the fastest-growing sectors of personal computing, the very sectors the Macintosh had been so perfectly poised to absolutely own, Apple was destined to have a very difficult 1990s. So difficult, in fact, that they would survive the decade’s many lows only by the skin of their teeth.

This cartoon by Tom Meyer, published in the San Francisco Chronicle, shows the popular consensus about Apple by the early 1990s -- increasingly: overpriced inelegant designs and increasingly clueless management.

This cartoon by Tom Meyer, published in the San Francisco Chronicle, shows the emerging new popular consensus about Apple by the early 1990s: increasingly overpriced, bloated designs and increasingly clueless management.

Now that the 68000 Wars have faded into history and passions have cooled, we can see that the Macintosh was in some ways almost as ill-served by its parent company as was the Commodore Amiga by its. Apple’s management in the post-Jobs era, like Commodore’s, seemed in some fundamental way not to get the very creation they’d unleashed on the world. And so, as with the Amiga, it was left to the users of the Macintosh to take up the slack, to keep the vision thing in the equation. Thankfully, they did a heck of a job with that. Something in the Mac’s DNA, something which Apple’s new sobriety could mask but never destroy, led it to remain a hotbed of inspiring innovations that had little to do with the nuts and bolts of running a day-to-day business. Sometimes seemingly in spite of Apple’s best efforts, the most committed Mac loyalists never forgot the Jobsian rhetoric that had greeted the platform’s introduction, continuing to see it as something far more compelling and beautiful than a tool for business. A 1988 survey by Macworld magazine revealed that 85 percent of their readers, the true Mac hardcore, kept their Macs at home, where they used them at least some of the time for pleasure rather than business.

So, the Mac world remained the first place to look if you wanted to see what the artists and the dreamers were getting up to with computers. We’ve already seen some examples of their work in earlier articles. In the course of the next few, we’ll see some more.

(Sources: Amazing Computing of February 1988, April 1988, May 1988, and August 1988; Info of July/August 1988; Byte of May 1986, June 1986, November 1986, April 1987, October 1987, and June 1990; InfoWorld of November 26 1984; Computer Chronicles television episodes entitled “The New Macs,” “Macintosh Business Software,” “Macworld Special 1988,” “Business Graphics Part 1,” “Macworld Boston 1988,” “Macworld San Francisco 1989,” and “Desktop Presentation Software Part 1”; the books West of Eden: The End of Innocence at Apple Computer by Frank Rose, Apple Confidential 2.0: The Definitive History of the World’s Most Colorful Computer Company by Owen W. Linzmayer, and Insanely Great: The Life and Times of Macintosh, the Computer that Changed Everything by Steven Levy; Andy Hertzfeld’s website Folklore.)

 
44 Comments

Posted by on September 16, 2016 in Digital Antiquaria, Interactive Fiction

 

Tags:

So You Want to Be a Hero?

Lori Ann and Corey Cole

Lori Ann and Corey Cole

Rule #1 is “The Player Must have Fun.” It’s trivially easy for a game designer to “defeat” players. We have all the tools and all the power. The trick is to play on the same side as the players, to tell the story together, and to make them the stars.

That rule is probably the biggest differentiator that made our games special. We didn’t strive to make the toughest, hardest-to-solve puzzles. We focused on the characters, the stories, and making the player the star.

— Corey Cole

It feels thoroughly appropriate that Corey and Lori Ann Cole first met over a game of Dungeons & Dragons. The meeting in question took place at Westercon — the West Coast Science Fantasy Conference — in San Francisco in the summer of 1979. Corey was the Dungeon Master, leading a group of players through an original scenario of his own devising that he would later succeed in getting published as The Tower of Indomitable Circumstance. But on this occasion he found the pretty young woman who was sitting at his table even more interesting than Dungeons & Dragons. Undaunted by mere geography — Corey was programming computers for a living in Southern California while Lori taught school in Arizona — the two struck up a romantic relationship. Within a few years, they were married, settling eventually in San Jose.

They had much in common. As their mutual presence at a convention like Westercon will attest, both the current and the future Cole were lovers of science-fiction and fantasy literature and its frequent corollary, gaming, from well before their first meeting. Their earliest joint endeavor — besides, that is, the joint endeavor of romance — was The Spellbook, a newsletter for tabletop-RPG aficionados which they edited and self-published.

Corey also nurtured an abiding passion for computers that had long since turned into a career. After first learning to program in Fortran and COBOL while still in high school during the early 1970s, his subsequent experiences had constituted a veritable grand tour of some of the most significant developments of this formative decade of creative computing. He logged onto the ARPANET (predecessor of the modern Internet) from a terminal at his chosen alma mater, the University of California, Santa Barbara; played the original Adventure in the classic way, via a paper teletype machine; played games on the PLATO system, including the legendary proto-CRPGs Oubliette and DND that were hosted there. After graduating, he took a job with his father’s company, a manufacturer of computer terminals, traveling the country writing software for customers’ installations. By 1981, he had moved on to become a specialist in word-processing and typesetting software, all the while hacking code and playing games at home on his home-built CP/M computer and his Commodore PET.

When the Atari ST was introduced in 1985, offering an unprecedented amount of power for the price, Corey saw in it the potential to become the everyday computer of the future. He threw himself into this latest passion with abandon, becoming an active member of the influential Bay Area Atari Users Group, a contributor to the new ST magazine STart, and even the founder of a new company, Visionary Systems; the particular vision in question was that of translating his professional programming experience into desktop-publishing software for the ST.

Interestingly, Corey’s passion for computers and computer games was largely not shared by Lori. Like many dedicated players of tabletop RPGs, she always felt the computerized variety to be lacking in texture, story, and most of all freedom. She could enjoy games like Wizardry in bursts with friends, but ultimately found them far too constraining to sustain her interest. And she felt equally frustrated by the limitations of both the parser-driven text adventures of Infocom and the graphical adventures of Sierra. Her disinterest in the status quo of computer gaming would soon prove an ironic asset, prompting her to push her own and Corey’s games in a different and very worthwhile direction.

By early 1988, it was becoming clear that the Atari ST was doomed to remain a niche platform in North America, and thus that Corey’s plan to get rich selling desktop-publishing software for it wasn’t likely to pan out. Meanwhile his chronic asthma was making it increasingly difficult to live in the crowded urban environs of San Jose. The Coles were at one of life’s proverbial crossroads, unsure what to do next.

Then one day they got a call from Carolly Hauksdottir, an old friend from science-fiction fandom — she wrote and sang filk songs with them — who happened to be working now as an artist for Sierra On-Line down in the rural paradise of Oakhurst, California. It seemed she had just come out of a meeting with Sierra’s management in which Ken Williams had stated emphatically that they needed to get back into the CRPG market. Since their brief association with Richard Garriott, which had led to their releasing Ultima II and re-releasing Ultima I, Sierra’s presence in CRPGs had amounted to a single game called Wrath of Denethenor, a middling effort for the Apple II and Commodore 64 sold to them by an outside developer. As that meager record will attest, Ken had never heretofore made the genre much of a priority. But of late the market for CRPGs showed signs of exploding, as evidenced by the huge success of other publishers’ releases like The Bard’s Tale and Ultima IV.  To get themselves a piece of that action, Ken stated in his typical grandiose style that Sierra would need to hire “a published, award-winning, tournament-level Dungeon Master” and set him loose with their latest technology. Corey and Lori quickly reasoned that The Tower of Indomitable Circumstances had been published by the small tabletop publisher Judges Guild, as had their newsletter by themselves; that Corey had once won a tournament at Gen Con as a player; and that together they had once created and run a tournament scenario for a Doctor Who convention. Between the two of them, then, they were indeed “published, award-winning, tournament-level Dungeon Masters.” Right?

Well, perhaps they weren’t quite what Ken had in mind after all. When Corey called to offer their services, at any rate, he sounded decidedly skeptical. He was much more intrigued by another skill Corey mentioned in passing: his talent for programming the Atari ST. Sierra had exactly one programmer who knew anything about the ST, and was under the gun to get their new SCI engine ported over to that platform as quickly as possible. Ken wound up hiring Corey for this task, waving aside the initial reason for Corey’s call with the vague statement that “we’ll talk about game design later.”

What with Corey filling such an urgent need on another front, one can easily imagine Ken’s “later” never arriving at all. Corey, however, never stopped bringing up game design, and with it the talents of his wife that he thought would make her perfect for the role. While he thought that the SCI engine, despite its alleged universal applicability, could never be used to power a convincing hardcore CRPG of the Bard’s Tale/Ultima stripe, he did believe it could serve very well as the base for a hybrid game — part CRPG, part traditional Sierra adventure game. Such a hybrid would suit Lori just fine; her whole interest in the idea of designing computer games was “to bring storytelling and [the] interesting plot lines of books and tabletop role-playing into the hack-and-slash thrill of a computer game.” Given the technological constraints of the time, a hybrid actually seemed a far better vehicle for accomplishing that than a hardcore CRPG.

So, while Corey programmed in Sierra’s offices, Lori sat at home with their young son, sketching out a game. In fact, knowing that Sierra’s entire business model revolved around game series rather than one-offs, she sketched out a plan for four games, each taking place in a different milieu corresponding to one of the four points of the compass, one of the four seasons, and one of the four classical elements of Earth, Fire, Air, and Water. As was typical of CRPGs of this period, the player would be able to transfer the same character, evolving and growing in power all the while, into each successive game in the series.

With his established tabletop-RPG designer still not having turned up, Ken finally relented and brought Lori on to make her hybrid game. But the programmer with whom she was initially teamed was very religious, and refused to continue when he learned that the player would have the option of choosing a “thief” class. And so, after finishing up some of his porting projects, Corey joined her on what they were now calling Hero’s Quest I: So You Want to Be a Hero. Painted in the broadest strokes, he became what he describes as the “left brain” to Lori’s “right brain” on the project, focusing on the details of systems and rules while Lori handled the broader aspects of plot and setting. Still, these generalized roles were by no means absolute. It was Corey, for instance, an incorrigible punster — so don’t incorrige him! — who contributed most of the horrid puns that abound throughout the finished game.

Less than hardcore though they envisioned their hybrid to be, Lori and Corey nevertheless wanted to do far more than simply graft a few statistics and a combat engine onto a typical Sierra adventure game. They would offer their player the choice of three classes, each with its own approach to solving problems: through combat and brute force in the case of the fighter, through spells in the case of the magic user, through finesse and trickery in the case of the thief. This meant that the Coles would in effect have to design Hero’s Quest three times, twining together an intricate tapestry of differing solutions to its problems. Considering this reality, one inevitably thinks of what Ron Gilbert said immediately after finishing Maniac Mansion, a game in which the player could select her own team of protagonists but one notably free of the additional complications engendered by Hero’s Quest‘s emergent CRPG mechanics: “I’m never doing that again!” The Coles, however, would not only do it again — in fact, four times more — but they would consistently do it well, succeeding at the tricky task of genre blending where designers as talented as Brian Moriarty had stumbled.

Instead of thinking in terms of “puzzles,” the Coles preferred to think in terms of “problems.” In Hero’s Quest, many of these problems can be treated like a traditional adventure-game puzzle and overcome using your own logic. But it’s often possible to power through the same problem using your character’s skills and abilities. This quality makes it blessedly difficult to get yourself well-and-truly, permanently stuck. Let’s say you need to get a fish from a fisherman in order to get past the bear who’s blocking your passage across a river. You might, in traditional adventure-game style, use another item you found somewhere to repair his leaky boat, thus causing him to give you a fish as a small token of his appreciation. But you might also, if your character’s intelligence score is high enough, be able to convince him to give you a fish through logical persuasion alone. Or you might bypass the whole question of the fish entirely if your character is strong and skilled enough to defeat the bear in combat. Moriarty’s Beyond Zork tries to accomplish a superficially similar blending of the hard-coded adventure game and the emergent CRPG, but does so far less flexibly, dividing its problems rather arbitrarily into those soluble by adventure-game means and those soluble by CRPG means. The result for the player is often confusion, as things that ought to work fail to do so simply because a problem fell into the wrong category. Hero’s Quest was the first to get the blending right.

Based on incremental skill and attribute improvements rather than employing the more monolithic level-based structure of Dungeons & Dragons, the core of the Hero’s Quest game system reached back to a system of tabletop rules the Coles had begun formulating years before setting to work on their first computer game. It has the advantage of offering nearly constant feedback, a nearly constant sense of progress even if you happen to be stuck on one of the more concrete problems in the game. Spend some time bashing monsters, and your character’s “weapons use” score along with his strength and agility will go up; practice throwing daggers on the game’s handy target range, and his “throwing” skill will increase a little with almost every toss. Although you choose a class for your character at the outset, there’s nothing preventing you from building up a magic user who’s also pretty handy with a sword, or a fighter who knows how to throw a spell or two. You’ll just have to sacrifice some points in the beginning to get a start in the non-core discipline, then keep on practicing.


If forced to choose one adjective to describe Hero’s Quest and the series it spawned as a whole, I would have to go with “generous” — not, as the regular readers among you can doubtless attest, an adjective I commonly apply to Sierra games in general. Hero’s Quest‘s generosity extends far beyond its lack of the sudden deaths, incomprehensible puzzles, hidden dead ends, and generalized player abuse that were so typical of Sierra designs. Departing from Sierra’s other series with their King Grahams, Rosellas, Roger Wilcos, and Larry Laffers, the Coles elected not to name or characterize their hero, preferring to let their players imagine and sculpt the character each of them wanted to play. Even within the framework of a given character class, alternate approaches and puzzle — er, problem — solutions abound, while the environment is fleshed-out to a degree unprecedented in a Sierra adventure game. Virtually every reasonable action, not to mention plenty of unreasonable ones, will give some sort of response, some acknowledgement of your cleverness, curiosity, or sense of humor. Almost any way you prefer to approach your role is viable. For instance, while it’s possible to leave behind a trail of monstrous carnage worthy of a Bard’s Tale game, it’s also possible to complete the entire game without taking a single life. The game is so responsive to your will that the few places where it does fall down a bit, such as in not allowing you to choose the sex of your character — resource constraints led the Coles to make the hero male-only — stand out more in contrast to how flexible the rest of this particular game is than they do in contrast to most other games of the period.

Hero's Quest's unabashedly positive message feels particularly bracing in this current Age of Irony of ours.

Hero’s Quest‘s message of positive empowerment feels particularly bracing in our current age of antiheroes and murky morality.

Indeed, Hero’s Quest is such a design outlier from the other Sierra games of its era that I contacted the Coles with the explicit goal of coming to understand just how it could have come out so differently. Corey took me back all the way to the mid-1970s, to one of his formative experiences as a computer programmer and game designer alike, when he wrote a simple player-versus-computer tic-tac-toe game for a time-shared system to which he had access. “Originally,” he says, “it played perfectly, always winning or drawing, and nobody played it for long. After I introduced random play errors by the computer, so that a lucky player could occasionally win, people got hooked on the game.” From this “ah-ha!” moment and a few others like it was born the Coles’ Rule #1 for game design: “The player must always have fun.” “We try to remember that rule,” says Corey, “every time we create a potentially frustrating puzzle.” The trick, as he describes it, is to make “the puzzles and challenges feel difficult, yet give the player a chance to succeed after reasonable effort.” Which leads us to Rule #2: “The player wants to win.” “We aren’t here to antagonize the players,” he says. “We work with them in a cooperative storytelling effort. If the player fails, everybody loses; we want to see everyone win.”

Although their professional credits in the realm of game design were all but nonexistent at the time they came to Sierra, the Coles were nevertheless used to thinking about games far more deeply than was the norm in Oakhurst. They were, for one thing, dedicated players of games, very much in tune with the experience of being a player, whether sitting behind a table or a computer. Ken Williams, by contrast, had no interest in tabletop games, and had sat down and played for himself exactly one computerized adventure game in his life (that game being, characteristically for Ken, the ribald original Softporn). While Roberta Williams had been inspired to create Mystery House by the original Adventure and some of the early Scott Adams games, her experience as a player never extended much beyond those primitive early text adventures; she was soon telling interviewers that she “didn’t have the time” to actually play games. Most of Sierra’s other design staff came to the role through the back door of being working artists, writers, or programmers, not through the obvious front door of loving games as a pursuit unto themselves. Corey states bluntly that “almost nobody there played [emphasis mine] games.” The isolation from the ordinary player’s experience that led to so many bad designs was actually encouraged by Ken Williams; he suggested that his staffers not look too much at what the competition was doing out of some misguided desire to preserve the “originality” of Sierra’s own games.

But the Coles were a little different than the majority of said staffers. Corey points out that they were both over thirty by the time they started at Sierra. They had, he notes, also “traveled a fair amount,” and “both the real-life experience and extensive tabletop-gaming experience gave [us] a more ‘mature’ attitude toward game development, especially that the designer is a partner to the player, not an antagonist to be overcome.” Given the wealth of experience with games and ideas about how games ought to be that they brought with them to Sierra, the Coles probably benefited as much from the laissez-faire approach to game-making engendered by Ken Williams as some of the other designers perhaps suffered from the same lack of direction. Certainly Ken’s personal guidance was only sporadic, and often inconsistent. Corey:

Once in a while, Ken Williams would wander through the development area — it might happen two or three times in a day, or more likely the visits might be three weeks apart. Everyone learned that it was essential to show Ken some really cool sequence or feature that he hadn’t seen before. You only showed him one such sequence because you needed to reserve two more in case he came back the same day.

Our first (and Sierra’s first) Producer, Guruka Singh Khalsa, taught us the “Ken Williams Rule” based on something Robert Heinlein wrote: “That which he tells you three times is true.” Ken constantly came up with half-baked ideas, some of them amazing, some terrible, and some impractical. If he said something once, you nodded in agreement. Twice, you sat up and listened. Anything he said three times was law and had to be done. While Ken mostly played a management role at Sierra, he also had some great creative ideas that really improved our games. Of course, it takes fifteen seconds to express an idea, and sometimes days or weeks to make it work. That’s why we ignored the half-baked, non-repeated suggestions.

The Coles had no affinity for any of Sierra’s extant games; they considered them “unfair and not much fun.” Yet the process of game development at Sierra was so unstructured that they had little sense of really reacting against them either.  As I mentioned earlier, Lori didn’t much care for any of the adventure games she had seen, from any company. She wouldn’t change her position until she played Lucasfilm Games’s The Secret of Monkey Island in 1990. After that experience, she became a great fan of the Lucasfilm adventures, enjoying them far more than the works of her fellow designers at Sierra. For now, though, rather than emulating existing computerized adventure or RPG games, the Coles strove to re-create the experience of sitting at a table with a great storytelling game master at its head.

Looking beyond issues of pure design, another sign of the Coles’ relatively broad experience in life and games can be seen in their choice of settings. Rather than settling for the generic “lazy Medieval” settings so typical of Dungeons & Dragons-derived computer games then and now, they planned their series as a hall of windows into some of the richest myths and legends of our own planet. The first game, which corresponded in Lori’s grand scheme for the series as a whole to the direction North, the season Spring, and the element Earth, is at first glance the most traditional of the series’s settings. This choice was very much a conscious one, planned to help the series attract an initial group of players and get some commercial traction; bullish as he was on series in general, Ken Williams wasn’t particularly noted for his patience with new ones that didn’t start pulling their own weight within a game or two. Look a little closer, though, and even the first game’s lush fantasy landscape, full of creatures that seem to have been lifted straight out of a Dungeons & Dragons Monster Manual, stands out as fairly unique. Inspired by an interest in German culture that had its roots in the year Corey had spent as a high-school exchange student in West Berlin back in 1971 and 1972, the Coles made their Medieval setting distinctly Germanic, as is highlighted by the name of the town around which the action centers: Spielburg. (Needless to say, the same name is also an example of the Coles’ love of puns and pop-culture in-joking.) Later games would roam still much further afield from the lazy-Medieval norm. The second, for instance, moves into an Arabian Nights milieu, while still later ones explore the myths and legends of Africa, Eastern Europe, and Greece. The Coles’ determination to inject a little world music into the cultural soundtracks of their mostly young players stands out as downright noble. Their series doubtless prompted more than a few blinkered teenage boys to begin to realize what a big old interesting world there really is out there.

Hero's Quest

Of course, neither the first Hero’s Quest nor any of the later ones in the series would be entirely faultless. Sierra suffered from the persistent delusion that their SCI engine was a truly universal hammer suitable for every sort of nail, leading them to incorporate action sequences into almost every one of their adventure games. These are almost invariably excruciating, afflicted with slow response times and imprecise, clumsy controls. Hero’s Quest, alas, isn’t an exception from this dubious norm. It has an action-oriented combat engine so unresponsive that no one I’ve ever talked to tries to do anything with it but just pound on the “attack” key until the monster is dead or it’s obviously time to run away. And then there are some move-exactly-right-or-you’re-dead sequences in the end game that are almost as frustrating as some of the ones found in Sierra’s other series. But still, far more important in the end are all the things Hero’s Quest does right, and more often than not in marked contrast to just about every other Sierra game of its era.

Hero’s Quest was slated into Sierra’s release schedule for Christmas 1989, part of a diverse lineup of holiday releases that also included the third Leisure Suit Larry game from the ever-prolific Al Lowe and something called The Colonel’s Bequest, a bit of a departure for Roberta Williams in the form of an Agatha Christie-style cozy murder mystery. With no new King’s Quest game on offer that year, Hero’s Quest, the only fantasy release among Sierra’s 1989 lineup, rather unexpectedly took up much of its slack. As pre-orders piled up to such an extent that Sierra projected needing to press 100,000 copies right off the bat just to meet the holiday demand, Corey struggled desperately with a sequence — the kobold cave, for those of you who have played the game already — that just wouldn’t come together. At last he brokered a deal to withhold only the disk that would contain that sequence from the duplicators. In a single feverish week, he rewrote it from scratch. The withheld disk was then duplicated in time to join the rest, and the game as a whole shipped on schedule, largely if not entirely bug-free. Even more impressively, it was, despite receiving absolutely no outside beta-testing — Sierra still had no way of seriously evaluating ordinary players’ reactions to a game before release — every bit as friendly, flexible, and soluble as the Coles had always envisioned it to be.

Hero's Quest

The game became the hit its pre-orders had indicated it would, its sales outpacing even the new Leisure Suit Larry and Roberta Williams’s new game by a comfortable margin that holiday season. The reviews were superlative; Questbusters‘s reviewer pronounced it “honestly the most fun I’ve had with any game in years,” and Computer Gaming World made it their “Adventure Game of the Year.” While it would be nice to attribute this success entirely to the public embracing its fine design sensibilities, which they had learned of via all the fine reviews, its sales numbers undoubtedly had much to do with its good fortune in being released during this year without a King’s Quest. Hero’s Quest was for many a harried parent and greedy child alike the closest analogue to Roberta Williams’s blockbuster series among the new releases on store shelves. The game sold over 130,000 copies in its first year on the market, about 200,000 copies in all in its original version, then another 100,000 copies when it was remade in 1992 using Sierra’s latest technology. Such numbers were, if not quite in the same tier as a King’s Quest, certainly nothing to sneeze at. In creative and commercial terms alike, the Coles’ series was off to a fine start.

At the instant of Hero’s Quest‘s release, Sierra was just embarking on a major transition in their approach to game-making. Ken Williams had decided it was time at last to make the huge leap from the EGA graphics standard, which could display just 16 colors onscreen from a palette of 64, to VGA, which could display 256 colors from a palette of 262,144. To help accomplish this transition, he had hired Bill Davis, a seasoned Hollywood animator, in the new role of Sierra’s “Creative Director” in July of 1989. Davis systematized Sierra’s heretofore laissez-faire approach to game development into a rigidly formulated Hollywood-style production pipeline. Under his scheme, the artists would now be isolated from the programmers and designers; inter-team communication would happen only through a bureaucratic paper trail.

The changes inevitably disrupted Sierra’s game-making operation, which of late had been churning out new adventure games at a rate of about half a dozen per year. Many of the company’s resources for 1990 were being poured into King’s Quest V, which was intended, as had been the norm for that series since the beginning, to be the big showpiece game demonstrating all the company’s latest goodies, including not only 256-color VGA graphics but also a new Lucasfilm Games-style point-and-click interface in lieu of the old text parser. King’s Quest V would of course be Sierra’s big title for Christmas. They had only two other adventure games on their schedule for 1990, both begun using the older technology and development methodology well before the end of 1989 and both planned for release in the first half of the new year. One, an Arthurian game by an established writer for television named Christy Marx, was called Conquests of Camelot: The Search for the Grail (thus winning the prize of being the most strained application of Sierra’s cherished “Quest” moniker ever). The other, a foray into Tom Clancy-style techno-thriller territory by Police Quest designer Jim Walls, was called Code-Name: ICEMAN. Though they had every reason to believe that King’s Quest V would become another major hit, Sierra was decidedly uncertain about the prospects of these other two games. They felt they needed at least one more game in an established series if they hoped to maintain the commercial momentum they’d been building up in recent years. Yet it wasn’t clear where that game was to come from; one side effect of the transition to VGA graphics was that art took much longer to create, and games thus took longer to make. Lori and Corey were called into a meeting and given two options. One, which Lori at least remembers management being strongly in favor of, was to make the second game in their series using Sierra’s older EGA- and parser-driven technology, getting it out in time to become King’s Quest V‘s running mate for the Christmas of 1990. The other was to be moved in some capacity to the King’s Quest V project, with the opportunity to return to Hero’s Quest only at some uncertain future date. They chose — or were pushed into — the former.

Despite using the older technology, their second game was, at Davis’s insistence, created using the newer production methodology. This meant among other things that the artists, now isolated from the rest of the developers, had to create the background scenes on paper; their pictures were then scanned in for use in the game. I’d like to reserve the full details of Sierra’s dramatically changed production methodology for another article, where I can give them their proper due. Suffice for today to say that, while necessary in many respects for a VGA game, the new processes struck everyone as a strange way to create a game using the sharply limited EGA color palette. By far the most obvious difference they made was that everything seemed to take much longer. Lori Ann Cole:

We got the worst of both worlds. We got a new [development] system that had never been tried before, and all the bugs that went with that. And we were doing it under the old-school technology where the colors weren’t as good and all that. We were under a new administration with a different way of treating people. We got time clocks; we had to punch in a number to get into the office so that we would work the set number of hours. We had all of a sudden gone from this free-form company to an authoritarian one: “This is the hours you have to work. Programmers will work over here and artists will work over there, and only their bosses can talk to one another; you can’t talk to the artist that’s doing the art.”

Some of the Coles’ frustrations with the new regime came out in the game they were making. Have a close look at the name of Raseir, an oppressed city — sort of an Arabian Nights version of Nineteen Eighty-Four — where the climax of the game occurs.

Scheduled for a late September release, exactly one year after the release of the first Hero’s Quest, the second game shipped two months behind schedule, coming out far too close to Christmas to have a prayer of fully capitalizing on the holiday rush. And then when it did finally ship, it didn’t even ship as Hero’s Quest II.

Quest for Glory II

In 1989, the same year that Sierra had released the first Hero’s Quest, the British division of the multi-national toys and games giant Milton Bradley had released HeroQuest, a sort of board-gameified version of Dungeons & Dragons. They managed to register their trademark on the name for Europe shortly before Sierra registered theirs for Europe and North America. After the board game turned into a big European success, Milton Bradley elected to bring it to North America the following year, whilst also entering into talks with some British developers about turning it into a computer game. Clearly something had to be done about the name conflict, and thanks to their having registered the trademark first Milton Bradley believed they had the upper hand. When the bigger company’s lawyers came calling, Sierra, unwilling to get entangled in an expensive lawsuit they probably couldn’t win anyway, signed a settlement that not only demanded they change the name of their series but also stated that they couldn’t even continue using the old name long enough to properly advertise that “Hero’s Quest has a new name!” Thus when Hero’s Quest and its nearly finished sequel were hastily rechristened Quest for Glory, the event was announced only via a single four-sentence press release.

So, a veritable perfect storm of circumstances had conspired to undermine the commercial prospects of the newly rechristened Quest for Glory II: Trial by Fire. Sierra’s last parser-driven 16-color game, it was going head to head with the technological wonder that was King’s Quest V — another fantasy game to boot. Due to its late release, it lost the chance to pick up even many or most of the scraps King’s Quest V might have left it. And finally, the name change meant that the very idea of a Quest for Glory II struck most Sierra fans as a complete non sequitur; they had no idea what game it was allegedly a sequel to. Under the circumstances, it’s remarkable that Quest for Glory II performed as well as it did. It sold an estimated 110,000 to 120,000 copies — not quite the hit its predecessor had been, but not quite the flop one could so easily imagine the newer game becoming under the circumstances either. Sales were still strong enough that this eminently worthy series was allowed to continue.


As a finished game, Quest for Glory II betrays relatively little sign of its difficult gestation, even if there are perhaps just a few more rough edges in comparison to its predecessor. The most common complaint is that the much more intricate and linear plot this time out can lead to a fair amount of time spent waiting for the next plot event to fire, with few concrete goals to achieve in the meantime. This syndrome can especially afflict those players who’ve elected to transfer in an established character from the first game, and thus have little need for the grinding with which newbies are likely to occupy themselves. At the same time, though, the new emphasis on plot isn’t entirely unwelcome in light of the almost complete lack of same in the first game, while the setting this time out of a desert land drawn from the Arabian Nights is even more interesting than was that of the previous game. The leisurely pace can make Quest for Glory II feel like a sort of vacation simulator, a relaxing computerized holiday spent chatting with the locals, sampling the cuisine, enjoying belly dances and poetry readings, and shopping in the bazaars. (Indeed, your first challenge in the game is one all too familiar to every tourist in a new land: converting the money you brought with you from Spielburg into the local currency.) I’ve actually heard Quest for Glory II described by a fair number of players as their favorite in the entire series. If push comes to shove, I’d probably have to say that I slightly prefer the first game, but I wouldn’t want to be without either of them. Certainly Quest for Glory II is about as fine a swan song for the era of parser-driven Sierra graphical adventures as one could possibly hope for.

The combat system used in the Quest for Glory games would change constantly. The one found in the second game is a little more responsive and playable than its predecessor.

The combat system used in the Quest for Glory games would change constantly from game to game. The one found in the second game is a little more responsive and playable than its predecessor.

Had more adventure-game designers at Sierra and elsewhere followed the Coles’ lead, the history of the genre might have been played out quite differently. As it is, though, we’ll have to be content with the games we do have. I’d hugely encourage any of you who haven’t played the Quest for Glory games to give them a shot — preferably in order, transferring the same character from game to game, just as the Coles ideally intended it. Thanks to our friends at GOG.com, they’re available for purchase today in a painless one-click install for modern systems. They remain as funny, likable, and, yes, generous as ever.

We’ll be returning to the Coles in due course to tell the rest of their series’s story. Next time, though, we’ll turn our attention to the Apple Macintosh, a platform we’ve been neglecting a bit of late, to see how it was faring as the 1990s were fast approaching.

Hero's Quest

(Sources: Questbusters of December 1989; Computer Gaming World of September 1990 and April 1991; Sierra’s newsletters dated Autumn 1989, Spring 1990, Summer 1991, Spring 1992, and Autumn 1992; Antic of August 1986; STart of Summer 1986; Dragon of October 1991; press releases and annual reports found in the Sierra archive at the Strong Museum of Play. Online sources include Matt Chats 173 and 174; Lori Ann Cole’s interview with Adventure Classic Gaming; Lori and Corey’s appearance on the Space Venture podcast; and various entries on the Coles’ own blog. But my biggest resource at all has been the Coles themselves, who took the time to patiently answer my many nit-picky questions at length. Thank you, Corey and Lori! And finally, courtesy of Corey and Lori, a little bonus for the truly dedicated among you who have read this far: some pages from an issue of their newsletter The Spell Book, including Corey’s take on “Fantasy Gaming Via Computer” circa summer 1982.)

 
56 Comments

Posted by on September 9, 2016 in Digital Antiquaria, Interactive Fiction

 

Tags: , ,

Sierra Gets Creative

Coming out of Sierra On-Line’s 1984 near-death experience, Ken Williams made a prognostication from which he would never waver: that the real future of home as well as business computing lay with the open, widely cloned hardware architecture of IBM’s computers, running Microsoft’s operating systems. He therefore established and nurtured a close relationship with Radio Shack, whose Tandy 1000 was by far the most consumer-friendly of the mid-1980s clones, and settled down to wait for the winds of the industry as a whole to start to blow his way. But that wait turned into a much longer one than he had ever anticipated. As each new Christmas approached, Ken predicted that this one must be the one where the winds would change, only to witness another holiday season dominated by the cheap and colorful Commodore 64, leaving the MS-DOS machines as relative afterthoughts.

MS-DOS was, mind you, a slowly growing afterthought, one on which Sierra was able to feed surprisingly well. Their unique relationship with Radio Shack in particular made them the envy of other publishers, allowing them as it did to sell their games almost without competition in thousands of stores nationwide. That strategic advantage among others helped Sierra to grow from $4.7 million in gross sales in the fiscal year ending on March 31, 1986, to almost $7 million the following fiscal year.

This sales history from a Sierra prospectus illustrates just how dramatically the company's customer changed almost overnight when Ken Williams made the decision to abandon what he dismissed as the "toy computers" to concentrate on the Apple II and especially MS-DOS markets.

This sales history from a Sierra prospectus illustrates just how dramatically the company’s customer base changed when Ken Williams made the decision to abandon what he dismissed as the “toy computers” to concentrate on the Apple II and especially the MS-DOS markets.

Still, such incrementalism was hardly a natural fit for Ken Williams’s personality; he was always an entrepreneur after the big gains. It was excruciating waiting for the 8-bit generation of machines to just die already. When IBM debuted their PS/2 line in 1987, Ken, seeing the new machines’ lovely MCGA and VGA graphics and user-friendly mouse support, felt a bit like Noah must have when the first drops of rain finally began to fall. Yes, the machines were ridiculously expensive as propositions for the home, but prior experience said that, given time, their technology would trickle down into more affordable price brackets. If nothing else, the PS/2 line was at long last a start.

Indeed, Ken was so encouraged by the PS/2 line that he decided to pull the trigger on a fraught decision faced by every growing young company: that of whether and when to go public. He decided that October of 1987 would be the right moment, just as Sierra’s lineup of new software for Christmas began to hit the streets. After a frenzy of preparation, all was ready — but then the very week the IPO was to take place opened with Black Monday, the largest single-day stock-market collapse since the mother of all stock-market collapses back in 1929. Sierra quietly abandoned their plans, to little notice from prospective investors who suddenly had much bigger fish to fry.

Sierra had gotten very lucky, and in more ways than one. Had Black Monday been, say, a Black Friday instead, their newly issued shares must inevitably have gotten caught in its undertow, with potentially disastrous results. But even absent those concerns, going public in 1987 was probably jumping the gun just a little, banking on an MS-DOS market that wasn’t quite there yet. This reality was abundantly demonstrated by that Christmas of 1987, the latest to defy Ken’s predictions by voting for the Commodore 64 over MS-DOS — although by this time Commodore’s evergreen was in turn being overshadowed by a new quantity from Japan called the Nintendo Entertainment System.

In fact, the Christmas of 1987 would prove the last of the 64’s strong American holiday seasons. The stars were aligning to make 1988 through 1990 the breakthrough years for both Sierra and the MS-DOS platform to which Ken was so obstinately determined to keep hitching his wagon. In the meantime, the fiscal year ending on March 31, 1988 was nothing to sneeze at in its own right: thanks largely to the new hit Leisure Suit Larry in the Land of the Lounge Lizards and the perennially strong sales of all three extant King’s Quest games, gross sales topped $12 million, enough to satisfy even a greedy entrepreneur like Ken.

That year Sierra broke ground on a new office complex close to their old one in picturesque Oakhurst, California, “at the southern gate of Yosemite National Park,” as their press put it. The new building was made cheaply in comparison to the old one: 40,000 square feet of pre-fab metal that has been variously described as resembling either a warehouse or an aircraft hangar, both inside and out. It would prove a far less pleasant place to work than the lovely redwood building Sierra now abandoned, but that, it seemed, was the price of progress. (Ken claimed to have learned from a survey that his employees actually preferred a cheap building in the name of saving money in order to grow the company in more important ways, but there was considerable skepticism about the veracity of that claim among those selfsame employees.)

To accompany an IPO do-over they had tentatively planned for late in the year, Sierra would have some impressive new gaming technology as well as their impressive — or at least much bigger — new building to put on display. Back in 1986, Ken had made his first trip to Japan, where he’d been entranced by a domestic line of computers from NEC called the PC-9801 series. Although these machines were built around Intel processors and were capable of running MS-DOS, they weren’t hardware-compatible with the IBM standard, a situation that left them much more room for hardware innovation than that allowed to the American clonesters. In particular, the need to display the Japanese Kanji script had pushed their display technology far beyond that of their American counterparts. The top-of-the-line PC-9801VX, with 4096 colors, 1 MB of memory, and a 10 MHz 80286 processor, could rival the Commodore Amiga as a gaming computer. And, best of all, the Japanese accepted the NEC machines in this application; there was a thriving market in games for them. Ken saw in these Japanese machines a window on the future of the American MS-DOS machines, tangible proof of what he’d been saying already for so long about the potential of the IBM/Intel/Microsoft standard to become the dominant architecture in homes as well as businesses. Ken returned from Japan determined that Sierra must push their software forward to meet this coming hardware. Out of this epiphany was born the project to make the Sierra Creative Interpreter (SCI), the successor to the Adventure Game Interpreter (AGI) that had been used to build all of Sierra’s current lineup of adventure games.

On the surface, the specifications of the first version of SCI hardly overwhelm. The standard display resolution of the engine was doubled, from a rather horrid 160 X 200 to a more reasonable (for the era) 320 X 200, with better support being added for mice and more complex animation possibilities being baked in. Notably, the first version of SCI did not support the impressive but expensive new MCGA and VGA graphics standards; even the technically aggressive Ken Williams had to agree that it was just too soon to be worth the investment.

Under the hood, however, the changes were far more extensive than they might appear on the surface. Jeff Stephenson, Sierra’s longtime technology guru, had created AGI on IBM’s dime and IBM’s timetable, in order to implement the original King’s Quest on the ill-fated PCjr. It was a closed and thus a limited system, albeit one that had proved far more flexible and served Sierra far better and longer than anyone had anticipated at the time of its creation. Still, Stephenson envisioned SCI as something very different from its predecessor: a more open-ended, modular system that could grow alongside the hardware it targeted, supporting ever denser and more colorful displays, ever better sound, eventually entirely new technologies like CD-ROM. As indicated by its name, which dropped any specific mention of adventure games, SCI was intended to be a universal engine potentially applicable to many gaming genres. To facilitate such ambitions, Stephenson  completely rewrote the language used for programming the engine, going from a simplistically cryptic scripting language to a full-fledged modern programming language reminiscent of C++, incorporating all the latest thinking about object-oriented coding.

Forward-thinking though it was, SCI proved a hard sell to Sierra’s little cadre of game-makers, most of whom lacked the grounding in computer science enjoyed by Jeff Stephenson; they would have been perfectly happy to stick with their simple AGI scripts, thank you very much. But time would show Stephenson to have been correct in designing SCI for the future. The SCI engine, steadily evolving all the while, would last for the remainder of Sierra’s life as an independent company, the technological bedrock for dozens of games to come.

Sierra planned to release their first three SCI-based adventure games in time for Christmas 1988 and that planned-for second-chance IPO: King’s Quest IV, Leisure Suit Larry II, and Police Quest II, with Space Quest III to follow early in 1989. (This lineup says much about Ken Williams’s sequel-obsessed marketing strategy. As an annual report from the period puts it, “Sierra attempts to exploit and extend the effective market life of a successful product by creating sequels to that product and introducing them at planned intervals, thereby stimulating interest in both the sequels and the original product.”) Of this group, King’s Quest IV was always planned as the real showcase for Sierra’s evolving technology, the game for which they would really pull out all the stops — understandably so given that, despite some recent challenges from one (Leisure Suit) Larry Laffer, Roberta Williams’s series of family-friendly fairy-tale adventures remained the most popular games in the Sierra catalog. Indeed, King’s Quest IV marked the beginning of a new, more proactive stance on Sierra’s part when it came to turning the still largely bland beige world of the MS-DOS machines into the new standard for computer gaming. Simply put, with MS-DOS’s consumer uptake threatening to stall again in the wake of the high prices and poor reception of the PS/2 line, Sierra decided to get out and push.

King’s Quest IV‘s most notable shove to the industry’s backside began almost accidentally, with one of Ken’s crazy ideas. He’d decided he’d like to have a real, Hollywood-style soundtrack in this latest King’s Quest, something to emphasize Sierra’s increasingly cinematic approach to adventure gaming in general. Further, he’d love it if said soundtrack could be written by a real Hollywood composer. Never reluctant to liaison with Tinseltown — Sierra had eagerly jumped into relationships with the likes of Jim Henson and Disney during their first heyday years before — he pulled out his old Rolodex and started dialing agents. Most never bothered to return his calls, but at last one of them arranged a meeting with William Goldstein. A former Motown producer, a Grammy-nominated composer for a number of films, and an Emmy-nominated former musical director for the television series Fame, Goldstein also nurtured an interest in electronic music, having worked on several albums of same. He found the idea of writing music for a computer game immediately intriguing. He and Ken agreed that what they wanted for King’s Quest IV was not merely a few themes to loop in the background but a full-fledged musical score, arguably the first such ever to be written for a computer game. As Goldstein explained it to Ken, “the purpose of a score is to evoke emotion, not to be hummed. Sometimes the score consists only of some chord being held and slowly becoming louder in order to create a feeling of tenseness. In creating a score, the instrument(s) it is composed for can be as important as the score itself.”

And therein lay the rub. When Ken demonstrated for him the primitive bleeps and bloops an IBM clone’s speaker was capable of playing, Goldstein pronounced writing a score for that blunt instrument to be equivalent to trying to shoot flies with a shotgun. But then he had an idea. Thanks to his work in other forms of electronic music, Goldstein enjoyed a relationship with the Roland Corporation, a longstanding Japanese maker of synthesizers. Just recently, Roland had released a gadget called the MT-32, a nine-channel synthesizer that plugged into an ordinary IBM-compatible computer. Maybe, Goldstein mused, he could write his score for the MT-32.

At first blush, it seemed a very problematic proposal. The MT-32, which typically went for $550 or more, was hardly an everyday piece of kit; it was aimed at the professional or at least the very serious amateur musician, not at gamers. Yet Ken decided that, faced with a classic chicken-and-egg situation, he needed to do something to move the needle on the deplorable state of IBM-compatible sound hardware. A showpiece game, like King’s Quest IV might become, could show the market what it had been missing and generate demand that might lead to more affordable audio solutions. And so Ken set Goldstein to work on the MT-32.

At the Summer Consumer Electronics Show in June of 1988, Sierra gave a series of invitation-only audiences a sneak preview of King’s Quest IV in the form of a nearly ten-minute opening “movie” — people would soon be saying “cut scene” — enhanced by Goldstein’s score. Sierra legend has it that it moved at least one woman to tears. “I feel bad even saying it,” remarks Sierra’s marketing director (and Ken Williams’s little brother) John Williams, “but it was then that we knew we had a winner.”


Such an extreme reaction may be difficult to fathom today; even in King’s Quest IV‘s own time, it’s hard to imagine Amiga owners used to, say, Cinemaware games being quite so awed as this one lady apparently was. But nevertheless, King’s Quest IV and its first real soundtrack score stands as a landmark moment in the evolution of computer games. The game did indeed do much to break the chicken-and-egg conundrum afflicting MS-DOS audio. Only shortly after Roland had released the MT-32, a Canadian company called Ad Lib had released a “Personal Computer Music System” of their own at a price of just $245. It left much to be desired in comparison to the MT-32, but it was certainly worlds better than a simple beeper; Sierra duly added Ad Lib support to King’s Quest IV and all the other SCI games before they shipped. And for Space Quest III, they enlisted the services of another sort of star composer: Bob Siebenberg, drummer of the rock band Supertramp. Thanks in large degree to Sierra’s own determined intervention, in this area at least their chosen platform was becoming steadily more desirable as a game machine.

But King’s Quest IV also advanced the state of the art of adventure gaming in other, less tech-centric ways. As evidenced by its prominent subtitle The Perils of Rosella, its protagonist is female. Hard as it may seem to believe today, when more adventure games than not seem to star women, this fact made King’s Quest IV almost unique in its day; Infocom’s commercially unsuccessful but artistically brilliant interactive romance novel Plundered Hearts is just about the only point of comparison that leaps to mind. Roberta confessed to no small trepidation over the choice at the time of King’s Quest IV‘s release: “I know it will be just fine with the women and girls who play the game, but how it will go over with some of the men, I don’t know.” She also admitted to some ambivalence about her choice in purely practical terms, stemming from differing expectations that are embedded so deeply in our culture that they’re often hard to spot at all until we’re confronted with them.

I have a lot of deaths in my games. My characters always die from falling or being thrown into a cauldron or something. And I always like to have them die in a funny way. It didn’t seem right; I don’t know why. I guess it’s because she’s a girl, and you don’t think a girl should be treated that way. But I got used to that too, until there was one death I had to deal with last week that I was real uncomfortable with. Was it throwing her in the cauldron? I’m not sure, but it was some death that seemed particularly unfeminine, not right.

And girls die differently. I discovered lots of these things, like the way she falls, which has to be different from the way a guy falls. It’s been an experience. And I think that men will find it fun and different because it’s from a different point of view.

One could wish that Roberta’s ambivalence about killing her new female heroine at every possible juncture had led her to consider the wisdom of indulging in all that indiscriminate player-killing at all, but such was not to be. In the end, the most surprising thing about King’s Quest IV‘s female protagonist would be how little remarked upon it was by players. Sounding almost disappointed, Roberta a few months after the game’s release noted that “I personally have not heard much about it.” “I thought it would get a lot of attention,” she went on. “It has gotten some, but nothing really dramatic”; “very few” of the letters she received about the game had anything at all to say about the female heroine.

But then, that non-reaction could of course be taken as a sign of progress in itself. One of the worthiest aspects of Sierra’s determination to turn computer gaming into a truly mainstream form of entertainment was their conviction that doing so must entail reaching far beyond the typical teenage-boy videogame demographic. Doubtless thanks to the relative paucity of hardcore action games and military simulations in their catalog as well as to their having a woman as their star designer, Sierra was always well ahead of most of the rest of their industry when it came to the diversity of their customer base. At a time when female players of other publishers’ games seldom got out of the single digits in percentage terms, Sierra could boast that fully one in four of their players was a woman or a girl; of other 1980s computer-game publishers, only Infocom could boast remotely comparable numbers. In the case of Roberta’s King’s Quest games, the number of female players rose as high as 40 percent, while women and girls wrote more than half of Roberta’s voluminous fan mail.

Sierra’s strides seem all the more remarkable in comparison to the benighted attitudes held by many other publishers. Mediagenic’s Bruce Davis, for instance, busy as usual formulating the modern caricature of the soulless videogame executive, declared vehemently that women and girls were “not a viable market” for games because of “profound” psychological differences that would always lead them to “shun” games. (One wonders what he makes of the modern gaming scene, vast swathes of which are positively dominated by female players.) The role model that Roberta Williams in particular became for many girls interested in games and/or computers should never be overlooked or minimized. Even as of this writing, eighteen years after Roberta published her last adventure game, John Williams tells me how people of a certain age “go crazy” upon learning he’s her brother-in-law, how he still gets at least two requests per week to put people in touch with her for an autograph, how there was an odd surge for a while there of newborn girls named Rosella and Roberta.

All of this only makes it tougher to reckon with the fact that Roberta’s actual games were so consistently poor in terms of fundamental design. King’s Quest IV is a particular lowlight in her checkered career, boasting some unfair howlers as bad as anything found in her legendarily insoluble Time Zone. At one point, you have to work your way through a horrendous sequence of random-seeming actions to wind up visiting an island, something you can only do one time. On this island is a certain magic bridle you’re going to need later in the game. But, incomprehensibly, the game not only doesn’t ever hint that the bridle may be present on the island, it literally refuses to show it to you even once you arrive there. The only way to find it is to walk around the island step by step, typing “look” again and again while facing in different directions, until you discover those pixels that should by all rights have depicted the bridle but for some reason don’t. Throw in climbing sequences that send you plummeting to your death if you move one pixel too far in the wrong direction, a brutal time limit, and plenty of other potential dead ends almost as heartless as the one just described, and King’s Quest IV becomes as unfair, unfun, frustrating, and downright torturous as any adventure game I’ve ever seen. It’s so bad that, rather than being dismissable as merely a disappointing game, it seems like a fundamentally broken game, thereby raising a question of ethics. Did a player who had just paid $40 for the game not deserve a product that was in fact a soluble adventure game? Even the trade press of King’s Quest IV‘s day, when not glorying over the higher-resolution graphics and especially that incredible soundtrack, had to acknowledge that the actual game underneath it all had some problems. Scorpia, the respected voice of adventure gaming for Computer Gaming World, filled her article on the game with adjectives like “exasperating,” “irritating,” “tedious,” and “boring”, before concluding that “it’s a matter of personal taste” — about as close to an outright pan as most magazine reviewers dared get in those days.

Roberta Williams, an example of that rare species of adventure-game designers who don’t actually play adventure games, likely had little idea just how torturous an experience her games actually were. Taken as a whole, Roberta’s consistent failings as a designer seemingly must stem from that inability to place herself in her player’s shoes, and from her own seeming disinterest in improving upon her previous works in any terms but those of their surface bells and whistles. That said, however, King’s Quest IV‘s unusually extreme failings, even in terms of a Roberta Williams design, quite obviously stemmed from the frenzied circumstances of its creation as well.

I should note before detailing those circumstances that Sierra was finally by the time of King’s Quest IV beginning to change some of the processes that had spawned so many bad adventure games during the company’s earlier years. By 1988, they finally had the beginnings of a real quality-assurance process, dedicating three employees full-time to thrashing away at their games and other software. But, welcome as it was to see testing happening in any form, Sierra’s conception of same focused on the trees rather than the forest. The testers spent their time chasing outright bugs, glitches, and typos, but feedback on more holistic aspects of design wasn’t really part of their brief. In other words, they might spend a great deal of time ensuring that a given sudden death worked correctly without it ever even occurring to them to think about whether that sudden death really needed to be there at all.

In the case of King’s Quest IV, however, even that circumscribed testing process broke down due to the pressure of external events. By the spring of 1988, Roberta had given her design for the game to the team of two artists and two programmers — all recent hires, more fruit of Sierra’s steady expansion — for implementation. Then, with IPO Attempt 2.0 now planned for October of that year and lots of other projects on the boil as well, nobody in management paid King’s Quest IV a whole lot more attention for quite some time, simply assuming that no news from its development team was good news and that it was coming along as expected. Al Lowe, who by the end of that summer had already finished designing and coding his Leisure Suit Larry sequel that was scheduled to ship shortly after King’s Quest IV, picks up the story from here:

King’s Quest IV was going to be the flagship product for the company when we went public. So, Ken and the money guys are busy going around the country, doing their dog-and-pony shows to Wall Street investors, saying, “This is a great company, you’re going to want to buy in, buy lots of stock. We’ve got this great product coming out that’s going to be the hit of the Christmas season.”

Finally, about the end of August, somebody said, “Has anybody looked at that game that’s supposed to be done in a month, that we’re supposed to ship in October? How’s it doing?” They went and looked at it, and the two programmers were lost. They had no clue. They had written a lot of code, but a lot of it was buggy, a lot of it didn’t take proper precautions. One of the big rules of programming is to never allow input at a time you don’t want it, but they had none of that. Everything was wide open. You could break it with a sneeze.

So, they called me and asked if I could come up that weekend — it was Labor Day weekend, Saturday — to look at the game. I did, and said, “Oh, my God, we’re in trouble.” I had a lot of stock options, and was hoping for a successful IPO myself. When I saw this, I said, “We’re in terrible shape. This isn’t going to make it.”

So, we devised a strategy over the weekend to bring every programmer in the company together on Labor Day Monday for a meeting. We said, “All hands are going to work on this title for the next month, and we’re going to finish this game in one month’s time because we’ve got to have it done by the end of September.” Do you remember the phrase from The Godfather, “We’ll go to the mattresses?” That’s what we did; we went to the mattresses. We all moved into the Sierra building. Everybody worked. They brought us food; they did our laundry; they got us hotel rooms. We basically just lived and ate and worked there, and when we needed to sleep we’d go to this hotel nearby. Then we’d get back up and do it again.

I took the lead on the project. I broke the game up into areas, and we assigned a programmer to each. As they finished their code, we had the whole company testing it. We’d distribute bug reports and talk about progress each morning. And by God, by the end of the month we had a game. It wasn’t perfect — it was a little buggy — but at least we had a game we could send out. And when we went public, it was a successful IPO.

Entertaining as this war story is, especially when told by a natural raconteur like Al Lowe, it could hardly result in anything but a bad adventure game. In a desperate flurry like this one, the first thing to fall by the wayside must be any real thoughtfulness about a game’s design or the player’s experience therein.

But despite its many design failings, King’s Quest IV did indeed deliver in spades as the discussion piece and IPO kick-starter it was intended to be. Sierra’s own promotional copy wasn’t shy about slathering on the purple prose in making the game’s case as a technical and aesthetic breakthrough. (In a first and only for Sierra, an AGI version of the game was also made for older systems, but it garnered little press interest and few sales in comparison to the “real” SCI version.)

King’s Quest IV sets a landmark in computer gaming with a new development system that transcends existing standards of computer graphics, sound, and animation. Powerfully dramatic, King’s Quest IV evokes emotion like no other computer game with unique combinations of lifelike animated personalities, beautiful landscapes, and soul-stirring music. Sierra has recreated the universe of King’s Quest to build a world that one moment will pull at your heartstrings, the next moment place terror in your heart.

Leveraging their best promotional asset, Sierra sent Roberta Williams, looking pretty, wholesome, and personable as ever, on a sort of “book tour” to software stores and media outlets across the country, signing autographs for long lines of fans everywhere she went. No one had attempted anything quite like this since the heyday of Trip Hawkins’s electronic artists/rock stars, and never as successfully as this. The proof was in the pudding: King’s Quest IV sold 100,000 copies in its first two weeks and received heaps of press coverage at a time when coverage of computer games in general was all but nonexistent in the Nintendo-obsessed mainstream media. Sales of the game may ultimately have reached as high as 500,000 copies. The IPO went off without a hitch this time on October 6, 1988: 1.4 million shares of common stock were issued at an opening price of $9 per share. Within a year, the stock would be flirting with a price of $20 per share.

In all their promotional efforts for King’s Quest IV and the rest of that first batch of SCI games, Sierra placed special emphasis on sound, the area where Ken Williams had chosen to try most aggressively to push the hardware forward. The relationship between Sierra and Roland grew so close that Thomas Beckmen, president of the latter company’s American division, joined Sierra’s board. But anyone from any of Roland’s rivals who feared that this relationship would lock them out needn’t have worried. Recognizing that even most purchasers of what they loved to describe as their “premium” products weren’t likely to splash out more than $500 on a high-end Roland synthesizer, Sierra pushed the cheaper Ad Lib alternative equally hard. In 1989, when a Singapore company called Creative Music Systems entered the fray with a cheaper knock-off of the Ad Lib design which they called the Game Blaster, Sierra took it to their bosom as well. (In the end, it was Creative who would be the big winners in the sound-card wars. Their Sound Blaster line, the successor to the Game Blaster, would become the ubiquitous standard for PC gaming through much of the 1990s.) Ken Williams went so far as to compare the latest Sierra games with the first “talkies” to invade the world of silent cinema. Given the sound that users of computers like the Amiga had been enjoying well before Sierra jumped on the bandwagon, this was perhaps a stretch, but it certainly made for good copy.

Ad Lib advertisementRoland advertisement

As part of their aggressive push to get sound cards into the machines of their customers, Sierra started selling the products of all three rival makers directly through their own catalog.

As part of their aggressive push to get sound cards into the machines of their customers, Sierra started selling the products of all three of the biggest rival makers of same directly through their own product catalogs.

Thanks to his own company’s efforts as much as those of anyone, Ken Williams was able to declare at the beginning of 1990 that during the previous year MS-DOS had become “the standard for entertainment software”; the cloudburst this latter-day Noah had been anticipating for so long had come at last. In a down year for the computer-game industry as a whole, which was suffering greatly under the Nintendo onslaught, MS-DOS and the Amiga had been the only platforms not to suffer a decline, with the former’s market share growing from 44 percent to 55 percent. Ken’s prediction that MS-DOS would go from being the majority platform to the absolutely dominant one as 1990 wore on would prove correct.

My guess is that as software publishers plan out their new year’s product schedules, versions of newer titles for machines which are in decline will either be shelved or delayed. Don’t be surprised if companies who traditionally have been strong Apple or Commodore publishers suddenly ship first on MS-DOS. Don’t be surprised if many new titles come out ONLY for MS-DOS next Christmas.

Ken’s emerging vision for Sierra saw his company as “part of the entertainment industry, not the computer industry.” An inevitable corollary to that vision, at least to Ken’s way of seeing things, was a focus on the “media” part of interactive media. In that spirit, he had hired in July of 1989 one Bill Davis, a director of more than 150 animated television commercials, for the newly created position of Sierra’s Creative Director. Davis introduced story-boarding and other new processes redolent of Hollywood, adding another largely welcome layer of systemization to Sierra’s traditionally laissez-faire approach to game development. But, tellingly, he had no experience working with games as games, and nothing much to say about the designs that lay underneath the surface of Sierra’s creations; these remained as hit-and-miss as ever.

The period between 1988 and 1998 or so — the heyday of MS-DOS gaming, before Windows 95/98 and its DirectX gaming layer changed the environment yet again — was one of enormous ferment in computer graphics and sound, when games could commercially thrive on surface sizzle alone. Ken Williams proved more adept at riding this wave than just about anyone else, hewing stolidly as ever to the ten-foot rule he’d formulated during his company’s earliest days: “If someone says WOW when they see the screen from ten feet away, you have them sold.” Sierra, like much of the rest of the industry, took all the wrong lessons from the many bad but pretty games that were so successful during this period, concluding that design could largely be left to take care of itself as long as a game looked exciting.

That Sierra games like King’s Quest IV did manage to be so successful despite their obvious underlying problems of design had much to do with the heady, unjaded times in which they were made — times in which a new piece of “bragware” for showing off one’s new hardware to best effect was worth a substantial price of admission quite apart from its value as a playable game. It also had something to do with Sierra’s masterful fan relations. The company projected an image as friendly and welcoming as their actual games were often unfriendly and obtuse. For instance, in another idea Ken nicked from Hollywood, by 1990 Sierra was offering free daily “studio tours” of their offices, complete with a slick pre-recorded “video welcome” from Roberta Williams herself, to any fan who happened to show up; for many a young fan, a visit to Sierra became the highlight of a family vacation to Yosemite. And of course the success of the King’s Quest games in particular had more than a little to do with the image of Roberta Williams, and the fact that the games were marketed almost as edutainment wares, drawing in a young, patient, and forgiving fan base who may not have fully comprehended that a King’s Quest was, at least theoretically, a game that could be won.

Still, these factors wouldn’t be enough to counter-balance fundamental issues of design forever. Well before the end of the 1990s, both Sierra and the adventure-gaming genre with which they would always be most identified would pay a steep price for too often making design an afterthought. Players, tired of being abused, bored with the lack of innovation in adventure-game design, and no longer quite so easy to wow with audiovisual flash alone, would begin to drift away; this trickle would become a flood which left the adventure genre commercially high and dry.

But all of that was still far in the future as of 1990. For now, Sierra was at the forefront of what they believed to be an emerging new form of mass entertainment, not quite a game, not quite a movie. Gross sales had risen to $21.1 million for the fiscal year ending March 31, 1989, then $29.1 million the following fiscal year. In 1990, they expanded their reach through the acquisition of Dynamix, a six-year-old Oregon-based development house with a rather odd mix of military simulations — after all, Sierra did want men as well as women to continue buying their products — and audio-visually rich if interactively problematic “interactive movies” in their portfolio. Sierra’s years in the MS-DOS wilderness were over; now that same MS-DOS represented the mainstream, soon virtually the only stream of American computer gaming. Some very, very good years lay ahead in commercial terms. And, it must be said, by no means would all of Sierra’s games be failures in terms of design; some talented and motivated designers would soon be using the company’s SCI technology to make interactive magic. So, having given poor King’s Quest IV such a hard time today, next time I’ll be kinder to a couple of other Sierra games that I really don’t like.

Nope… I love them.

(Sources: Computer Gaming World of December 1988; Byte of September 1987; Sierra’s newsletters dated Spring 1988, Winter 1988, Spring 1989, Autumn 1989, Spring 1990, Summer 1990; Sierra’s 10th Anniversary promotional brochure; press releases and annual reports found in the Sierra archive at the Strong Museum of Play. Much of this article is also drawn from personal email correspondence with John Williams and Corey Cole. And, last but far from least, Ken Gagne also shared with me the full audio of an interview he conducted with Al Lowe for Juiced.GS magazine. My huge thanks to John, Corey, and Ken!)

 
 

Tags: , ,

IBM’s New Flavor

The PS/2 lineup

IBM’s greatest triumph was inextricably linked with what by 1986 was turning into their biggest problem. Following its introduction five years before, the IBM PC had remade the face of corporate computing in its image, legitimizing personal computing in the eyes of the Fortune 500 and all those smaller companies who dreamed of someday joining their ranks. The ecosystem that surrounded the IBM PC and its successors was now worth countless billions, the greatest story of American business success of them all to play out during Ronald Reagan’s storied Morning in America.

The problem, at least as IBM and many of their worried stockholders perceived it, was that they now seemed on the verge of losing control of the very standard they had created. A combination of the decisions that had allowed the original IBM PC to become a standard in the first place — its simple, workmanlike design that utilized only off-the-shelf components; the scrupulously thorough documentation of said design; the decision to outsource the machine’s operating system to Microsoft, a third party all too willing to license the same operating system to other parties as well — had led to a thriving market in so-called “clone” machines whose combined revenues now far exceeded IBM’s personal-computer sales. IBM believed that the clonesters were lifting billions out of their pockets every year, even as they saw their own sales, which had broken record after record in the first few years following the IBM PC’s launch, beginning to show signs of stagnation.

Compaq of Houston, Texas, the most aggressive and innovative of the clonesters, had first begun to collect for themselves a reputation to rival IBM’s own with their very first product back in 1983, a portable — or, perhaps better said, “luggable” — all-in-one IBM-compatible. The Compaq Portable had forced IBM for the first time to play catch-up with a personal-computing rival, rushing to market a luggable of their own. To make matters worse, the IBM version of portable computing had proved far less practical than the Compaq, as many a reviewer wasn’t shy about pointing out.

Now, in 1986, Compaq threatened to wrangle away from IBM the mantle of technological leadership via a machine that represented a more fundamental advance than a new form factor. After hearing that IBM didn’t have any immediate plans to release a machine built around the Intel 80386, a new 32-bit processor that was sending waves of excitement rippling through the industry, Compaq decided to push ahead with a 386-based machine of their own — right now, this very year. The public launch of the Compaq Deskpro 386 on September 9, 1986 — almost exactly five years after the debut of the original IBM PC — was another watershed moment, the first time one of the clonesters had released a machine more powerful than anything in IBM’s stable. Compaq’s CEO Rod Canion, never a shrinking violet under any circumstances, outdid himself at the launch, declaring the Deskpro 386 “the third generation of the personal-computer revolution” after the Apple II and the IBM PC, thus implicitly placing his own Compaq on a par with those two storied companies.

The clone market was getting so big that there seemed a danger that the clones wouldn’t be dismissed under that selfsame moniker much longer. People in the business world were beginning to replace the phrase “IBM clone” with phrases like “the MS-DOS standard” or “the Intel standard,” giving no credit to the company that had really created that standard. As was well attested by their checkered history of antitrust investigations and allegations of unfair competitive practices, IBM had never been known as a bastion of corporate generosity. It may not be exaggerating the case to say that they felt themselves to have a moral right to the PC standard they’d created, a right that encompassed not just an acknowledgement that said standard was still the IBM standard but also the ability to continue to steer every aspect of the further development of that standard. And by all rights the right should also encompass — and this was the sticking point that really irked — their fair share of all those billions that all those other companies were making from IBM’s standard.

In addition to furnishing what they saw as ample evidence of a need for them to reassert control of their industry, this period found IBM at another, more purely technical crossroads. The imminent move from 16-bit to 32-bit computing represented by the new 80386 would have to bring with it some elaborations on IBM’s tried-and-true architecture — elaborations that would undoubtedly define the face of mainstream business computing into the 1990s. IBM saw in those elaborations a way to remedy the ongoing problem of the clonesters as well. Unknown to everyone outside the company, they were about to initiate the so-called “bus wars,” a premeditated strike aimed directly at what they saw as parasites like Compaq.

The bus in this context referred not to a mode of public transportation but rather to the system of expansion slots that allowed the innermost core of an IBM-compatible computer — little more than the processor and memory — to communicate with just about everything else that made up a full-fledged PC: floppy and hard disk drives, monitors, modems, printers, ad infinitum, from the most generalized components found in just about every office to the most specialized for the most esoteric of tasks. The original IBM PC, built around a hybrid 8-bit and 16-bit chip called the Intel 8088, had used an 8-bit bus, meaning the electronic “channel” it used to talk to all these myriad devices was just 8 bits wide. In 1984, IBM had released the PC/AT, built around the newer fully 16-bit Intel 80286, and in that machine had expanded the original bus to support 16-bit devices while remaining backward compatible with the older 8-bit standard. The result retroactively came to be known as the Industry Standard Architecture, or ISA.

Now, with the 32-bit 80386 a reality, it was time to think about revisiting the bus again, to make it support 32-bit communications. To fail to do so would be to cripple the 386, forcing it to act like a 16-bit chip every time it wanted to communicate with a peripheral; impressive as they were in many ways, the Compaq Deskpro 386 and other early 386 clones saw their performance limited by exactly this problem. Most people expected IBM to do for the 386 what they had previously done for the 286, delivering a new bus which would support 32-bit peripherals but remain compatible with older 16-bit and even 8-bit devices. Instead they delivered something they called the Micro Channel Architecture, or MCA, a complete break with the past which supported only 32-bit peripherals.

So much controversy over something barely noticeable. The four Micro Channel slots sit at the left rear of this PS/2 Model 50.

So much controversy over something barely noticeable. The four Micro Channel slots sit at the left rear of this PS/2 Model 50. Many of the components that would have been housed in expansion cards in earlier IBM systems, such as the video card and hard-drive controller, were moved onto the motherboard with the PS/2 line.

MCA debuted as a key component in a new line of personal computers in April of 1987, the most ambitious such IBM had ever or would ever introduce. The Personal System/2 lineup — better known as the PS/2 — was envisioned as exactly the next generation in personal computing that an ebullient Rod Canion had perhaps overenthusiastically declared the Compaq Deskpro 386 to represent barely six months before. IBM was determined to once again remake the computer industry in their image — and to get it right this time, avoiding the perceived mistakes that had led to the rise of the clonesters. The PS/2 lineup did encompass lower-end machines using the old 16-bit PC/AT bus, but the real point of the effort lay with the higher-end models, IBM’s first to use the 80386 and their first to use the new MCA bus architecture to take advantage of all of the 32 bits of throughput offered by that chip. IBM offered various technical justifications for the failure of MCA to support their older bus standards, but they always rang false. As the more astute industry observers quickly realized, MCA had more to do with business and marketing than it did with technology in the abstract.

IBM was attempting a delicate trick with MCA. They wanted to be able to continue to reap the enormous benefits of the business-computing standard they had birthed, with its huge constellation of compatible software that by now even more so than IBM’s reputation made an MS-DOS machine the only kind to be seriously considered by the vast majority of corporate purchasing departments. At the same time, though, they wanted to cut off the oxygen to the clonesters who were also benefiting so conspicuously from that same universal acceptance, and to reassert their role as the ultimate authorities on the direction business computing would take in the future. They believed they could accomplish all of that, in the long term at least, by threading the needle of compatibility — keeping the 386-based PS/2 lineup software-compatible with the older machines while deliberately breaking the hardware compatibility so relied on by the clonesters. In doing so, they would take the hardware to a place the clonesters couldn’t follow, thus securing for themselves all those billions the clonesters had heretofore been stealing out of their pockets.

Unlike the original IBM bus architecture, MCA was locked up inside an ironclad cage of patents, making it legally uncloneable unless one could somehow negotiate a license to do so through IBM. The patents even extended to add-on cards and other peripherals that might be compatible with MCA, meaning that absolutely anyone who wanted to make a hardware add-on for an MCA machine would have to negotiate a license and pay for the privilege. The result should be not only a lucrative new revenue stream but also complete control of business computing’s further evolution. Yes, the clonesters would be able to survive for a few more years making machines using the older 16-bit bus architecture. In the longer term, however, as personal computing inevitably transitioned into a realm of 32 bits, they would survive purely at IBM’s whim, their fate predicated on IBM’s willingness to grant them a patent license for MCA and their own willingness to pay dearly for it.

The clonesters rightly and immediately saw MCA as nothing less than an existential threat, and were thrown into a tizzy trying to figure out how to respond to it. It was the ever-quotable Rod Canion who came up with the best line of attack, drawing an analogy between MCA and the recent soft-drink marketing disaster of New Coke. (What with Pepsi alumnus John Sculley in charge over at Apple, computers and soft drinks seemed to be running oddly in parallel during this era.) Clever, pithy, and blessedly non-technical, Canion’s comparison spread like wildfire through the business press, regurgitated ad nauseam by journalists who often had little to no idea what this MCA thing that it referenced actually was. IBM never quite managed to formulate a response that didn’t sound nefariously evasive.

With the “New Coke” meme setting the tone, just about everything about the PS/2 line turned into an unexpected uphill struggle for IBM. While plenty of early reviewers dutifully toed the line, doubtless mindful that if no one ever got fired for buying IBM no one was likely to get fired for giving them a positive review either, a surprising number of the reviews were distinctly lukewarm. The complaints started and often ended with the prices. Even the low-end 16-bit PS/2 models started at a suggested list price of $2295 without monitor, while the high-end models topped out at almost $7000. Insider reports had it that IBM was enjoying profit margins of 40 percent or more, leading to rampant speculation on what the cost of entry into business-friendly personal computing might become if they really should manage to stamp out the clonesters.

The high-end models in particular struck many as a pointless waste of money given that IBM didn’t have an operating system ready to take advantage of their capabilities. The machines were all still saddled with MS-DOS, clunky and archaic and barely worthy of the name “operating system” even in the terms of 1987. In one of the more striking examples of hardware running away from software in computing history, the higher-end models shipped with 1 MB of memory, but couldn’t actually use more than 640 K of it thanks to MS-DOS’s built-in limitations. IBM promised a new, next-generation operating system called OS/2 to unlock the real potential of these next-generation machines. But OS/2, a project they had once again chosen to turn over to Microsoft, was still an unknown number of months away, with the so-called “Presentation Manager” that would add to it a Macintosh-style GUI due yet further months after that. [1]The full story of OS/2 and the Presentation Manager and their relationship to Microsoft Windows and even Apple’s MacOS is a complex yet fascinating one, but also one best reserved for a future article where I can give it its proper due. And, as a final little bit of buyer discouragement, IBM planned to charge the people who had already spent many thousands on their PS/2 hardware another $800 or so for the privilege of using the eventual OS/2 to take advantage of it.

The PS/2 launch prompted constant comparisons with the original IBM PC launch of five and a half years before, and constantly came up wanting. IBM’s publicity campaign was lavish — as it ought to have been, given those profit margins — but unfocused and uninspired. Its centerpiece was a series of commercials involving much of the cast from M*A*S*H, playing their old sitcom characters inexplicably transported from the Korean War to a modern office. With M*A*S*H still a beloved cultural touchstone only a few years removed from its record-shattering final episode, the spots had plenty of sheer star power, but lacked even a modicum of the charm or creativity that had characterized the award-winning “Charlie Chaplin” advertisements for the original IBM PC.

Likewise, it was hard not to compare the unexpected spirit of openness that had suffused the 1981 IBM PC with the domination and control IBM so plainly intended to assert with the 1987 PS/2 launch. Apple’s iconic old “Big Brother” Macintosh advertisement, a soaring triumph of rhetoric over substance back in its day, would have fit much better to the PS/2 line than it had to the state of business computing back in 1984. Many chose to blame the change in tone on the loss of Don Estridge, the leader of the small team that had built the original IBM PC. An unusually charismatic personality and independent thinker for the famously conservative and bureaucratic IBM — enough so that he had been courted by Steve Jobs to fill the CEO role John Sculley ended up taking at Apple — Estridge had been killed in a plane crash in 1985. His stewardship over IBM’s microcomputer division had been succeeded by that of William Lowe, a much more traditional rank-and-file, buttoned-down IBM man. Whether due to this reason or some other, the shift in tone and direction from 1981 to 1987 was striking.

In the months following the PS/2 line’s release, the media narrative drifted from one of uncertain excitement to reports of the new machines’ disappointing reception in many quarters. IBM sold around 200,000 MCA-equipped PS/2s in the first six months, mostly to the biggest of big business; United Airlines alone, for example, bought 40,000 of them as part of a complete revamping of their reservations system. But far too many even within the Fortune 500 proved stubbornly, unexpectedly resistant to IBM’s unsubtle prodding to jump onto the PS/2 train. Many chose to invest in the clonesters’ cheaper 80386 offerings instead; the 16-bit bus used by those machines, while far from ideal from a purely technical standpoint, did at least have the advantage of compatibility with existing peripherals. Seventeen months after MCA’s debut, 66 percent of all business computers being sold each month were still using the old bus architecture, versus just 20 percent that used MCA. (The remainder was largely accounted for by the Macintosh.) Survey after survey reported IBM to be losing market share rather than gaining it since the arrival of the PS/2. By this point OS/2 and its “Presentation Manager” GUI were finally available, but, hampered by that high price tag, the new operating system’s uptake had also been limited at best.

And then, just when it seemed the news couldn’t get much worse for IBM, much of the industry went into unthinkable open revolt against their ongoing hegemony. On September 13, 1988, a group of the clonesters, driven as usual by Compaq and with the tacit support of Intel and Microsoft, announced the creation of a new 32-bit bus standard, to be called the Extended Industry Standard Architecture, or EISA. Unlike MCA, EISA would be compatible with older 16-bit and 8-bit peripherals. And it would manage to be so without performing notably worse than MCA, thus giving the lie to IBM’s claims that their decision to abandon bus compatibility had been motivated by technical rather than business concerns. The press promptly dubbed the budding consortium, which included virtually every manufacturer of IBM-compatible computers not named IBM, the “Gang of Nine” after the allegedly traitorous Gang of Four of the Chinese Cultural Revolution. Machines using the new EISA bus entered production within a year.

This shot of an EISA card illustrates the unique two-layer connection that allowed the same sockets to work for both the older ISA standard and the newer EISA. The shorter pins correspond to the older 16-bit standard; the longer extend it to 32 bits.

This shot of an EISA card illustrates the unique two-layer connection devised by the Gang of Nine to extend the old ISA standard without requiring ridiculously long, unwieldy cards and sockets. The shorter pins correspond to the older 16-bit standard; the longer extend it to 32 bits.

In the end, EISA would prove of limited technical importance in the evolution of the Intel architecture. The new standard didn’t have much more luck than had MCA in establishing itself as the market’s default. Instead, by the time a 32-bit bus became a truly commonplace need among ordinary computer users, EISA and MCA alike were replaced by a still newer and better standard than either called the Peripheral Component Interconnect, or PCI. The bus wars of the late 1980s and very early 1990s can thus all too easily be seen as just another of the industry’s tempests in a teapot, an obscure squabble over technical esoterica of interest only to hardcore hackers.

Look a little harder at EISA, however, and we see a watershed moment in the history of the personal computer that dwarfs even the arrival of the Compaq Portable or the Deskpro 386. The Gang of Nine’s announcement brought with it a torrent of press coverage that for the first time openly questioned IBM’s continuing dominance of business-oriented computing. CNN’s Moneyline, the most-watched business report on cable television, dredged up Canion’s evergreen New Coke analogy yet again, going so far as to open its reports on the Gang of Nine’s announcement with a shot of soda bottles moving down a production line. IBM was “faced with overwhelming resistance to the flavor of ‘New Compute,'” declared the breathless report that followed; September 13, 1988, “was a day that left Big Blue looking black and blue.” An only slightly more sober Wall Street Journal article had it that the Gang of Nine “was joining forces in an audacious attempt to wrest away from IBM the power of setting the standard for how personal computers are designed, and they seem to have a chance of succeeding.” The article threw all its metaphors in a blender for the big conclusion: “For IBM, the Gang’s announcement yesterday is at best a dust storm of confusion, and, at worst, a dagger to the heart of its PC strategy.” When the Wall Street Journal threatens to turn against your big business, you know you have problems.

And, indeed, September 13, 1988, wound up representing everything the pundits and journalists said it might and more. Simply put, this was the instant that IBM finally and definitively lost control of the business-computing industry, the moment when the architecture they had created back in 1981 left the nest to go its own way. After this instant, no one would ever defer to IBM again. In January of 1989, Arlan Levitan, a columnist for the big consumer-computing magazine Compute! — like most such magazines, not particularly known for the boldness of its editorial stances — signaled the shifting conventional wisdom. His editors empowered him to launch a satirical broadside at IBM, the PS/2, MCA, and even all those who had bought into the hype, a group that very much included their own magazine.

During a Monday morning press breakfast hosted by IBM, over a thousand representatives of the computing press were shocked to hear newly hired Entry Systems Division president P.W. Herman declare that the firm’s PS/2 computer systems and its associated products were part of an elaborate psychological study undertaken at the behest of the National Institute of Mental Health. “I sure am glad the American people haven’t lost their sense of humor. It’s good to know that in these times everybody still appreciates a good joke.” According to Herman, the study was intended to quantify the limits of the operational parameters associated with Abraham Lincoln’s most famous aphorism. Said Herman, “I guess you really can’t fool all of the people all of the time. I’ll tell ya, though — the Micro Channel Architecture even had me going for a while.” All PS/2 owners will receive a letter signed by Herman and thanking them for their personal contribution toward furthering the present-day understanding of aberrant behavior. Corporate executives who committed their firms to IBM’s $800 OS/2 operating system will receive free remedial therapy in DOS reeducation centers. Those who took advantage of IBM’s trade-in policy, whereby users gave up their XTs or ATs for a PS/2, will receive their weight in PCjr computers. According to internal IBM sources, all costs associated with manufacturing and promoting PS/2s will cumulatively qualify as a tax-deductible research grant.

In terms of hardware if not software — Microsoft’s long, often damaging domination was just beginning in the latter realm — the industry was now a meritocracy, bound together only by a set of mutually if often only tacitly agreed-upon standards. That could only mean hard times for IBM, who were hardly used to competing on such a level playing field. In 1993, they posted a loss of a staggering $8 billion, the largest to that point in American business history, prompting a long, painful process of reinvention as a smaller, nimbler, dare I say it even humbler company. In 2004, in another watershed moment symbolic of many things, IBM stopped making PCs altogether, selling what was left of their personal-computer division to the Chinese computer manufacturer Lenovo in order to focus on consulting services.

The PS/2 story has rightfully gone down in business history as a classic tale of overweening arrogance that received its justified comeuppance. In attempting so aggressively to seize complete control of business computing — all of it — IBM pissed away the enviable dominance they already enjoyed. In attempting to build an empire that stood utterly alone and unchallenged, they burned the one they already had.

Yet there is another side to the PS/2 story that also deserves its due. Existing in those seemingly misbegotten machines alongside MCA and the cynicism it represented was a more positive, one might even say technically idealistic determination to advance the state of the art for this architecture that had long since become the mainstream face of computing, dwarfing in terms of the sheer money it generated any other platform.

And make no mistake: the world of the IBM compatibles was in sore need of advancement on multiple fronts. While machines like the Apple Macintosh and Commodore Amiga had opened whole new paradigms of computing — the former with its friendly GUI interface and crisp almost print-quality display, the latter with its multitasking operating system and implementation of the ideal of multimedia computing long before “multimedia” became a buzzword — the world of the clones had remained as bland as ever, a land of green or amber text-only displays, unpleasant beeps and squawks, and inscrutable command lines. For all the apparently proud users and sellers who took all this ugliness as a sign of serious businesslike intent, there were others who recognized that IBM and the clonesters had long since ceded the high ground of real, fundamental innovation in computing to rival platforms. Thankfully, some inside IBM were included in the latter group, and the results could be seen in the PS/2 machines.

Given how far the IBM-compatible world had fallen behind, it’s not surprising that many or most of the alleged innovations of the PS/2 were really a case of playing catch-up. For example, IBM finally produced their first-ever mouse for the line. They also switched over from the old, fragile 5.25-inch floppy-disk format to the newer, more robust and higher-capacity 3.5-inch format already being used by machines like the Macintosh and Amiga.

But undoubtedly the most welcome and significant of all the PS/2’s new technical developments were some desperately needed display improvements. The Video Graphics Array, or VGA, was included with the higher-end PS/2 models; lower-end models shipped with something called the Multi-Color Graphics Array (MCGA), with many but not quite all of the capabilities of VGA. After allowing their machines’ graphics capabilities to languish for years, IBM through VGA and to some extent MCGA finally brought them up to a level that compared very favorably with the Amiga. VGA and MCGA defined a palette of fully 262,144 colors, a huge leap over the 64 offered by the Enhanced Graphics Adapter (EGA), IBM’s previous best display option for their mainstream machines. The Amiga, by contrast, offered just 4096 colors, although its blitter and other custom hardware still gave it some notable advantages in the realm of fast animation.

All of these new developments marked IBM’s last great gifts to the standard they had birthed — gifts destined to long outlive the PS/2 line itself. The mouse connection IBM developed, for instance, remained a standard well beyond the millennium, with so-called “PS/2” connectors remaining common jargon, used by younger tech-heads and system builders who likely had only the vaguest idea from whence the usage derived. The VGA standard proved even longer-lived. It still survives today as the lowest-common-denominator baseline for computer displays, while ports matching the specification defined by IBM all those years ago remain on the back of every monitor and television set.

Ironically given IBM’s laser focus on using the PS/2 line to secure their dominance of business computing, its technical innovations ultimately proved most important in making the architecture viable as a proposition for the home, paving the way for the Microsoft-dominated second home-computer revolution of the 1990s. With good graphics falling into place at last thanks to VGA and the raw power of the 32-bit 80386, only two barriers remained to making PC-compatible machines realistic rivals to the likes of the Amiga as compelling home computers: decent sound to replace those atrocious beeps and squawks, and a decent price.

The first problem wouldn’t be a problem at all for very much longer. The first gaming-focused sound cards began to reach the market within a year of the PS/2 line’s debut, and by 1989 Creative Music Systems and Ad Lib both offered popular cards at street prices of $200 or less.

But the prices of home-oriented systems incorporating all of the PS/2 line’s innovations — MCA excepted — would, alas, take a little longer to fall. As late as July of 1989, when the VGA standard was already more than two years old, Computer Gaming World ran an article titled “Is VGA Worth It?” that seriously questioned whether it was indeed worth the still very considerable expense — VGA boards still cost $500 or more — to so equip a machine, especially given how few games supported VGA at that point. Nor did the 80386 find an immediate place in homes. As the 1980s turned into the 1990s, the newer chip was still a piece of pricey exotica in terms of the average consumer’s budget; the vast majority of the Intel-based PCs that were in consumers’ homes were still built around the 80286 or even the venerable old 8088.

Still, in the long run prices could only fall in such a hyper-competitive market. Given Commodore’s lackadaisical attitude toward improving the Amiga and Apple’s almost complete neglect of the consumer market in their eagerness to force the Macintosh into the offices of corporate America, the emerging standard of a 32-bit Intel-based PC with VGA graphics and a sound card came to the fore effectively unopposed. With the Internet having yet to emerge as home computing’s killer app to end all killer apps, it was games that drove this shift. In 1989, an Amiga was still the ultimate gaming computer. By 1991, it was an afterthought for American game publishers, the market being absolutely dominated by what was now starting to be called the “Wintel” standard. While game consoles and mobile devices have come and gone by the handful over the years since, in the realm of desktop- and laptop-based personal computing the heirs of the original IBM PC remain the overwhelming standard to this day. How ironic that this decades-long dominance was ensured by the PS/2, simultaneously the downfall of IBM and the savior of the inadvertently standard architecture IBM created.

(Sources: the books Big Blues: The Unmaking of IBM by Paul Carroll, Open: How Compaq Ended IBM’s PC Domination and Helped Invent Modern Computing by Rod Canion, and Hard Drive: Bill Gates and Making of the Microsoft Empire by James Wallace and Jim Erickson; Byte of June 1987, July 1987, August 1987, and December 1987; Compute! of June 1988, January 1989, and March 1989; Computer Gaming World of July 1989 and September 1989; Wall Street Journal of September 14 1988; the episodes of The Computer Chronicles titled “Intel 386 — The Fast Lane,” “IBM Personal System/2,” and “Bus Wars.”)

Footnotes

Footnotes
1 The full story of OS/2 and the Presentation Manager and their relationship to Microsoft Windows and even Apple’s MacOS is a complex yet fascinating one, but also one best reserved for a future article where I can give it its proper due.
 
 

Tags: , ,

A Conversation with Lane Barrow

Although I seem to find myself talking to more and more people in researching this history I’m in the middle of, I don’t often publish the results as straight-up interviews. In fact, I’ve published just one interview in the entire history of this blog, and a very short one at that, done during the early days when I was still finding my way to some extent. I have a number of reasons for avoiding interviews, starting with the fallibility of all human memory and ending with the fact that I consider myself a writer, not a transcriber.

Still, almost any policy ought to have its reasoned exceptions, and this anti-interview policy of mine is itself no exception to that rule. Having just introduced you to AGT and the era of more personal text adventures it ushered in in my last article, it seems appropriate today to let one AGT author tell his own very personal story. So, I’d like to introduce you to Lane Barrow, author of A Dudley Dilemma, the winner of the first of David Malmberg’s eventual six annual AGT competitions. Unique (and uniquely interesting) though it is in so many ways, I trust that some of the more generalized overtones of Lane’s story apply to many of the others who found through AGT a way to make the switch from being text-adventure consumers to text-adventure creators.

If what follows should tempt you to give A Dudley Dilemma a play — something I highly recommend! — do be sure to go with the “remastered” version Lane has provided, which cleans up the design here and there and works properly with modern interpreters like AGiliTy and Gargoyle. You can download this definitive version from this very site or from the Interactive Fiction Archive.


Lane Barrow, 1988

Lane Barrow, 1988

Thank you so much for talking with me today! Maybe we could start with a bit of your personal background. I believe I read somewhere that you spent some time in the Air Force?

Yes, I was in the Air Force from 1966 to 1970 – two tours in Vietnam. When I rejoined civilian life, I lived in California for all of the 70’s, which was a perfect time to be there (a decade of great music and horrible clothing). I even had a brief encounter with members of the Manson family. Interesting story, but probably not relevant to what you’re looking for.

Sorry, but I can’t just let that one fly by. Please, tell!

It’s not as sinister as it sounds. When I first moved to LA after the Air Force, I hung out with a nascent rock band. We liked to party a lot, and one of the places we frequented belonged to a guy named T.J. and his on-again, off-again girlfriend Jo. I got along really well with T.J. For one thing, he was also a Vietnam vet turned hippy, plus he was creative and outgoing. Turns out he was also an ex-Manson family member. In fact, he was with Manson when Charlie shot some North Hollywood drug dealer. This didn’t sit well with T.J. so he basically left the family soon after.

Anyway, we went over to T.J.’s one night (this was sometime in the summer of 1970) and there were these four girls sitting around the living room with shaved heads and “X”s cut into their foreheads. Apparently these girls were still faithful to Manson and kept a vigil outside the county courthouse while his trial was in session . They had come to see if T.J. could put them up for the night. After a few minutes, T.J. whisked us into the kitchen and suggested that it wasn’t a good idea to party that night, so we left. I still remember the cold stares those girls gave us the whole time we were there. And they never said a single word. So that was my Manson family experience. As I said, living in LA in those days was never boring.

Okay, thanks! So, how did you go from being a Southern California hippie to a Harvard PhD candidate?

I knocked around LA for several years, and then settled in Santa Barbara, where I worked as a baker at Sunrise Bakery (a small co-op enterprise). At the same time, I attended Santa Barbara City College and then UCSB on the GI Bill. I majored in English Lit, and did well enough to get accepted to graduate school at Harvard, also in English.

I was 33 years old when I entered Harvard, so I was a little older than most of my classmates, although there were several other Vietnam vets in the English Dept at the time. I was single then, but I met my future wife there (we’re still together by the way), and her long luxurious hair was the reason I included the sentient hairball in the first part of A Dudley Dilemma.

Bear in mind that my life as a grad student was pretty uneventful compared to Vietnam and California, but that was OK with me. Of course, uneventful isn’t the same thing as stress-free. Grad school can be pretty intense. I actually had more anxiety dreams about the classroom than I ever did about combat. Go figure. Working on the Dudley game was a real stress-reliever for me. It introduced me to programming, which I still enjoy, mostly in Excel these days.

Long before you started to write A Dudley Dilemma, I understand that you discovered text adventures at Harvard?

Yes. In the early ’80s I discovered a couple of fun games on the mainframe while I was learning how to work with computers. These were, of course, Colossal Cave and Zork. If I remember correctly, Zork had just been released commercially, but I didn’t get my first PC until Leading Edge came on the scene in 1985, so the mainframe was my only access. At first, I played both games pretty much equally but Zork slowly took over as my favorite, largely because of its sense of humor.

Why did you come to buy that first PC? Were you intending to use it to play more games like Zork from the beginning?

I’m afraid I had a fairly utilitarian motive for buying my first PC. I was beginning my dissertation at the time, and using the mainframe was a nightmare. If you’ve ever worked with printer “dot commands”, you understand. So I bought a Leading Edge Model D for purely academic work. The computer games were just icing on the cake.

Since I never finished Zork on the mainframe, that was the first game I purchased. I still have the receipt for Zork I tucked into the box ($29.95 purchased on March 31, 1986). Zork II and Zork III were next.

After that, I went on an Infocom binge. I think I bought every title they had at the time, and would wait expectantly for their new releases. I still have many of those boxed sets, complete with tchotchkes. Needless to say, this slowed down my progress on my dissertation…

Did you have any particular favorites among the Infocom catalog?

I liked them all. I gravitated toward the sci-fi / fantasy titles, but I got a big kick out of Bureaucracy also.

Did you play any games from other publishers — whether text adventures or games in other genres — or were you strictly an Infocom guy?

Infocom was pretty much my only focus at first, but eventually I tried other games. However, I don’t remember any specific titles, so obviously they didn’t have the same impact on me as the Infocom offerings. For me, the biggest attraction of the AGT toolkit was its ability to create an Infocom-type game. I had plans to write a second AGT game, but never got around to it. By that time, I was wrapping up grad school and engaged in job-hunting.

I continue to enjoy computer games, post-Infocom, and prefer adventure games, with an emphasis on puzzle-solving. I don’t care much for platform games, or timed puzzles. As you know, that somewhat limits my choices these days, although the Portal games are fun.

How exactly did you become an early AGT adopter? Do you recall how you first learned about the system?

I don’t remember how I learned about AGT, but I was pretty active in various bulletin board chat rooms in those days, so it was probably via one of those. At any rate, I decided to try my hand at creating an Infocom-type game for Dudley House, where I was a resident tutor. I wanted to cram in as many recognizable people, events, places as possible, since the game was going to be on the computer in Dudley House Library. So, I ordered the AGT toolkit, and got to it. I found the language pretty easy to pick up, since it’s very logical. Plus, whenever I had a problem or question, I would email Dave Malmberg, and he would get back to me quickly. I believe I even spoke with him on the phone once or twice, but I might be mis-remembering that (growing old has its advantages, but memory isn’t one of them).

It took me several months to finish the original Dudley Dilemma, and when I put it on the library computer, it caused a bit of a conflict between students who wanted to play the game, and students who wanted to use the on-line card catalog. We even had a competition to see who could finish the game the fastest. I don’t recall the winner’s name, but she was a Junior English major.

I had a ball writing the game, and tried to capture the quirky feel that Infocom was so good at. I ripped off their ideas shamelessly. As you probably noticed, the WHISTLE-CLAP hedge maze sequence is straight out of Leather Goddesses of Phobos (Clap-Hop-Kweepa).

To what extent did you feel yourself to be a part of an AGT community?

If there was an AGT community in those days, I wasn’t aware of it. I did play a couple of other AGT games from time to time (I remember one that had a carnival setting) . If I recall, they were in the overall package that came with the toolkit, or maybe they came later, when Dave mailed out a compilation of AGT contest winners. I don’t remember the chronology all that distinctly.

So, we might even say that you felt yourself to be developing your game largely in a vacuum?

Yes. I really developed Dudley by the seat of my pants, through trial and error. There were times when I was trying to work out a tricky bit of coding that I found myself dreaming about flags and variables. As I mentioned earlier, I wanted to incorporate a lot of actual detail that Dudley students would recognize, so I would jot down notes on a particular incident or individual and then figure out how to code that into the game. Of course I added an exaggerated quality to everything to give it a more whimsical feel, but the vast majority of A Dudley Dilemma is based on reality.

Going back over those days has helped me remember how much fun I had creating the game in the first place. Or maybe nostalgia is a selective process that filters out the “bad.” I’m sure there were probably times when I wondered why I had gotten myself into this project, but obviously I stuck with it.

In general, Dudley is a quite fair game for its day, with few instances of guess-the-verb or read-the-author’s-mind puzzles. There are adventure games that seem designed to frustrate and defeat the player and those that prioritize fun, fair play, and solubility. A Dudley Dilemma is, within the limitations of its era and its technology, very much in the latter category for me. Do you have any comments to make on your general design approach or methodology?

I’m not sure I had a coherent design methodology beyond what I’ve already mentioned: making it accessible to the students of Dudley House. Pretty much all the people and places in the game have their counterparts in the Harvard of the day, and these would have been evident to my core audience. Of course, this dates the game in that respect, but I also tried to make the situations broad enough to have some shelf life, and to be enjoyable even if you didn’t get the “in jokes.” Beyond that, there was a certain random quality to my choices. One thing seemed to flow out of another, maybe just by association of ideas.

You refer to adventure games that frustrate or defeat the player. In the years since I wrote Dudley, I’ve encountered a few of those, and I felt like a bit of the enjoyment was leached out. For example, some of the puzzles in Schizm or The Witness (the recent Jonathan Blow game, not the Infocom title) would challenge Einstein. Infocom games never took that road, which is one of the reasons I like them to this day. They are infused with a focus on fun and entertainment, and that’s what I tried to do in Dudley. However, there IS one overall design element that I’d change if I were re-writing the game today: I would make it impossible to render the game un-winnable.

A few puzzles that might raise some eyebrows today are those relying on outside knowledge. I’m thinking particularly here of the Arabian Nights, Waste Land, and Kingston Trio puzzles. These sorts of “outside research” puzzles were not commonly found in Infocom games (other than puzzles that required information included in the feelies, of course). Any comments on these?

I think I must have been a little ambivalent about those even when I included them. In one of the Dudley re-writes, I added a couple of books in the opening room that, if read, gave the solutions to the Arabian Nights puzzle and to the Waste Land puzzle. I also gave a more detailed hint about the Kingston Trio puzzle, but I don’t recall where that is in the game. Maybe when you first encounter the Kingston Trio album in the giant cockroach maze.

Just a side note: Obviously, the MBTA references have a Boston connection, and since Dudley House was the administrative center for commuting students, a lot of them rode the “T” on a daily basis, so that’s why I added that component. As for the Waste Land bit, this is more obscure. The game opens in Apley Court, which is where T.S. Eliot lived when he was a graduate student at Harvard. Some scholars believe that he began early drafts of The Waste Land at that time, so I couldn’t resist slipping that in.

What audience did you envision playing the game? You said that it was often played on a computer in a library at Harvard. Were you therefore writing primarily for fellow Harvard students? In short, what did you envision doing with the game, as far as distribution, after it was completed, given that you didn’t really feel yourself to be a member of any broader AGT community?

My main audience for the game was always the students of Dudley House, which helped me keep a certain focus to the action. I wanted them to undergo the “shock of recognition” while playing. I didn’t really envision a wider audience, and entering the AGT contest was an afterthought. I was thrilled to win it, which inspired me to “improve” the game over several versions, with pictures, sounds, etc. In retrospect, the original plain vanilla version is still my favorite.

I believe I even thought about applying for a job at Infocom, which was just down the road in Cambridge. That fantasy lasted for about 5 minutes. My only excursions into game design since Dudley are creating some Community Test Chambers in Portal 2. Also fun, but a whole different experience than AGT.

I thought it might be fun — for me and hopefully for you as well as for our readers (especially those who have begun to play the game) — if we could really dig into some of those aspects of daily life at Harvard that inspired so much of Dudley. This is the sort of thing that can make interactive fiction so uniquely personal in contrast to other sorts of games, and that can make amateur efforts like many of the AGT games more interesting in some ways than the slicker, more impersonal games of Infocom. So, I thought we could perhaps play a little game of free association. I’m going to try to jog your memory with various elements of Dudley, and maybe you could respond with their real-life antecedents (if any). Perhaps together we can create a sort of Annotated Dudley Dilemma to go with the Annotated Lurking Horror — the latter was an unusually personal game by Infocom standards — that Janice Eisen and I created earlier. Indeed, it feels particularly appropriate given that The Lurking Horror took place at (a thinly fictionalized) MIT, while A Dudley Dilemma plays out at MIT’s cross-town counterpart Harvard. So…

The scruffy pigeon?

Every adventurer needs a sidekick, right? Of course if I were entirely faithful to that idea, I would have kept the bird nearby for the entire game. Actually, in a later rewrite, I had the pigeon come to the rescue when you face the punk in the mean streets of Cambridge.

The genesis of this character involves an incident in the English Department around Christmas of 1987. One of the senior professors, Barbara Lewalski, was in her office with an advisee, when a soot-covered bird fell into the (unlit) fireplace and started fluttering around the room. Professor Lewalski opened a window and tried to shoo it out to no avail. After a few minutes, the bird fluttered back up the chimney. To make sure the bird was gone, the professor (who was an ample woman) got down on hands and knees to look up the chimney. Right then another senior professor, William Alfred, walked by the office door and did a double-take. According to the advisee, he leaned into the office and said “I don’t believe Santa is due for another week”, and strolled off chuckling. Trust me, Mr. Alfred was one of the only people I ever met who actually chuckled. Obviously this story made the rounds pretty quickly. The original bird wasn’t a pigeon, but since pigeons flock all over Harvard Square and Yard, I had to go with what works. All the rooms in Apley court have fireplaces, which I had already planned to use for roof access. I wanted the player to see early on that the fireplace was also an exit point, so I hoped that the pigeon would help establish that. Once the bird was in the room, I couldn’t resist expanding its role a bit.

The silverfish?

In order to get from the opening site (Apley Court) to the next location (Lehman Hall), you enter the silverfish maze. The maze is actually based on a system of steam tunnels that connect a number of Harvard buildings. Historical note: back in 1968, Harvard security used the steam tunnels to whisk Alabama Governor George Wallace out of Sanders Theater past a large crowd of protesters. That incident was still pretty infamous when I wrote Dudley, so I had to use the steam tunnels somehow. The silverfish guardian evolved out of the large number of those disgusting insects that swarmed around the basement storage area of Apley Court. I just converted the thousands of little ones into one huge one.

The nude tutors on the roof?

Apley Court was originally a residence hall for students (remember T.S. Eliot), but by the time I was there, it only housed the resident tutors for Dudley House. It had a flat roof that was perfect for sunbathing, so we would occasionally sneak up there for that purpose. I say sneak, because technically the roof was off-limits for safety’s sake. To my knowledge, no nude sunbathing ever took place up there, since the building across the street was much taller and afforded an unobstructed view, but I took some poetic license just for comic effect.

The statue in the dining hall?

Ah, Delmar Leighton. He was the first Master of Dudley House and around the time I was writing the game, a large wooden statue of the man was placed in one corner of the dining hall, where it gazed out on the students. I don’t know if the statue was moved from some other location or whether it was commissioned at that time, but it was quite a presence when you were trying to eat. Here’s a picture so you can see what I mean. I concocted the “touch and be touched by all” quote as a gameplay hint, since there’s no such thing on the original.

Delmar Leighton

Mike the guard?

Mike was a real security guard, and I’m really pissed at myself for forgetting his last name. It was something like Moretti or Frascetti. Sigh. Anyway, the real Mike was, if anything, even more diligent and proprietary about his building than my depiction of him. He was the mother hen of Lehman Hall, and I don’t mean that in a negative way. He was chatty and helpful and ever-vigilant. When I was designing the Lehman Hall section, it would have been sacrilege to omit Mike. It took me a while to figure out how to code in Mike’s eventual acceptance of you as a legit student, but using two different ID cards did the trick.

The crazy woman in Harvard Yard?

We called her “The Flapper.” She was rail-thin, about 60 years old or so, and dressed all in black head-to-toe (even in the summer). She mostly wandered around Harvard Square and just inside the gate beside Lehman Hall. She usually had a bag full of scavenged cans and other cast-off stuff, and she was always armed with a little square of folded newspaper that she would “flap” at you if you came too close. I don’t recall if she actually cursed at anyone, so obviously I took some liberties with that. This sequence was my first attempt at creating a random response to player interaction, so I had fun coming up with various curses. As for getting rid of her, I was concerned that the solution might be a bit obscure, (spoilers: highlight to read) but then I reasoned that most of us ignore strange street people anyway, so that part of the game really wrote itself.

Brother Blue in Harvard Yard?

Another real person. He was actually Dr. Hugh Morgan Hill, but his street moniker was Brother Blue, and he was a Boston institution (you can look him up in Wikipedia if you want more detail on his amazing life). When I was there, he would cruise around Harvard Square on roller skates and gather a crowd together so he could tell stories. He referred to himself as a “griot,” a kind of African poet and storyteller. His stories always had an inspirational point to them, but I didn’t think I could do justice to that aspect of his persona, so I made up my own little snippets. I wanted to create the impression of a complete story just by giving the ending. This is another random interaction, so the stories vary depending on the probabilities. I think there are maybe three or four different endings.

The hordes of lawyers?

Not much to say about this. I was looking for a way to “trap” the player with no obvious way out, so I could have done that in any number of ways. Since personal-injury lawyers are always a convenient target, I went for the obvious over-the-top joke. Harvard Law School is just down the walkway from the Science Center, so the internal geography worked out as well.

The professor explaining Hellenic warrior culture to a “class of large young men with no necks?”

Every university, even Harvard (gasp!) has its Easy A or “gut” classes. The class I’m referring to here was officially called Literature & Arts C-14: “The Concept of the Hero in Greek Civilization,” but was universally referred to as “Heroes for Zeros” because of the above-average concentration of jocks. It was taught by Professor Gregory Nagy, who is actually a world-renowned classical scholar. I think it must have come as a shock to many of the students that the class wasn’t as easy as reputation had it. But again, I was going for humor, and I needed a way to introduce a “zero” for later use in the game.

The GreenHouse Grill?

In reality, the Greenhouse Cafe in the Science Center. The Science Center is a massive building with computer rooms (in the 80’s anyway), offices, and classrooms, so having an in-house cafe was a real luxury. It gets the name from a glassed-in atrium section, and it’s a real resting-place, hang-out, meeting spot for students. I don’t recall that it plays a significant role in the game, so I probably included it just for local color and because I used to frequent it myself.

The aging, irate alumnus in the food line?

Well, I think I was channeling my future self when I came up with this guy. Scary! Anyway, the cafe in Dudley House was a tiny little area that served a lot of people every day. It was open to the public, so the students were only a part of the customer base. On any given day, the line at the cash register was clogged at lunch time and tempers would occasionally get frayed. The aging alum was based on a Dudley student’s parents who were visiting him. Things weren’t moving efficiently enough for the father, and he kept muttering about how much better it was when he was a student there. I was behind him in line, so I had to listen to him for many long minutes. That memory stuck with me, so I used it in the game. Trust me, I made the fictional alum a lot more pleasant than the real thing. Helen the cashier is also a real person, and dealt very patiently with the daily chaos.

Paul and Cynthia Hanson?

They were the Co-Masters of Dudley House. Maybe a little explanation is needed here. After their freshman year, the vast majority of Harvard students move into a residential “House” that creates a smaller space within the larger university. These houses have distinct characters, and students tend to form long-lasting loyalties to them. At the time of the game, Dudley House was the center for non-residential or commuter students. Like the residential houses, Dudley had a tutorial staff, dining facilities, lounges, a game room, a library, etc. The houses are overseen by Harvard faculty, often a married couple, called Masters who act “in loco parentis” for the students. House Masters are kind of omnipresent, so I coded them in a way similar to Mike. In other words, they pop up all the time until you figure out how to get rid of them. Talking to them provides a major hint which should be evident after you discover the conundrum dispenser. This machine is obviously based on a different kind of dispenser commonly found in men’s bathrooms of the day. Couldn’t resist the pun!

The Center for High-Energy Metaphysics and their potluck dinner?

Okay, I know I said that Dudley was a non-residential house, but there were a couple of exceptions. About a half-mile or so off campus, near Porter Square, were two old Cambridge Victorians that housed about 15-20 Dudley students between them. These were begun back in the 60’s as commune-type alternatives for students who weren’t attracted to the typical Harvard House experience. One of these houses had a sign at the entrance proclaiming that you were about to enter “The Center for High-Energy Metaphysics,” an obvious pun on experimental physics labs. As a Dudley tutor, I would visit from time to time for potluck dinners, which were largely vegetarian. Seems that the character of those houses hadn’t changed much from the 60’s. Of course, I added the “militant vegetarian” quality just for laughs.

An interesting bit of film trivia here: the Joe Pesci character in the 1994 film With Honors was based on a homeless man who crashed off and on for years at the High-Energy Center. One of the students who lived there at the time wrote the basis of the screenplay. But of course by the time it made it to theaters, the true story was completely unrecognizable.

The party animal?

This character was based on one of my fellow tutors, a mathematician named Yang Wang. Actually, there’s almost no resemblance between them except for the nickname. We used to call Yang a party animal because he so clearly wasn’t. But the location is correct, Yang’s apartment in Peabody Terrace near the Charles River.

The History of Boston Harbor by George Bush?

In the 1988 presidential election between George Bush Sr. and Michael Dukakis, the Bush team hammered Dukakis on how Boston Harbor had turned into a toxic sewage dump under his watch. Since another part of the game involves how polluted the Charles River had become, I threw this in both as a contemporary reference and as an echo of another part of the game. Bostonians used to revel in the bad reputation of the Charles. Maybe you remember the Standell’s song “Love That Dirty Water.” It was a staple between innings at Fenway Park.

The two secretaries, Mrs. J and Mrs. Handy?

These were two of the sweetest people on earth – Louise Janowicz and Margaret Handy. They ran Dudley House on a day-to-day basis and were truly loved by generations of students. Various Masters came and went, but Mrs. J and Mrs Handy kept the place from falling apart. They were the institutional memory and the beating heart of Dudley. There’s no way I could have written the game without including them. The bit of business involving the key to the bathroom is fact-based. Since Dudley House (Lehman Hall) abutted Harvard Square, there were occasions when our men’s room attracted a less than savory element. So in order to gain access, you had to get the key from a hook beside Mrs. J’s desk. And woe is you if you forgot to return it! As I once did.

The queer old dean?

That’s a reference to William Archibald Spooner, Dean of New College, Oxford, and famous for his unintentionally humorous mangling of the English language. As you probably know, the term “spoonerism” refers to him, and “queer old dean” was apparently a reference he once made about “dear old Queen” Victoria. I’ve been a closet fan of puns and spoonerism my whole life, so I had to figure out a way to include him in Dudley. It seemed to me that having his little problem extend beyond the verbal and into the “real” world would be a great way to play around with morphing some of the objects in the game. I confess that I was influenced by Infocom again here (Nord and Bert is full of spoonerisms).

John Marquand?

John Marquand was Senior Tutor at Dudley House during my time there. He was an institution at Dudley and really was a kind of Father Confessor to the undergrads. He was also a bottomless reservoir of knowledge about food and wine, so if you needed advice on a great restaurant, he was your guy. In the game, I actually have him give you a tip about Bartley’s Burgers (another Harvard institution). He is NOT John P. Marquand, the creator of the Mr. Moto detective novels, but they were related. I originally planned to work the Mr. Moto connection in somehow, but that one slipped through the cracks.

Thanks for all that! It really deepens and enriches the game’s “time capsule” quality all these years later.

It was mentioned at the time that A Dudley Dilemma won the competition that you planned to make another game, this one to be based on Charles Dickens, the subject of your dissertation. Whatever became of that idea?

It never really made it out of the concept stage, but my hope was to mingle characters from various novels together in a sort of “through the looking glass” romp. It seemed to me that having, for example, David Copperfield knock some sense into Pip would be satisfying. Or having Scrooge hire Uriah Heep instead of Bob Cratchet would act as a form of karmic justice. I made some notes at the time, but I have no idea where they are today.

Interesting. I’ve often toyed with an idea similar to this one. There’s a long tradition of time-travel text adventures that have you visiting different time periods, using things collected in one time in another to solve puzzles, etc. I’ve often thought to do something similar, but to have you visiting worlds out of literature — an idea partly inspired by Jasper Fforde’s Thursday Next books. Like you, though, I’ve never gotten around to it. The blog sucks up too much time and energy, I’m afraid.

I haven’t read the Fforde books, but I’ll check them out. By the way, if you’re not already familiar with them, you might look for a couple of stories from the 40’s by L. Sprague de Camp and Fletcher Pratt, called The Incomplete Enchanter. These have been in and out of print for years, so I expect they’re available somewhere. The protagonist, Harold Shea, is able to enter parallel worlds based on literary works: Norse Edda in one story and Spenser’s Faerie Queene in another. Side note here: when I was studying for my PhD orals, I had to read The Faerie Queene, and I kept looking around the corners of that text for Harold. Sadly, he was nowhere to be found.

Ah, The Faerie Queene… “A gentle knight was pricking on the plain…”

I have a beautiful old Victorian edition that I love to take out and look at. I must confess that I’ve never gotten through the whole thing, though. There’s only so much allegory one man can take I reckon.

I didn’t mind Spenser, but Pilgrim’s Progress did me in. What is it Mrs. Malaprop says – “As headstrong as an allegory on the banks of the Nile.”

Before we wrap up, maybe you could tell just briefly where life took you after the days of Harvard and A Dudley Dilemma.

After I completed Dudley, I dove back into teaching and working on my dissertation, which I never did complete (can’t blame Dudley for this, however). A year or so later, I moved to Connecticut and took a job in the UConn School of Business. My wife was in the English Department at UConn, so this actually allowed us to live under the same roof. In the world of academic marriages, having jobs at the same institution is pretty rare, so we jumped at the chance. I also reasoned that having one English professor in the family was enough, so the transition to business was fairly smooth. Besides, I used to sneak across the Charles to the cafe at the Harvard B-School (the food was really good there), so I must have had a premonition.

My work at the UConn B-School involved corporate consulting and teaching business writing to undergrads and MBA students. Just so we’re clear on this, I taught my students how to write clean English prose, without business jargon. Eventually, I served as MBA Director for 10 years. And yes, there was a certain Dickensian quality to the business school. I’ll leave the interpretation of that remark up to you! I retired from my full-time job in 2012, but I currently work part-time with a UConn program called the EBV (Entrepreneurial Bootcamp for Disabled Veterans). We hold workshops for vets who want to start their own businesses. My contribution is helping them create a business plan.

Thank you! And congratulations on making it to retirement after such an interesting and varied working life. I hope that this article and the “remastered” version of A Dudley Dilemma which we released last week will lead more people to play this very clever game and inadvertent time capsule of life at Harvard in the late 1980s.

Thanks, Jimmy. For my part, this entire exchange has been a real pleasure and has allowed me to relive an enjoyable past experience. Thanks again for putting the final version of the game out there. I thought about doing that myself over the years, but didn’t think there’d be an audience for it.

I continue to read and enjoy your blog, and I’ll probably go back and do it in chronological order to see how it develops over time. I’m sure you’ll be expanding it for many years to come. I hope we can keep in touch, and if I ever decide to follow up with the Dickens game (unlikely), I’ll let you know.

I hope so too! Take care!

Lane Barrow, 2016. He's a man who likes to sleep with his hat on, which I suppose is better than dying with his boots on.

Lane Barrow, 2016. He’s a man who likes to sleep with his hat on, which I suppose is better than dying with his boots on.

 
 

Tags: ,