RSS

Category Archives: Interactive Fiction

A Slow-Motion Revolution

CD-ROM

A quick note on terminology before we get started: “CD-ROM” can be used to refer either to the use of CDs as a data-storage format for computers in general or to the Microsoft-sponsored specification for same. I’ll be using the term largely in the former sense in the introduction to this article, in the latter after something called “CD-I” enters the picture. I hope the point of transition won’t be too hard to identify, but my apologies if this leads to any confusion. Sometimes this language of ours is a very inexact thing.



In the first week of March 1986, much of the computer industry converged on Seattle for the first annual Microsoft CD-ROM Conference. Microsoft had anticipated about 500 to 600 attendees to the four-day event. Instead more than 1000 showed up, forcing the organizers to reject many of them at the door of a conference center that by law could only accommodate 800 people. Between the presentations on CD-ROM’s bright future, the attendees wandered through an exhibit hall showcasing the format’s capabilities. The hit of the hall was what was about to become the first CD-ROM product ever to be made available for sale to the public, consisting of the text of all 21 volumes of the Grolier Academic Encyclopedia, some 200 MB in all, on a single disc. It was to be published by KnowledgeSet, a spinoff of Digital Research. Digital’s founder Gary Kildall, apparently forgiving Bill Gates his earlier trespasses in snookering a vital IBM contract out from under his nose, gave the conference’s keynote address.

Kildall’s willingness to forgive and forget in light of the bright optical-storage future that stood before the computer industry seemed very much in harmony with the mood of the conference as a whole. Sentiments often verged on the utopian, with talk of a new “paperless society” abounding, a revolution to rival that of Gutenberg. “The compact disc represents a major discontinuity in the cost of producing and distributing information,” said one Ed Schmid of DEC. “You have to go back to the invention of movable type and the printing press to find something equivalent.” The enthusiasm was so intense and the good vibes among the participants — many of them, like Gates and Kildall, normally the bitterest of enemies — so marked that some came to call the conference “the computer industry’s Woodstock.” If the attendees couldn’t quite smell peace and love in the air, they certainly could smell potential and profit.

All the excitement came down to a single almost unbelievable number: the 650 MB of storage offered by every tiny, inexpensive-to-manufacture compact disc. It’s very, very difficult to fully convey in our current world of gigabytes and terabytes just how inconceivably huge a figure 650 MB actually was in 1986, a time when a 40 MB hard drive was a cavernous, how-can-I-ever-possibly-fill-this-thing luxury found on only the most high-end computers. For developers who had been used to making their projects fit onto floppy disks boasting less than 1 MB of space, the idea of CD-ROM sounded like winning the lottery several times over. You could put an entire 21-volume encyclopedia on one of the things, for Pete’s sake, and still have more than two-thirds of the space left over! Suddenly one of the most nail-biting constraints against which they had always labored would be… well, not so much eased as simply erased. After all, how could anything possibly fill 650 MB?

And just in case that wasn’t enough great news, there was also the fact that the CD was a read-only format. If the industry as a whole moved to CD-ROM as its format of choice, the whole piracy problem, which organizations like the Software Publishers Association ardently believed was costing it billions every year, would dry up and blow away like a dandelion in the fall. Small wonder that the mood at the conference sometimes approached evangelistic fervor. Microsoft, as swept away with it all as anyone, published a collection of the papers that were presented there under the very non-businesslike, non-Microsoft-like title of CD-ROM: The New Papyrus. The format just seemed to demand a touch of rhapsodic poetry.

But the rhapsody wasn’t destined to last very long. The promised land of a software industry built around the effectively unlimited storage capacity of the compact disc would prove infuriatingly difficult to reach; the process of doing so would stretch over the better part of a decade, by the end of which time the promised land wouldn’t seem quite so promising anymore. Throughout that stretch, CD-ROM was always coming in a year or two, always the next big thing right there on the horizon that never quite arrived. This situation, so antithetical to the usual propulsive pace of computer technology, was brought about partly by limitations of the format itself which were all too easy to overlook amid the optimism of that first conference, and partly by a unique combination of external factors that sometimes almost seemed to conspire, perfect-storm-like, to keep CD-ROM out of the hands of consumers.



The compact disc was developed as a format for music by a partnership of the Dutch electronics giant Philips and the Japanese Sony during the late 1970s. Unlike the earlier analog laser-disc format for the storage of video, itself a joint project of Philips and the American media conglomerate MCA, the CD stored information digitally, as long strings of ones and zeros to be passed through digital-to-analog converters and thus turned into rich stereo sound. Philips and Sony published the final specifications for the music CD in 1980, opening up to others who wished to license the technology what would become known as the “Red Book” standard after the color of the binder in which it was described. The first consumer-oriented CD players began to appear in Japan in 1982, in the rest of the world the following year. Confined at first to the high-end audiophile market, by the time of that first Microsoft CD-ROM Conference in 1986 the CD was already well on its way to overtaking the record album and, eventually, the cassette tape to become the most common format for music consumption all over the world.

There were good reasons for the CD’s soaring popularity. Not only did CDs sound better than at least all but the most expensive audiophile turntables, with a complete absence of hiss or surface noise, but, given that nothing actually touched the surface of a disc when it was being played, they could effectively last forever, no matter how many times you listened to them; “Perfect sound forever!” ran the tagline of an early CD advertising campaign. Then there was the way you could find any song you liked on a CD just by tapping a few buttons, as opposed to trying to drop a stylus on a record at just the right point or rewind and fast-forward a cassette to just the right spot. And then there was the way that CDs could be carried around and stored so much more easily than a record album, plus the way they could hold up to 75 minutes worth of music, enough to pack many double vinyl albums onto a single CD. Throw in the lack of a need to change sides to listen to a full album, and seldom has a new media format appeared that is so clearly better than the existing formats in almost all respects.

It didn’t take long for the computer industry to come to see the CD format, envisioned originally strictly as a music medium, as a natural one to extend to other types of data storage. Where the rubber met the road — or the laser met the platter — a CD player was just a mechanism for reading bits off the surface of the disc and sending them on to some other circuitry that knew what to do with them. This circuitry could just as easily be part of a computer as a stereo system.

Such a sanguine view was perhaps a bit overly reductionist. When one started really delving into the practicalities of the CD as a format for data storage, one found a number of limitations, almost all of them drawn directly from the technology’s original purpose as a music-delivery solution. For one thing, CD drives were only capable of reading data off a disc at a rate of 153.6 K per second, this figure corresponding not coincidentally to the speed required to stream standard CD sound for real-time playback. [1]The data on a music CD is actually read at a speed of approximately 172.3 K per second. The first CD-ROM drives had an effective reading speed that was slightly slower due to the need for additional error-correcting checksums in the raw data. Such a throughput was considered pretty good but hardly breathtaking by mid-1980s hard-disk standards; an average 10 MB hard drive of the period might have a transfer rate of about 96 K per second, although high-performance drives could triple or even quadruple that figure.

More problematic was a CD drive’s atrocious seek speed — i.e., the speed at which files could be located for reading on a disc. An average 10 MB hard disk of 1986 had a typical seek time of about 100 milliseconds, a worst-case-scenario maximum of about 200 — although, again, high-performance models could improve on those figures by a factor of four. A CD drive, by contrast, had a typical seek time of 500 milliseconds, a maximum of 1000  — one full second. The designers of the music CD hadn’t been particularly concerned by the issue, for a music-CD player would spend the vast majority of its time reading linear streams of sound data. On those occasions when the user did request a certain track found deeper on the disc, even a full second spent by the drive in seeking her favorite song would hardly be noticed unduly, especially in comparison to the pain of trying to find something on a cassette or a record album. For storage of computer data, however, the slow seek speed gave far more cause for concern.

The LMS LaserDrive is typical of the oddball formats that proliferated during the early years of optical data storage. It can hold 1 GB on each side of a double-sided disc. Unfortunately, each disc cost hundreds of dollars, the unit itself thousands.

The Laser Magnetic Storage LaserDrive is typical of the oddball formats that proliferated during the early years of optical data storage. It could hold 1 GB on each side of a double-sided disc. Unfortunately, each disc cost hundreds of dollars, the unit itself thousands.

Given these issues of performance, which promised only to get more marked in comparison to hard drives as the latter continued to get faster, one might well ask why the industry was so determined to adapt the music CD specifically to data storage rather than using Philips and Sony’s work as a springboard to another optical format with affordances more suitable to the role. In fact, any number of companies did choose the latter course, developing optical formats in various configurations and capacities, many even offering the ability to write to as well as read from the disc. (Such units were called “WORM” drives, for “Write Once Read Many”; data, in other words, could be written to their discs, but not erased or rewritten thereafter.) But, being manufactured in minuscule quantities as essentially bespoke items, all such efforts were doomed to be extremely expensive.

The CD, on the other hand, had the advantage of an existing infrastructure dedicated to stamping out the little silver discs and filling them with data. At the moment, that data consisted almost exclusively of encoded music, but the process of making the discs didn’t care a whit what the ones and zeros being burned into them actually represented. CD-ROM would allow the computer industry to piggy-back on an extant, mature technology that was already nearing ubiquity. That was a huge advantage when set against the cost of developing a new format from scratch and setting up a similar infrastructure to turn it out in bulk — not to mention the challenge of getting the chaotic, hyper-competitive computer industry to agree on another format in the first place. For all these reasons, there was surprisingly little debate on whether adapting the music CD to the purpose of data storage was really the best way to go. For better or for worse, the industry hitched its wagon to the CD; its infelicities as a general-purpose data-storage solution would just have to be worked around.

One of the first problems to be confronted was the issue of a logical file format for CD-ROM. The physical layout of the bits on a data CD was largely dictated by the design of the platters themselves and the machinery used to burn data into them. Yet none of that existing infrastructure had anything to say about how a filesystem appropriate for use with a computer should work within that physical layout. Microsoft, understanding that a certain degree of inter-operability was a valuable thing to have even among the otherwise rival platforms that might wind up embracing CD-ROM, pushed early for a standardized logical format. As a preliminary step on the road to that landmark first CD-ROM Conference, they brought together a more intimate group of eleven other industry leaders at the High Sierra Resort and Casino in Lake Tahoe in November of 1985 to hash out a specification. Among those present were Philips, Sony, Apple, and DEC; notably absent was IBM, a clear sign of Microsoft’s growing determination to step out of the shadow of Big Blue and start dictating the direction of the industry in their own right. The so-called “High Sierra” format would be officially published in finalized form in May of 1986.

In the run-up to the first Microsoft CD-ROM Conference, then, everything seemed to be coming together nicely. CD-ROM had its problems, but virtually everyone agreed that it was a tremendously exciting development. For their part, Microsoft, driven by a Bill Gates who was personally passionate about the format and keenly aware that his company, the purveyor of clunky old MS-DOS, needed for reasons of public relations if nothing else a cutting-edge project to rival any of Apple’s, had established themselves as the driving force behind the nascent optical revolution. And then, just five days before the conference was scheduled to convene — timing that struck very few as accidental — Philips injected a seething ball of chaos into the system via something called CD-I.

CD-I was a different, competing file format for CD data storage. But CD-I was also much, much more. Excited by the success the music CD had enjoyed, Philips, with the tacit support of Sony, had decided to adapt the format into the all-singing, all-dancing, all-around future of home entertainment in the abstract. Philips would be making a CD-I box for the home, based on a minimalist operating system called OS-9 running on a Motorola 68000 processor. But this would be no typical home computer; the user would be able to control CD-I entirely using a VCR-style remote control. CD-I was envisioned as the interactive television of the future, a platform for not only conventional videogames but also lifestyle products of every description, from interactive astronomy lessons to the ultimate in exercise tapes. Philips certainly wasn’t short of ideas:

Think of owning an encyclopedia which presents chosen topics in several different ways. Watching a short audio/video sequence to gain a general background to the topic. Then choosing a word or subject for more in-depth study. Jumping to another topic without losing your place — and returning again after studying the related topic to proceed further. Or watching a cartoon film, concert, or opera with the interactive capabilities of CD-I added. Displaying the score, libretto, or text onscreen in a choice of languages. Or removing one singer or instrument to be able to sing along with the music.

Just as they had with the music CD, Philips would license the specifications to whoever else wanted to make gadgets of their own capable of playing the CD-I discs. They declared confidently that there would be as many CD-I players in the world as phonographs within a few years of the format’s debut, that “in the long run” CD-I “could be every bit as big as the CD-audio market.”

Already at the Microsoft CD-ROM Conference, Philips began aggressively courting developers in the existing computer-games industry to embrace CD-I. Plenty of them were more than happy to do so. Despite the optimism that dominated at the conference, it wasn’t clear how much priority Microsoft, who earned the vast majority of their money from business computing, would really give to more consumer-focused applications of CD-ROM like gaming. Philips, on the other hand, was a giant of consumer electronics. While they paid due lip service to applications of CD-I in areas like corporate training, it was always clear that it would be first and foremost a technology for the living room, one that comprehensively addressed what most believed was the biggest factor limiting the market for conventional computer games: that the machines that ran them were just too fiddly to operate. At the time that CD-I was first announced, the videogame console was almost universally regarded as a dead fad; the machine that would so dramatically reverse that conventional wisdom, the Nintendo Entertainment System, was still an oddball upstart being sold in selected markets only. Thus many game makers saw CD-I as their only viable route out of the back bedroom and into the living room — into the mainstream of home entertainment.

So, when Philips spoke, the game developers listened. Many publishers, including big powerhouses like Activision as well as smaller boutique houses like the 68000 specialists Aegis Development, committed to CD-I projects during 1986, receiving in return a copy of the closely guarded “Green Book” that detailed the inner workings of the system. There was no small pressure to get in on the action quickly, for Philips was promising to ship the first finished CD-I units in time for the Christmas of 1987. Trip Hawkins of Electronic Arts made CD-I a particular priority, forming a whole new in-house development division for the platform. He’d been waiting for a true next-generation mainstream game machine for years. At first, he’d thought the Commodore Amiga would be that machine, but Commodore’s clueless marketing and the Amiga’s high price were making such an outcome look less and less likely. So now he was looking to CD-I, which promised graphics and sound as good as those of the Amiga, along with the all but infinite storage of the unpirateable CD format, and all in a tidy, inexpensive package designed for the living room. What wasn’t to like? He imagined Silicon Valley becoming “the New Hollywood,” imagined a game like Electronic Arts’s hit Starflight remade as a CD-I experience.

You could actually do it just like a real movie. You could hire a costume designer from the movie business, and create special-effects costumes for the aliens. Then you’d videotape scenes with the aliens, and have somebody do a soundtrack for the voices and for the text that they speak in the game.

Then you’d digitize all of that. You could fill up all the space on the disc with animated aliens and interesting sounds. You would also have a universe that’s a lot more interesting to look at. You might have an out-of-the-cockpit view, like Star Trek, with planets that look like planets — rotating, with detailed zooms and that sort of thing.

Such a futuristic vision seemed thoroughly justifiable based on Philips’s CD-I hype, which promised a rich multimedia environment combining CD-quality stereo sound with full-motion video, all at a time when just displaying a photo-realistic still image captured from life on a computer screen was considered an amazing feat. (Among extant personal computers, only the Amiga could manage it.) When developers began to dive into the Green Book, however, they found the reality of CD-I often sharply at odds with the hype. For instance, if you decided to take advantage of the CD-quality audio, you had to tie up the CD drive entirely to stream it, meaning you couldn’t use it to fetch pictures or video or anything else for this supposed rich multimedia environment.

Video playback became an even bigger sore spot that echoed back to those fundamental limitations that had been baked into the CD when it was regarded only as a medium for music delivery. A transfer rate of barely 150 K per second just wasn’t much to work with in terms of streaming video. Developers found themselves stymied by an infuriating Catch-22. If you tried to work with an uncompressed or only modestly compressed video format, you simply couldn’t read it off the disk fast enough to display it in real-time. Yet if you tried to use more advanced compression techniques, it became so expensive in terms of computation to decompress the data that the CD-I unit’s 68000 CPU couldn’t keep up. The best you could manage was to play video snippets that only filled a quarter of the screen — not a limitation that felt overly compatible with the idea of CD-I as the future of home entertainment in the abstract. It meant that a game like the old laser-disc-driven arcade favorite Dragon’s Lair, the very sort of thing people tended to think of first when you mentioned optical storage in the context of entertainment, would be impossible with CD-I. The developers who had signed contracts with Philips and committed major resources to CD-I could only soldier on and hope the technology would continue to evolve.

By 1987, then, the CD as a computer format had been split into two camps. While the games industry had embraced CD-I, the powers that were in business computing had jumped aboard the less ambitious, Microsoft-sponsored standard of CD-ROM, which solved issues like the problematic video playback of CD-I by the simple expediency of not having anything at all to say about them. Perhaps the most impressive of the very early CD-ROM products was the Microsoft Bookshelf, which combined Roget’s Thesaurus, The American Heritage Dictionary, The Chicago Manual of Style, The World Almanac and Book of Facts, and Bartlett’s Familiar Quotations alongside spelling and grammar checkers, a ZIP Code directory, and a collection of forms and form letters, all on a single disc — as fine a demonstration of the potential of the new format as could be imagined short of all that rich multimedia that Philips had promised. Microsoft proudly noted that Bookshelf was their largest single product ever in terms of the number of bits it contained and their smallest ever in physical size. Nevertheless, with most drives costing north of $1000 and products to use with them like Microsoft Bookshelf hundreds more, CD-ROM remained a pricey proposition found in vanishingly few homes — and for that matter not in all that many businesses either.

But at least actual products were available in CD-ROM format, which was more than could be said for CD-I. As 1986 turned into 1987, developers still hadn’t received any CD-I hardware at all, being forced to content themselves with printed specifications and examples of the system in action distributed on videotape by Philips. Particularly for a small company like Aegis, which had committed heavily to a game based on Jules Verne’s 20,000 Leagues Under the Sea, for which they had recruited Jim Sachs of Defender of the Crown fame as illustrator, it was turning into a potentially dangerous situation.

The computer industry — even those parts of it now more committed to CD-I than CD-ROM — dutifully came together once again for the second Microsoft CD-ROM Conference in March of 1987. In contrast to the unusual Pacific Northwest sunshine of the previous conference, the weather this year seemed to match the more unsettled mood: three days of torrential downpour. It was a more skeptical and decidedly less Woodstock-like audience who filed into the auditorium one day for a presentation by no less unlikely a party than the venerable old American conglomerate General Electric. But in the course of that presentation, the old rapture came back in a hurry, culminating in a spontaneous standing ovation. What had so shocked and amazed the audience was the impossible made real: full-screen video running in real-time off a CD drive connected to what to all appearances was an ordinary IBM PC/AT computer. Digital Video Interactive, or DVI, had just made its dramatic debut.

DVI’s origins dated back to 1983, when engineer Larry Ryan of another old-school American company, RCA, had been working on ways to make the old analog laser-disc technology more interactive. Growing frustrated with the limitations he kept bumping against, he proposed to his bosses that RCA dump the laser disc from the equation entirely and embrace digital optical storage. They agreed, and a new project on those lines was begun in 1984. It was still ongoing two years later — just reaching the prototype stage, in fact — when General Electric acquired RCA.

DVI worked by throwing specialized hardware at the problem which Philips had been fruitlessly trying to solve via software alone. By using ultra-intensive compression techniques, it was possible to crunch video playing at a resolution of 256 X 240 — not an overwhelming resolution even by the standards of the day, but not that far below the practical resolution of a typical television set either — down to a size below 153.6 K per second of footage without losing too much quality. This fact was fairly well-known, not least to Philips. The bottleneck had always been the cost of decompressing the footage fast enough to get it onto the screen in real time. DVI attacked this problem via a hardware add-on that consisted principally of a pair of semi-autonomous custom chips designed just for the task of decompressing the video stream as quickly as possible. DVI effectively transformed the potential 75 minutes of sound that could be stored on a CD into 75 minutes of video.

Philosophically, the design bore similarities to the Amiga’s custom chips — similarities which became even more striking when you considered some of the other capabilities that came almost as accidental byproducts of the design. You could, for instance, overlay conventional graphics onto the streaming video by using the computer’s normal display circuitry in conjunction with DVI, just as you could use an Amiga to overlay titles and other graphics onto a “genlocked” feed from a VCR or other video source. But the difference with DVI was that it required no complicated external video source at all, just a CD in the computer’s CD drive. The potential for games was obvious.

In this demonstration of DVI's potential, the user can explore an ancient Mayan archeological site that's depicted using real-world video footage, while the control icons are traditional computer graphics.

In this demonstration of DVI’s potential, the user can explore an ancient Mayan archeological site that’s depicted using real-world video footage, while the icons used as controls are traditional computer graphics.

Still, DVI’s dramatic debut barely ended before the industry’s doubts began. It seemed clear enough that DVI was technically better than CD-I, at least in the hugely important area of video playback, but General Electric — hardly anyone’s idea of a nimble innovator — offered as yet no clear road map for the technology, no hint of what they really planned to do with it. Should game developers place their CD-I projects on hold to see if something better really was coming in the form of DVI, or should they charge full speed ahead and damn the torpedoes? Some did one, some did the other; some made halfhearted commitments to both technologies, some vacillated between them.

But worst of all was the effect that DVI had on Philips. They were thrown into a spin by that presentation from which they never really recovered. Fearful of getting their clock cleaned in the marketplace by a General Electric product based on DVI, Philips stopped CD-I in its tracks, demanding that a way be found to make it do full-screen video as well. From an original plan to ship the first finished CD-I units in time for Christmas 1987, the timetable slipped to promise the first prototypes for developers by January of 1988. Then that deadline also came and went, and all that developers had received were software emulators. Now the development prototypes were promised by summer 1988, finished units expected to ship in 1989. The delay notwithstanding, Philips still confidently predicted sales in “the tens of millions.” But then world domination was delayed again until 1990, then 1991.

Prototype CD-I units finally began reaching developers in early 1989, years behind schedule.

Prototype CD-I units finally began reaching developers in early 1989, years behind schedule.

Wanting CD-I to offer the best of everything, the project chased its own tail for years, trying to address every actual or potential innovation from every actual or potential rival. The game publishers who had jumped aboard with such enthusiasm in the early days were wracked with doubt upon the announcement of each successive delay. Should they jump off the merry-go-round now and cut their losses, or should they stay the course in the hope that CD-I finally would turn into the revolutionary product Philips had been promising for so long? To this day, you merely have to mention CD-I to even the most mild-mannered old games-industry insider to be greeted with a torrent of invective. Philips’s merry-go-round cost the industry huge. Some smaller developers who had trusted Philips enough to bet their very survival on CD-I paid the ultimate price. Aegis, for example, went out of business in 1990 with CD-I still vaporware.

While CD-I chased its tail, General Electric, the unwitting instigators of all this chaos, tried to decide in their slow, bureaucratic way what to do with this DVI thing they’d inherited. Thus things were as unsettled as ever on the CD-I and DVI fronts when the third Microsoft CD-ROM Conference convened in March of 1988. The old plain-Jane CD-ROM format, however, seemed still to be advancing slowly but steadily. Certainly Microsoft appeared to be in fine fettle; harking back to the downpour that had greeted the previous year’s conference, they passed out oversized gold umbrellas to everyone — emblazoned, naturally, with the Microsoft logo in huge type. They could announce at their conference that the High Sierra logical format for CD-ROM had been accepted, with some modest modifications to support languages other than English, by the International Standards Organization as something that would henceforward be known as “ISO 9660.” (It remains the standard logical format for CD-ROM to this day.) Meanwhile Philips and Sony were about to begrudgingly codify the physical format for CD-ROM, extant already as a de facto standard for several years now, as the Yellow Book, latest addition to a library of binders that was turning into quite the rainbow. Apple, who had previously been resistant to CD-ROM, driven as it was by their arch-rival Microsoft, showed up with an official CD-ROM drive for a Macintosh or even an Apple II, albeit at a typically luxurious Apple price of $1200. Even IBM showed up for the conference this time, albeit with a single computer attached to a non-IBM CD-ROM drive and a carefully noncommittal official stance on all this optical evangelism.

As CD-ROM gathered momentum, the stories of DVI and CD-I alike were already beginning to peter out in anticlimax. After doing little with DVI for eighteen long months, General Electric finally sold it to Intel at the end of 1988, explaining that DVI just “didn’t mesh with [their] strategic plans.” Intel began shipping DVI setups to early adopters in 1989, but they cost a staggering $20,000 — a long, long way from a reasonable consumer price point. DVI continued to lurch along into the 1990s, but the price remained too high. Intel, possessed of no corporate tradition of marketing directly to consumers, often seemed little more motivated to turn DVI into a practical product than had been General Electric. Thus did the technology that had caused such a sensation and such disruption in 1987 gradually become yesterday’s news.

Ironically, we can lay the blame for the creeping irrelevancy of DVI directly at the feet of the work for which Intel was best known. As Gordon Moore — himself an Intel man — had predicted decades before, the overall throughput of Intel’s most powerful microprocessors continued to double every two years or so. This situation meant that the problem DVI addressed through all that specialized hardware — that of conventional general-purpose CPUs not having enough horsepower to decompress an ultra-compressed video stream fast enough — wasn’t long for this world. And meanwhile other engineers were attacking the problem from the other side, addressing the standard CD’s reading speed of just 153.6 K per second. They realized that by applying an integral multiplier to the timing of a CD drive’s circuitry, its reading (and seeking) speed could be increased correspondingly. Soon so-called “2X” drives began to appear, capable of reading data at well over 300 K per second, followed in time by “4X” drives, “8X” drives, and whatever unholy figure they’ve reached by today. These developments rendered all of the baroque circuitry of DVI pointless, a solution in search of a problem. Who needed all that complicated stuff?

CD-I’s end was even more protracted and ignominious. The absurd wait eventually got to be too much for even the most loyal CD-I developers. One by one, they dropped their projects. It marked a major tipping point when in 1989 Electronic Arts, the most enthusiastic of all the software publishers in the early days of CD-I, closed down the department they had formed to develop for the platform, writing off millions of dollars on the aborted venture. In another telling sign of the times, Greg Riker, the manager of that department, left Electronic Arts to work for Microsoft on CD-ROM.

When CD-I finally trickled onto store shelves just a few weeks shy of Christmas 1991, it was able to display full-screen video of a sort but only in 128 colors, and was accompanied by an underwhelming selection of slapdash games and lifestyle products, most funded by Philips themselves, that were a far cry from those halcyon expectations of 1986. CD-I sales disappointed — immediately, consistently, and comprehensively. Philips, nothing if not persistent, beat the dead horse for some seven years before giving up at last, having sold only 1 million units in total, many of them at fire-sale discounts.

In the end, the big beneficiary of the endless CD-I/DVI standoff was CD-ROM, the simple, commonsense format that had made its public debut well before either of them. By 1993 or so, you didn’t need anything special to play video off a CD at equivalent or better quality to that which had been so amazing in 1987; an up-to-date CPU combined with a 2X CD-ROM drive would do the job just fine. The Microsoft standard had won out. Funny how often that happened in the 1980s and 1990s, isn’t it?

Bill Gates’s reputation as a master Machiavellian being what it is, I’ve heard it suggested that the chaos and indecision which followed the public debut of DVI had been consciously engineered by him — that he had convinced a clueless General Electric to give that 1987 demonstration and later convinced Intel to keep DVI at least ostensibly alive and thus paralyzing Philips long enough for everyday PC hardware and vanilla CD-ROM to win the day, all the while knowing full well that DVI would never amount to anything. That sounds a little far-fetched to this writer, but who knows? Philips’s decision to announce CD-I five days before Microsoft’s CD-ROM Conference had clearly been a direct shot across Bill Gates’s bow, and such challenges did tend not to end well for the challenger. Anything else is, and must likely always remain, mere speculation.

(Sources: Amazing Computing of May 1986; Byte of May 1986, October 1986, April 1987, January 1989, May 1989, and December 1990; Commodore Magazine of November 1988; 68 Micro Journal of August/September 1989; Compute! of February 1987 and June 1988; Macworld of April 1988; ACE of September 1989, March 1990, and April 1990; The One of October 1988 and November 1988; Sierra On-Line’s newsletter of Autumn 1989; PC Magazine of April 29 1986; the premiere issue of AmigaWorld; episodes of the Computer Chronicles television series entitled “Optical Storage Devices,” “CD-ROMs,” and “Optical Storage”; the book CD-ROM: The New Papyrus from the Microsoft Press. Finally, my huge thanks to William Volk, late of Aegis and Mediagenic, for sharing his memories and impressions of the CD wars with me in an interview.)

Footnotes

Footnotes
1 The data on a music CD is actually read at a speed of approximately 172.3 K per second. The first CD-ROM drives had an effective reading speed that was slightly slower due to the need for additional error-correcting checksums in the raw data.
 
44 Comments

Posted by on September 30, 2016 in Digital Antiquaria, Interactive Fiction

 

Tags:

The Freedom to Associate

In 1854, an Austrian priest and physics teacher named Gregor Mendel sought and received permission from his abbot to plant a two-acre garden of pea plants on the grounds of the monastery at which he lived. Over the course of the next seven years, he bred together thousands upon thousands of the plants under carefully controlled circumstances, recording in a journal the appearance of every single offspring that resulted, as defined by seven characteristics: plant height, pod shape and color, seed shape and color, and flower position and color. In the end, he collected enough data to formulate the basis of the modern science of genetics, in the form of a theory of dominant and recessive traits passed down in pairs from generation to generation. He presented his paper on the subject, “Experiments on Plant Hybridization,” before the Natural History Society of Austria in 1865, and saw it published in a poorly circulated scientific journal the following year.

And then came… nothing. For various reasons — perhaps due partly to the paper’s unassuming title, perhaps due partly to the fact that Mendel was hardly a known figure in the world of biology, undoubtedly due largely to the poor circulation of the journal in which it was published — few noticed it at all, and those who did dismissed it seemingly without grasping its import. Most notably, Charles Darwin, whose On the Origin of Species had been published while Mendel was in the midst of his own experiments, seems never to have been aware of the paper at all, thereby missing this key gear in the mechanism of evolution. Mendel was promoted to abbot of his monastery shortly after the publication of his paper, and the increased responsibilities of his new post ended his career as a scientist. He died in 1884, remembered as a quiet man of religion who had for a time been a gentleman dabbler in the science of botany.

But then, at the turn of the century, the German botanist Carl Correns stumbled upon Mendel’s work while conducting his own investigations into floral genetics, becoming in the process the first to grasp its true significance. To his huge credit, he advanced Mendel’s name as the real originator of the set of theories which he, along with one or two other scientists working independently, was beginning to rediscover. Correns effectively shamed those other scientists as well into acknowledging that Mendel had figured it all out decades before any of them even came close. It was truly a selfless act; today the name of Carl Correns is unknown except in esoteric scientific circles, while Gregor Mendel’s has been done the ultimate honor of becoming an adjective (“Mendelian”) and a noun (“Mendelism”) locatable in any good dictionary.

Vannevar Bush

Vannevar Bush

So, all’s well that ends well, right? Well, maybe, but maybe not. Some 30 years after the rediscovery of Mendel’s work, an American named Vannevar Bush, dean of MIT’s School of Engineering, came to see the 35 years that had passed between the publication of Mendel’s theory and the affirmation of its importance as a troubling symptom of the modern condition. Once upon a time, all knowledge had been regarded as of a piece, and it had been possible for a great mind to hold within itself huge swathes of this collective knowledge of humanity, everything informing everything else. Think of that classic example of a Renaissance man, Leonardo da Vinci, who was simultaneously a musician, a physicist, a mathematician, an anatomist, a botanist, a geologist, a cartographer, an alchemist, an astronomer, an engineer, and an inventor. Most of all, of course, he was a great visual artist, but he used everything else he was carrying around in that giant brain of his to create paintings and drawings as technically meticulous as they were artistically sublime.

By Bush’s time, however, the world had long since entered the Age of the Specialist. As the sheer quantity of information in every field exploded, those who wished to do worthwhile work in any given field — even those people gifted with giant brains — were increasingly being forced to dedicate their intellectual lives entirely to that field and only that field, just to keep up. The intellectual elite were in danger of becoming a race of mole people, closeted one-dimensionals fixated always on the details of their ever more specialized trades, never on the bigger picture. And even then, the amount of information surrounding them was so vast, and existing systems for indexing and keeping track of it all so feeble, that they could miss really important stuff within their own specialties; witness the way the biologists of the late nineteenth century had missed Gregor Mendel’s work, and the 35-years head start it had cost the new science of genetics. “Mendel’s work was lost,” Bush would later write, “because of the crudity with which information is transmitted between men.” How many other major scientific advances were lying lost in the flood of articles being published every year, a flood that had increased by an order of magnitude just since Mendel’s time? “In this are thoughts,” wrote Bush, “certainly not often as great as Mendel’s, but important to our progress. Many of them become lost; many others are repeated over and over.” “This sort of catastrophe is undoubtedly being repeated all around us,” he believed, “as truly significant attainments become lost in the sea of the inconsequential.”

Bush’s musings were swept aside for a time by the rush of historical events. As the prospect of another world war loomed, he became President Franklin Delano Roosevelt’s foremost advisor on matters involving science and engineering. During the war, he shepherded through countless major advances in the technologies of attack and defense, culminating in the most fearsome weapon the world had ever known: the atomic bomb. It was actually this last that caused Bush to return to the seemingly unrelated topic of information management, a problem he now saw in a more urgent light than ever. Clearly the world was entering a new era, one with far less tolerance for the human folly, born of so much context-less mole-person ideology, that had spawned the current war.

Practical man that he was, Bush decided there was nothing for it but to roll up his sleeves and make a concrete proposal describing how humanity could solve the needle-in-a-haystack problem of the modern information explosion. Doing so must entail grappling with something as fundamental as “how creative men think, and what can be done to help them think. It is a problem of how the great mass of material shall be handled so that the individual can draw from it what he needs — instantly, correctly, and with utter freedom.”

As revolutionary manifestos go, Vannevar Bush’s “As We May Think” is very unusual in terms of both the man that wrote it and the audience that read it. Bush was no Karl Marx, toiling away in discontented obscurity and poverty. On the contrary, he was a wealthy upper-class patrician who was, as a member of the White House inner circle, about as fabulously well-connected as it was possible for a man to be. His article appeared first in the July 1945 edition of the Atlantic Monthly, hardly a bastion of radical thought. Soon after, it was republished in somewhat abridged form by Life, the most popular magazine on the planet. Thereby did this visionary document reach literally millions of readers.

With the atomic bomb still a state secret, Bush couldn’t refer directly to his real reasons for wanting so urgently to write down his ideas now. Yet the dawning of the atomic age nevertheless haunts his article.

It is the physicists who have been thrown most violently off stride, who have left academic pursuits for the making of strange destructive gadgets, who have had to devise new methods for their unanticipated assignments. They have done their part on the devices that made it possible to turn back the enemy, have worked in combined effort with the physicists of our allies. They have felt within themselves the stir of achievement. They have been part of a great team. Now, as peace approaches, one asks where they will find objectives worthy of their best.

Seen in one light, Bush’s essay is similar to many of those that would follow from other Manhattan Project alumni during the uncertain interstitial period between the end of World War II and the onset of the Cold War. Bush was like many of his colleagues in feeling the need to advance a utopian agenda to counter the apocalyptic potential of the weapon they had wrought, in needing to see the ultimate evil that was the atomic bomb in almost paradoxical terms as a potential force for good that would finally shake the world awake.

Bush was true to his engineer’s heart, however, in basing his utopian vision on technology rather than politics. The world was drowning in information, making the act of information synthesis — intradisciplinary and interdisciplinary alike — ever more difficult.

The difficulty seems to be, not so much that we publish unduly in view of the extent and variety of present-day interests, but rather that publication has been extended far beyond our present ability to make real use of the record. The summation of human experience is being expanded at a prodigious rate, and the means we use for threading through the consequent maze to the momentarily important item is the same as was used in the days of square-rigged ships.

Our ineptitude in getting at the record is largely caused by the artificiality of systems of indexing. When data of any sort are placed in storage, they are filed alphabetically or numerically, and information is found (when it is) by tracing it down from subclass to subclass. It can be in only one place, unless duplicates are used; one has to have rules as to which path will locate it, and the rules are cumbersome. Having found one item, moreover, one has to emerge from the system and reenter on a new path.

The human mind does not work that way. It operates by association. With one item in its grasp, it snaps instantly to the next that is suggested by the association of thoughts, in accordance with some intricate web of trails carried by the cells of the brain. It has other characteristics, of course; trails that are not frequently followed are prone to fade, items are not fully permanent, memory is transitory. Yet the speed of action, the intricacy of trails, the detail of mental pictures, is awe-inspiring beyond all else in nature.

Man cannot hope fully to duplicate this mental process artificially, but he certainly ought to be able to learn from it. In minor ways he may even improve it, for his records have relative permanency. The first idea, however, to be drawn from the analogy concerns selection. Selection by association, rather than indexing, may yet be mechanized. One cannot hope thus to equal the speed and flexibility with which the mind follows an associative trail, but it should be possible to beat the mind decisively in regard to the permanence and clarity of the items resurrected from storage.

Bush was not among the vanishingly small number of people who were working in the nascent field of digital computing in 1945. His “memex,” the invention he proposed to let an individual free-associate all of the information in her personal library, was more steampunk than cyberpunk, all whirring gears, snickering levers, and whooshing microfilm strips. But really, those things are just details; he got all of the important stuff right. I want to quote some more from “As We May Think,” and somewhat at length at that, because… well, because its vision of the future is just that important. This is how the memex should work:

When the user is building a trail, he names it, inserts the name in his code book, and taps it out on his keyboard. Before him are the two items to be joined, projected onto adjacent viewing positions. At the bottom of each there are a number of blank code spaces, and a pointer is set to indicate one of these on each item. The user taps a single key, and the items are permanently joined. In each code space appears the code word. Out of view, but also in the code space, is inserted a set of dots for photocell viewing; and on each item these dots by their positions designate the index number of the other item.

Thereafter, at any time, when one of these items is in view, the other can be instantly recalled merely by tapping a button below the corresponding code space. Moreover, when numerous items have been thus joined together to form a trail, they can be reviewed in turn, rapidly or slowly, by deflecting a lever like that used for turning the pages of a book. It is exactly as though the physical items had been gathered together from widely separated sources and bound together to form a new book. It is more than this, for any item can be joined into numerous trails.

The owner of the memex, let us say, is interested in the origin and properties of the bow and arrow. Specifically he is studying why the short Turkish bow was apparently superior to the English long bow in the skirmishes of the Crusades. He has dozens of possibly pertinent books and articles in his memex. First he runs through an encyclopedia, finds an interesting but sketchy article, leaves it projected. Next, in a history, he finds another pertinent item, and ties the two together. Thus he goes, building a trail of many items. Occasionally he inserts a comment of his own, either linking it into the main trail or joining it by a side trail to a particular item. When it becomes evident that the elastic properties of available materials had a great deal to do with the bow, he branches off on a side trail which takes him through textbooks on elasticity and tables of physical constants. He inserts a page of longhand analysis of his own. Thus he builds a trail of his interest through the maze of materials available to him.

And his trails do not fade. Several years later, his talk with a friend turns to the queer ways in which a people resist innovations, even of vital interest. He has an example, in the fact that the outraged Europeans still failed to adopt the Turkish bow. In fact he has a trail on it. A touch brings up the code book. Tapping a few keys projects the head of the trail. A lever runs through it at will, stopping at interesting items, going off on side excursions. It is an interesting trail, pertinent to the discussion. So he sets a reproducer in action, photographs the whole trail out, and passes it to his friend for insertion in his own memex, there to be linked into the more general trail.

Wholly new forms of encyclopedias will appear, ready-made with a mesh of associative trails running through them, ready to be dropped into the memex and there amplified. The lawyer has at his touch the associated opinions and decisions of his whole experience, and of the experience of friends and authorities. The patent attorney has on call the millions of issued patents, with familiar trails to every point of his client’s interest. The physician, puzzled by a patient’s reactions, strikes the trail established in studying an earlier similar case, and runs rapidly through analogous case histories, with side references to the classics for the pertinent anatomy and histology. The chemist, struggling with the synthesis of an organic compound, has all the chemical literature before him in his laboratory, with trails following the analogies of compounds, and side trails to their physical and chemical behavior.

The historian, with a vast chronological account of a people, parallels it with a skip trail which stops only on the salient items, and can follow at any time contemporary trails which lead him all over civilization at a particular epoch. There is a new profession of trail blazers, those who find delight in the task of establishing useful trails through the enormous mass of the common record. The inheritance from the master becomes, not only his additions to the world’s record, but for his disciples the entire scaffolding by which they were erected.

Ted Nelson

Ted Nelson

There is no record of what all those millions of Atlantic Monthly and Life readers made of Bush’s ideas in 1945 — or for that matter if they made anything of them at all. In the decades that followed, however, the article became a touchstone of the burgeoning semi-underground world of creative computing. Among its discoverers was Ted Nelson, who is depending on whom you talk to either one of the greatest visionaries in the history of computing or one of the greatest crackpots — or, quite possibly, both. Born in 1937 to a Hollywood director and his actress wife, then raised by his wealthy and indulgent grandparents following the inevitable Hollywood divorce, Nelson’s life would largely be defined by, as Gary Wolf put it in his classic profile for Wired magazine, his “aversion to finishing.” As in, finishing anything at all, or just the concept of finishing in the abstract. Well into middle-age, he would be diagnosed with attention-deficit disorder, an alleged malady he came to celebrate as his “hummingbird mind.” This condition perhaps explains why he was so eager to find a way of forging permanent, retraceable associations among all the information floating around inside and outside his brain.

Nelson coined the terms “hypertext” and “hypermedia” at some point during the early 1960s, when he was a graduate student at Harvard. (Typically, he got a score of Incomplete in the course for which he invented them, not to mention an Incomplete on his PhD as a whole.) While they’re widely used all but interchangeably today, in Nelson’s original formulation the former term was reserved for purely textual works, the later for those incorporating others forms of media, like images and sound. But today we’ll just go with the modern flow, call them all hypertexts, and leave it at that. In his scheme, then, hypertexts were texts capable of being “zipped” together with other hypertexts, memex-like, wherever the reader or writer wished to preserve associations between them. He presented his new buzzwords to the world at a conference of the Association for Computing Machinery in 1965, to little impact. Nelson, possessed of a loudly declamatory style of discourse and all the rabble-rousing fervor of a street-corner anarchist, would never be taken all that seriously by the academic establishment.

Instead, it being the 1960s and all, he went underground, embracing computing’s burgeoning counterculture. His eventual testament, one of the few things he ever did manage to complete — after a fashion, at any rate — was a massive 1200-page tome called Computer Lib/Dream Machines, self-published in 1974, just in time for the heyday of the Altair and the Homebrew Computer Club, whose members embraced Nelson as something of a patron saint. As the name would indicate, Computer Lib/Dream Machines was actually two separate books, bound back to back. Theoretically, Computer Lib was the more grounded volume, full of practical advice about gaining access to and using computers, while Dream Machines was full of the really out-there ideas. In practice, though, they were often hard to distinguish. Indeed, it was hard to even find anything in the books, which were published as mimeographed facsimile copies filled with jotted marginalia and cartoons drafted in Nelson’s shaky hand, with no table of contents or page numbers and no discernible organizing principle beyond the stream of consciousness of Nelson’s hummingbird mind. (I trust that the irony of a book concerned with finding new organizing principles for information itself being such an impenetrable morass is too obvious to be worth belaboring further.) Nelson followed Computer Lib/Dream Machines with 1981’s Literary Machines, a text written in a similar style that dwelt, when it could be bothered, at even greater length on the idea of hypertext.

The most consistently central theme of Nelson’s books, to whatever extent one could be discerned, was an elaboration of the hypertext concept he called Xanadu, after the pleasure palace in Samuel Taylor Coleridge’s poem “Kubla Khan.” The product of an opium-fueled hallucination, the 54-line poem is a mere fragment of a much longer work Coleridge had intended to write. Problem was, in the course of writing down the first part of his waking dream he was interrupted; by the time he returned to his desk he had simply forgotten the rest.

So, Nelson’s Xanadu was intended to preserve information that would otherwise be lost, which goal it would achieve through associative linking on a global scale. Beyond that, it was almost impossible to say precisely what Xanadu was or wasn’t. Certainly it sounds much like the World Wide Web to modern ears, but Nelson insists adamantly that the web is a mere bad implementation of the merest shadow of his full idea. Xanadu has been under allegedly active development since the late 1960s, making it the most long-lived single project in the history of computer programming, and by far history’s most legendary piece of vaporware. As of this writing, the sum total of all those years of work are a set of web pages written in Nelson’s inimitable declamatory style, littered with angry screeds against the World Wide Web, along with some online samples that either don’t work quite right or are simply too paradigm-shattering for my poor mind to grasp.

In my own years on this planet, I’ve come to reserve my greatest respect for people who finish things, a judgment which perhaps makes me less than the ideal critic of Ted Nelson’s work. Nevertheless, even I can recognize that Nelson deserves huge credit for transporting Bush’s ideas to their natural habitat of digital computers, for inventing the term “hypertext,” for defining an approach to links (or “zips”) in a digital space, and, last but far from least, for making the crucial leap from Vannevar Bush’s concept of the single-user memex machine to an interconnected global network of hyperlinks.

But of course ideas, of which both Bush and Nelson had so many, are not finished implementations. During the 1960s, 1970s, and early 1980s, there were various efforts — in addition, that is, to the quixotic effort that was Xanadu — to wrestle at least some of the concepts put forward by these two visionaries into concrete existence. Yet it wouldn’t be until 1987 that a corporation with real financial resources and real commercial savvy would at last place a reasonably complete implementation of hypertext before the public. And it all started with a frustrated programmer looking for a project.

Steve Jobs and Bill Atkinson

Steve Jobs and Bill Atkinson

Had he never had anything to do with hypertext, Bill Atkinson’s place in the history of computing would still be assured. Coming to Apple Computer in 1978, when the company was only about eighteen months removed from that famous Cupertino garage, Atkinson was instrumental in convincing Steve Jobs to visit the Xerox Palo Alto Research Center, thereby setting in motion the chain of events that would lead to the Macintosh. A brilliant programmer by anybody’s measure, he eventually wound up on the Lisa team. He wrote the routines to draw pixels onto the Lisa’s screen — routines on which, what with the Lisa being a fundamentally graphical machine whose every display was bitmapped, every other program depended. Jobs was so impressed by Atkinson’s work on what he named LisaGraf that he recruited him to port his routines over to the nascent Macintosh. Atkinson’s routines, now dubbed QuickDraw, would remain at the core of MacOS for the next fifteen years. But Atkinson’s contribution to the Mac went yet further: after QuickDraw, he proceeded to design and program MacPaint, one of the two applications included with the finished machine, and one that’s still justifiably regarded as a little marvel of intuitive user-interface design.

Atkinson’s work on the Mac was so essential to the machine’s success that shortly after its release he became just the fourth person to be named an Apple Fellow — an honor that carried with it, implicitly if not explicitly, a degree of autonomy for the recipient in the choosing of future projects. The first project that Atkinson chose for himself was something he called the Magic Slate, based on a gadget called the Dynabook that had been proposed years ago by Xerox PARC alum (and Atkinson’s fellow Apple Fellow) Alan Kay: a small, thin, inexpensive handheld computer controlled via a touch screen. It was, as anyone who has ever seen an iPhone or iPad will attest, a prescient project indeed, but also one that simply wasn’t realizable using mid-1980s computer technology. Having been convinced of this at last by his skeptical managers after some months of flailing,  Atkinson wondered if he might not be able to create the next best thing in the form of a sort of software version of the Magic Slate, running on the Macintosh desktop.

In a way, the Magic Slate had always had as much to do with the ideas of Bush and Nelson as it did with those of Kay. Atkinson had envisioned its interface as a network of “pages” which the user navigated among by tapping links therein — a hypertext in its own right. Now he transported the same concept to the Macintosh desktop, whilst making his metaphorical pages into metaphorical stacks of index cards. He called the end result, the product of many months of design and programming, “Wildcard.” Later, when the trademark “Wildcard” proved to be tied up by another company, it turned into “HyperCard” — a much better name anyway in my book.

By the time he had HyperCard in some sort of reasonably usable shape, Atkinson was all but convinced that he would have to either sell the thing to some outside software publisher or start his own company to market it. With Steve Jobs now long gone and with him much of the old Jobsian spirit of changing the world through better computing, Apple was heavily focused on turning the Macintosh into a practical business machine. The new, more sober mood in Cupertino — not to mention Apple’s more buttoned-down public image — would seem to indicate that they were hardly up for another wide-eyed “revolutionary” product. It was Alan Kay, still kicking around Cupertino puttering with this and that, who convinced Atkinson to give CEO John Sculley a chance before he took HyperCard elsewhere. Kay brokered a meeting between Sculley and Atkinson, in which the latter was able to personally demonstrate to the former what he’d been working on all these months. Much to Atkinson’s surprise, Sculley loved HyperCard. Apparently at least some of the old Jobsian fervor was still alive and well after all inside Apple’s executive suite.

At its most basic, a HyperCard stack to modern eyes resembles nothing so much as a PowerPoint presentation, albeit one which can be navigated non-linearly by tapping links on the slides themselves. Just as in PowerPoint, the HyperCard designer could drag and drop various forms of media onto a card. Taken even at this fairly superficial level, HyperCard was already a full-fledged hypertext-authoring (and hypertext-reading) tool — by no means the first specimen of its kind, but the first with the requisite combination of friendliness, practicality, and attractiveness to make it an appealing environment for the everyday computer user. One of Atkinson’s favorite early demo stacks had many cards with pictures of people wearing hats. If you clicked on a hat, you were sent to another card showing someone else wearing a hat. Ditto for other articles of fashion. It may sound banal, but this really was revolutionary, organization by association in action. Indeed, one might say that HyperCard was Vannevar Bush’s memex, fully realized at last.

But the system showed itself to have much, much more to offer when the author started to dig into HyperTalk, the included scripting language. All sorts of logic, simple or complex, could be accomplished by linking scripts to clicks on the surface of the cards. At this level, HyperCard became an almost magical tool for some types of game development, as we’ll see in future articles. It was also a natural fit for many other applications: information kiosks, interactive tutorials, educational software, expert systems, reference libraries, etc.

HyperCard in action

HyperCard in action

John Sculley himself premiered HyperCard at the August 1987 MacWorld show. Showing unusual largess in his determination to get HyperCard into the hands of as many people as possible as quickly as possible, he announced that henceforward all new Macs would ship with a free copy of the system, while existing owners could buy copies for their machines for just $49. He called HyperCard the most important product Apple had released during his tenure there. Considering that Sculley had also been present for the launch of the original Macintosh, this was certainly saying something. And yet he wasn’t clearly in the wrong either. As important as the Macintosh, the realization in practical commercial form of the computer-interface paradigms pioneered at Xerox PARC during the 1970s, has been to our digital lives of today, the concept of associative indexing — hyperlinking — has proved at least as significant. But then, the two do go together like strawberries and cream, the point-and-click paradigm providing the perfect way to intuitively navigate through a labyrinth of hyperlinks. It was no coincidence that an enjoyable implementation of hypertext appeared first on the Macintosh; the latter almost seemed a prerequisite for the former.

The full revolutionary nature of the concept of hypertext was far from easy to get across in advertising copy, but Apple gave it a surprisingly good go, paying due homage to Vannevar Bush in the process.

The full import of the concept of hypertext was far from easy to get across in advertising copy, but Apple gave it a surprisingly serious go, paying due homage to Vannevar Bush in the process.

In the wake of that MacWorld presentation, a towering tide of HyperCard hype rolled from one side of the computer industry to the other, out into the mainstream media, and then back again, over and over. Hypertext’s time had finally come. In 1985, it was an esoteric fringe concept known only to academics and a handful of hackers, being treated at real length and depth in print only in Ted Nelson’s own sprawling, well-nigh impenetrable tomes. Four years later, every bookstore in the land sported a shelf positively groaning with trendy paperbacks advertising hypertext this and hypertext that. By then the curmudgeons had also begun to come out in force, always a sure sign that an idea has truly reached critical mass. Presentations showed up in conference catalogs with snarky titles like “Hypertext: Will It Cook Me Breakfast Too?.”

The curmudgeons had plenty of rabid enthusiasm to push back against. HyperCard, even more so than the Macintosh itself, had a way of turning the most sober-minded computing veterans into starry-eyed fanatics. Jan Lewis, a long time business-computing analyst, declared that “HyperCard is going to revolutionize the way computing is done, and possibly the way human thought is done.” Throwing caution to the wind, she abandoned her post at InfoWorld to found HyperAge, the first magazine dedicated to the revolution. “There’s a tremendous demand,” she said. “If you look at the online services, the bulletin boards, the various ad hoc meetings, user groups — there is literally a HyperCulture developing, almost a cult.” To judge from her own impassioned statements, she should know. She recruited Ted Nelson himself — one of the HyperCard holy trinity of Bush, Nelson, and Atkinson — to write a monthly column.

HyperCard effectively amounted to an entirely new computing platform that just happened to run atop the older platform that was the Macintosh. As Lewis noted, user-created HyperCard stacks — this new platform’s word for “programs” or “software” — were soon being traded all over the telecommunications networks. The first commercial publisher to jump into the HyperCard game was, somewhat surprisingly, Mediagenic. [1]Mediagenic was known as Activision until mid-1988. To avoid confusion, I just stick with the name “Mediagenic” in this article. Bruce Davis, Mediagenic’s CEO, has hardly gone down into history as a paradigm of progressive thought in the realms of computer games and software in general, but he defied his modern reputation in this one area at least by pushing quickly and aggressively into “stackware.” One of the first examples of same that Mediagenic published was Focal Point, a collection of business and personal-productivity tools written by one Danny Goodman, who was soon to publish a massive bible called The Complete HyperCard Handbook, thus securing for himself the mantle of the new ecosystem’s go-to programming guru. Focal Point was a fine demonstration that just about any sort of software could be created by the sufficiently motivated HyperCard programmer. But it was another early Mediagenic release, City to City, that was more indicative of the system’s real potential. It was a travel guide to most major American cities — an effortlessly browsable and searchable guide to “the best food, lodgings, and other necessities” to be found in each of the metropolises in its database.

City to City

City to City

Other publishers — large, small, and just starting out — followed Mediagenic’s lead, releasing a bevy of fascinating products. The people behind The Whole Earth Catalog — themselves the inspiration for Ted Nelson’s efforts in self-publication — converted their current edition into a HyperCard stack filling a staggering 80 floppy disks. A tiny company called Voyager combined HyperCard with a laser-disc player — a very common combination among ambitious early HyperCard developers — to offer an interactive version of the National Gallery of Art which could be explored using such associative search terms as “Impressionist landscapes with boats.” Culture 1.0 let you explore its namesake through “3700 years of Western history — over 200 graphics, 2000 hypertext links, and 90 essays covering topics from the Black Plague to Impressionism,” all on just 7 floppy disks. Mission: The Moon, from the newly launched interactive arm of ABC News, gathered together details of every single Mercury, Gemini, and Apollo mission, including videos of each mission hosted on a companion laser disc. A professor of music converted his entire Music Appreciation 101 course into a stack. The American Heritage Dictionary appeared as stackware. And lots of what we might call “middlestackware” appeared to help budding programmers with their own creations: HyperComposer for writing music in HyperCard, Take One for adding animations to cards.

Just two factors were missing from HyperCard to allow hypertext to reach its full potential. One was a storage medium capable of holding lots of data, to allow for truly rich multimedia experiences, combining the lavish amounts of video, still pictures, music, sound, and of course text that the system clearly cried out for. Thankfully, that problem was about to be remedied via a new technology which we’ll be examining in my very next article.

The other problem was a little thornier, and would take a little longer to solve. For all its wonders, a HyperCard stack was still confined to the single Macintosh on which it ran; there was no provision for linking between stacks running on entirely separate computers. In other words, one might think of a HyperCard stack as equivalent to a single web site running locally off a single computer’s hard drive, without the ability to field external links alongside its internal links. Thus the really key component of Ted Nelson’s Xanadu dream, that of a networked hypertext environment potentially spanning the entire globe, remained unrealized. In 1990, Bill Nisen, the developer of a hypertext system called Guide that slightly predated HyperCard but wasn’t as practical or usable, stated the problem thus:

The one thing that is precluding the wide acceptance of hypertext and hypermedia is adequate broadcast mechanisms. We need to find ways in which we can broadcast the results of hypermedia authoring. We’re looking to in the future the ubiquitous availability of local-area networks and low-cost digital-transmission facilities. Once we can put the results of this authoring into the hands of more users, we’re going to see this industry really explode.

Already at the time Nisen made that statement, a British researcher named Tim Berners-Lee had started to experiment with something he called the Hypertext Transfer Protocol. The first real web site, the beginning of the World Wide Web, would go online in 1991. It would take a few more years even from that point, but a shared hypertextual space of a scope and scale the likes of which few could imagine was on the way. The world already had its memex in the form of HyperCard. Now — and although this equivalency would scandalize Ted Nelson — it was about to get its Xanadu.

Associative indexing permeates our lives so thoroughly today that, as with so many truly fundamental paradigm shifts, the full scope of the change it has wrought can be difficult to fully appreciate. A century ago, education was still largely an exercise in retention: names, dates, Latin verb cognates. Today’s educational institutions  — at least the more enlightened ones — recognize that it’s more important to teach their pupils how to think than it is to fill their heads with facts; facts, after all, are now cheap and easy to acquire when you need them. That such a revolution in the way we think about thought happened in just a couple of decades strikes me as incredible. That I happened to be present to witness it strikes me as amazing.

What I’ve witnessed has been a revolution in humanity’s relationship to information itself that’s every bit as significant as any political revolution in history. Some Singularity proponents will tell you that it marks the first step on the road to a vast worldwide consciousness. But even if you choose not to go that far, the ideas of Vannevar Bush and Ted Nelson are still with you every time you bring up Google. We live in a world in which much of the sum total of human knowledge is available over an electronic connection found in almost every modern home. This is wondrous. Yet what’s still more wondrous is the way that we can find almost any obscure fact, passage, opinion, or idea we like from within that mass, thanks to selection by association. Mama, we’re all cyborgs now.

(Sources: the books Hackers: Heroes of the Computer Revolution and Insanely Great: The Life and Times of the Macintosh, the Computer That Changed Everything by Steven Levy; Computer Lib/Dream Machines and Literary Machines by Ted Nelson; From Memex to Hypertext: Vannevar Bush and the Mind’s Machine, edited by James M. Nyce and Paul Kahn; The New Media Reader, edited by Noah Wardrip-Fruin and Nick Montfort; Multimedia and Hypertext: The Internet and Beyond by Jakob Nielsen; The Making of the Atomic Bomb by Richard Rhodes. Also the June 1995 Wired magazine profile of Ted Nelson; Andy Hertzfeld’s website Folklore; and the Computer Chronicles television episodes entitled “HyperCard,” “MacWorld Special 1988,” “HyperCard Update,” and “Hypertext.”)

Footnotes

Footnotes
1 Mediagenic was known as Activision until mid-1988. To avoid confusion, I just stick with the name “Mediagenic” in this article.
 
54 Comments

Posted by on September 23, 2016 in Digital Antiquaria, Interactive Fiction

 

Tags: ,

Cracking Open the Mac

The Macintosh II

The biggest problem with the Macintosh hardware was pretty obvious, which was its limited expandability. But the problem wasn’t really technical as much as philosophical, which was that we wanted to eliminate the inevitable complexity that was a consequence of hardware expandability, both for the user and the developer, by having every Macintosh be identical. It was a valid point of view, even somewhat courageous, but not very practical, because things were still changing too fast in the computer industry for it to work, driven by the relentless tides of Moore’s Law.

— original Macintosh team-member Andy Hertzfeld

Jef Raskin and Steve Jobs didn’t agree on much, but they did agree on their loathing for expansion slots. The absence of slots was one of the bedrock attributes of Raskin’s original vision for the Macintosh, the most immediately obvious difference between it and Apple’s then-current flagship product, the Apple II. In contrast to Steve Wozniak’s beloved hacker plaything, Raskin’s computer for the people would be as effortless to set up and use as a stereo, a television, or a toaster.

When Jobs took over the Macintosh project — some, including Raskin himself, would say stole it — he changed just about every detail except this one. Yet some members of the tiny team he put together, fiercely loyal to their leader and his vision of a “computer for the rest of us” though they were, were beginning to question the wisdom of this aspect of the machine by the time the Macintosh came together in its final form. It was a little hard in January of 1984 not to question the wisdom of shipping an essentially unexpandable appliance with just 128 K of memory and a single floppy-disk drive for a price of $2495. At some level, it seemed, this just wasn’t how the computer market worked.

Jobs would reply that the whole point of the Macintosh was to change how computers worked, and with them the workings of the computer market. He wasn’t entirely without concrete arguments to back up his position. One had only to glance over at the IBM clone market — always Jobs’s first choice as the antonym to the Mac — to see how chaotic a totally open platform could be. Clone users were getting all too familiar with the IRQ and memory-address conflicts that could result from plugging two cards that were determined not to play nice together into the same machine, and software developers were getting used to chasing down obscure bugs that only popped up when their programs ran on certain combinations of hardware.

Viewed in the big picture, we could actually say that Jobs was prescient in his determination to stamp out that chaos, to make every Macintosh the same as every other, to make the platform in general a thoroughly known quantity for software developers. The norm in personal computing as most people know it — whether we’re talking phones, tablets, laptops, or increasingly even desktop computers — has long since become sealed boxes of one stripe or another. But there are some important factors that make said sealed boxes a better idea now than they were back then. For one thing, the pace of hardware and software development alike has slowed enough that a new computer can be viable just as it was purchased for ten years or more. For another, prices have come down enough that throwing a device away and starting over with a new one isn’t so cost-prohibitive as it once was. With personal computers still exotic, expensive machines in a constant state of flux at the time of the Mac’s introduction, the computer as a sealed appliance was a vastly more problematic proposition.

Determined to do everything possible to keep users out of the Macintosh's innards, Apple used Torx screws, which were almost unheard of at the time, and even threatened them with electrocution should they persist. The contrast with the Apple II, whose top could be popped in seconds, could hardly have been more striking.

Determined to do everything possible to keep users out of the Mac’s innards, Apple used Torx screws for which screwdrivers weren’t commonly available to seal it, and even threatened users with electrocution should they persist in trying to open it. The contrast with the Apple II, whose top could be popped in seconds using nothing more than a pair of hands to reveal seven tempting expansion slots, could hardly have been more striking.

It was the early adopters who spotted the potential in that first slow, under-powered Macintosh, the people who believed Jobs’s promise that the machine’s success or failure would be determined by the number who bought it in its first hundred days on the market, who bore the brunt of Apple’s decision to seal it as tightly as Fort Knox. When Apple in September of 1984 released the so-called “Fat Mac” with 512 K of memory, the quantity that in the opinion of just about everyone — including most of those at Apple not named Steve Jobs — the machine should have shipped with in the first place, owners of the original model were offered the opportunity to bring their machines to their dealers and have them retro-fitted to the new specifications for $995. This “deal” sparked considerable outrage and even a letter-writing campaign that tried to shame Apple into bettering the terms of the upgrade. Disgruntled existing owners pointed out that their total costs for a 512 K Macintosh amounted to $3490, while a Fat Mac could be bought outright by a prospective new member of the Macintosh fold for $2795. “Apple should have bent over backward for the people who supported it in the beginning,” said one of the protest’s ringleaders. “I’m never going to feel the same about Apple again.” Apple, for better or for worse never a company that was terribly susceptible to such public shaming, sent their disgruntled customers a couple of free software packages and told them to suck it up.

The Macintosh Plus

The Macintosh Plus

Barely fifteen months later, when Apple released the Macintosh Plus with 1 MB of memory among other advancements, the merry-go-round spun again. This time the upgrade would cost owners of the earlier models over $1000, along with lots of downtime while their machines sat in queues at their dealers. With software developers rushing to take advantage of the increased memory of each successive model, dedicated users could hardly stand to regard each successive upgrade as optional. As things stood, then, they were effectively paying a service charge of about $1000 per year just to remain a part of the Macintosh community. Owning a Mac was like owning a car that had to go into the shop for a week for a complete engine overhaul once every year. Apple, then as now, was famous for the loyalty of their users, but this was stretching even that legendary goodwill to the breaking point.

For some time voices within Apple had been mumbling that this approach simply couldn’t continue if the Macintosh was to become a serious, long-lived computing platform; Apple simply had to open the Mac up, even if that entailed making it a little more like all those hated beige IBM clones. During the first months after the launch, Steve Jobs was able to stamp out these deviations from his dogma, but as sales stalled and his relationship with John Sculley, the CEO he’d hand-picked to run the company he’d co-founded, deteriorated, the grumblers grew steadily more persistent and empowered.

The architect of one of the more startling about-faces in Apple’s corporate history would be Jean-Louis Gassée, a high-strung marketing executive newly arrived in Silicon Valley from Apple’s French subsidiary. Gassée privately — very privately in the first months after his arrival, when Jobs’s word still was law — agreed with many on Apple’s staff that the only way to achieve the dream of making the Macintosh into a standard to rival or beat the Intel/IBM/Microsoft trifecta was to open the platform. Thus he quietly encouraged a number of engineers to submit proposals on what direction they would take the platform in if given free rein. He came to favor the ideas of Mike Dhuey and Brian Berkeley, two young engineers who envisioned a machine with slots as plentiful and easily accessible as those of the Apple II or an IBM clone. Their “Little Big Mac” would be based around the 32-bit Motorola 68020 chip rather than the 16-bit 68000 of the current models, and would also sport color — another Jobsian heresy.

In May of 1985, Jobs made the mistake of trying to recruit Gassée into a rather clumsy conspiracy he was formulating to oust Sculley, with whom he was now in almost constant conflict. Rather than jump aboard the coup train, Gassée promptly blew the whistle to Sculley, precipitating an open showdown between Jobs and Sculley in which, much to Jobs’s surprise, the entirety of Apple’s board backed Sculley. Stripped of his power and exiled to a small office in a remote corner of Apple’s Cupertino campus, Jobs would soon depart amid recriminations and lawsuits to found a new venture called NeXT.

Gassée’s betrayal of Jobs’s confidence may have had a semi-altruistic motivation. Convinced that the Mac needed to open up to survive, perhaps he concluded that that would only happen if Jobs was out of the picture. Then again, perhaps it came down to a motivation as base as personal jealousy. With a penchant for leather and a love of inscrutable phraseology — “the Apple II smelled like infinity” is a typical phrase from his manifesto The Third Apple, “an invitation to voyage into a region of the mind where technology and poetry exist side by side, feeding each other” — Gassée seemed to self-consciously adopt the persona of a Gallic version of Jobs himself. But regardless, with Jobs now out of the picture Gassée was able to consolidate his own power base, taking over Jobs’s old role as leader of the Macintosh division. He went out and bought a personalized license plate for his sports car: “OPEN MAC.”

Coming some four months after Jobs’s final departure, the Mac Plus already included such signs of the changing times as a keyboard with arrow keys and a numeric keypad, anathema to Jobs’s old mouse-only orthodoxy. But much, much bigger changes were also well underway. Apple’s 1985 annual report, released in the spring of 1986, dropped a bombshell: a Mac with slots was on the way. Dhuey and Berkeley’s open Macintosh was now proceeding… well, openly.

The Macintosh II

The Macintosh II

When it debuted five months behind schedule in March of 1987, the Macintosh II was greeted as a stunning but welcome repudiation of much of what the Mac had supposedly stood for. In place of the compact all-in-one-case designs of the past, the new Mac was a big, chunky box full of empty space and empty slots — six of them altogether — with the monitor an item to be purchased separately and perched on top. Indeed, one could easily mistake the Mac II at a glance for a high-end IBM clone; its big, un-stylish case even included a cooling fan, an item that placed even higher than expansion slots and arrow keys on Steve Jobs’s old list of forbidden attributes.

Apple’s commitment to their new vision of a modular, open Macintosh was so complete that the Mac II didn’t include any on-board video at all; the buyer of the $6500 machine would still have to buy the video card of her choice separately. Apple’s own high-end video card offered display capabilities unprecedented in a personal computer: a palette of over 16 million colors, 256 of them displayable onscreen at any one time at resolutions as high as 640 X 480. And, in keeping with the philosophy behind the Mac II as a whole, the machine was ready and willing to accept a still more impressive graphics card just as soon as someone managed to make one. The Mac II actually represented colors internally using 48 bits, allowing some 281 trillion different shades. These idealized colors were then translated automatically into the closest approximations the actual display hardware could manage. This fidelity to the subtlest vagaries of color would make the Mac II the favorite of people working in many artistic and image-processing fields, especially when those aforementioned even better video cards began to hit the market in earnest. Even today no other platform can match the Mac in its persnickety attention to the details of accurate color reproduction.

Some of the Mac II's capabilities truly were ahead of their time. Here we see a desktop extended across two monitors, each powered by a separate video card.

Some of the Mac II’s capabilities truly were ahead of their time. Here we see a desktop extended across two monitors, each powered by its own video card.

The irony wasn’t lost on journalists or users when, just weeks after the Mac II’s debut, IBM debuted their new PS/2 line, marked by sleeker, slimmer cases and many features that would once have been placed on add-on-cards now integrated into the motherboards. While Apple was suddenly encouraging the sort of no-strings-attached hardware hacking on the Macintosh that had made their earlier Apple II so successful, IBM was trying to stamp that sort of thing out on their own heretofore open platform via their new Micro Channel Architecture, which demanded that anyone other than IBM who wanted to expand a PS/2 machine negotiate a license and pay for the privilege. “The original Mac’s lack of slots stunted its growth and forced Apple to expand the machine by offering new models,” wrote Byte. “With the Mac II, Apple — and, more importantly, third-party developers — can expand the machine radically without forcing you to buy a new computer. This is the design on which Apple plans to build its Macintosh empire.” It seemed like the whole world of personal computing was turning upside down, Apple turning into IBM and IBM turning into Apple.

The Macintosh SE

The Macintosh SE

If so, however, Apple’s empire would be a very exclusive place. By the time you’d bought a monitor, video card, hard drive, keyboard — yes, even the keyboard was a separate item — and other needful accessories, a Mac II system could rise uncomfortably close to the $10,000 mark. Those who weren’t quite flush enough to splash out that much money could still enjoy a taste of the Mac’s new spirit of openness via the simultaneously released Mac SE, which cost $3699 for a hard-drive-equipped model. The SE was a 68000-based machine that looked much like its forefathers — built-in black-and-white monitor included — but did have a single expansion slot inside its case. The single slot was a little underwhelming in comparison to the Mac II, but it was better than nothing, even if Apple did still recommend that customers take their machines to their dealers if they wanted to actually install something in it. Apple’s not-terribly-helpful advice for those needing to employ more than one expansion card was to buy an “integrated” card that combined multiple functions. If you couldn’t find a card that happened to combine exactly the functions you needed, you were presumably just out of luck.

During the final years of the 1980s, Apple would continue to release new models of the Mac II and the Mac SE, now established as the two separate Macintosh flavors. These updates enhanced the machines with such welcome goodies as 68030 processors and more memory, but, thanks to the wonders of open architecture, didn’t immediately invalidate the models that had come before. The original Mac II, for instance, could be easily upgraded from the 68020 to the 68030 just by dropping a card into one of its slots.

The Steve Jobs-less Apple, now thoroughly under the control of the more sober and pragmatic John Sculley, toned down the old visionary rhetoric in favor of a more businesslike focus. Even the engineers dutifully toed the new corporate line, at least publicly, and didn’t hesitate to denigrate Apple’s erstwhile visionary-in-chief in the process. “Steve Jobs thought that he was right and didn’t care what the market wanted,” Mike Dhuey said in an interview to accompany the Mac II’s release. “It’s like he thought everyone wanted to buy a size-nine shoe. The Mac II is specifically a market-driven machine, rather than what we wanted for ourselves. My job is to take all the market needs and make the best computer. It’s sort of like musicians — if they make music only to satisfy their own needs, they lose their audience.” Apple, everyone was trying to convey, had grown up and left all that changing-the-world business behind along with Steve Jobs. They were now as sober and serious as IBM, their machines ready to take their places as direct competitors to those of Big Blue and the clonesters.

To a rather surprising degree, the world of business computing accepted Apple and the Mac’s new persona. Through 1986, the machines to which the Macintosh was most frequently compared were the Commodore Amiga and Atari ST. In the wake of the Mac II and Mac SE, however, the Macintosh was elevated to a different plane. Now the omnipresent point of comparison was high-end IBM compatibles; the Amiga and ST, despite their architectural similarities, seldom even saw their existence acknowledged in relation to the Mac. There were some good reasons for this neglect beyond the obvious ones of pricing and parent-company rhetoric. For one, the Macintosh was always a far more polished experience for the end user than either of the other 68000-based machines. For another, Apple had enjoyed a far more positive reputation with corporate America than Commodore or Atari had even well before any of the three platforms in question had existed. Still, the nature of the latest magazine comparisons was a clear sign that Apple’s bid to move the Mac upscale was succeeding.

Whatever one thought of Apple’s new, more buttoned-down image, there was no denying that the market welcomed the open Macintosh with a matching set of open arms. Byte went so far as to call the Mac II “the most important product that Apple has released since the original Apple II,” thus elevating it to a landmark status greater even than that of the first Mac model. While history hasn’t been overly kind to that judgment, the fact remains that third-party software and hardware developers, who had heretofore been stymied by the frustrating limitations of the closed Macintosh architecture, burst out now in myriad glorious ways. “We can’t think of everything,” said an ebullient Jean-Louis Gassée. “The charm of a flexible, open product is that people who know something you don’t know will take care of it. That’s what they’re doing in the marketplace.” The biannual Macworld shows gained a reputation as the most exciting events on the industry’s calendar, the beat to which every journalist lobbied to be assigned. The January 1988 show in San Francisco, the first to reflect the full impact of Apple’s philosophical about-face, had 20,000 attendees on its first day, and could have had a lot more than that had there been a way to pack them into the exhibit hall. Annual Macintosh sales more than tripled between 1986 and 1988, with cumulative sales hitting 2 million machines in the latter year. And already fully 200,000 of the Macs out there by that point were Mac IIs, an extraordinary number really given that machine’s high price. Granted, the Macintosh had hit the 2-million mark fully three years behind the pace Steve Jobs had foreseen shortly after the original machine’s introduction. But nevertheless, it did look like at least some of the more modest of his predictions were starting to come true at last.

An Apple Watch 27 years before its time? Just one example of the extraordinary innovation of the Macintosh market was the WristMac from Ex Machina, a "personal information manager" that could be synchronized with a Mac to function as your appointment calendar and a telephone Rolodex among other possibilities.

An Apple Watch 27 years before its time? Just one example of the extraordinary innovation of the Macintosh market was the WristMac from Ex Machina, a “personal information manager” that could be synchronized with a Mac to take the place of your appointment calendar, to-do list, and Rolodex.

While the Macintosh was never going to seriously challenge the IBM standard on the desks of corporate America when it came to commonplace business tasks like word processing and accounting, it was becoming a fixture in design departments of many stripes, and the staple platform of entire niche industries — most notably, the publishing industry, thanks to the revolutionary combination of Aldus PageMaker (or one of the many other desktop-publishing packages that followed it) and an Apple LaserWriter printer (or one of the many other laser printers that followed it). By 1989, Apple could claim about 10 percent of the business-computing market, making them the third biggest player there after IBM and Compaq — and of course the only significant player there not running a Microsoft operating system. What with Apple’s premium prices and high profit margins, third place really wasn’t so bad, especially in comparison with the moribund state of the Macintosh of just a few years before.

Steve Jobs and John Sculley in happier times.

Steve Jobs and John Sculley in happier times.

So, the Macintosh was flying pretty high as the curtain began to come down on the 1980s. It’s instructive and more than a little ironic to contrast the conventional wisdom that accompanied that success with the conventional wisdom of today. Despite the strong counterexample of Nintendo’s exploding walled garden over in the videogame-console space, the success the Macintosh had enjoyed since Apple’s decision to open up the platform was taken as incontrovertible proof that openness in terms of software and hardware alike was the only viable model for computing’s future. In today’s world of closed iOS and Android ecosystems and computing via disposable black boxes, such an assertion sounds highly naive.

But even more striking is the shift in the perception of Steve Jobs. In the late 1980s, he was loathed even by many strident Mac fans, whilst being regarded in the business and computer-industry press and, indeed, much of the popular press in general as a dilettante, a spoiled enfant terrible whose ill-informed meddling had very nearly sunk a billion-dollar corporation. John Sculley, by contrast, was lauded as exactly the responsible grown-up Apple had needed to scrub the company of Jobs’s starry-eyed hippie meanderings and lead them into their bright businesslike present. Today popular opinion on the two men has neatly reversed itself: Sculley is seen as the unimaginative corporate wonk who mismanaged Jobs’s brilliant vision, Jobs as the greatest — or at least the coolest — computing visionary of all time. In the end, of course, the truth must lie somewhere in the middle. Sculley’s strengths tended to be Jobs’s weaknesses, and vice versa. Apple would have been far better off had the two been able to find a way to continue to work together. But, in Jobs’s case especially, that would have required a fundamental shift in who these men were.

The loss among Apple’s management of that old Jobsian spirit of zealotry, overblown and impractical though it could sometimes be, was felt keenly by the Macintosh even during these years of considerable success. Only Jean-Louis Gassée was around to try to provide a splash of the old spirit of iconoclastic idealism, and everyone had to agree in the end that he made a rather second-rate Steve Jobs. When Sculley tried on the mantle of visionary — as when he named his fluffy corporate autobiography Odyssey and subtitled it “a journey of adventure, ideas, and the future” — it never quite seemed to fit him right. The diction was always off somehow, like he was playing a Silicon Valley version of Mad Libs. “This is an adventure of passion and romance, not just progress and profit,” he told the January 1988 Macworld attendees, apparently feeling able to wax a little more poetic than usual before this audience of true believers. “Together we set a course for the world which promises to elevate the self-esteem of the individual rather than a future of subservience to impersonal institutions.” (Apple detractors might note that elevating their notoriously smug users’ self-esteem did indeed sometimes seem to be what the company was best at.)

It was hard not to feel that the Mac had lost something. Jobs had lured Sculley from Pepsi because the latter was widely regarded as a genius of consumer marketing; the Pepsi Challenge, one of the most iconic campaigns in the long history of the cola wars, had been his brainchild. And yet, even before Jobs’s acrimonious departure, Sculley, bowing to pressure from Apple’s stockholders, had oriented the Macintosh almost entirely toward taking on the faceless legions of IBM and Compaq that dominated business computing. Consumer computing was largely left to take care of itself in the form of the 8-bit Apple II line, whose final model, the technically impressive but hugely overpriced IIGS, languished with virtually no promotion. Sculley, a little out of his depth in Silicon Valley, was just following the conventional wisdom that business computing was where the real money was. Businesspeople tended to be turned off by wild-eyed talk of changing the world; thus Apple’s new, more sober facade. And they were equally turned off by any whiff of fun or, God forbid, games; thus the old sense of whimsy that had been one of the original Mac’s most charming attributes seemed to leach away a little more with each successive model.

Those who pointed out that business computing had a net worth many times that of home computing weren’t wrong, but they were missing something important and at least in retrospect fairly obvious: namely, the fact that most of the companies who could make good use of computers had already bought them by now. The business-computing industry would doubtless continue to be profitable for many and even to grow steadily alongside the economy, but its days of untapped potential and explosive growth were behind it. Consumer computing, on the other hand, was still largely virgin territory. Millions of people were out there who had been frustrated by the limitations of the machines at the heart of the brief-lived first home-computer boom, but who were still willing to be intrigued by the next generation of computing technology, still willing to be sold on computers as an everyday lifestyle accessory. Give them a truly elegant, easy-to-use computer — like, say, the Macintosh — and who knew what might happen. This was the vision Jef Raskin had had in starting the ball rolling on the Mac back in 1979, the one that had still been present, if somewhat obscured even then by a high price, in the first released version of the machine with its “the computer for the rest of us” tagline. And this was the vision that Sculley betrayed after Jobs’s departure by keeping prices sky-high and ignoring the consumer market.

“We don’t want to castrate our computers to make them inexpensive,” said Jean-Louis Gassée. “We make Hondas, we don’t make Yugos.” Fair enough, but the Mac was priced closer to Mercedes than Honda territory. And it was common knowledge that Apple’s profit margins remained just about the fattest in the industry, thus raising the question of how much “castration” would really be necessary to make a more reasonably priced Mac. The situation reached almost surrealistic levels with the release of the Mac IIfx in March of 1990, an admittedly “wicked fast” addition to the product line but one that cost $9870 sans monitor or video card, thus replacing the metaphorical with the literal in Gassée’s favored comparison: a complete Mac IIfx system cost more than most actual brand-new Hondas. By now, the idea of the Mac as “the computer for the rest of us” seemed a bitter joke.

Apple was choosing to fight over scraps of the business market when an untapped land of milk and honey — the land of consumer computing — lay just over the horizon. Instead of the Macintosh, the IBM-compatible machines lurched over in fits and starts to fill that space, adopting in the process most of the Mac’s best ideas, even if they seldom managed to implement those ideas quite as elegantly. By the time Apple woke up to what was happening in the 1990s and rushed to fill the gap with a welter of more reasonably priced consumer-grade Macs, it was too late. Computing as most Americans knew it was exclusively a Wintel world, Macs incompatible, artsy-fartsy oddballs. All but locked out of the fastest-growing sectors of personal computing, the very sectors the Macintosh had been so perfectly poised to absolutely own, Apple was destined to have a very difficult 1990s. So difficult, in fact, that they would survive the decade’s many lows only by the skin of their teeth.

This cartoon by Tom Meyer, published in the San Francisco Chronicle, shows the popular consensus about Apple by the early 1990s -- increasingly: overpriced inelegant designs and increasingly clueless management.

This cartoon by Tom Meyer, published in the San Francisco Chronicle, shows the emerging new popular consensus about Apple by the early 1990s: increasingly overpriced, bloated designs and increasingly clueless management.

Now that the 68000 Wars have faded into history and passions have cooled, we can see that the Macintosh was in some ways almost as ill-served by its parent company as was the Commodore Amiga by its. Apple’s management in the post-Jobs era, like Commodore’s, seemed in some fundamental way not to get the very creation they’d unleashed on the world. And so, as with the Amiga, it was left to the users of the Macintosh to take up the slack, to keep the vision thing in the equation. Thankfully, they did a heck of a job with that. Something in the Mac’s DNA, something which Apple’s new sobriety could mask but never destroy, led it to remain a hotbed of inspiring innovations that had little to do with the nuts and bolts of running a day-to-day business. Sometimes seemingly in spite of Apple’s best efforts, the most committed Mac loyalists never forgot the Jobsian rhetoric that had greeted the platform’s introduction, continuing to see it as something far more compelling and beautiful than a tool for business. A 1988 survey by Macworld magazine revealed that 85 percent of their readers, the true Mac hardcore, kept their Macs at home, where they used them at least some of the time for pleasure rather than business.

So, the Mac world remained the first place to look if you wanted to see what the artists and the dreamers were getting up to with computers. We’ve already seen some examples of their work in earlier articles. In the course of the next few, we’ll see some more.

(Sources: Amazing Computing of February 1988, April 1988, May 1988, and August 1988; Info of July/August 1988; Byte of May 1986, June 1986, November 1986, April 1987, October 1987, and June 1990; InfoWorld of November 26 1984; Computer Chronicles television episodes entitled “The New Macs,” “Macintosh Business Software,” “Macworld Special 1988,” “Business Graphics Part 1,” “Macworld Boston 1988,” “Macworld San Francisco 1989,” and “Desktop Presentation Software Part 1”; the books West of Eden: The End of Innocence at Apple Computer by Frank Rose, Apple Confidential 2.0: The Definitive History of the World’s Most Colorful Computer Company by Owen W. Linzmayer, and Insanely Great: The Life and Times of Macintosh, the Computer that Changed Everything by Steven Levy; Andy Hertzfeld’s website Folklore.)

 
42 Comments

Posted by on September 16, 2016 in Digital Antiquaria, Interactive Fiction

 

Tags:

So You Want to Be a Hero?

Lori Ann and Corey Cole

Lori Ann and Corey Cole

Rule #1 is “The Player Must have Fun.” It’s trivially easy for a game designer to “defeat” players. We have all the tools and all the power. The trick is to play on the same side as the players, to tell the story together, and to make them the stars.

That rule is probably the biggest differentiator that made our games special. We didn’t strive to make the toughest, hardest-to-solve puzzles. We focused on the characters, the stories, and making the player the star.

— Corey Cole

It feels thoroughly appropriate that Corey and Lori Ann Cole first met over a game of Dungeons & Dragons. The meeting in question took place at Westercon — the West Coast Science Fantasy Conference — in San Francisco in the summer of 1979. Corey was the Dungeon Master, leading a group of players through an original scenario of his own devising that he would later succeed in getting published as The Tower of Indomitable Circumstance. But on this occasion he found the pretty young woman who was sitting at his table even more interesting than Dungeons & Dragons. Undaunted by mere geography — Corey was programming computers for a living in Southern California while Lori taught school in Arizona — the two struck up a romantic relationship. Within a few years, they were married, settling eventually in San Jose.

They had much in common. As their mutual presence at a convention like Westercon will attest, both the current and the future Cole were lovers of science-fiction and fantasy literature and its frequent corollary, gaming, from well before their first meeting. Their earliest joint endeavor — besides, that is, the joint endeavor of romance — was The Spellbook, a newsletter for tabletop-RPG aficionados which they edited and self-published.

Corey also nurtured an abiding passion for computers that had long since turned into a career. After first learning to program in Fortran and COBOL while still in high school during the early 1970s, his subsequent experiences had constituted a veritable grand tour of some of the most significant developments of this formative decade of creative computing. He logged onto the ARPANET (predecessor of the modern Internet) from a terminal at his chosen alma mater, the University of California, Santa Barbara; played the original Adventure in the classic way, via a paper teletype machine; played games on the PLATO system, including the legendary proto-CRPGs Oubliette and DND that were hosted there. After graduating, he took a job with his father’s company, a manufacturer of computer terminals, traveling the country writing software for customers’ installations. By 1981, he had moved on to become a specialist in word-processing and typesetting software, all the while hacking code and playing games at home on his home-built CP/M computer and his Commodore PET.

When the Atari ST was introduced in 1985, offering an unprecedented amount of power for the price, Corey saw in it the potential to become the everyday computer of the future. He threw himself into this latest passion with abandon, becoming an active member of the influential Bay Area Atari Users Group, a contributor to the new ST magazine STart, and even the founder of a new company, Visionary Systems; the particular vision in question was that of translating his professional programming experience into desktop-publishing software for the ST.

Interestingly, Corey’s passion for computers and computer games was largely not shared by Lori. Like many dedicated players of tabletop RPGs, she always felt the computerized variety to be lacking in texture, story, and most of all freedom. She could enjoy games like Wizardry in bursts with friends, but ultimately found them far too constraining to sustain her interest. And she felt equally frustrated by the limitations of both the parser-driven text adventures of Infocom and the graphical adventures of Sierra. Her disinterest in the status quo of computer gaming would soon prove an ironic asset, prompting her to push her own and Corey’s games in a different and very worthwhile direction.

By early 1988, it was becoming clear that the Atari ST was doomed to remain a niche platform in North America, and thus that Corey’s plan to get rich selling desktop-publishing software for it wasn’t likely to pan out. Meanwhile his chronic asthma was making it increasingly difficult to live in the crowded urban environs of San Jose. The Coles were at one of life’s proverbial crossroads, unsure what to do next.

Then one day they got a call from Carolly Hauksdottir, an old friend from science-fiction fandom — she wrote and sang filk songs with them — who happened to be working now as an artist for Sierra On-Line down in the rural paradise of Oakhurst, California. It seemed she had just come out of a meeting with Sierra’s management in which Ken Williams had stated emphatically that they needed to get back into the CRPG market. Since their brief association with Richard Garriott, which had led to their releasing Ultima II and re-releasing Ultima I, Sierra’s presence in CRPGs had amounted to a single game called Wrath of Denethenor, a middling effort for the Apple II and Commodore 64 sold to them by an outside developer. As that meager record will attest, Ken had never heretofore made the genre much of a priority. But of late the market for CRPGs showed signs of exploding, as evidenced by the huge success of other publishers’ releases like The Bard’s Tale and Ultima IV.  To get themselves a piece of that action, Ken stated in his typical grandiose style that Sierra would need to hire “a published, award-winning, tournament-level Dungeon Master” and set him loose with their latest technology. Corey and Lori quickly reasoned that The Tower of Indomitable Circumstances had been published by the small tabletop publisher Judges Guild, as had their newsletter by themselves; that Corey had once won a tournament at Gen Con as a player; and that together they had once created and run a tournament scenario for a Doctor Who convention. Between the two of them, then, they were indeed “published, award-winning, tournament-level Dungeon Masters.” Right?

Well, perhaps they weren’t quite what Ken had in mind after all. When Corey called to offer their services, at any rate, he sounded decidedly skeptical. He was much more intrigued by another skill Corey mentioned in passing: his talent for programming the Atari ST. Sierra had exactly one programmer who knew anything about the ST, and was under the gun to get their new SCI engine ported over to that platform as quickly as possible. Ken wound up hiring Corey for this task, waving aside the initial reason for Corey’s call with the vague statement that “we’ll talk about game design later.”

What with Corey filling such an urgent need on another front, one can easily imagine Ken’s “later” never arriving at all. Corey, however, never stopped bringing up game design, and with it the talents of his wife that he thought would make her perfect for the role. While he thought that the SCI engine, despite its alleged universal applicability, could never be used to power a convincing hardcore CRPG of the Bard’s Tale/Ultima stripe, he did believe it could serve very well as the base for a hybrid game — part CRPG, part traditional Sierra adventure game. Such a hybrid would suit Lori just fine; her whole interest in the idea of designing computer games was “to bring storytelling and [the] interesting plot lines of books and tabletop role-playing into the hack-and-slash thrill of a computer game.” Given the technological constraints of the time, a hybrid actually seemed a far better vehicle for accomplishing that than a hardcore CRPG.

So, while Corey programmed in Sierra’s offices, Lori sat at home with their young son, sketching out a game. In fact, knowing that Sierra’s entire business model revolved around game series rather than one-offs, she sketched out a plan for four games, each taking place in a different milieu corresponding to one of the four points of the compass, one of the four seasons, and one of the four classical elements of Earth, Fire, Air, and Water. As was typical of CRPGs of this period, the player would be able to transfer the same character, evolving and growing in power all the while, into each successive game in the series.

With his established tabletop-RPG designer still not having turned up, Ken finally relented and brought Lori on to make her hybrid game. But the programmer with whom she was initially teamed was very religious, and refused to continue when he learned that the player would have the option of choosing a “thief” class. And so, after finishing up some of his porting projects, Corey joined her on what they were now calling Hero’s Quest I: So You Want to Be a Hero. Painted in the broadest strokes, he became what he describes as the “left brain” to Lori’s “right brain” on the project, focusing on the details of systems and rules while Lori handled the broader aspects of plot and setting. Still, these generalized roles were by no means absolute. It was Corey, for instance, an incorrigible punster — so don’t incorrige him! — who contributed most of the horrid puns that abound throughout the finished game.

Less than hardcore though they envisioned their hybrid to be, Lori and Corey nevertheless wanted to do far more than simply graft a few statistics and a combat engine onto a typical Sierra adventure game. They would offer their player the choice of three classes, each with its own approach to solving problems: through combat and brute force in the case of the fighter, through spells in the case of the magic user, through finesse and trickery in the case of the thief. This meant that the Coles would in effect have to design Hero’s Quest three times, twining together an intricate tapestry of differing solutions to its problems. Considering this reality, one inevitably thinks of what Ron Gilbert said immediately after finishing Maniac Mansion, a game in which the player could select her own team of protagonists but one notably free of the additional complications engendered by Hero’s Quest‘s emergent CRPG mechanics: “I’m never doing that again!” The Coles, however, would not only do it again — in fact, four times more — but they would consistently do it well, succeeding at the tricky task of genre blending where designers as talented as Brian Moriarty had stumbled.

Instead of thinking in terms of “puzzles,” the Coles preferred to think in terms of “problems.” In Hero’s Quest, many of these problems can be treated like a traditional adventure-game puzzle and overcome using your own logic. But it’s often possible to power through the same problem using your character’s skills and abilities. This quality makes it blessedly difficult to get yourself well-and-truly, permanently stuck. Let’s say you need to get a fish from a fisherman in order to get past the bear who’s blocking your passage across a river. You might, in traditional adventure-game style, use another item you found somewhere to repair his leaky boat, thus causing him to give you a fish as a small token of his appreciation. But you might also, if your character’s intelligence score is high enough, be able to convince him to give you a fish through logical persuasion alone. Or you might bypass the whole question of the fish entirely if your character is strong and skilled enough to defeat the bear in combat. Moriarty’s Beyond Zork tries to accomplish a superficially similar blending of the hard-coded adventure game and the emergent CRPG, but does so far less flexibly, dividing its problems rather arbitrarily into those soluble by adventure-game means and those soluble by CRPG means. The result for the player is often confusion, as things that ought to work fail to do so simply because a problem fell into the wrong category. Hero’s Quest was the first to get the blending right.

Based on incremental skill and attribute improvements rather than employing the more monolithic level-based structure of Dungeons & Dragons, the core of the Hero’s Quest game system reached back to a system of tabletop rules the Coles had begun formulating years before setting to work on their first computer game. It has the advantage of offering nearly constant feedback, a nearly constant sense of progress even if you happen to be stuck on one of the more concrete problems in the game. Spend some time bashing monsters, and your character’s “weapons use” score along with his strength and agility will go up; practice throwing daggers on the game’s handy target range, and his “throwing” skill will increase a little with almost every toss. Although you choose a class for your character at the outset, there’s nothing preventing you from building up a magic user who’s also pretty handy with a sword, or a fighter who knows how to throw a spell or two. You’ll just have to sacrifice some points in the beginning to get a start in the non-core discipline, then keep on practicing.


If forced to choose one adjective to describe Hero’s Quest and the series it spawned as a whole, I would have to go with “generous” — not, as the regular readers among you can doubtless attest, an adjective I commonly apply to Sierra games in general. Hero’s Quest‘s generosity extends far beyond its lack of the sudden deaths, incomprehensible puzzles, hidden dead ends, and generalized player abuse that were so typical of Sierra designs. Departing from Sierra’s other series with their King Grahams, Rosellas, Roger Wilcos, and Larry Laffers, the Coles elected not to name or characterize their hero, preferring to let their players imagine and sculpt the character each of them wanted to play. Even within the framework of a given character class, alternate approaches and puzzle — er, problem — solutions abound, while the environment is fleshed-out to a degree unprecedented in a Sierra adventure game. Virtually every reasonable action, not to mention plenty of unreasonable ones, will give some sort of response, some acknowledgement of your cleverness, curiosity, or sense of humor. Almost any way you prefer to approach your role is viable. For instance, while it’s possible to leave behind a trail of monstrous carnage worthy of a Bard’s Tale game, it’s also possible to complete the entire game without taking a single life. The game is so responsive to your will that the few places where it does fall down a bit, such as in not allowing you to choose the sex of your character — resource constraints led the Coles to make the hero male-only — stand out more in contrast to how flexible the rest of this particular game is than they do in contrast to most other games of the period.

Hero's Quest's unabashedly positive message feels particularly bracing in this current Age of Irony of ours.

Hero’s Quest‘s message of positive empowerment feels particularly bracing in our current age of antiheroes and murky morality.

Indeed, Hero’s Quest is such a design outlier from the other Sierra games of its era that I contacted the Coles with the explicit goal of coming to understand just how it could have come out so differently. Corey took me back all the way to the mid-1970s, to one of his formative experiences as a computer programmer and game designer alike, when he wrote a simple player-versus-computer tic-tac-toe game for a time-shared system to which he had access. “Originally,” he says, “it played perfectly, always winning or drawing, and nobody played it for long. After I introduced random play errors by the computer, so that a lucky player could occasionally win, people got hooked on the game.” From this “ah-ha!” moment and a few others like it was born the Coles’ Rule #1 for game design: “The player must always have fun.” “We try to remember that rule,” says Corey, “every time we create a potentially frustrating puzzle.” The trick, as he describes it, is to make “the puzzles and challenges feel difficult, yet give the player a chance to succeed after reasonable effort.” Which leads us to Rule #2: “The player wants to win.” “We aren’t here to antagonize the players,” he says. “We work with them in a cooperative storytelling effort. If the player fails, everybody loses; we want to see everyone win.”

Although their professional credits in the realm of game design were all but nonexistent at the time they came to Sierra, the Coles were nevertheless used to thinking about games far more deeply than was the norm in Oakhurst. They were, for one thing, dedicated players of games, very much in tune with the experience of being a player, whether sitting behind a table or a computer. Ken Williams, by contrast, had no interest in tabletop games, and had sat down and played for himself exactly one computerized adventure game in his life (that game being, characteristically for Ken, the ribald original Softporn). While Roberta Williams had been inspired to create Mystery House by the original Adventure and some of the early Scott Adams games, her experience as a player never extended much beyond those primitive early text adventures; she was soon telling interviewers that she “didn’t have the time” to actually play games. Most of Sierra’s other design staff came to the role through the back door of being working artists, writers, or programmers, not through the obvious front door of loving games as a pursuit unto themselves. Corey states bluntly that “almost nobody there played [emphasis mine] games.” The isolation from the ordinary player’s experience that led to so many bad designs was actually encouraged by Ken Williams; he suggested that his staffers not look too much at what the competition was doing out of some misguided desire to preserve the “originality” of Sierra’s own games.

But the Coles were a little different than the majority of said staffers. Corey points out that they were both over thirty by the time they started at Sierra. They had, he notes, also “traveled a fair amount,” and “both the real-life experience and extensive tabletop-gaming experience gave [us] a more ‘mature’ attitude toward game development, especially that the designer is a partner to the player, not an antagonist to be overcome.” Given the wealth of experience with games and ideas about how games ought to be that they brought with them to Sierra, the Coles probably benefited as much from the laissez-faire approach to game-making engendered by Ken Williams as some of the other designers perhaps suffered from the same lack of direction. Certainly Ken’s personal guidance was only sporadic, and often inconsistent. Corey:

Once in a while, Ken Williams would wander through the development area — it might happen two or three times in a day, or more likely the visits might be three weeks apart. Everyone learned that it was essential to show Ken some really cool sequence or feature that he hadn’t seen before. You only showed him one such sequence because you needed to reserve two more in case he came back the same day.

Our first (and Sierra’s first) Producer, Guruka Singh Khalsa, taught us the “Ken Williams Rule” based on something Robert Heinlein wrote: “That which he tells you three times is true.” Ken constantly came up with half-baked ideas, some of them amazing, some terrible, and some impractical. If he said something once, you nodded in agreement. Twice, you sat up and listened. Anything he said three times was law and had to be done. While Ken mostly played a management role at Sierra, he also had some great creative ideas that really improved our games. Of course, it takes fifteen seconds to express an idea, and sometimes days or weeks to make it work. That’s why we ignored the half-baked, non-repeated suggestions.

The Coles had no affinity for any of Sierra’s extant games; they considered them “unfair and not much fun.” Yet the process of game development at Sierra was so unstructured that they had little sense of really reacting against them either.  As I mentioned earlier, Lori didn’t much care for any of the adventure games she had seen, from any company. She wouldn’t change her position until she played Lucasfilm Games’s The Secret of Monkey Island in 1990. After that experience, she became a great fan of the Lucasfilm adventures, enjoying them far more than the works of her fellow designers at Sierra. For now, though, rather than emulating existing computerized adventure or RPG games, the Coles strove to re-create the experience of sitting at a table with a great storytelling game master at its head.

Looking beyond issues of pure design, another sign of the Coles’ relatively broad experience in life and games can be seen in their choice of settings. Rather than settling for the generic “lazy Medieval” settings so typical of Dungeons & Dragons-derived computer games then and now, they planned their series as a hall of windows into some of the richest myths and legends of our own planet. The first game, which corresponded in Lori’s grand scheme for the series as a whole to the direction North, the season Spring, and the element Earth, is at first glance the most traditional of the series’s settings. This choice was very much a conscious one, planned to help the series attract an initial group of players and get some commercial traction; bullish as he was on series in general, Ken Williams wasn’t particularly noted for his patience with new ones that didn’t start pulling their own weight within a game or two. Look a little closer, though, and even the first game’s lush fantasy landscape, full of creatures that seem to have been lifted straight out of a Dungeons & Dragons Monster Manual, stands out as fairly unique. Inspired by an interest in German culture that had its roots in the year Corey had spent as a high-school exchange student in West Berlin back in 1971 and 1972, the Coles made their Medieval setting distinctly Germanic, as is highlighted by the name of the town around which the action centers: Spielburg. (Needless to say, the same name is also an example of the Coles’ love of puns and pop-culture in-joking.) Later games would roam still much further afield from the lazy-Medieval norm. The second, for instance, moves into an Arabian Nights milieu, while still later ones explore the myths and legends of Africa, Eastern Europe, and Greece. The Coles’ determination to inject a little world music into the cultural soundtracks of their mostly young players stands out as downright noble. Their series doubtless prompted more than a few blinkered teenage boys to begin to realize what a big old interesting world there really is out there.

Hero's Quest

Of course, neither the first Hero’s Quest nor any of the later ones in the series would be entirely faultless. Sierra suffered from the persistent delusion that their SCI engine was a truly universal hammer suitable for every sort of nail, leading them to incorporate action sequences into almost every one of their adventure games. These are almost invariably excruciating, afflicted with slow response times and imprecise, clumsy controls. Hero’s Quest, alas, isn’t an exception from this dubious norm. It has an action-oriented combat engine so unresponsive that no one I’ve ever talked to tries to do anything with it but just pound on the “attack” key until the monster is dead or it’s obviously time to run away. And then there are some move-exactly-right-or-you’re-dead sequences in the end game that are almost as frustrating as some of the ones found in Sierra’s other series. But still, far more important in the end are all the things Hero’s Quest does right, and more often than not in marked contrast to just about every other Sierra game of its era.

Hero’s Quest was slated into Sierra’s release schedule for Christmas 1989, part of a diverse lineup of holiday releases that also included the third Leisure Suit Larry game from the ever-prolific Al Lowe and something called The Colonel’s Bequest, a bit of a departure for Roberta Williams in the form of an Agatha Christie-style cozy murder mystery. With no new King’s Quest game on offer that year, Hero’s Quest, the only fantasy release among Sierra’s 1989 lineup, rather unexpectedly took up much of its slack. As pre-orders piled up to such an extent that Sierra projected needing to press 100,000 copies right off the bat just to meet the holiday demand, Corey struggled desperately with a sequence — the kobold cave, for those of you who have played the game already — that just wouldn’t come together. At last he brokered a deal to withhold only the disk that would contain that sequence from the duplicators. In a single feverish week, he rewrote it from scratch. The withheld disk was then duplicated in time to join the rest, and the game as a whole shipped on schedule, largely if not entirely bug-free. Even more impressively, it was, despite receiving absolutely no outside beta-testing — Sierra still had no way of seriously evaluating ordinary players’ reactions to a game before release — every bit as friendly, flexible, and soluble as the Coles had always envisioned it to be.

Hero's Quest

The game became the hit its pre-orders had indicated it would, its sales outpacing even the new Leisure Suit Larry and Roberta Williams’s new game by a comfortable margin that holiday season. The reviews were superlative; Questbusters‘s reviewer pronounced it “honestly the most fun I’ve had with any game in years,” and Computer Gaming World made it their “Adventure Game of the Year.” While it would be nice to attribute this success entirely to the public embracing its fine design sensibilities, which they had learned of via all the fine reviews, its sales numbers undoubtedly had much to do with its good fortune in being released during this year without a King’s Quest. Hero’s Quest was for many a harried parent and greedy child alike the closest analogue to Roberta Williams’s blockbuster series among the new releases on store shelves. The game sold over 130,000 copies in its first year on the market, about 200,000 copies in all in its original version, then another 100,000 copies when it was remade in 1992 using Sierra’s latest technology. Such numbers were, if not quite in the same tier as a King’s Quest, certainly nothing to sneeze at. In creative and commercial terms alike, the Coles’ series was off to a fine start.

At the instant of Hero’s Quest‘s release, Sierra was just embarking on a major transition in their approach to game-making. Ken Williams had decided it was time at last to make the huge leap from the EGA graphics standard, which could display just 16 colors onscreen from a palette of 64, to VGA, which could display 256 colors from a palette of 262,144. To help accomplish this transition, he had hired Bill Davis, a seasoned Hollywood animator, in the new role of Sierra’s “Creative Director” in July of 1989. Davis systematized Sierra’s heretofore laissez-faire approach to game development into a rigidly formulated Hollywood-style production pipeline. Under his scheme, the artists would now be isolated from the programmers and designers; inter-team communication would happen only through a bureaucratic paper trail.

The changes inevitably disrupted Sierra’s game-making operation, which of late had been churning out new adventure games at a rate of about half a dozen per year. Many of the company’s resources for 1990 were being poured into King’s Quest V, which was intended, as had been the norm for that series since the beginning, to be the big showpiece game demonstrating all the company’s latest goodies, including not only 256-color VGA graphics but also a new Lucasfilm Games-style point-and-click interface in lieu of the old text parser. King’s Quest V would of course be Sierra’s big title for Christmas. They had only two other adventure games on their schedule for 1990, both begun using the older technology and development methodology well before the end of 1989 and both planned for release in the first half of the new year. One, an Arthurian game by an established writer for television named Christy Marx, was called Conquests of Camelot: The Search for the Grail (thus winning the prize of being the most strained application of Sierra’s cherished “Quest” moniker ever). The other, a foray into Tom Clancy-style techno-thriller territory by Police Quest designer Jim Walls, was called Code-Name: ICEMAN. Though they had every reason to believe that King’s Quest V would become another major hit, Sierra was decidedly uncertain about the prospects of these other two games. They felt they needed at least one more game in an established series if they hoped to maintain the commercial momentum they’d been building up in recent years. Yet it wasn’t clear where that game was to come from; one side effect of the transition to VGA graphics was that art took much longer to create, and games thus took longer to make. Lori and Corey were called into a meeting and given two options. One, which Lori at least remembers management being strongly in favor of, was to make the second game in their series using Sierra’s older EGA- and parser-driven technology, getting it out in time to become King’s Quest V‘s running mate for the Christmas of 1990. The other was to be moved in some capacity to the King’s Quest V project, with the opportunity to return to Hero’s Quest only at some uncertain future date. They chose — or were pushed into — the former.

Despite using the older technology, their second game was, at Davis’s insistence, created using the newer production methodology. This meant among other things that the artists, now isolated from the rest of the developers, had to create the background scenes on paper; their pictures were then scanned in for use in the game. I’d like to reserve the full details of Sierra’s dramatically changed production methodology for another article, where I can give them their proper due. Suffice for today to say that, while necessary in many respects for a VGA game, the new processes struck everyone as a strange way to create a game using the sharply limited EGA color palette. By far the most obvious difference they made was that everything seemed to take much longer. Lori Ann Cole:

We got the worst of both worlds. We got a new [development] system that had never been tried before, and all the bugs that went with that. And we were doing it under the old-school technology where the colors weren’t as good and all that. We were under a new administration with a different way of treating people. We got time clocks; we had to punch in a number to get into the office so that we would work the set number of hours. We had all of a sudden gone from this free-form company to an authoritarian one: “This is the hours you have to work. Programmers will work over here and artists will work over there, and only their bosses can talk to one another; you can’t talk to the artist that’s doing the art.”

Some of the Coles’ frustrations with the new regime came out in the game they were making. Have a close look at the name of Raseir, an oppressed city — sort of an Arabian Nights version of Nineteen Eighty-Four — where the climax of the game occurs.

Scheduled for a late September release, exactly one year after the release of the first Hero’s Quest, the second game shipped two months behind schedule, coming out far too close to Christmas to have a prayer of fully capitalizing on the holiday rush. And then when it did finally ship, it didn’t even ship as Hero’s Quest II.

Quest for Glory II

In 1989, the same year that Sierra had released the first Hero’s Quest, the British division of the multi-national toys and games giant Milton Bradley had released HeroQuest, a sort of board-gameified version of Dungeons & Dragons. They managed to register their trademark on the name for Europe shortly before Sierra registered theirs for Europe and North America. After the board game turned into a big European success, Milton Bradley elected to bring it to North America the following year, whilst also entering into talks with some British developers about turning it into a computer game. Clearly something had to be done about the name conflict, and thanks to their having registered the trademark first Milton Bradley believed they had the upper hand. When the bigger company’s lawyers came calling, Sierra, unwilling to get entangled in an expensive lawsuit they probably couldn’t win anyway, signed a settlement that not only demanded they change the name of their series but also stated that they couldn’t even continue using the old name long enough to properly advertise that “Hero’s Quest has a new name!” Thus when Hero’s Quest and its nearly finished sequel were hastily rechristened Quest for Glory, the event was announced only via a single four-sentence press release.

So, a veritable perfect storm of circumstances had conspired to undermine the commercial prospects of the newly rechristened Quest for Glory II: Trial by Fire. Sierra’s last parser-driven 16-color game, it was going head to head with the technological wonder that was King’s Quest V — another fantasy game to boot. Due to its late release, it lost the chance to pick up even many or most of the scraps King’s Quest V might have left it. And finally, the name change meant that the very idea of a Quest for Glory II struck most Sierra fans as a complete non sequitur; they had no idea what game it was allegedly a sequel to. Under the circumstances, it’s remarkable that Quest for Glory II performed as well as it did. It sold an estimated 110,000 to 120,000 copies — not quite the hit its predecessor had been, but not quite the flop one could so easily imagine the newer game becoming under the circumstances either. Sales were still strong enough that this eminently worthy series was allowed to continue.


As a finished game, Quest for Glory II betrays relatively little sign of its difficult gestation, even if there are perhaps just a few more rough edges in comparison to its predecessor. The most common complaint is that the much more intricate and linear plot this time out can lead to a fair amount of time spent waiting for the next plot event to fire, with few concrete goals to achieve in the meantime. This syndrome can especially afflict those players who’ve elected to transfer in an established character from the first game, and thus have little need for the grinding with which newbies are likely to occupy themselves. At the same time, though, the new emphasis on plot isn’t entirely unwelcome in light of the almost complete lack of same in the first game, while the setting this time out of a desert land drawn from the Arabian Nights is even more interesting than was that of the previous game. The leisurely pace can make Quest for Glory II feel like a sort of vacation simulator, a relaxing computerized holiday spent chatting with the locals, sampling the cuisine, enjoying belly dances and poetry readings, and shopping in the bazaars. (Indeed, your first challenge in the game is one all too familiar to every tourist in a new land: converting the money you brought with you from Spielburg into the local currency.) I’ve actually heard Quest for Glory II described by a fair number of players as their favorite in the entire series. If push comes to shove, I’d probably have to say that I slightly prefer the first game, but I wouldn’t want to be without either of them. Certainly Quest for Glory II is about as fine a swan song for the era of parser-driven Sierra graphical adventures as one could possibly hope for.

The combat system used in the Quest for Glory games would change constantly. The one found in the second game is a little more responsive and playable than its predecessor.

The combat system used in the Quest for Glory games would change constantly from game to game. The one found in the second game is a little more responsive and playable than its predecessor.

Had more adventure-game designers at Sierra and elsewhere followed the Coles’ lead, the history of the genre might have been played out quite differently. As it is, though, we’ll have to be content with the games we do have. I’d hugely encourage any of you who haven’t played the Quest for Glory games to give them a shot — preferably in order, transferring the same character from game to game, just as the Coles ideally intended it. Thanks to our friends at GOG.com, they’re available for purchase today in a painless one-click install for modern systems. They remain as funny, likable, and, yes, generous as ever.

We’ll be returning to the Coles in due course to tell the rest of their series’s story. Next time, though, we’ll turn our attention to the Apple Macintosh, a platform we’ve been neglecting a bit of late, to see how it was faring as the 1990s were fast approaching.

Hero's Quest

(Sources: Questbusters of December 1989; Computer Gaming World of September 1990 and April 1991; Sierra’s newsletters dated Autumn 1989, Spring 1990, Summer 1991, Spring 1992, and Autumn 1992; Antic of August 1986; STart of Summer 1986; Dragon of October 1991; press releases and annual reports found in the Sierra archive at the Strong Museum of Play. Online sources include Matt Chats 173 and 174; Lori Ann Cole’s interview with Adventure Classic Gaming; Lori and Corey’s appearance on the Space Venture podcast; and various entries on the Coles’ own blog. But my biggest resource at all has been the Coles themselves, who took the time to patiently answer my many nit-picky questions at length. Thank you, Corey and Lori! And finally, courtesy of Corey and Lori, a little bonus for the truly dedicated among you who have read this far: some pages from an issue of their newsletter The Spell Book, including Corey’s take on “Fantasy Gaming Via Computer” circa summer 1982.)

 
56 Comments

Posted by on September 9, 2016 in Digital Antiquaria, Interactive Fiction

 

Tags: , ,

Sierra Gets Creative

Coming out of Sierra On-Line’s 1984 near-death experience, Ken Williams made a prognostication from which he would never waver: that the real future of home as well as business computing lay with the open, widely cloned hardware architecture of IBM’s computers, running Microsoft’s operating systems. He therefore established and nurtured a close relationship with Radio Shack, whose Tandy 1000 was by far the most consumer-friendly of the mid-1980s clones, and settled down to wait for the winds of the industry as a whole to start to blow his way. But that wait turned into a much longer one than he had ever anticipated. As each new Christmas approached, Ken predicted that this one must be the one where the winds would change, only to witness another holiday season dominated by the cheap and colorful Commodore 64, leaving the MS-DOS machines as relative afterthoughts.

MS-DOS was, mind you, a slowly growing afterthought, one on which Sierra was able to feed surprisingly well. Their unique relationship with Radio Shack in particular made them the envy of other publishers, allowing them as it did to sell their games almost without competition in thousands of stores nationwide. That strategic advantage among others helped Sierra to grow from $4.7 million in gross sales in the fiscal year ending on March 31, 1986, to almost $7 million the following fiscal year.

This sales history from a Sierra prospectus illustrates just how dramatically the company's customer changed almost overnight when Ken Williams made the decision to abandon what he dismissed as the "toy computers" to concentrate on the Apple II and especially MS-DOS markets.

This sales history from a Sierra prospectus illustrates just how dramatically the company’s customer base changed when Ken Williams made the decision to abandon what he dismissed as the “toy computers” to concentrate on the Apple II and especially the MS-DOS markets.

Still, such incrementalism was hardly a natural fit for Ken Williams’s personality; he was always an entrepreneur after the big gains. It was excruciating waiting for the 8-bit generation of machines to just die already. When IBM debuted their PS/2 line in 1987, Ken, seeing the new machines’ lovely MCGA and VGA graphics and user-friendly mouse support, felt a bit like Noah must have when the first drops of rain finally began to fall. Yes, the machines were ridiculously expensive as propositions for the home, but prior experience said that, given time, their technology would trickle down into more affordable price brackets. If nothing else, the PS/2 line was at long last a start.

Indeed, Ken was so encouraged by the PS/2 line that he decided to pull the trigger on a fraught decision faced by every growing young company: that of whether and when to go public. He decided that October of 1987 would be the right moment, just as Sierra’s lineup of new software for Christmas began to hit the streets. After a frenzy of preparation, all was ready — but then the very week the IPO was to take place opened with Black Monday, the largest single-day stock-market collapse since the mother of all stock-market collapses back in 1929. Sierra quietly abandoned their plans, to little notice from prospective investors who suddenly had much bigger fish to fry.

Sierra had gotten very lucky, and in more ways than one. Had Black Monday been, say, a Black Friday instead, their newly issued shares must inevitably have gotten caught in its undertow, with potentially disastrous results. But even absent those concerns, going public in 1987 was probably jumping the gun just a little, banking on an MS-DOS market that wasn’t quite there yet. This reality was abundantly demonstrated by that Christmas of 1987, the latest to defy Ken’s predictions by voting for the Commodore 64 over MS-DOS — although by this time Commodore’s evergreen was in turn being overshadowed by a new quantity from Japan called the Nintendo Entertainment System.

In fact, the Christmas of 1987 would prove the last of the 64’s strong American holiday seasons. The stars were aligning to make 1988 through 1990 the breakthrough years for both Sierra and the MS-DOS platform to which Ken was so obstinately determined to keep hitching his wagon. In the meantime, the fiscal year ending on March 31, 1988 was nothing to sneeze at in its own right: thanks largely to the new hit Leisure Suit Larry in the Land of the Lounge Lizards and the perennially strong sales of all three extant King’s Quest games, gross sales topped $12 million, enough to satisfy even a greedy entrepreneur like Ken.

That year Sierra broke ground on a new office complex close to their old one in picturesque Oakhurst, California, “at the southern gate of Yosemite National Park,” as their press put it. The new building was made cheaply in comparison to the old one: 40,000 square feet of pre-fab metal that has been variously described as resembling either a warehouse or an aircraft hangar, both inside and out. It would prove a far less pleasant place to work than the lovely redwood building Sierra now abandoned, but that, it seemed, was the price of progress. (Ken claimed to have learned from a survey that his employees actually preferred a cheap building in the name of saving money in order to grow the company in more important ways, but there was considerable skepticism about the veracity of that claim among those selfsame employees.)

To accompany an IPO do-over they had tentatively planned for late in the year, Sierra would have some impressive new gaming technology as well as their impressive — or at least much bigger — new building to put on display. Back in 1986, Ken had made his first trip to Japan, where he’d been entranced by a domestic line of computers from NEC called the PC-9801 series. Although these machines were built around Intel processors and were capable of running MS-DOS, they weren’t hardware-compatible with the IBM standard, a situation that left them much more room for hardware innovation than that allowed to the American clonesters. In particular, the need to display the Japanese Kanji script had pushed their display technology far beyond that of their American counterparts. The top-of-the-line PC-9801VX, with 4096 colors, 1 MB of memory, and a 10 MHz 80286 processor, could rival the Commodore Amiga as a gaming computer. And, best of all, the Japanese accepted the NEC machines in this application; there was a thriving market in games for them. Ken saw in these Japanese machines a window on the future of the American MS-DOS machines, tangible proof of what he’d been saying already for so long about the potential of the IBM/Intel/Microsoft standard to become the dominant architecture in homes as well as businesses. Ken returned from Japan determined that Sierra must push their software forward to meet this coming hardware. Out of this epiphany was born the project to make the Sierra Creative Interpreter (SCI), the successor to the Adventure Game Interpreter (AGI) that had been used to build all of Sierra’s current lineup of adventure games.

On the surface, the specifications of the first version of SCI hardly overwhelm. The standard display resolution of the engine was doubled, from a rather horrid 160 X 200 to a more reasonable (for the era) 320 X 200, with better support being added for mice and more complex animation possibilities being baked in. Notably, the first version of SCI did not support the impressive but expensive new MCGA and VGA graphics standards; even the technically aggressive Ken Williams had to agree that it was just too soon to be worth the investment.

Under the hood, however, the changes were far more extensive than they might appear on the surface. Jeff Stephenson, Sierra’s longtime technology guru, had created AGI on IBM’s dime and IBM’s timetable, in order to implement the original King’s Quest on the ill-fated PCjr. It was a closed and thus a limited system, albeit one that had proved far more flexible and served Sierra far better and longer than anyone had anticipated at the time of its creation. Still, Stephenson envisioned SCI as something very different from its predecessor: a more open-ended, modular system that could grow alongside the hardware it targeted, supporting ever denser and more colorful displays, ever better sound, eventually entirely new technologies like CD-ROM. As indicated by its name, which dropped any specific mention of adventure games, SCI was intended to be a universal engine potentially applicable to many gaming genres. To facilitate such ambitions, Stephenson  completely rewrote the language used for programming the engine, going from a simplistically cryptic scripting language to a full-fledged modern programming language reminiscent of C++, incorporating all the latest thinking about object-oriented coding.

Forward-thinking though it was, SCI proved a hard sell to Sierra’s little cadre of game-makers, most of whom lacked the grounding in computer science enjoyed by Jeff Stephenson; they would have been perfectly happy to stick with their simple AGI scripts, thank you very much. But time would show Stephenson to have been correct in designing SCI for the future. The SCI engine, steadily evolving all the while, would last for the remainder of Sierra’s life as an independent company, the technological bedrock for dozens of games to come.

Sierra planned to release their first three SCI-based adventure games in time for Christmas 1988 and that planned-for second-chance IPO: King’s Quest IV, Leisure Suit Larry II, and Police Quest II, with Space Quest III to follow early in 1989. (This lineup says much about Ken Williams’s sequel-obsessed marketing strategy. As an annual report from the period puts it, “Sierra attempts to exploit and extend the effective market life of a successful product by creating sequels to that product and introducing them at planned intervals, thereby stimulating interest in both the sequels and the original product.”) Of this group, King’s Quest IV was always planned as the real showcase for Sierra’s evolving technology, the game for which they would really pull out all the stops — understandably so given that, despite some recent challenges from one (Leisure Suit) Larry Laffer, Roberta Williams’s series of family-friendly fairy-tale adventures remained the most popular games in the Sierra catalog. Indeed, King’s Quest IV marked the beginning of a new, more proactive stance on Sierra’s part when it came to turning the still largely bland beige world of the MS-DOS machines into the new standard for computer gaming. Simply put, with MS-DOS’s consumer uptake threatening to stall again in the wake of the high prices and poor reception of the PS/2 line, Sierra decided to get out and push.

King’s Quest IV‘s most notable shove to the industry’s backside began almost accidentally, with one of Ken’s crazy ideas. He’d decided he’d like to have a real, Hollywood-style soundtrack in this latest King’s Quest, something to emphasize Sierra’s increasingly cinematic approach to adventure gaming in general. Further, he’d love it if said soundtrack could be written by a real Hollywood composer. Never reluctant to liaison with Tinseltown — Sierra had eagerly jumped into relationships with the likes of Jim Henson and Disney during their first heyday years before — he pulled out his old Rolodex and started dialing agents. Most never bothered to return his calls, but at last one of them arranged a meeting with William Goldstein. A former Motown producer, a Grammy-nominated composer for a number of films, and an Emmy-nominated former musical director for the television series Fame, Goldstein also nurtured an interest in electronic music, having worked on several albums of same. He found the idea of writing music for a computer game immediately intriguing. He and Ken agreed that what they wanted for King’s Quest IV was not merely a few themes to loop in the background but a full-fledged musical score, arguably the first such ever to be written for a computer game. As Goldstein explained it to Ken, “the purpose of a score is to evoke emotion, not to be hummed. Sometimes the score consists only of some chord being held and slowly becoming louder in order to create a feeling of tenseness. In creating a score, the instrument(s) it is composed for can be as important as the score itself.”

And therein lay the rub. When Ken demonstrated for him the primitive bleeps and bloops an IBM clone’s speaker was capable of playing, Goldstein pronounced writing a score for that blunt instrument to be equivalent to trying to shoot flies with a shotgun. But then he had an idea. Thanks to his work in other forms of electronic music, Goldstein enjoyed a relationship with the Roland Corporation, a longstanding Japanese maker of synthesizers. Just recently, Roland had released a gadget called the MT-32, a nine-channel synthesizer that plugged into an ordinary IBM-compatible computer. Maybe, Goldstein mused, he could write his score for the MT-32.

At first blush, it seemed a very problematic proposal. The MT-32, which typically went for $550 or more, was hardly an everyday piece of kit; it was aimed at the professional or at least the very serious amateur musician, not at gamers. Yet Ken decided that, faced with a classic chicken-and-egg situation, he needed to do something to move the needle on the deplorable state of IBM-compatible sound hardware. A showpiece game, like King’s Quest IV might become, could show the market what it had been missing and generate demand that might lead to more affordable audio solutions. And so Ken set Goldstein to work on the MT-32.

At the Summer Consumer Electronics Show in June of 1988, Sierra gave a series of invitation-only audiences a sneak preview of King’s Quest IV in the form of a nearly ten-minute opening “movie” — people would soon be saying “cut scene” — enhanced by Goldstein’s score. Sierra legend has it that it moved at least one woman to tears. “I feel bad even saying it,” remarks Sierra’s marketing director (and Ken Williams’s little brother) John Williams, “but it was then that we knew we had a winner.”


Such an extreme reaction may be difficult to fathom today; even in King’s Quest IV‘s own time, it’s hard to imagine Amiga owners used to, say, Cinemaware games being quite so awed as this one lady apparently was. But nevertheless, King’s Quest IV and its first real soundtrack score stands as a landmark moment in the evolution of computer games. The game did indeed do much to break the chicken-and-egg conundrum afflicting MS-DOS audio. Only shortly after Roland had released the MT-32, a Canadian company called Ad Lib had released a “Personal Computer Music System” of their own at a price of just $245. It left much to be desired in comparison to the MT-32, but it was certainly worlds better than a simple beeper; Sierra duly added Ad Lib support to King’s Quest IV and all the other SCI games before they shipped. And for Space Quest III, they enlisted the services of another sort of star composer: Bob Siebenberg, drummer of the rock band Supertramp. Thanks in large degree to Sierra’s own determined intervention, in this area at least their chosen platform was becoming steadily more desirable as a game machine.

But King’s Quest IV also advanced the state of the art of adventure gaming in other, less tech-centric ways. As evidenced by its prominent subtitle The Perils of Rosella, its protagonist is female. Hard as it may seem to believe today, when more adventure games than not seem to star women, this fact made King’s Quest IV almost unique in its day; Infocom’s commercially unsuccessful but artistically brilliant interactive romance novel Plundered Hearts is just about the only point of comparison that leaps to mind. Roberta confessed to no small trepidation over the choice at the time of King’s Quest IV‘s release: “I know it will be just fine with the women and girls who play the game, but how it will go over with some of the men, I don’t know.” She also admitted to some ambivalence about her choice in purely practical terms, stemming from differing expectations that are embedded so deeply in our culture that they’re often hard to spot at all until we’re confronted with them.

I have a lot of deaths in my games. My characters always die from falling or being thrown into a cauldron or something. And I always like to have them die in a funny way. It didn’t seem right; I don’t know why. I guess it’s because she’s a girl, and you don’t think a girl should be treated that way. But I got used to that too, until there was one death I had to deal with last week that I was real uncomfortable with. Was it throwing her in the cauldron? I’m not sure, but it was some death that seemed particularly unfeminine, not right.

And girls die differently. I discovered lots of these things, like the way she falls, which has to be different from the way a guy falls. It’s been an experience. And I think that men will find it fun and different because it’s from a different point of view.

One could wish that Roberta’s ambivalence about killing her new female heroine at every possible juncture had led her to consider the wisdom of indulging in all that indiscriminate player-killing at all, but such was not to be. In the end, the most surprising thing about King’s Quest IV‘s female protagonist would be how little remarked upon it was by players. Sounding almost disappointed, Roberta a few months after the game’s release noted that “I personally have not heard much about it.” “I thought it would get a lot of attention,” she went on. “It has gotten some, but nothing really dramatic”; “very few” of the letters she received about the game had anything at all to say about the female heroine.

But then, that non-reaction could of course be taken as a sign of progress in itself. One of the worthiest aspects of Sierra’s determination to turn computer gaming into a truly mainstream form of entertainment was their conviction that doing so must entail reaching far beyond the typical teenage-boy videogame demographic. Doubtless thanks to the relative paucity of hardcore action games and military simulations in their catalog as well as to their having a woman as their star designer, Sierra was always well ahead of most of the rest of their industry when it came to the diversity of their customer base. At a time when female players of other publishers’ games seldom got out of the single digits in percentage terms, Sierra could boast that fully one in four of their players was a woman or a girl; of other 1980s computer-game publishers, only Infocom could boast remotely comparable numbers. In the case of Roberta’s King’s Quest games, the number of female players rose as high as 40 percent, while women and girls wrote more than half of Roberta’s voluminous fan mail.

Sierra’s strides seem all the more remarkable in comparison to the benighted attitudes held by many other publishers. Mediagenic’s Bruce Davis, for instance, busy as usual formulating the modern caricature of the soulless videogame executive, declared vehemently that women and girls were “not a viable market” for games because of “profound” psychological differences that would always lead them to “shun” games. (One wonders what he makes of the modern gaming scene, vast swathes of which are positively dominated by female players.) The role model that Roberta Williams in particular became for many girls interested in games and/or computers should never be overlooked or minimized. Even as of this writing, eighteen years after Roberta published her last adventure game, John Williams tells me how people of a certain age “go crazy” upon learning he’s her brother-in-law, how he still gets at least two requests per week to put people in touch with her for an autograph, how there was an odd surge for a while there of newborn girls named Rosella and Roberta.

All of this only makes it tougher to reckon with the fact that Roberta’s actual games were so consistently poor in terms of fundamental design. King’s Quest IV is a particular lowlight in her checkered career, boasting some unfair howlers as bad as anything found in her legendarily insoluble Time Zone. At one point, you have to work your way through a horrendous sequence of random-seeming actions to wind up visiting an island, something you can only do one time. On this island is a certain magic bridle you’re going to need later in the game. But, incomprehensibly, the game not only doesn’t ever hint that the bridle may be present on the island, it literally refuses to show it to you even once you arrive there. The only way to find it is to walk around the island step by step, typing “look” again and again while facing in different directions, until you discover those pixels that should by all rights have depicted the bridle but for some reason don’t. Throw in climbing sequences that send you plummeting to your death if you move one pixel too far in the wrong direction, a brutal time limit, and plenty of other potential dead ends almost as heartless as the one just described, and King’s Quest IV becomes as unfair, unfun, frustrating, and downright torturous as any adventure game I’ve ever seen. It’s so bad that, rather than being dismissable as merely a disappointing game, it seems like a fundamentally broken game, thereby raising a question of ethics. Did a player who had just paid $40 for the game not deserve a product that was in fact a soluble adventure game? Even the trade press of King’s Quest IV‘s day, when not glorying over the higher-resolution graphics and especially that incredible soundtrack, had to acknowledge that the actual game underneath it all had some problems. Scorpia, the respected voice of adventure gaming for Computer Gaming World, filled her article on the game with adjectives like “exasperating,” “irritating,” “tedious,” and “boring”, before concluding that “it’s a matter of personal taste” — about as close to an outright pan as most magazine reviewers dared get in those days.

Roberta Williams, an example of that rare species of adventure-game designers who don’t actually play adventure games, likely had little idea just how torturous an experience her games actually were. Taken as a whole, Roberta’s consistent failings as a designer seemingly must stem from that inability to place herself in her player’s shoes, and from her own seeming disinterest in improving upon her previous works in any terms but those of their surface bells and whistles. That said, however, King’s Quest IV‘s unusually extreme failings, even in terms of a Roberta Williams design, quite obviously stemmed from the frenzied circumstances of its creation as well.

I should note before detailing those circumstances that Sierra was finally by the time of King’s Quest IV beginning to change some of the processes that had spawned so many bad adventure games during the company’s earlier years. By 1988, they finally had the beginnings of a real quality-assurance process, dedicating three employees full-time to thrashing away at their games and other software. But, welcome as it was to see testing happening in any form, Sierra’s conception of same focused on the trees rather than the forest. The testers spent their time chasing outright bugs, glitches, and typos, but feedback on more holistic aspects of design wasn’t really part of their brief. In other words, they might spend a great deal of time ensuring that a given sudden death worked correctly without it ever even occurring to them to think about whether that sudden death really needed to be there at all.

In the case of King’s Quest IV, however, even that circumscribed testing process broke down due to the pressure of external events. By the spring of 1988, Roberta had given her design for the game to the team of two artists and two programmers — all recent hires, more fruit of Sierra’s steady expansion — for implementation. Then, with IPO Attempt 2.0 now planned for October of that year and lots of other projects on the boil as well, nobody in management paid King’s Quest IV a whole lot more attention for quite some time, simply assuming that no news from its development team was good news and that it was coming along as expected. Al Lowe, who by the end of that summer had already finished designing and coding his Leisure Suit Larry sequel that was scheduled to ship shortly after King’s Quest IV, picks up the story from here:

King’s Quest IV was going to be the flagship product for the company when we went public. So, Ken and the money guys are busy going around the country, doing their dog-and-pony shows to Wall Street investors, saying, “This is a great company, you’re going to want to buy in, buy lots of stock. We’ve got this great product coming out that’s going to be the hit of the Christmas season.”

Finally, about the end of August, somebody said, “Has anybody looked at that game that’s supposed to be done in a month, that we’re supposed to ship in October? How’s it doing?” They went and looked at it, and the two programmers were lost. They had no clue. They had written a lot of code, but a lot of it was buggy, a lot of it didn’t take proper precautions. One of the big rules of programming is to never allow input at a time you don’t want it, but they had none of that. Everything was wide open. You could break it with a sneeze.

So, they called me and asked if I could come up that weekend — it was Labor Day weekend, Saturday — to look at the game. I did, and said, “Oh, my God, we’re in trouble.” I had a lot of stock options, and was hoping for a successful IPO myself. When I saw this, I said, “We’re in terrible shape. This isn’t going to make it.”

So, we devised a strategy over the weekend to bring every programmer in the company together on Labor Day Monday for a meeting. We said, “All hands are going to work on this title for the next month, and we’re going to finish this game in one month’s time because we’ve got to have it done by the end of September.” Do you remember the phrase from The Godfather, “We’ll go to the mattresses?” That’s what we did; we went to the mattresses. We all moved into the Sierra building. Everybody worked. They brought us food; they did our laundry; they got us hotel rooms. We basically just lived and ate and worked there, and when we needed to sleep we’d go to this hotel nearby. Then we’d get back up and do it again.

I took the lead on the project. I broke the game up into areas, and we assigned a programmer to each. As they finished their code, we had the whole company testing it. We’d distribute bug reports and talk about progress each morning. And by God, by the end of the month we had a game. It wasn’t perfect — it was a little buggy — but at least we had a game we could send out. And when we went public, it was a successful IPO.

Entertaining as this war story is, especially when told by a natural raconteur like Al Lowe, it could hardly result in anything but a bad adventure game. In a desperate flurry like this one, the first thing to fall by the wayside must be any real thoughtfulness about a game’s design or the player’s experience therein.

But despite its many design failings, King’s Quest IV did indeed deliver in spades as the discussion piece and IPO kick-starter it was intended to be. Sierra’s own promotional copy wasn’t shy about slathering on the purple prose in making the game’s case as a technical and aesthetic breakthrough. (In a first and only for Sierra, an AGI version of the game was also made for older systems, but it garnered little press interest and few sales in comparison to the “real” SCI version.)

King’s Quest IV sets a landmark in computer gaming with a new development system that transcends existing standards of computer graphics, sound, and animation. Powerfully dramatic, King’s Quest IV evokes emotion like no other computer game with unique combinations of lifelike animated personalities, beautiful landscapes, and soul-stirring music. Sierra has recreated the universe of King’s Quest to build a world that one moment will pull at your heartstrings, the next moment place terror in your heart.

Leveraging their best promotional asset, Sierra sent Roberta Williams, looking pretty, wholesome, and personable as ever, on a sort of “book tour” to software stores and media outlets across the country, signing autographs for long lines of fans everywhere she went. No one had attempted anything quite like this since the heyday of Trip Hawkins’s electronic artists/rock stars, and never as successfully as this. The proof was in the pudding: King’s Quest IV sold 100,000 copies in its first two weeks and received heaps of press coverage at a time when coverage of computer games in general was all but nonexistent in the Nintendo-obsessed mainstream media. Sales of the game may ultimately have reached as high as 500,000 copies. The IPO went off without a hitch this time on October 6, 1988: 1.4 million shares of common stock were issued at an opening price of $9 per share. Within a year, the stock would be flirting with a price of $20 per share.

In all their promotional efforts for King’s Quest IV and the rest of that first batch of SCI games, Sierra placed special emphasis on sound, the area where Ken Williams had chosen to try most aggressively to push the hardware forward. The relationship between Sierra and Roland grew so close that Thomas Beckmen, president of the latter company’s American division, joined Sierra’s board. But anyone from any of Roland’s rivals who feared that this relationship would lock them out needn’t have worried. Recognizing that even most purchasers of what they loved to describe as their “premium” products weren’t likely to splash out more than $500 on a high-end Roland synthesizer, Sierra pushed the cheaper Ad Lib alternative equally hard. In 1989, when a Singapore company called Creative Music Systems entered the fray with a cheaper knock-off of the Ad Lib design which they called the Game Blaster, Sierra took it to their bosom as well. (In the end, it was Creative who would be the big winners in the sound-card wars. Their Sound Blaster line, the successor to the Game Blaster, would become the ubiquitous standard for PC gaming through much of the 1990s.) Ken Williams went so far as to compare the latest Sierra games with the first “talkies” to invade the world of silent cinema. Given the sound that users of computers like the Amiga had been enjoying well before Sierra jumped on the bandwagon, this was perhaps a stretch, but it certainly made for good copy.

Ad Lib advertisementRoland advertisement

As part of their aggressive push to get sound cards into the machines of their customers, Sierra started selling the products of all three rival makers directly through their own catalog.

As part of their aggressive push to get sound cards into the machines of their customers, Sierra started selling the products of all three of the biggest rival makers of same directly through their own product catalogs.

Thanks to his own company’s efforts as much as those of anyone, Ken Williams was able to declare at the beginning of 1990 that during the previous year MS-DOS had become “the standard for entertainment software”; the cloudburst this latter-day Noah had been anticipating for so long had come at last. In a down year for the computer-game industry as a whole, which was suffering greatly under the Nintendo onslaught, MS-DOS and the Amiga had been the only platforms not to suffer a decline, with the former’s market share growing from 44 percent to 55 percent. Ken’s prediction that MS-DOS would go from being the majority platform to the absolutely dominant one as 1990 wore on would prove correct.

My guess is that as software publishers plan out their new year’s product schedules, versions of newer titles for machines which are in decline will either be shelved or delayed. Don’t be surprised if companies who traditionally have been strong Apple or Commodore publishers suddenly ship first on MS-DOS. Don’t be surprised if many new titles come out ONLY for MS-DOS next Christmas.

Ken’s emerging vision for Sierra saw his company as “part of the entertainment industry, not the computer industry.” An inevitable corollary to that vision, at least to Ken’s way of seeing things, was a focus on the “media” part of interactive media. In that spirit, he had hired in July of 1989 one Bill Davis, a director of more than 150 animated television commercials, for the newly created position of Sierra’s Creative Director. Davis introduced story-boarding and other new processes redolent of Hollywood, adding another largely welcome layer of systemization to Sierra’s traditionally laissez-faire approach to game development. But, tellingly, he had no experience working with games as games, and nothing much to say about the designs that lay underneath the surface of Sierra’s creations; these remained as hit-and-miss as ever.

The period between 1988 and 1998 or so — the heyday of MS-DOS gaming, before Windows 95/98 and its DirectX gaming layer changed the environment yet again — was one of enormous ferment in computer graphics and sound, when games could commercially thrive on surface sizzle alone. Ken Williams proved more adept at riding this wave than just about anyone else, hewing stolidly as ever to the ten-foot rule he’d formulated during his company’s earliest days: “If someone says WOW when they see the screen from ten feet away, you have them sold.” Sierra, like much of the rest of the industry, took all the wrong lessons from the many bad but pretty games that were so successful during this period, concluding that design could largely be left to take care of itself as long as a game looked exciting.

That Sierra games like King’s Quest IV did manage to be so successful despite their obvious underlying problems of design had much to do with the heady, unjaded times in which they were made — times in which a new piece of “bragware” for showing off one’s new hardware to best effect was worth a substantial price of admission quite apart from its value as a playable game. It also had something to do with Sierra’s masterful fan relations. The company projected an image as friendly and welcoming as their actual games were often unfriendly and obtuse. For instance, in another idea Ken nicked from Hollywood, by 1990 Sierra was offering free daily “studio tours” of their offices, complete with a slick pre-recorded “video welcome” from Roberta Williams herself, to any fan who happened to show up; for many a young fan, a visit to Sierra became the highlight of a family vacation to Yosemite. And of course the success of the King’s Quest games in particular had more than a little to do with the image of Roberta Williams, and the fact that the games were marketed almost as edutainment wares, drawing in a young, patient, and forgiving fan base who may not have fully comprehended that a King’s Quest was, at least theoretically, a game that could be won.

Still, these factors wouldn’t be enough to counter-balance fundamental issues of design forever. Well before the end of the 1990s, both Sierra and the adventure-gaming genre with which they would always be most identified would pay a steep price for too often making design an afterthought. Players, tired of being abused, bored with the lack of innovation in adventure-game design, and no longer quite so easy to wow with audiovisual flash alone, would begin to drift away; this trickle would become a flood which left the adventure genre commercially high and dry.

But all of that was still far in the future as of 1990. For now, Sierra was at the forefront of what they believed to be an emerging new form of mass entertainment, not quite a game, not quite a movie. Gross sales had risen to $21.1 million for the fiscal year ending March 31, 1989, then $29.1 million the following fiscal year. In 1990, they expanded their reach through the acquisition of Dynamix, a six-year-old Oregon-based development house with a rather odd mix of military simulations — after all, Sierra did want men as well as women to continue buying their products — and audio-visually rich if interactively problematic “interactive movies” in their portfolio. Sierra’s years in the MS-DOS wilderness were over; now that same MS-DOS represented the mainstream, soon virtually the only stream of American computer gaming. Some very, very good years lay ahead in commercial terms. And, it must be said, by no means would all of Sierra’s games be failures in terms of design; some talented and motivated designers would soon be using the company’s SCI technology to make interactive magic. So, having given poor King’s Quest IV such a hard time today, next time I’ll be kinder to a couple of other Sierra games that I really don’t like.

Nope… I love them.

(Sources: Computer Gaming World of December 1988; Byte of September 1987; Sierra’s newsletters dated Spring 1988, Winter 1988, Spring 1989, Autumn 1989, Spring 1990, Summer 1990; Sierra’s 10th Anniversary promotional brochure; press releases and annual reports found in the Sierra archive at the Strong Museum of Play. Much of this article is also drawn from personal email correspondence with John Williams and Corey Cole. And, last but far from least, Ken Gagne also shared with me the full audio of an interview he conducted with Al Lowe for Juiced.GS magazine. My huge thanks to John, Corey, and Ken!)

 
 

Tags: , ,