RSS

Category Archives: Interactive Fiction

Turning on, Booting up, and Jacking into Neuromancer

When a novel becomes notably successful, Hollywood generally comes calling to secure the film rights. Many an author naïvely assumes that the acquisition of film rights means an actual film will get made, and in fairly short order at that. And thus is many an author sorely disappointed. Almost every popular novelist who’s been around for a while has stories to tell about Hollywood’s unique form of development purgatory. The sad fact is that the cost of acquiring the rights to even the biggest bestseller is a drop in the bucket in comparison to the cost of making a film out of them. Indeed, the cost is so trivial in terms of Hollywood budgets that many studios are willing to splash out for rights to books they never seriously envision doing anything productive with at all, simply to keep them out of the hands of rivals and protect their own properties in similar genres.

One could well imagine the much-discussed but never-made movie of William Gibson’s landmark cyberpunk novel Neuromancer falling into this standard pattern. Instead, though, its story is far, far more bizarre than the norm — and in its weird way far more entertaining.

Our story begins not with the power brokers of Hollywood, but rather with two young men at the very bottom of the Tinseltown social hierarchy. Ashley Tyler and Jeffrey Kinart were a pair of surfer dudes and cabana boys who worked the swimming pool of the exclusive Beverly Hills Hotel. Serving moguls and stars every day, they noticed that the things they observed their charges doing really didn’t seem all that difficult at all. With a little luck and a little drive, even a couple of service workers like them could probably become players. Despite having no money, no education in filmmaking, and no real inroads with the people who tipped them to deliver poolside drinks, they hatched a plan in early 1985 to make a sequel to their favorite film of all time, the previous year’s strange postmodern action comedy The Adventures of Buckaroo Banzai Across the 8th Dimension.

The idea was highly problematic, not only for all of the reasons I’ve just listed but also because Buckaroo Banzai, while regarded as something of a cult classic today, had been a notorious flop in its own day, recouping barely a third of its production budget — hardly, in other words, likely sequel fodder. Nevertheless, Tyler and Kinart were able to recruit Earl Mac Rauch, the creator of the Buckaroo Banzai character and writer of the film’s screenplay, to join their little company-in-name-only, which they appropriately titled Cabana Boy Productions. As they made the rounds of the studios, the all-too-plainly clueless Tyler and Kinart didn’t manage to drum up much interest for their Buckaroo Banzai sequel, but the Hollywood establishment found their delusions of grandeur and surfer-boy personalities so intriguing that there was reportedly some talk of signing them to a deal — not to make a Buckaroo Banzai movie, but as the fodder for a television comedy, a sort of Beverly Hillbillies for the 1980s.

After some months, the cabana boys finally recognized that Buckaroo Banzai had little chance of getting resurrected, and moved on to wanting to make a movie out of the hottest novel in science fiction: William Gibson’s Neuromancer. Rauch’s own career wasn’t exactly going gangbusters; in addition to Buckaroo Banzai, he also had on his résumé New York, New York, mob-movie maestro Martin Scorsese’s misbegotten attempt to make a classic Hollywood musical. Thus he agreed to stick with the pair, promising to write the screenplay if they could secure the rights to Neuromancer. In the meantime, they continued to schmooze the guests at the Beverly Hills Hotel, making their revised pitch to any of them who would listen. Against the odds, they stumbled upon one guest who took them very seriously indeed.

As was all too easy to tell from her rictus smile, Deborah Rosenberg was the wife of a plastic surgeon. Her husband, Victor Rosenberg, had been in private practice in New York City since 1970, serving the rich, the famous, and the would-be rich and famous. He also enjoyed a profitable sideline as a writer and commentator on his field for the supermarket tabloids, the glossy beauty magazines, and the bored-housewife talk-show circuit, where he was a regular on programs like Live with Regis and Kathie Lee, The Oprah Winfrey Show, and Donahue. When business took him and his wife to Beverly Hills in late 1985, Deborah was left to loiter by the pool while her husband attended a medical convention. It was there that she made the acquaintance of Tyler and Kinart.

Smelling money, the cabana boys talked up their plans to her with their usual gusto despite her having nothing to do with the film industry. Unaccountably, Deborah Rosenberg thought the idea of making Neuromancer with them a smashing one, and convinced her husband to put up seed capital for the endeavor. Ashley Tyler actually followed the Rosenbergs back to New York and moved into their mansion as a permanent house guest while he and Deborah continued to work on their plans. There would be much speculation around both Hollywood and New York in the months to come about exactly what sort of relationship Deborah and Ashley had, and whether her husband a) was aware of Deborah’s possible extramarital shenanigans and b) cared if he was.

While the irony of Gibson’s book full of cosmetic surgeries and body modifications of all descriptions being adapted by a plastic surgeon would have been particularly rich, Victor took little active role in the project, seeming to regard it (and possibly Ashley?) primarily as a way to keep his high-maintenance wife occupied. He did, however, help her to incorporate Cabana Boy Productions properly in January of 1986, and a few weeks later, having confirmed that Neuromancer rather surprisingly remained un-optioned, offered William Gibson $100,000 for all non-print-media rights to the novel. Gibson was almost as naïve as Deborah and her cabana boys; he had never earned more than the most menial of wages before finishing the science-fiction novel of the decade eighteen months before. He jumped at the offer with no further negotiation whatsoever, mumbling something about using the unexpected windfall to remodel his kitchen. The film rights to the hottest science-fiction novel in recent memory were now in the hands of two California surfer dudes and a plastic surgeon’s trophy wife. And then, just to make the situation that much more surreal, Timothy Leary showed up.

I should briefly introduce Leary for those of you who may not be that familiar with the psychologist whom President Nixon once called “the most dangerous man in America.” At the age of 42 in 1963, the heretofore respectable Leary was fired from his professorship at Harvard, allegedly for skipping lectures but really for administering psychedelic drugs to students without proper authorization. Ousted by the establishment, he joined the nascent counterculture as an elder statesman and cool hippie uncle. Whilst battling unsuccessfully to keep LSD and similar drugs legal — by 1968, they would be outlawed nationwide despite his best efforts — Leary traveled the country delivering “lectures” that came complete with a live backing band, light shows, and more pseudo-mystical mumbo jumbo than could be found anywhere this side of a Scientology convention. In his encounters with the straight mainstream press, he strained to be as outrageous and confrontational as possible. His favorite saying became one of the most enduring of the entire Age of Aquarius: “Turn on, tune in, drop out.” Persecuted relentlessly by the establishment as the Judas who had betrayed their trust, Leary was repeatedly arrested for drug possession. This, of course, only endeared him that much more to the counterculture, who regarded each successive bust as another instance of his personal martyrdom for their cause. The Moody Blues wrote an oh-so-sixties anthem about him called “Legend of a Mind” and made it the centerpiece of their 1968 album In Search of the Lost Chord; the Beatles song “Come Together” was begun as a campaign anthem for Leary’s farcical candidacy for governor of California.

In January of 1970, Leary, the last person in the world on whom any judge was inclined to be lenient, was sentenced to ten years imprisonment by the state of California for the possession of two marijuana cigarettes. With the aid of the terrorist group the Weather Underground, he escaped from prison that September and fled overseas, first to Algeria, then to Switzerland, where, now totally out of his depth in the criminal underworld, he wound up being kept under house arrest as a sort of prize pet by a high-living international arms dealer. When he was recaptured by Swiss authorities and extradited back to the United States in 1972, it thus came as something of a relief for him. He continued to write books in prison, but otherwise kept a lower profile as the last embers of the counterculture burned themselves out. His sentence was commuted by California Governor Jerry Brown in 1976, and he was released.

Free at last, he was slightly at loose ends, being widely regarded as a creaky anachronism of a decade that already felt very long ago and far away; in the age of disco, cocaine was the wonderdrug rather than LSD. But in 1983, when he played Infocom’s Suspended, he discovered a new passion that would come to dominate the last thirteen years of his life. He wrote to Mike Berlyn, the author of the game, to tell him that Suspended had “changed his life,” that he had been “completely overwhelmed by the way the characters split reality into six pieces.” He had, he said, “not thought much of computers before then,” but Suspended “had made computers a reality” for him. Later that year, he visited Infocom with an idea for, as one employee of the company remembers it, “a personality that would sit on top of the operating system, observe what you did, and modify what the computer would do and how it would present information based on your personal history, what you’d done on the computer.” If such an idea seems insanely ambitious in the context of early 1980s technology, it perhaps points to some of the issues that would tend to keep Leary, who wasn’t a programmer and had no real technical understanding of how computers worked, at the margins of the industry. His flamboyance and tendency to talk in superlatives made him an uneasy fit with the more low-key personality of Infocom. Another employee remembers Leary as being “too self-centered to make a good partner. He wanted his name and his ideas on something, but he didn’t want us to tell him how to do it.”

Mind Mirror

His overtures to Infocom having come to naught, Leary moved on, but he didn’t forget about computers. Far from it. As the waves of hype about home computers rolled across the nation, Leary saw in them much the same revolutionary potential he had once seen in peace, love, and LSD — and he also saw in them, one suspects, a new vehicle to bring himself, an inveterate lover of the spotlight, back to a certain cultural relevance. Computers, he declared, were better than drugs: “the language of computers [gives] me the metaphor I was searching for twenty years ago.” He helpfully provided the media with a new go-to slogan to apply to his latest ideas, albeit one that would never quite catch on like the earlier had: “Turn on, boot up, jack in.” “Who controls the pictures on the screen controls the future,” he said, “and computers let people control their own screen.”

In that spirit, he formed a small software developer of his own, which he dubbed Futique. Futique’s one tangible product was Mind Mirror, published by Electronic Arts in 1986. It stands to this day as the single strangest piece of software Electronic Arts has ever released. Billed as “part tool, part game, and part philosopher on a disk,” Mind Mirror was mostly incomprehensible — a vastly less intuitive Alter Ego with all the campy fun of that game’s terrible writing and dubious psychological insights leached out in favor of charts, graphs, and rambling manifestos. Electronic Arts found that Leary’s cultural cachet with the average computer user wasn’t as great as they might have hoped; despite their plastering his name and picture all over the box, Mind Mirror resoundingly flopped.

It was in the midst of all this activity that Leary encountered William Gibson’s novel Neuromancer. Perhaps unsurprisingly given the oft-cited link between Gibson’s vision of an ecstatic virtual reality called the Matrix and his earlier drug experiences, Leary became an instant cyberpunk convert, embracing the new sub-genre with all of his characteristic enthusiasm. Gibson, he said, had written “the New Testament of the 21st century.” Having evidently decided that the surest route to profundity lay in placing the prefix “cyber-” in front of every possible word, he went on to describe Neuromancer as “an encyclopedic epic for the cyber-screen culture of the immediate future, and an inspiring cyber-theology for the Information Age.” He reached out to the man he had anointed as the cyber-prophet behind this new cyber-theology, sparking up an acquaintance if never quite a real friendship. It was probably through Gibson — the chain of events isn’t entirely clear — that Leary became acquainted with the management of Cabana Boy Productions and their plans for a Neuromancer film. He promptly jumped in with them.

Through happenstance and sheer determination, the cabana boys now had a real corporation with at least a modicum of real funding, the rights to a real bestselling novel, and a real professional screenwriter — and the real Timothy Leary, for whatever that was worth. They were almost starting to look like a credible operation — until, that is, they started to talk.

Cabana Boy’s attempts to sell their proposed $20 million film to Hollywood were, according to one journalist, “a comedy of errors and naïveté — but what they lack in experience they are making up for in showmanship.” Although they were still not taken all that seriously by anyone, their back story and their personalities were enough to secure brief write-ups in People and Us, and David Letterman, always on the lookout for endearing eccentrics to interview and/or make fun of on his late-night talk show, seriously considered having them on. “My bet,” concluded the journalist, “is that they’ll make a movie about Cabana Boy before Neuromancer ever gets off the ground.”

Around the middle of 1986, Cabana Boy made a sizzle reel to shop around the Hollywood studios. William Gibson and his agent  and his publicist with Berkley Books were even convinced to show up and offer a few pleasantries. Almost everyone comes across as hopelessly vacuous in this, the only actual film footage Cabana Boy would ever manage to produce.


Shortly after the sizzle reel was made, Earl Mac Rauch split when he was offered the chance to work on a biopic about comedian John Belushi. No problem, said Deborah Rosenberg and Ashley Tyler, we’ll just write the Neuromancer script ourselves — this despite neither of them having ever written anything before, much less the screenplay to a proverbial “major motion picture.” At about the same time, Jeffrey Kinart had a falling-out with his old poolside partner — his absence from the promo video may be a sign of the troubles to come — and left as well. Tyler himself left at the end of 1987, marking the exit of the last actual cabana boy from Cabana Boy, even as Deborah Rosenberg remained no closer to signing the necessary contracts to make the film than she had been at the beginning of the endeavor. On the other hand, she had acquired two entertainment lawyers, a producer, a production designer, a bevy of “financial consultants,” offices in three cities for indeterminate purposes, and millions of dollars in debt. Still undaunted, on August 4, 1988, she registered her completed script, a document it would be fascinating but probably kind of horrifying to read, with the United States Copyright Office.

While all this was going on, Timothy Leary was obsessing over what may very well have been his real motivation for associating himself with Cabana Boy in the first place: turning Neuromancer into a computer game, or, as he preferred to call it, a “mind play” or “performance book.” Cabana Boy had, you’ll remember, picked up all electronic-media rights to the novel in addition to the film rights. Envisioning a Neuromancer game developed for the revolutionary new Commodore Amiga by his own company Futique, the fabulously well-connected Leary assembled a typically star-studded cast of characters to help him make it. It included David Byrne, lead singer of the rock band Talking Heads; Keith Haring, a trendy up-and-coming visual artist; Helmut Newton, a world-famous fashion photographer; Devo, the New Wave rock group; and none other than William Gibson’s personal literary hero William S. Burroughs to adapt the work to the computer.

This image created for Timothy Leary's "mind play" of Neuromancer features the artist Keith Haring, who was to play the role of Case. Haring died of AIDS in 1990 at the age of just 31, but nevertheless left behind him a surprisingly rich legacy.

This image created for Timothy Leary’s “mind play” of Neuromancer features David Byrne of the band Talking Heads.

Leary sub-contracted the rights for a Neuromancer game from Cabana Boy, and was able to secure a tentative deal with Electronic Arts. But that fell through when Mind Mirror hit the market and bombed. Another tentative agreement, this time with Jim Levy’s artistically ambitious Activision, collapsed when the much more practical-minded Bruce Davis took over control of that publisher in January of 1987. Neuromancer was a property that should have had huge draw with the computer-game demographic, but everyone, it seemed, was more than a little leery of Leary and his avant-garde aspirations. For some time, the game project didn’t make much more headway than the movie.

Neuromancer the game was saved by a very unusual friendship. While Leary was still associated with Electronic Arts, an unnamed someone at the publisher had introduced him to the head of one of their best development studios, Brian Fargo of Interplay, saying that he thought the two of them “will get along well.” “Timothy and his wife Barbara came down to my office, and sure enough we all hit it off great,” remembers Fargo. “Tim was fascinated by technology; he thought about it and talked about it all the time. So I was his go-to guy for questions about it.”

Being friends with the erstwhile most dangerous man in America was quite an eye-opening experience for the clean-cut former track star. Leary relished his stardom, somewhat faded though its luster may have been by the 1980s, and gloried in the access it gave him to the trendy jet-setting elite. Fargo remembers that Leary “would take me to all the hottest clubs in L.A. I got to go to the Playboy Mansion when I was 24 years old; I met O.J. and Nicole Simpson at his house, and Devo, and David Byrne from Talking Heads. It was a good time.”

His deals with Electronic Arts and Activision having fallen through, it was only natural for Leary to turn at last to his friend Brian Fargo to get his Neuromancer game made. Accepting the project, hot property though Neuromancer was among science-fiction fans, wasn’t without risk for Fargo. Interplay was a commercially-focused developer whose reputation rested largely on their Bard’s Tale series of traditional dungeon-crawling CRPGs; “mind plays” hadn’t exactly been in their bailiwick. Nor did they have a great deal of financial breathing room for artistic experimentation. Interplay, despite the huge success of the first Bard’s Tale game in particular, remained a small, fragile company that could ill-afford an expensive flop. In fact, they were about to embark on a major transition that would only amplify these concerns. Fargo, convinced that the main reason his company wasn’t making more money from The Bard’s Tale and their other games was the lousy 15 percent royalty they were getting from Electronic Arts — a deal which the latter company flatly refused to renegotiate — was moving inexorably toward severing those ties and trying to go it alone as a publisher as well as a developer. Doing so would mean giving up the possibility of making more Bard’s Tale games; that trademark would remain with Electronic Arts. Without that crutch to lean on, an independent Interplay would need to make all-new hits right out of the gate. And, judging from the performance of Mind Mirror, a Timothy Leary mind play didn’t seem all that likely to become one.

Fargo must therefore have breathed a sigh of relief when Leary, perhaps growing tired of this project he’d been flogging for quite some time, perhaps made more willing to trust Fargo’s instincts by the fact that he considered him a friend, said he would be happy to step back into a mere “consulting” role. He did, however, arrange for William Gibson to join Fargo at his house one day to throw out ideas. Gibson was amiable enough, but ultimately just not all that interested, as he tacitly admitted: “I was offered a lot more opportunity for input than I felt capable of acting on. One thing that quickly became apparent to me was that I hadn’t the foggiest notion of the way an interactive computer game had to be constructed, the various levels of architecture involved. It was fascinating, but I felt I’d best keep my nose out of it and let talented professionals go about the actual business of making the game.” So, Fargo and his team, which would come to include programmer Troy A. Miles, artist Charles H.H. Weidman III, and writers and designers Bruce Balfour and Mike Stackpole, were left alone to make their game. While none of them was a William Gibson, much less a William S. Burroughs, they did have a much better idea of what made for a fun, commercially viable computer game than did anyone on the dream team Leary had assembled.

Three fifths of the team that wound up making the completed Neuromancer: Troy Miles, Charles H.H. Weidman III, and Bruce Balfour.

Three fifths of the team that wound up making Interplay’s Neuromancer: Troy Miles, Charles H.H. Weidman III, and Bruce Balfour.

One member of Leary’s old team did agree to stay with the project. Brian Fargo:

My phone rang one night at close to one o’clock in the morning. It was Timothy, and he was all excited that he had gotten Devo to do the soundtrack. I said, “That’s great.” But however I said it, he didn’t think I sounded enthused enough, so he started yelling at me that he had worked so hard on this, and he should get more excitement out of me. Of course, I literally had just woken up.

So, next time I saw him, I said, “Tim, you can’t do that. It’s not fair. You can’t wake me up out of a dead sleep and tell me I’m not excited enough.” He said, “Brian, this is why we’re friends. I really appreciate the fact that you can tell me that. And you’re right.”

But in the end, Devo didn’t provide a full soundtrack, only a chiptunes version of “Some Things Never Change,” a track taken from their latest album Total Devo which plays over Neuromancer‘s splash screen.

The opening of the game. Case, now recast as a hapless loser, not much better than a space janitor, wakes up face-down in a plate of "synth-spaghetti."

The opening of the game. Case, now recast as a hapless loser, not much better than a space janitor, wakes up face-down in a plate of “synth-spaghetti.”

As an adaptation of the novel, Neuromancer the game can only be considered a dismal failure. Like that of the book, the game’s story begins in a sprawling Japanese metropolis of the future called Chiba City, stars a down-on-his-luck console cowboy named Case, and comes to revolve around a rogue artificial intelligence named Neuromancer. Otherwise, though, the plot of the game has very little resemblance to that of the novel. Considered in any other light than the commercial, the license is completely pointless; this could easily have been a generic cyberpunk adventure.

The game’s tone departs if anything even further from its source material than does its plot. Out of a sense of obligation, it occasionally shoehorns in a few lines of Gibson’s prose, but, rather than even trying to capture the noirish moodiness of the novel, the game aims for considerably lower-hanging fruit. In what was becoming a sort of default setting for adventure-game protagonists by the late 1980s, Case is now a semi-incompetent loser whom the game can feel free to make fun of, inhabiting a science-fiction-comedy universe which has much more to do with Douglas Adams — or, to move the fruit just that much lower, Planetfall or Space Quest — than William Gibson. This approach tended to show up so much in adventure games for very practical reasons: it removed most of the burden from the designers of trying to craft really coherent, believable narratives out of the very limited suite of puzzle and gameplay mechanics at their disposal. Being able to play everything for laughs just made design so much easier. Cop-out though it kind of was, it must be admitted that some of the most beloved classics of the adventure-game genre use exactly this approach. Still, it does have the effect of making Neuromancer the game read almost like a satire of Neuromancer the novel, which can hardly be ideal, at least from the standpoint of the licenser.

And yet, when divorced of its source material and considered strictly as a computer game Neuromancer succeeds rather brilliantly. It plays on three levels, only the first of which is open to you in the beginning. Those earliest stages confine you to “meat space,”  where you walk around, talk with other characters, and solve simple puzzles. Once you find a way to get your console back from the man to whom you pawned it, you’ll be able to enter the second level. Essentially a simulation of the online bulletin-board scene of the game’s own time, it has you logging onto various “databases,” where you can download new programs to run on your console, piece together clues and passwords, read forums and email, and hack banks and other entities. Only around the midway point of the game will you reach the Matrix proper, a true virtual-reality environment. Here you’ll have to engage in graphical combat with ever more potent forms of ICE (“Intrusion Countermeasures Electronics”) to penetrate ever more important databases.

Particularly at this stage, the game has a strong CRPG component; not only do you need to earn money to buy ever better consoles, software, and “skill chips” that conveniently slot right into Case’s brain, but as Case fights ICE on the Matrix his core skills improve with experience. It’s a heady brew, wonderfully varied and entertaining. Despite the limitations of the Commodore 64, the platform on which it made its debut, Neuromancer is one of the most content-rich games of its era, with none of the endless random combats and assorted busywork that stretches the contemporaneous CRPGs of Interplay and others to such interminable lengths. Neuromancer ends just about when you feel it ought to end, having provided the addictive rush of building up a character from a weakling to a powerhouse without ever having bored you in the process.

Reading messages from The Scene... err, from Neuromancer's hacker underground.

Reading messages from the Scene… err, from Neuromancer‘s version of the hacker underground.

One of the more eyebrow-raising aspects of Neuromancer is the obvious influence that the real underground world of the Scene had on its. The lingo, the attitudes… all of it is drawn from pirate BBS culture, circa 1988. Ironically, the game evokes the spirit of the Scene far better than it does anything from Gibson’s novel, serving in this respect as a time capsule par excellence. At least some people at Interplay, it seems, were far more familiar with that illegal world than any upstanding citizen ought to have been. Neuromancer is merely one more chapter in the long shared history of legitimate software developers and pirates, who were always more interconnected and even mutually dependent than the strident rhetoric of the Software Publishers Association might lead one to suspect. Richard Garriott’s Akalabeth was first discovered by his eventual publisher California Pacific via a pirated version someone brought into the office; Sid Meier ran one of the most prolific piracy rings in Baltimore before he became one of the most famous game designers in history… the anecdotes are endless. Just to blur the lines that much more, soon after Neuromancer some cracking groups would begin to go legitimate, becoming game makers in their own rights.

Like other Interplay games from this period, Neuromancer is also notable for how far it’s willing to push the barriers of acceptability in what was still the games industry’s equivalent of pre-Hayes Code Hollywood. There’s an online sex board you can visit, a happy-ending massage parlor, a whore wandering the streets. Still, and for all that it’s not exactly a comedic revelation, I find the writing in Neuromancer makes it a more likable game than, say, Wasteland with its somewhat juvenile transgression for transgression’s sake. Neuromancer walks right up to that line on one or two occasions, but never quite crosses it in this critic’s opinion.

Of course, it’s not without some niggles. The interface, especially in the meat-space portions, is a little clunky; it looks like a typical point-and-click adventure game, but its control scheme is less intuitive than it appears, which can lead to some cognitive dissonance when you first start to play. But that sorts itself out once you get into the swing of things. Neuromancer is by far my favorite Interplay game of the 1980s, boldly original but also thoroughly playable — and, it should be noted, rigorously fair. Take careful notes and do your due diligence, and you can feel confident of being able to solve this one.

About to do battle with an artificial intelligence, the most fearsome of the foes you'll encounter in the Matrix.

About to do battle with an artificial intelligence, the most fearsome of the foes you’ll encounter in the Matrix.

Neuromancer was released on the Commodore 64 and the Apple II in late 1988 as one of Interplay’s first two self-published games. The other, fortunately for Interplay but perhaps unfortunately for Neuromancer‘s commercial prospects, was an Amiga game called Battle Chess. Far less conceptually ambitious than Neuromancer, Battle Chess was an everyday chess engine, no better or worse than dozens of other ones that could be found in the public domain, onto which Interplay had grafted “4 MB of animation” and “400 K of digitized sound” (yes, those figures were considered very impressive at the time). When you moved a piece on the board, you got to watch it walk over to its new position, possibly killing other pieces in the process. And that was it, the entire gimmick. But, in those days when games were so frequently purchased as showpieces for one’s graphics and sound hardware, it was more than enough. Battle Chess became just the major hit Interplay needed to establish themselves as a publisher, but in the process it sucked all of Neuromancer‘s oxygen right out of the room. Despite the strength of the license, the latter game went comparatively neglected by Interplay, still a very small company with very limited resources, in the rush to capitalize on the Battle Chess sensation. Neuromancer was ported to MS-DOS and the Apple IIGS in 1989 and to the Amiga in 1990 — in my opinion this last is the definitive version — but was never a big promotional priority and never sold in more than middling numbers. Early talk of a sequel, to have been based on William Gibson’s second novel Count Zero, remained only that. Neuromancer is all but forgotten today, one of the lost gems of its era.

I always make it a special point to highlight games I consider to be genuine classics, the ones that still hold up very well today, and that goes double if they aren’t generally well-remembered. Neuromancer fits into both categories. So, please, feel free to download the Amiga version from right here, pick up an Amiga emulator if you don’t have one already, and have at it. This one really is worth it, folks.

I’ll of course have much more to say about the newly self-sufficient Interplay in future articles. But as for the other players in today’s little drama:

Timothy Leary remained committed to using computers to “express the panoramas of your own brain” right up until he died in 1996, although without ever managing to bring any of his various projects, which increasingly hewed to Matrix-like three-dimensional virtual realities drawn from William Gibson, into anything more than the most experimental of forms.

William Gibson himself… well, I covered him in my last article, didn’t I?

Deborah Rosenberg soldiered on for quite some time alone with the cabana-boy-less Cabana Boy; per contractual stipulation, the Neuromancer game box said that it was “soon to be a major motion picture from Cabana Boy Productions.” And, indeed, she at last managed to sign an actual contract with Tri-Star Pictures on June 2, 1989, to further develop her screenplay, at which point Tri-Star would, “at its discretion,” “produce the movie.” But apparently Tri-Star took discretion to be the better part of valor in the end; nothing else was ever heard of the deal. Cabana Boy was officially dissolved on March 24, 1993. There followed years of litigation between the Rosenbergs and the Internal Revenue Service; it seems the former had illegally deducted all of the money they’d poured into the venture from their tax returns. (It’s largely thanks to the paper trail left behind by the tax-court case, which wasn’t finally settled until 2000, that we know as much about the details of Cabana Boy as we do.) Deborah Rosenberg has presumably gone back to being simply the wife of a plastic surgeon to the stars, whatever that entails, her producing and screenwriting aspirations nipped in the bud and tucked back away wherever it was they came from.

Earl Mac Rauch wrote the screenplay for Wired, the biopic about John Belushi, only to see it greeted with jeers and walk-outs at the 1989 Cannes Film Festival. It went on to become a critical and financial disaster. Having collected three strikes in the form of New York, New York, Buckaroo Banzai, and now Wired, Rauch was out. He vanished into obscurity, although I understand he has resurfaced in recent years to write some Buckaroo Banzai graphic novels.

And as for our two cabana boys, Ashley Tyler and Jeffrey Kinart… who knows? Perhaps they’re patrolling some pool somewhere to this day, regaling the guests with glories that were or glories that may, with the right financial contribution, yet be.

(Sources: Computer Gaming World of September 1988; The Games Machine of October 1988; Aboriginal Science Fiction of October 1986; AmigaWorld of May 1988; Compute! of October 1991; The One of February 1989; Starlog of July 1984; Spin of April 1987. Online sources include the sordid details of the Cabana Boy tax case, from the United States Tax Court archive and an Alison Rhonemus’s blog post on some of the contents of Timothy Leary’s papers, which are now held at the New York Public Library. I also made use of the Get Lamp interview archives which Jason Scott so kindly shared with me. Finally, my huge thanks to Brian Fargo for taking time from his busy schedule to discuss his memories of Interplay’s early days with me.)

 
40 Comments

Posted by on November 11, 2016 in Digital Antiquaria, Interactive Fiction

 

Tags: , ,

The Prophet of Cyberspace

William Gibson

William Gibson was born on March 17, 1948, on the coast of South Carolina. An only child, he was just six years old when his father, a middle manager for a construction company, choked on his food and died while away on one of his many business trips. Mother and son moved back to the former’s childhood home, a small town in Virginia.

Life there was trying for the young boy. His mother, whom he describes today as “chronically anxious and depressive,” never quite seemed to get over the death of her husband, and never quite knew how to relate to her son. Gibson grew up “introverted” and “hyper-bookish,” “the original can’t-hit-the-baseball kid,” feeling perpetually isolated from the world around him. He found refuge, like so many similar personalities, in the shinier, simpler worlds of science fiction. He dreamed of growing up to inhabit those worlds full-time by becoming a science-fiction writer in his own right.

At age 15, desperate for a new start, Gibson convinced his mother to ship him off to a private school for boys in Arizona. It was by his account as bizarre a place as any of the environments that would later show up in his fiction.

It was like a dumping ground for chronically damaged adolescent boys. There were just some weird stories there, from all over the country. They ranged from a 17-year-old, I think from Louisiana, who was like a total alcoholic, man, a terminal, end-of-the-stage guy who weighed about 300 pounds and could drink two quarts of vodka straight up and pretend he hadn’t drunk any to this incredibly great-looking, I mean, beautiful kid from San Francisco, who was crazy because from age 10 his parents had sent him to plastic surgeons because they didn’t like the way he looked.

Still, the clean desert air and the forced socialization of life at the school seemed to do him good. He began to come out of his shell. Meanwhile the 1960s were starting to roll, and young William, again like so many of his peers, replaced science fiction with Beatles, Beats, and, most of all, William S. Burroughs, the writer who remains his personal literary hero to this day.

William Gibson on the road, 1967

William Gibson on the road, 1967

As his senior year at the boys’ school was just beginning, Gibson’s mother died as abruptly as had his father. Left all alone in the world, he went a little crazy. He was implicated in a drug ring at his school — he still insists today that he was innocent — and kicked out just weeks away from graduation. With no one left to go home to, he hit the road like Burroughs and his other Beat heroes, hoping to discover enlightenment through hedonism; when required like all 18-year-olds to register for the draft, he listed as his primary ambition in life the sampling of every drug ever invented. He apparently made a pretty good stab at realizing that ambition, whilst tramping around North America and, a little later, Europe for years on end, working odd jobs in communes and head shops and taking each day as it came. By necessity, he learned the unwritten rules and hierarchies of power that govern life on the street, a hard-won wisdom that would later set him apart as a writer.

In 1972, he wound up married to a girl he’d met on his travels and living in Vancouver, British Columbia, where he still makes his home to this day. As determined as ever to avoid a conventional workaday life, he realized that, thanks to Canada’s generous student-aid program, he could actually earn more money by attending university than he could working some menial job. He therefore enrolled at the University of British Columbia as an English major. Much to his own surprise, the classes he took there and the people he met in them reawakened his childhood love of science fiction and the written word in general, and with them his desire to write. Gibson’s first short story was published in 1977 in a short-lived, obscure little journal occupying some uncertain ground between fanzine and professional magazine; he earned all of $27 from the venture. Juvenilia though it may be, “Fragments of a Hologram Rose,” a moody, plot-less bit of atmospherics about a jilted lover of the near future who relies on virtual-reality “ASP cassettes” to sleep, already bears his unique stylistic stamp. But after writing it he published nothing else for a long while, occupying himself instead with raising his first child and living the life of a househusband while his wife, now a teacher with a Master’s Degree in linguistics, supported the family. It seemed a writer needed to know so much, and he hardly knew where to start learning it all.

It was punk rock and its child post-punk that finally got him going in earnest. Bands like Wire and Joy Division, who proved you didn’t need to know how to play like Emerson, Lake, and Palmer to make daring, inspiring music, convinced him to apply the same lesson to his writing — to just get on with it. When he did, things happened with stunning quickness. His second story, a delightful romp called “The Gernsback Continuum,” was purchased by Terry Carr, a legendary science-fiction editor and taste-maker, for the 1981 edition of his long-running Universe series of paperback short-story anthologies. With that feather in his cap, Gibson began regularly selling stories to Omni, one of the most respected of the contemporary science-fiction magazines. The first story of his that Omni published, “Johnny Mnemonic,” became the manifesto of a whole new science-fiction sub-genre that had Gibson as its leading light. The small network of writers, critics, and fellow travelers sometimes called themselves “The Movement,” sometimes “The Mirrorshades Group.” But in the end, the world would come to know them as the cyberpunks.

If forced to name one thing that made cyberpunk different from what had come before, I wouldn’t point to any of the exotic computer technology or the murky noirish aesthetics. I’d rather point to eight words found in Gibson’s 1982 story “Burning Chrome”: “the street finds its own use for things.” Those words signaled a shift away from past science fiction’s antiseptic idealized futures toward more organic futures extrapolated from the dirty chaos of the contemporary street. William Gibson, a man who out of necessity had learned to read the street, was the ideal writer to become the movement’s standard bearer. While traditional science-fiction writers were interested in technology for its own sake, Gibson was interested in the effect of technology on people and societies.

Cyberpunk, this first science fiction of the street, was responding to a fundamental shift in the focus of technological development in the real world. The cutting-edge technology of previous decades had been deployed as large-scale, outwardly focused projects, often funded with public money: projects like the Hoover Dam, the Manhattan Project, and that ultimate expression of macro-technology the Apollo moon landing. Even our computers were things filling entire floors, to be programmed and maintained by a small army of lab-coated drones. Golden-age science fiction was right on-board with this emphasis on ever greater scope and scale, extrapolating grand voyages to the stars alongside huge infrastructure projects back home.

Not long after macro-technology enjoyed its greatest hurrah in the communal adventure that was Apollo, however, technology began to get personal. In the mid-1970s, the first personal computers began to appear. In 1979, in an event of almost equal significance, Sony introduced the Walkman, a cassette player the size of your hand, the first piece of lifestyle technology that you could carry around with you. The PC and the Walkman begat our iPhones and Fitbits of today. And if we believe what Gibson and the other cyberpunks were already saying in the early 1980s, those gadgets will in turn beget chip implants, nerve splices, body modifications, and artificial organs. The public has become personal; the outward-facing has become inward-facing; the macro spaces have become micro spaces. We now focus on making ever smaller gadgets, even as we’ve turned our attention away from the outer space beyond our planet in favor of drilling down ever further into the infinitesimal inner spaces of genes and cells, into the tiniest particles that form our universe. All of these trends first showed up in science fiction in the form of cyberpunk.

In marked contrast to the boldness of his stories’ content, Gibson was peculiarly cautious, even hesitant, when it came to the process of writing and of making a proper career out of the act. The fact that Neuromancer, Gibson’s seminal first novel, came into being when it did was entirely down to the intervention of Terry Carr, the same man who had kick-started Gibson’s career as a writer of short stories by publishing “The Gernsback Continuum.” When in 1983 he was put in charge of a new “Ace Specials” line of science-fiction paperbacks reserved exclusively for the first novels of up-and-coming writers, Carr immediately thought again of William Gibson. A great believer in Gibson’s talent and potential importance, he cajoled him into taking an advance and agreeing to write a novel; Gibson had considered himself still “four or five years away” from being ready to tackle such a daunting task. “It wasn’t that vast forces were silently urging me to write,” he says. “It’s just that Terry Carr had given me this money and I had to make up some kind of story. I didn’t have a clue, so I said, ‘Well, I’ll plagiarize myself and see what comes of it.'” And indeed, there isn’t that much in 1984’s Neuromancer that would have felt really new to anyone who had read all of the stories Gibson had written in the few years before it. As a distillation of all the ideas with which he’d been experimenting in one 271-page novel, however, it was hard to beat.

 

Neuromancer

The plot is never the most important aspect of a William Gibson novel, and this first one is no exception to that rule. Still, for the record…

Neuromancer takes place at some indeterminate time in the future, in a gritty society where the planet is polluted and capitalism has run amok, but the designer drugs and technological toys are great if you can pay for them. Our hero is Case, a former “console cowboy” who used to make his living inside the virtual reality, or “Matrix,” of a worldwide computer network, battling “ICE” (“Intrusion Countermeasures Electronics”) and pulling off heists for fun and profit. Unfortunately for him, an ex-employer with a grudge has recently fried those pieces of Case’s brain that interface with his console and let him inject himself into “cyberspace.” Left stuck permanently in “meat” space, as the novel opens he’s a borderline suicidal, down-and-out junkie. But soon he’s offered the chance to get his nervous system repaired and get back into the game by a mysterious fellow named Armitage, mastermind of a ragtag gang of outlaws who are investigating mysterious happenings on the Matrix. Eventually they’ll discover a rogue artificial intelligence behind it all — the titular Neuromancer.

Given that plot summary, we can no longer avoid addressing the thing for which William Gibson will always first and foremost be known, whatever his own wishes on the matter: he’s the man who invented the term “cyberspace,” as well as the verb “to surf” it and with them much of the attitudinal vector that accompanied the rise of the World Wide Web in the 1990s. It should be noted that both neologisms actually predate Neuromancer in Gibson’s work, dating back to 1982’s “Burning Chrome.” And it should most definitely be noted that he was hardly the first to stumble upon many of the ideas behind the attitude. We’ve already chronicled some of the developments in the realms of theory and practical experimentation that led to the World Wide Web. And in the realm of fiction, a mathematician and part-time science-fiction writer named Vernor Vinge had published True Names, a novella describing a worldwide networked virtual reality of its own, in 1981; its plot also bears some striking similarities to that of Gibson’s later Neuromancer. But Vinge was (and is) a much more prosaic writer than Gibson, hewing more to science fiction’s sturdy old school of Asimov, Clarke, and Heinlein. He could propose the idea of a worldwide network and then proceed to work it out with much more technical rigorousness than Gibson could ever dream of mustering, but he couldn’t hope to make it anywhere near as sexy.

For many the most inexplicable thing about Gibson’s work is that he should ever have come up with all this cyberspace stuff in the first place. As he took a certain perverse delight in explaining to his wide-eyed early interviewers, in his real-world life Gibson was something of a Luddite even by the standards of the 1980s. He had, for instance, never owned or used a computer at the time he wrote his early stories and Neuromancer; he wrote of his sleek high-tech futures on a clunky mechanical typewriter dating from 1927. (Gibson immortalized it within Neuromancer itself by placing it in disassembled form on the desk of Julius Deane, an underworld kingpin Case visits early in the novel.) And I’ve seen no evidence that Gibson was aware of True Names prior to writing “Burning Chrome” and Neuromancer, much less the body of esoteric and (at the time) obscure academic literature on computer networking and hypertext.

Typically, Gibson first conceived the idea of the Matrix not from reading tech magazines and academic journals, as Vinge did in conceiving his own so-called “Other Plane,” but on the street, while gazing through the window of an arcade. Seeing the rapt stares of the players made him think they believed in “some kind of actual space behind the screen, someplace you can’t see but you know is there.” In Neuromancer, he describes the Matrix as the rush of a drug high, a sensation with which his youthful adventures in the counterculture had doubtless left him intimately familiar.

He closed his eyes.

Found the ridged face of the power stud.

And in the bloodlit dark behind his eyes, silver phosphenes boiling in from the edge of space, hypnagogic images jerking past like film compiled from random frames. Symbols, figures, faces, a blurred, fragmented mandala of visual information.

Please, he prayed, _now –_

A gray disk, the color of Chiba sky.

_Now –_

Disk beginning to rotate, faster, becoming a sphere of paler gray. Expanding —

And flowed, flowered for him, fluid neon origami trick, the unfolding of his distanceless home, his country, transparent 3D chessboard extending to infinity. Inner eye opening to the stepped scarlet pyramid of the Eastern Seaboard Fission Authority burning beyond the green cubes of Mitsubishi Bank of America, and high and very far away he saw the spiral arms of military systems, forever beyond his reach.

And somewhere he was laughing, in a white-painted loft, distant fingers caressing the deck, tears of release streaking his face.

Much of the supposedly “futuristic” slang in Neuromancer is really “dope dealer’s slang” or “biker’s talk” Gibson had picked up on his travels. Aside from the pervasive role played by the street, he has always listed the most direct influences on Neuromancer as the cut-up novels of his literary hero William S. Burroughs, the noirish detective novels of Dashiell Hammett, and the deliciously dystopian nighttime neon metropolis of Ridley Scott’s film Blade Runner, which in its exploration of subjectivity, the nature of identity, and the influence of technology on same hit many of the same notes that became staples of Gibson’s work. That so much of the modern world seems to be shaped in Neuromancer‘s image says much about Gibson’s purely intuitive but nevertheless prescient genius — and also something about the way that science fiction can be not only a predictor but a shaper of the future, an idea I’ll return to shortly.

But before we move on to that subject and others we should take just a moment more to consider how unique Neuromancer, a bestseller that’s a triumph of style as much as anything else, really is in the annals of science fiction. In a genre still not overly known for striking or elegant prose, William Gibson is one of the few writers immediately recognizable after just a paragraph or two. If, on the other hand, you’re looking for air-tight world-building and careful plotting, Gibson is definitely not the place to find it. “You’ll notice in Neuromancer there’s obviously been a war,” he said in an interview, “but I don’t explain what caused it or even who was fighting it. I’ve never had the patience or the desire to work out the details of who’s doing what to whom, or exactly when something is taking place, or what’s become of the United States.”

I remember standing in a record store one day with a friend of mine who was quite a good guitar player when Jimi Hendrix’s famous Woodstock rendition of “The Star-Spangled Banner” came over the sound system. “All he does is make a bunch of noise to cover it up every time he flubs a note,” said my friend — albeit, as even he had to agree, kind of a dazzling noise. I sometimes think of that conversation when I read Neuromancer and Gibson’s other early works. There’s an ostentatious, look-at-me! quality to his prose, fueled by, as Gibson admitted, his “blind animal panic” at the prospect of “losing the reader’s attention.” Or, as critic Andrew M. Butler puts it more dryly: “This novel demonstrates great linguistic density, Gibson’s style perhaps blinding the reader to any shortcomings of the novel, and at times distancing us from the characters and what Gibson the author may feel about them.” The actual action of the story, meanwhile, Butler sums up not entirely unfairly as, “Case, the hapless protagonist, stumbles between crises, barely knowing what’s going on, at risk from a femme fatale and being made offers he cannot refuse from mysterious Mr. Bigs.” Again, you don’t read William Gibson for the plot.

Which of course only makes Neuromancer‘s warm reception by the normally plot-focused readers of science fiction all the more striking. But make no mistake: it was a massive critical and commercial success, winning the Hugo and Nebula Awards for its year and, as soon as word spread following its very low-key release, selling like crazy. Unanimously recognized as the science-fiction novel of 1984, it was being labeled the novel of the decade well before the 1980s were actually over; it was just that hard to imagine another book coming out that could compete with its influence. Gibson found himself in a situation somewhat akin to that of Douglas Adams during the same period, lauded by the science-fiction community but never quite feeling a part of it. “Everyone’s been so nice,” he said in the first blush of his success, “but I still feel very much out of place in the company of most science-fiction writers. It’s as though I don’t know what to do when I’m around them, so I’m usually very polite and keep my tie on. Science-fiction authors are often strange, ill-socialized people who have good minds but are still kids.” Politeness or no, descriptions like that weren’t likely to win him many new friends among them. And, indeed, there was a considerable backlash against him by more traditionalist writers and readers, couched in much the same rhetoric that had been deployed against science fiction’s New Wave of writers of twenty years before.

But if we wish to find reasons that so much of science-fiction fandom did embrace Neuromancer so enthusiastically, we can certainly find some that were very practical if not self-serving, and that had little to do with the literary stylings of William S. Burroughs or extrapolations on the social import of technological development. Simply put, Neuromancer was cool, and cool was something that many of the kids who read it decidedly lacked in their own lives. It’s no great revelation to say that kids who like science fiction were and are drawn in disproportionate numbers to computers. Prior to Neuromancer, such kids had few media heroes to look up to; computer hackers were almost uniformly depicted as socially inept nerds in Coke-bottle glasses and pocket protectors. But now along came Case, and with him a new model of the hacker as rock star, dazzling with his Mad Skillz on the Matrix by day and getting hot and heavy with his girlfriend Molly Millions, who seemed to have walked into the book out of an MTV music video, by night. For the young pirates and phreakers who made up the Scene, Neuromancer was the feast they’d never realized they were hungry for. Cyberpunk ideas, iconography, and vocabulary were quickly woven into the Scene’s social fabric.

Like much about Neuromancer‘s success, this way of reading it, which reduced it down to a stylish exercise in escapism, bothered Gibson. His book was, he insisted, not about how cool it was to be “hard and glossy” like Case and Molly, but about “what being hard and glossy does to you.” “My publishers keep telling me the adolescent market is where it’s at,” he said, “and that makes me pretty uncomfortable because I remember what my tastes ran to at that age.”

While Gibson may have been uncomfortable with the huge appetite for comic-book-style cyberpunk that followed Neuromancer‘s success, plenty of others weren’t reluctant to forgo any deeper literary aspirations in favor of piling the casual violence and casual sex atop the casual tech. As the violence got ever more extreme and the sex ever more lurid, cyberpunk risked turning into the most insufferable of clichés.

Sensation though cyberpunk was in the rather insular world of written science fiction, William Gibson and the sub-genre he had pioneered filtered only gradually into the world outside of that ghetto. The first cyberpunk character to take to the screen arguably was, in what feels a very appropriate gesture, a character who allegedly lived within a television: Max Headroom, a curious computerized talking head who became an odd sort of cultural icon for a few years there during the mid- to late-1980s. Invented for a 1985 low-budget British television movie called Max Headroom: 20 Minutes into the Future, Max went on to host his own talk show on British television, to become an international spokesman for the ill-fated New Coke, and finally to star in an American dramatic series which managed to air 14 episodes on ABC during 1987 and 1988. While they lacked anything precisely equivalent to the Matrix, the movie and the dramatic series otherwise trafficked in themes, dystopic environments, and gritty technologies of the street not far removed at all from those of Neuromancer. The ambitions of Max’s creators were constantly curtailed by painfully obvious budgetary limitations as well as the pop-cultural baggage carried by the character himself; by the time of the 1987 television series he had become more associated with camp than serious science fiction. Nevertheless, the television series in particular makes for fascinating viewing for any student of cyberpunk history. (The series endeared itself to Commodore Amiga owners in another way: Amigas were used to create many of the visual effects used on the show, although not, as was occasionally reported, to render Max Headroom himself. He was actually played by an actor wearing a prosthetic mask, with various visual and auditory effects added in post-production to create the character’s trademark tics.)

There are other examples of cyberpunk’s slowly growing influence to be found in the film and television of the late 1980s and early 1990s, such as the street-savvy, darkly humorous low-budget action flick Robocop. But William Gibson’s elevation to the status of Prophet of Cyberspace in the eyes of the mainstream really began in earnest with a magazine called Wired, launched in 1993 by an eclectic mix of journalists, entrepreneurs, and academics. Envisioned as a glossy lifestyle magazine for the hip and tech-conscious — the initial pitch labeled it “the Rolling Stone of technology” — Wired‘s aesthetics were to a large degree modeled on William Gibson. When they convinced him to contribute a rare non-fiction article (on Singapore, which he described as “Disneyland with the death penalty”) to the fourth issue, the editors were so excited that they stuck the author rather than the subject of the article on their magazine’s cover.

Wired

Well-funded and editorially polished in all the ways that traditional technology journals weren’t, Wired was perfectly situated to become mainstream journalism’s go-to resource for understanding the World Wide Web and the technology bubble expanding around it. It was largely through Wired that “cyberspace” and “surfing” became indelible parts of the vocabulary of the age, even as both neologisms felt a long, long way in spirit from the actual experience of using the World Wide Web in those early days, involving as it did mostly text-only pages delivered to the screen at a glacial pace. No matter. The vocabulary surrounding technology has always tended to be grounded in aspiration rather than reality, and perhaps that’s as it should be. By the latter 1990s, Gibson was being acknowledged by even such dowdy organs as The New York Times as the man who had predicted it all five years before the World Wide Web was so much as a gleam in the eye of Tim Berners-Lee.

To ask whether William Gibson deserves his popular status as a prophet is, I would suggest, a little pointless. Yes, Vernor Vinge may have better claim to the title in the realm of fiction, and certainly people like Vannevar Bush, Douglas Engelbart, Ted Nelson, and even Bill Atkinson of Apple have huge claims on the raw ideas that turned into the World Wide Web. Even within the oeuvre of William Gibson himself, his predictions in other areas of personal technology and society — not least his anticipation of globalization and its discontents — strike me as actually more prescient than his rather vague vision of a global computerized Matrix.

Yet, whether we like it or not, journalism and popular history do tend to condense complexities down to single, easily graspable names, and in this case the beneficiary of that tendency is William Gibson. And it’s not as if he didn’t make a contribution. Whatever the rest did, Gibson was the guy who made the idea of a networked society — almost a networked consciousness — accessible, cool, and fun. In doing so, he turned the old idea of science fiction as prophecy on its head. Those kids who grew up reading Neuromancer became the adults who are building the technology of today. If, with the latest developments in virtual reality, we seem to be inching ever closer to a true worldwide Matrix, we can well ask ourselves who is the influenced and who is the influencer. Certainly Neuromancer‘s effect on our popular culture has been all but incalculable. The Matrix, the fifth highest-grossing film of 1999 and a mind-expanding pop-culture touchstone of its era, borrowed from Gibson to the extent of naming itself after his version of virtual reality. In our own time, it’s hard to imagine current too-cool-for-school television hits like Westworld, Mr. Robot, and Black Mirror existing without the example of Neuromancer (or, at least, without The Matrix and thus by extension Neuromancer). The old stereotype of the closeted computer nerd, if not quite banished to the closet from which it came, does now face strong competition indeed. Cyberpunk has largely faded away as a science-fiction sub-genre or even just a recognized point of view, not because the ideas behind it died but because they’ve become so darn commonplace.

You may have noticed that up to this point I’ve said nothing about the books William Gibson wrote after Neuromancer. That it’s been so easy to avoid doing so says much about his subsequent career, doomed as it is always to be overshadowed by his very first novel. For understandable reasons, the situation hasn’t always sat well with Gibson himself. Already in 1992, he could only wryly reply, “Yeah, and they’ll never let me forget it,” when introduced as the man who invented cyberspace — this well before his mainstream fame as the inventor of the word had really even begun to take off. Writing a first book with the impact of Neuromancer is not an unalloyed blessing.

That said, one must also acknowledge that Gibson didn’t do his later career any favors in getting out from under Neuromancer‘s shadow. Evincing that peculiar professional caution that always sat behind his bold prose, he mined the same territory for years, releasing a series of books whose titles — Count Zero, Mona Lisa Overdrive, Virtual Light — seem as of a piece as their dystopic settings and their vaguely realized plots. It’s not that these books have nothing to say; it’s rather that almost everything they do say is already said by Neuromancer. His one major pre-millennial departure from form, 1990’s The Difference Engine, is an influential exercise in Victorian steampunk, but also a book whose genesis owed much to his good friend and fellow cyberpunk icon Bruce Sterling, with whom he collaborated on it.

Here’s the thing, though: as he wrote all those somewhat interchangeable novels through the late 1980s and 1990s, William Gibson was becoming a better writer. His big breakthrough came with 2003’s Pattern Recognition, in my opinion the best pure novel he’s ever written. Perhaps not coincidentally, Pattern Recognition also marks the moment when Gibson, who had been steadily inching closer to the present ever since Neuromancer, finally decided to set a story in our own contemporary world. His prose is as wonderful as ever, full of sentence after sentence I can only wish I’d come up with, yet now free of the look-at-me! ostentation of his early work. One of the best ways to appreciate how much subtler a writer Gibson has become is to look at his handling of his female characters. Molly Millions from Neuromancer was every teenage boy’s wet dream come to life. Cayce, the protagonist of Pattern Recognition — her name is a sly nod back to Neuromancer‘s Case — is, well, just a person. Her sexuality is part of her identity, but it’s just a part. A strong, capable, intelligent character, she’s not celebrated by the author for any of these qualities. Instead she’s allowed just to be. This strikes me as a wonderful sign of progress — for William Gibson, and perhaps for all of us.

Which isn’t to say that Gibson’s dystopias have turned into utopias. While his actual plots remain as underwhelming as ever, no working writer of today that I’m aware of captures so adroitly the sense of dislocation and isolation that has become such a staple of post-millennial life — paradoxically so in this world that’s more interconnected than ever. If some person from the future or the past asked you how we live now, you could do a lot worse than to simply hand her one of William Gibson’s recent novels.

Whether Gibson is still a science-fiction writer is up for debate and, like so many exercises in labeling, ultimately inconsequential. There remains a coterie of old fans unhappy with the new direction, who complain about every new novel he writes because it isn’t another Neuromancer. By way of compensation, Gibson has come to be widely accepted as a writer of note outside of science-fiction fandom — a writer of note, that is, for something more than being the inventor of cyberspace. That of course doesn’t mean he will ever write another book with the impact of Neuromancer, but Gibson, who never envisioned himself as anything more than a cult writer in the first place, seems to have made his peace at last with the inevitability of the phrases “author of Neuromancer” and “coiner of the term ‘cyberspace'” appearing in the first line of his eventual obituary. Asked in 2007 by The New York Times whether he was “sick of being known as the writer who coined the word ‘cyberspace,'” he said he thought he’d “miss it if it went away.” In the meantime, he has more novels to write. We may not be able to escape our yesterdays, but we always have our today.

(Sources: True Names by Vernor Vinge; Conversations with William Gibson, edited by Patrick A. Smith; Bruce Sterling and William Gibson’s introductions to the William Gibson short-story collection Burning Chrome; Bruce Sterling’s preface to the cyberpunk short-story anthology Mirrorshades; “Science Fiction from 1980 to the Present” by John Clute and “Postmodernism and Science Fiction” by Andrew M. Butler, both found in The Cambridge Companion to Science Fiction; Spin of April 1987; Los Angeles Times of September 12 1993; The New York Times of August 19 2007; William Gibson’s autobiography from his website; “William Gibson and the Summer of Love” from the Toronto Dream Project; and of course the short stories and novels of William Gibson.)

 
 

Tags: ,

How Jordan Mechner Made a Different Sort of Interactive Movie (or, The Virtues of Restraint)

One can learn much about the state of computer gaming in any given period by looking to the metaphors its practitioners are embracing. In the early 1980s, when interfaces were entirely textual and graphics crude or nonexistent, text adventures like those of Infocom were heralded as the vanguard of a new interactive literature destined to augment or entirely supersede non-interactive books. That idea peaked with the mid-decade bookware boom, when just about every entertainment-software publisher (and a few traditional book publishers) were rushing to sign established authors and books to interactive projects. It then proceeded to collapse just as quickly under the weight of its own self-importance when the games proved less compelling and the public less interested than anticipated.

Prompted by new machines like the Commodore Amiga with their spectacular graphics and sound, the industry reacted to that failure by turning to the movies for media mentorship. This relationship would prove more long-lasting. By the end of the 1980s, companies like Cinemaware and Sierra were looking forward confidently to a blending of Hollywood and Silicon Valley that they believed might just replace the conventional non-interactive movie, not to mention computer games as people had known them to that point. Soon most of the major publishers would be conducting casting calls and hiring sound stages, trying literally to make games out of films. It was an approach fraught with problems — problems that were only slowly and grudgingly acknowledged by these would-be unifiers of Southern and Northern Californian entertainment. Before it ran its course, it spawned lots of really terrible games (and, it must be admitted, against all the odds the occasional good one as well).

Given the game industry’s growing fixation on the movies as the clock wound down on the 1980s, Jordan Mechner would seem the perfect man for the age. Struggling with the blessing or curse of an equally abiding love for both mediums, his professional life had already been marked by constant vacillation between movies and games. Inevitably, his love of film influenced him even when he was making games. But, perhaps because that love was so deep and genuine, he accomplished the blending in a more even-handed, organic way than would most of the multi-CD, multi-gigabyte interactive movies that would soon be cluttering store shelves. Mechner’s most famous game, by contrast, filled just two Apple II disk sides — less than 300 K in total. And yet the cinematic techniques it employs have far more in common with those found in the games of today than do those of its more literal-minded rivals.


 

As a boy growing up in the wealthy hamlet of Chappaqua, New York, Jordan Mechner dreamed of becoming “a writer, animator, or filmmaker.” But those ambitions got modified if not discarded when he discovered computers at his high school. Soon after, he got his hands on his own Apple II for the first time. Honing his chops as a programmer, he started contributing occasional columns on BASIC to Creative Computing magazine at the age of just 14. Yet fun as it was to be the magazine’s youngest contributor, his real reason for learning programming was always to make games. “Games were the only kind of software I knew,” he says. “They were the only kind that I enjoyed. At that time, I didn’t really see any use for a word processor or a spreadsheet.” He fell into the throes of what he describes as an “obsession” to get a game of his own published.

Initially, he did what lots of other game programmers were doing at the time: cloning the big standup-arcade hits for fun and (hopefully) profit. He made a letter-perfect copy of Atari’s Asteroids, changed the titular space rocks to bright bouncing balls in the interest of plausible deniability, and sent the resulting Deathbounce off to Brøderbund for consideration; what with Brøderbund having been largely built on the back of Apple Galaxian, an arcade clone which made no effort whatsoever to conceal its source material, the publisher seemed a very logical choice. But Doug Carlston was now trying to distance his company from such fare for reasons of reputation as well as his fear of Atari’s increasingly aggressive legal threats. Nice guy that he was, he called Mechner personally to explain why Deathbounce wasn’t for Brøderbund. He promised to send Mechner a free copy of Brøderbund’s latest hit, Choplifter, suggesting he think about whether he might be able to apply the programming chops he had demonstrated in Deathbounce to a more original game, as Choplifter‘s creator Dan Gorlin had done. Mechner remembers the conversation as well-nigh life-changing. He had been so immersed in the programming side of making games that the idea of doing an original design had never really occurred to him before: “I didn’t have to copy someone else’s arcade game. I was allowed to design my own!”

Carlston’s phone call came in May of 1982, when Mechner was finishing up his first year at Yale University; undecided about his major as he was so much else in his life at the time, he would eventually wind up with a Bachelors in psychology. We’re granted an unusually candid and personal glimpse into his life between 1982 and 1993 thanks to his private journals, which he published (doubtless in a somewhat expurgated form) in 2012. The early years paint a picture of a bright, sensitive young man born into a certain privilege that carries with it the luxury of putting off adulthood for quite some time. He romanticizes chance encounters (“I saw a heartbreakingly beautiful young blonde out of the corner of my eye. She was wearing a blue down vest. As she passed, our eyes met. She smiled at me. As I went out I held the door for her; her fingers grazed mine. Then she was gone.”); frets frequently about cutting classes and generally not being the man he ought to be (“I think Ben is the only person who truly comprehends the depths of how little classwork I do.”); alternates between grand plans accompanied by frenzies of activity and indecision accompanied by long days of utter sloth (“Here’s what I do do: listen to music. Browse in record stores. Read newspapers, magazines, play computer games, stare out the windows. See a lot of movies.”); muses with all the self-obliviousness of youth on whether he would prefer “writing a bestselling novel or directing a blockbusting film,” as if attaining fame and fortune was as simple as deciding on one or the other.

At Yale, film, that other constant of his creative life, came to the fore. He joined every film society he stumbled upon, signed up for every film-studies course in the catalog, and set about “trying to see in four years every film ever made”; Akira Kurosawa’s classic adventure epic Seven Samurai (a major inspiration behind Star Wars among other things) emerged as his favorite of them all. He also discovered an unexpected affinity for silent cinema, which naturally led him to compare that earliest era of film with the current state of computer games, a medium that seemed in a similar state of promising creative infancy. All of this, combined with the example of Choplifter and the karate lessons he was sporadically attending, led to Karateka, the belated fruition of his obsession with getting a game published.

To a surprising degree given his youth and naivete, Mechner consciously designed Karateka as the proverbial Next Big Thing in action games after the first wave of simple quarter munchers, whose market he watched collapse over the two-plus years he spent intermittently working on it. Plenty of fighting games had appeared on the Apple II and other platforms before, some of them very playable; Mechner wasn’t sure he could really improve on their templates when it came to pure game play. What he could do, however, was give his game some of the feel and emotional resonance of cinema. Reasoning that computer games were technically on par with the first decade or two of film in terms of the storytelling tools at his disposal, he mimicked the great silent-film directors in building his story out of the broadest archetypal elements: an unnamed hero must assault a mountain fortress to rescue an abducted princess, fighting through wave after wave of enemies, culminating in a showdown with the villain himself. He energetically cross-cut the interactive fighting sequences with non-interactive scenes of the villain issuing orders to his minions while the princess looks around nervously in her cell — a suspense-building technique from cinema dating back to The Birth of a Nation. He mimicked the horizontal wipes Kurosawa used for transitions in Seven Samurai; mimicked the scrolling textual prologue from Star Wars. When the player lost or won, he printed “THE END” on the screen in lieu of “GAME OVER.” And, indeed, he made it possible, although certainly not easy, to win Karateka and carry the princess off into the sunset. The player was, in other words, playing for bigger stakes than a new high score.

Karateka

The most technically innovative aspect of Karateka — suggested, like much in the game, by Mechner’s very supportive father — involved the actual people on the screen. To make his fighters move as realistically as possible, Mechner made use for the first time in a computer game of an old cartoon-animation technique known as rotoscoping. After shooting some film footage of his karate instructor in action, doing various kicks and punches, Mechner used an ancient Moviola editing machine that had somehow wound up in the basement of the family home to isolate and make prints out of every third frame. He imported the figure at the center of each print into his Apple II by tracing it on a contraption called the VersaWriter. Flipped through in sequence, the resulting sprites appeared to “move” in an unusually fluid and realistic fashion. “When I saw that sketchy little figure walk across the screen,” he wrote in his journal, “looking just like Dennis [his karate instructor], all I could say was ‘ALL RIGHT!’ It was a glorious moment.”

Karateka

Doug Carlston, who clearly saw something special in this earnest kid, was gently encouraging and almost infinitely patient with him. When it looked like Mechner had come up with something potentially great at last, Carlston signed him to a contract and flew him out to California in the summer of 1984 to finish it up with the help of Brøderbund’s in-house staff. Released just a little too late to fully capitalize on the 1984 Christmas rush, Karateka started slowly but gradually turned into a hit, especially once the Commodore 64 port dropped in June of 1985. Once ported to Nintendo for the domestic Japanese market, it proceeded to sell many hundreds of thousand units, making Jordan Mechner a very flush young man indeed.

So, Mechner, about to somehow manage to graduate despite all the missed assignments and cut classes spent working on Karateka, seemed poised for a fruitful career making games. Yet he continued to vacillate between his twin obsessions. Even as his game, the most significant accomplishment of his young life and one of which anyone could justly be proud, had entered the homestretch, he had written how “I definitely want my next project to be film-related. Videogames have taken up enough of my time for now.” In the wake of his game’s release, the steady stream of royalties therefrom only made it easier to dabble in film.

Mechner spent much of the year after graduating from university back at home in Chappaqua working on his first screenplay. In between writing dialog and wracking himself with doubt over whether he really wanted to do another game at all, he occasionally turned his attention to the idea of a successor to Karateka. Already during that first summer after Yale, he and Gene Portwood, a Brøderbund executive, dreamed up a scenario for just such a beast: an Arabian Nights-inspired story involving an evil sultan, a kidnapped princess, and a young man — the player, naturally — who must rescue her. Karateka in Middle Eastern clothing though it may have been in terms of plot, that was hardly considered a drawback by Brøderbund, given the success of Mechner’s first game.

Seven frames of animation ready to be photocopied and digitized.

Seven frames of animation ready to be photocopied and digitized.

Determined to improve upon the rotoscoping of Karateka, Mechner came up with a plan to film a moving figure and use a digitizer to capture the frames into the computer, rather than tracing the figure using the VersaWriter. He spent $2500 on a high-end VCR and video camera that fall, knowing he would return them before his month’s grace period was out (“I feel so dishonest,” he wrote in his journal). The technique he had in the works may have been an improvement over what he had done for Karateka, but it was still very primitive and hugely labor-intensive. After shooting his video, he would play it back on the VCR, pausing it on each frame he wanted to capture. Then he would take a picture of the screen using an ordinary still camera and get the film developed. Next step was to trace the outline of the figure in the photograph using Magic Marker and fill him in using White-Out. Then he would Xerox the doctored photograph to get a black-and-white version with a very clear silhouette of the figure. Finally, he would digitize the photocopy to import it into his Apple II, and erase everything around the figure by hand on the computer to create a single frame of sprite animation. He would then get to go through this process a few hundred more times to get the prince’s full repertoire of movements down.


On October 20, 1985, Jordan Mechner did his first concrete work on the game that would become Prince of Persia, using his ill-gotten video camera to film his 16-year-old brother David running and jumping through a local parking lot. When he finally got around to buying a primitive black-and-white image digitizer for his trusty Apple II more than six months later, he quickly determined that the footage he’d shot was useless due to poor color separation. Nevertheless, he saw potential magic.

I still think this can work. The key is not to clean up the frames too much. The figure will be tiny and messy and look like crap… but I have faith that, when the frames are run in sequence at 15 fps, it’ll create an illusion of life that’s more amazing than anything that’s ever been seen on an Apple II screen. The little guy will be wiggling and jiggling like a Ralph Bakshi rotoscope job… but he’ll be alive. He’ll be this little shimmering beacon of life in the static Apple-graphics Persian world I’ll build for him to run around in.

For months after that burst of enthusiasm, however, he did little more with the game.

At last in September of 1986, having sent his screenplay off to Hollywood and thus with nothing more to do on that front but wait, Mechner moved out to San Rafael, California, close to Brøderbund’s offices, determined to start in earnest on Prince of Persia. He spent much time over the next few months refining his animation technique, until by Christmas everyone who saw the little running and jumping figure was “bowled over” by him. Yet after that progress again slowed to a crawl, as he struggled to motivate himself to turn his animation demos into an actual game.

And then, on May 4, 1987, came the phone call that would stop the little running prince in his tracks for the better part of a year. A real Hollywood agent called to tell him she “loved” his script for Birthstone, a Spielbergian supernatural comedy/thriller along the lines of Gremlins or The Goonies. Within days of her call, the script was optioned by Larry Turman, a major producer with films like The Graduate on his resume. For months Mechner fielded phone calls from a diverse cast of characters with a diverse cast of suggestions, did endless rewrites, and tried to play the Hollywood game, schmoozing and negotiating and trying not to appear to be the awkward, unworldly kid he still largely was. Only when Birthstone seemed permanently stuck in development hell — “Hollywood’s the only town where you can die of encouragement,” he says wryly, quoting Pauline Kael —  did he give up and turn his attention back to games. Mechner notes today that just getting as far as he did with his very first script was a huge achievement and a great start in itself. After all, he was, if not quite hobnobbing with the Hollywood elite, at least getting rejection letters from such people as Michael Apted, Michael Crichton, and Henry Winkler; such people were reading his script. But he had been spoiled by the success of Karateka. If he wrote another screenplay, there was no guarantee it would get even as far as his first had. If he finished Prince of Persia, on the other hand, he knew Brøderbund would publish it.

And so, in 1988, it was back to games, back to Prince of Persia. Inspired by “puzzly” 8-bit action games like Doug Smith’s Lode Runner and Ed Hobbs’s The Castles of Dr. Creep, his second game was shaping up to be more than just a game of combat. Instead his prince would have to make his way through area after area full of tricks, traps, and perilous drops. “What I wanted to do with Prince of Persia,” Mechner says, “was a game which would have that kind of logical, head-scratching, fast-action, Lode Runner-esque puzzles in a level-based game but also have a story and a character that was trying to accomplish a recognizable human goal, like save a princess. I was trying to merge those two things.” Ideally, the game would play like the iconic first ten minutes of Raiders of the Lost Ark, in which Indiana Jones runs and leaps and dodges and sometimes outwits rather than merely outruns a series of traps. For a long while, Mechner planned to make the hero entirely defenseless, as a sort of commentary on the needless ultra-violence found in so many other games. In the end, he didn’t go that far — the allure of sword-fighting, not to mention commercial considerations, proved too strong — but Prince of Persia was nevertheless shaping up to be a far more ambitious, multi-faceted work than Karateka, boasting much more than just improved running and jumping animations.

With just 128 K of memory to work with on the Apple II, Mechner was forced to make Prince of Persia a modular design, relying on a handful of elements which are repeatedly reused and recombined. Take, for instance, the case of the loose floorboards. The first time they appear, they’re a simple trap: you have to jump over a section of the floor to avoid falling into a pit. Later, they appear on the ceiling, as part of the floor above your own; caught in an apparent cul de sac, you have to jump up and bash the ceiling to open an escape route. Still later, they can be used strategically: to kill guards below you by dropping the floorboards on their heads, or to hold down a pressure plate below you that opens a door on the level on which you’re currently standing. It’s a fine example of a constraint in game design turning into a strength. “There’s a certain elegance to taking an element the player is already familiar with,” says Mechner, “and challenging him to think about it in a different way.”


On July 14, 1989, Mechner shot the final footage for Prince of Persia: the denouement, showing the prince — now played by the game’s project manager at Brøderbund, Brian Ehler — embracing the rescued princess — played by Tina LaDeau, the 18-year-old daughter of another Brøderbund employee, in her prom dress. (“Man, she is a fox,” Mechner wrote in his journal. “Brian couldn’t stop blushing when I had her embrace him.”)

The game shipped for the Apple II on October 6, 1989. And then, despite a very positive review in Computer Gaming World — Charles Ardai called it nothing less than “the Star Wars of its field,” music to the ears of a movie buff like Mechner — it proceeded to sell barely at all: perhaps 500 units a month. It was, everyone at Brøderbund agreed, at least a year too late to hope to sell significant numbers of a game like this on the Apple II, whose only remaining commercial strength was educational software, thanks to the sheer number of the things still installed in American schools. Mechner’s procrastination and vacillation had spoiled this version’s commercial prospects entirely.

Thankfully, the Apple II version wasn’t to be the only one. Brøderbund already had programmers and artists working on ports to MS-DOS and the Amiga, the last two truly viable computer-gaming platforms in North America. Mechner as well turned his attention to the versions for these more advanced machines as soon as the Apple II version was finished. And once again his father pitched in, composing a lovely score for the luxuriously sophisticated sound hardware now at the game’s disposal. “This is going to be the definitive version of Prince of Persia,” Mechner enthused over the MS-DOS version. “With VGA [graphics] and sound card, on a fast machine, it’ll blow the Apple away. It looks like a Disney film. It’s the most beautiful game I’ve ever seen.” Reworked though they were in almost all particulars, at the heart of the new versions lay the same digitized film footage that had made the 8-bit prince run and leap so fluidly.

Prince of Persia

And yet, after it shipped on April 19, 1990, the MS-DOS version also disappointed. Mechner chafed over his publisher’s disinterest in promoting the game; they seemed on the verge of writing it off, noting how the vastly superior MS-DOS version was being regarded as just another port of an old 8-bit game, and thus would likely never be given a fair shake by press or public. True as ever to the bifurcated pattern of his life, he decided to turn back to film. Having tried and failed to get into New York University film school, he resorted to working as a production assistant in movies by way of supporting himself and trying to drum up contacts in the film-making community of New York. Thus the first anniversary of Prince of Persia‘s original release on the Apple II found him schlepping crates around New York City. His career as a game developer seemed to be behind him, and truth be told his prospects as a filmmaker didn’t look a whole lot brighter.

The situation began to reverse itself only after the Amiga version was finished — programmed, as it happened, by Dan Gorlin, the very fellow whose Choplifter had first inspired Mechner to look at his own games differently. In Europe, the Amiga’s stronghold, Prince of Persia was free of the baggage which it carried in North America — few in Europe had much idea of what an Apple II even was — and doubtless benefited from a much deeper and richer tradition on European computers of action-adventures and platform puzzlers. It received ebullient reviews and turned into a big hit on European Amigas, and its reputation gradually leaked back across the pond to turn it at last into a hit in its homeland as well. Thus did Prince of Persia become a slow grower of an international sensation — a very unusual phenomenon in the hits-driven world of videogames, where shelf lives are usually short and retailer patience shorter. Soon came the console releases, along with releases for various other European and Japanese domestic computers, sending total sales soaring to over 2 million units.

By the beginning of 1992, Mechner was far removed from his plight of just eighteen months before. He was drowning in royalties, consulting intermittently with Brøderbund on a Prince of Persia 2 — it was understood that his days in the programming trenches were behind him — and living a globetrotting lifestyle, jaunting from Paris to San Rafael to Madrid to New York as whim and business took him. He was also planning his first film, a short documentary to be shot in Cuba, and already beginning to mull over what would turn into his most ambitious and fascinating game production of all, known at this point only as “the train game.”

Prince of Persia, which despite the merits of that eventual “train game” is and will likely always remain Mechner’s signature work, strikes me most of all as a triumph of presentation. The actual game play is punishingly difficult. Each of its twelve levels is essentially an elaborate puzzle that can only be worked out by dying many times when not getting trapped into one of way too many dead ends. Even once you think you have it all worked out, you still need to execute every step with perfect precision, no mean feat in itself. Messing up at any point in the process means starting that level over again from the beginning. And, because you only have one hour of real time to rescue the princess, every failure is extremely costly; a perfect playthrough, accomplished with absolute surety and no hesitations, takes about half an hour, leaving precious little margin for error. At least there is a “save” feature that will let you bookmark each level starting with the third, so you don’t have to replay the whole game every time you screw up — which, believe me, you will, hundreds if not thousands of times before you finally rescue the princess. Beating Prince of Persia fair and square is a project for a summer vacation of those long-gone adolescent days when responsibilities were few and distractions fewer. As a busy adult, I find it too repetitive and too reliant on rote patterns, as well as — let’s be honest here — just too demanding on my aging reflexes. In short, the effort-to-reward ratio strikes me as way out of whack. Of course, I’m sure that, given Prince of Persia‘s status as a beloved icon of gaming, many of you have a different opinion.

So, let’s turn back to something on which we can hopefully all agree: the brilliance of that aforementioned presentation, which brings to aesthetic maturity many of the techniques Mechner had first begun to experiment with in Karateka. Rather than using filmed footage as a tool for the achievement of fluid, lifelike motion, as Mechner did, games during the years immediately following Prince of Persia would be plastered with jarring chunks of poorly acted, poorly staged “full-motion video.” Such spectacles look far more dated today than the restrained minimalism of Prince of Persia. The industry as a whole would take years to wind up back at the place where Jordan Mechner had started: appropriating some of the language of cinema in the service of telling a story and building drama, without trying to turn games into literal interactive movies. Mechner:

Just as theater is its own thing — with its own conventions, things that it does well, things it does badly — so is film, and so [are] computer games. And there is a way to borrow from one medium to another, and in fact that’s what an all-new medium does when it’s first starting out. Film, when it was new, looked like someone set up a camera front and center and filmed a staged play. Then the things that are specific to film — like the moving camera, close-ups, reaction shots, dissolves — all these kinds of things became part of the language of cinema. It’s the same with computer games. To take a long film sequence and to play that on your TV screen is the bad way to make a game cinematic. The computer game is not a VCR. But if you can borrow from the knowledge that we all carry inside our heads of how cuts work, how reaction shots work, what a low angle means dramatically, what it means when the camera suddenly pulls back… We’ve got this whole collective unconscious of the vocabulary of film, and that’s a tremendously valuable tool to bring into computer gaming.

In a medium that has always struggled to tamp down its instinct toward aesthetic maximalism, Mechner’s games still stand out for their concern with balance and proportion. Mechner again:

Visuals are [a] component where it’s often tempting to compromise. You think, “Well, we could put a menu bar across here, we could put a number in the upper right-hand corner of the screen representing how many potions you’ve drunk,” or something. The easy solution is always to do something that as a side effect is going to make the game look ugly. So I took as one of the ground rules going in that the overall screen layout had to be pleasing, had to be strong and simple. So that somebody who was not playing the game but who walked into the room and saw someone else playing it would be struck by a pleasing composition and could stop to watch for a minute, thinking, “This looks good, this looks as if I’m watching a movie.” It really forces you as a designer to struggle to find the best solution for things like inventory. You can’t take the first solution that suggests itself, you have to try to solve it within the constraints you set yourself.

Mechner’s take on visual aesthetics can be seen as a subversion of Ken Williams’s old “ten-foot rule,” which, as you might remember, stated that every Sierra game ought to be visually arresting enough to make someone say “Wow!” when glimpsing it from ten feet away across a crowded shop. Mechner believed that game visuals ought to be more than just striking; they ought to be aesthetically good by the more refined standards of film and the other, even older visual arts. All that time Mechner spent obsessing over films and film-making, which could all too easily be labeled a complete waste of time, actually allowed him to bring something unique to the table, something that made him different from virtually all of his many contemporaries in the interactive-movie business.

There are various ways to situate Jordan Mechner’s work in general and Prince of Persia in particular within the context of gaming history. It can be read as the last great swan song of the Apple II and, indeed, of the entire era of 8-bit computer gaming, at least in North America. It can be read as yet one more example of Brøderbund’s downright bizarre commercial Midas touch, which continued to yield a staggering number of hits from a decidedly modest roster of new releases (Brøderbund also released SimCity in 1989, thus spawning two of the most iconic franchises in gaming history within bare months of one another). It can be read as the precursor to countless cinematic action-adventures and platformers to come, many of whose designers would acknowledge it as a direct influence. In its elegant simplicity, it can even be read as a fascinating outlier from the high-concept complexity that would come to dominate American computer gaming in the very early 1990s. But the reading that makes me happiest is to simply say that Prince of Persia showed how less can be more.

(Sources: Game Design Theory and Practice by Richard Rouse III; The Making of Karateka and The Making of Prince of Persia by Jordan Mechner; Creative Computing of March 1979, September 1979, and May 1980; Next Generation of May 1998; Computer Gaming World of December 1989; Jordan Mechner’s Prince of Persia postmortem from the 2011 Game Developers Conference; “Jordan Mechner: The Man Who Would Be Prince” from Games™; the Jordan Mechner and Brøderbund archives at the Strong Museum of Play.)

 
 

Tags: , ,

Cinemaware’s Year in the Desert

The last year of the 1980s was also the last that the Commodore Amiga would enjoy as the ultimate American game machine. Even as the low-end computer-game market was being pummeled into virtual nonexistence by the Nintendo Entertainment System, leaving the Amiga with little room into which to expand downward, the heretofore business-centric world of MS-DOS was developing rapidly on the high end, with VGA graphics and sound cards becoming more and more common. The observant could already recognize that these developments, combined with Commodore’s lackadaisical attitude toward improving their own technology, must spell serious trouble for the Amiga in the long run.

But for now, for this one more year, things were still going pretty well. Amiga zealots celebrated loudly and proudly at the beginning of 1989 when news broke that the platform had pushed past the magic barrier of 1 million machines sold. As convinced as ever that world domination was just around the corner for their beloved “Amy,” they believed that number would have to lead to her being taken much more seriously by the big non-gaming software houses. While that, alas, would never happen, sales were just beginning to roll in many of the European markets that would sustain the Amiga well into the 1990s.

This last positive development fed directly into the bottom line of Cinemaware, the American software house that was the developer most closely identified with the Amiga to a large extent even in Europe. Cinemaware’s founder Bob Jacob wisely forged close ties with the exploding European Amiga market via a partnership with the British publisher Mirrorsoft. In this way he got Cinemaware’s games wide distribution and promotion throughout Europe, racking up sales across the pond under the Mirrorsoft imprint that often dramatically exceeded those Cinemaware was able to generate under their own label in North America. The same partnership led to another welcome revenue stream: the importation of European games into Cinemaware’s home country. Games like Speedball, by the rockstar British developers The Bitmap Brothers, didn’t have much in common with Cinemaware’s usual high-concept fare, but did feed the appetite of American youngsters who had recently found Amiga 500s under their Christmas trees for splashy, frenetic, often ultra-violent action.

Yet Cinemaware’s biggest claim to fame remained their homegrown interactive movies — which is not to say that everyone was a fan of their titular cinematic approach to game-making. A steady drumbeat of criticism, much of it far from unjustified, had accompanied the release of each new interactive movie since the days of Defender of the Crown. Take away all of the music and pretty pictures that surrounded their actual game play, went the standard line of attack, and these games were nothing but shallow if not outright broken exercises in strategy attached to wonky, uninteresting action mini-games. Cinemaware clearly took the criticism to heart despite the sales success they continued to enjoy. Indeed, the second half of the company’s rather brief history can to a large extent be read as a series of reactions to that inescapable negative drumbeat, a series of attempts to show that they could make good games as well as pretty ones.

At first, the new emphasis on depth led to decidedly mixed results. Conflating depth with difficulty in a manner akin to the way that so many adventure-game designers conflate difficulty with unfairness, Cinemaware gave the world Rocket Ranger as their second interactive movie of 1988. It had all the ingredients to be great, but was undone by balance issues exactly the opposite of those which had plagued the prototypical Cinemaware game, Defender of the Crown. In short, Rocket Ranger was just too hard, a classic game-design lesson in the dangers of overcompensation and the importance of extensive play-testing to get that elusive balance just right. With two more new interactive movies on the docket for 1989, players were left wondering whether this would be the year when Cinemaware would finally get it right.

Lords of the Rising Sun

Certainly they showed no sign of backing away from their determination to bring more depth to their games. On the contrary, they pushed that envelope still harder with Lords of the Rising Sun, their first interactive movie of 1989. At first glance, it was a very typical Cinemaware confection, a Defender of the Crown set in feudal Japan. Built like that older game from the tropes and names of real history without bothering to be remotely rigorous about any of it, Lords of the Rising Sun is also another strategy game broken up by action-oriented minigames — the third time already, following Defender of the Crown and Rocket Ranger, that Cinemaware had employed this template. This time, however, a concerted effort was made to beef up the strategy game, not least by making it into a much more extended affair. Lords of the Rising Sun became just the second interactive movie to include a save-game feature, and in this case it was absolutely necessary; a full game could absorb many hours. It thus departed more markedly than anything the company had yet done from Bob Jacob’s original vision of fast-playing, non-taxing, ultra-accessible games. Indeed, with a thick manual and a surprising amount of strategic and tactical detail to keep track of, Lords of the Rising Sun can feel more like an SSI than a typical Cinemaware game once you look past its beautiful audiovisual presentation. Reaching for the skies if not punching above their weight, Cinemaware even elected to include the option of playing the game as an exercise in pure strategy, with the action sequences excised.


But sadly, the strategy aspect is as inscrutable as a Zen koan. While Rocket Ranger presents with elegance and grace a simple strategy game that would be immensely entertaining if it wasn’t always kicking your ass, Lords of the Rising Sun is just baffling. You’re expected to move your armies over a map of Japan, recruiting allies where possible, fighting battles to subdue enemies where not. Yet it’s all but impossible to divine any real sense of the overall situation from the display. This would-be strategy game ends up feeling more random than anything else, as you watch your banners wander around seemingly of their own volition, bumping occasionally into other banners that may represent enemies or friends. It suffers mightily from a lack of clear status displays, making it really, really hard to keep track of who wants to do what to whom. If you have the mini-games turned on, the bird’s-eye view is broken up by arcade sequences that are at least as awkward as the strategy game. In the end, Lords of the Rising Sun is just no fun at all.

Lords of the Rising Sun's animated, scrolling map is nicer to look at than it is a practical tool for strategizing.

While it’s very pretty, Lords of the Rising Sun‘s animated, scrolling map is nicer to look at than it is a practical tool for strategizing.

Press and public alike were notably unkind to Lords of the Rising Sun. Claims like Bob Jacob’s that “there is more animation in Lords than has ever been done in any computer game” — a claim as unquantifiable as it was dubious, especially in light of some of Sierra’s recent efforts — did nothing to shake Cinemaware’s reputation for being all sizzle, no steak. Ken St. Andre of Tunnels & Trolls and Wasteland fame, reviewing the game for Questbusters magazine, took Cinemaware to task on its every aspect, beginning with the excruciating picture on the box of a cowering maiden about to fall out of her kimono; he deemed it “an insult to women everywhere and to Japanese culture in particular.” (Such a criticism sounds particularly forceful coming from St. Andre; Wasteland with its herpes-infested prostitutes and all the rest is hardly a bastion of political correctness.) He concluded his review with a zinger so good I wish I’d thought of it: he called the game “a Japanese Noh play.”

Many other reviewers, while less boldly critical, seemed nonplussed by the whole experience — a very understandable reaction to the strategy game’s vagaries. Sales were disappointing in comparison to those of earlier interactive movies, and the game has gone down in history alongside the equally underwhelming S.D.I. as perhaps the least remembered of all the Cinemaware titles.

It Came from the Desert

So, what with the game-play criticisms beginning to affect the bottom line, Cinemaware really needed to deliver something special for their second game of 1989. Thankfully, It Came from the Desert would prove to be the point where they finally got this interactive-movie thing right, delivering at long last a game as nice to play as it is to look at.


It Came from the Desert was the first of the interactive movies not to grow from a seed of an idea planted by Bob Jacob himself. Its originator was rather David Riordan, a newcomer to the Cinemaware fold with an interesting career in entertainment already behind him. As a very young man, he’d made a go of it in rock music, enjoying his biggest success in 1970 with a song called “Green-Eyed Lady,” a #3 hit he co-wrote for the (briefly) popular psychedelic band Sugarloaf. A perennial on Boomer radio to this day, that song’s royalties doubtless went a long way toward letting him explore his other creative passions after his music career wound down. He worked in movies for a while, and then worked with MIT on a project exploring the interactive potential of laser discs. After that, he worked briefly for Lucasfilm Games during their heady early days with Peter Langston at the helm. And from there, he moved on to Atari, where he worked on laser-disc-driven stand-up arcade games until it became obvious that Dragon’s Lair and its spawn had been the flashiest of flashes in the pan.

David Riordan on the job at Cinemaware.

David Riordan on the job at Cinemaware.

Riordan’s resume points to a clear interest in blending cinematic approaches with interactivity. It thus comes as little surprise that he was immediately entranced when he first saw Defender of the Crown one day at his brother-in-law’s house. It had, he says, “all the movie attributes and approaches that I had been trying to get George Lucas interested in” while still with Lucasfilm. He wrote to Cinemaware, sparking up a friendship with Bob Jacob which led him to join the company in 1988. Seeing in Riordan a man who very much shared his own vision for Cinemaware, Jacob relinquished a good deal of the creative control onto which he had heretofore held so tightly. Riordan was placed in charge of the company’s new “Interactive Entertainment Group,” which was envisioned as a production line for cranking out new interactive movies of far greater sophistication than those Cinemaware had made to date. These latest and greatest efforts were to be made available on a whole host of platforms, from their traditional bread and butter the Amiga to the much-vaunted CD-based platforms now in the offing from a number of hardware manufacturers. If all went well, It Came from the Desert would mark the beginning of a whole new era for Cinemaware.

Here we can see -- just barely; sorry for this picture's terrible fidelity -- Cinemaware's interactive-movie scripting tool, which they dubbed MasterPlan, running in HyperCard.

Here we can see — just barely; sorry for this picture’s terrible fidelity — Cinemaware’s scripting tool MasterPlan.

Cinemaware spent months making the technology that would allow them to make It Came from the Desert. Riordan’s agenda can be best described as a desire to free game design from the tyranny of programmers. If this new medium was to advance sufficiently to tell really good, interesting interactive stories, he reasoned, its tools would have to become something that non-coding “real” writers could successfully grapple with. Continuing to advance Cinemaware’s movie metaphors, his team developed a game engine that could largely be “scripted” in point-and-click fashion in HyperCard rather than needing to be programmed in any conventional sense. Major changes to the structure of a game could be made without ever needing to write a line of code, simply by editing the master plan of the game in a HyperCard tool Cinemaware called, appropriately enough, MasterPlan. The development process leveraged the best attributes of a number of rival platforms: Amigas ran the peerless Deluxe Paint for the creation of art; Macs ran HyperCard for the high-level planning; fast IBM clones served as the plumbing of the operation, churning through compilations and compressions. It was by anyone’s standards an impressive collection of technology — so impressive that the British magazine ACE, after visiting a dozen or more studios on a sort of grand tour of the American games industry, declared Cinemaware’s development system the most advanced of them all. Cinemaware had come a long way from the days of Defender of the Crown, whose development process had consisted principally of locking programmer R.J. Mical into his office with a single Amiga and a bunch of art and music and not letting him out again until he had a game. “If we ever get a real computer movie,” ACE concluded, “this is where it’s going to come from.”

It Came from the Desert

While it’s debatable whether It Came from the Desert quite rises to that standard, it certainly is Cinemaware’s most earnest and successful attempt at crafting a true interactive narrative since King of Chicago. The premise is right in their usual B-movie wheelhouse. Based loosely on the campy 1950s classic Them!, the game takes place in a small desert town with the charming appellation of Lizard Breath that’s beset by an alarming number of giant radioactive ants, product of a recent meteor strike. You play a geologist in town; “the most interesting rocks always end up in the least interesting places,” notes the introduction wryly. Beginning in your cabin, you can move about the town and its surroundings as you will, interacting with its colorful cast of inhabitants via simple multiple-choice dialogs and getting into scrapes of various sorts which lead to the expected Cinemaware action sequences. Your first priority is largely to convince the townies that they have a problem in the first place; this task you can accomplish by collecting enough evidence of the threat to finally gain the attention of the rather stupefyingly stupid mayor. Get that far, and you’ll be placed in charge of the town’s overall defense, at which point a strategic aspect joins the blend of action and adventure to create a heady brew indeed. Your ultimate goal, which you have just fifteen days in total to accomplish, is to find the ants’ main nest and kill the queen.

It Came from the Desert excels in all the ways that most of Cinemaware’s interactive movies excel. The graphics and sound were absolutely spectacular in their day, and still serve very well today; you can well-nigh taste the gritty desert winds. What makes it a standout in the Cinemaware catalog, however, is the unusual amount of attention that’s been paid to the design — to you the player’s experience. A heavily plot-driven game like this could and usually did go only one way in the 1980s. You probably know what I’m picturing: a long string of choke points requiring you to be in just the right place at just the right time to avoid being locked out of victory. Thankfully, It Came from the Desert steers well away from that approach. The plot is a dynamic thing rolling relentlessly onward, but your allies in the town are not entirely without agency of their own. If you fail to accomplish something, someone else might just help you out — perhaps not as quickly or efficiently as one might ideally wish, but at least you still feel you have a shot.

And even without the townies’ help, there are lots of ways to accomplish almost everything you need to. The environment as a whole is remarkably dynamic, far from the static set of puzzle pieces so typical of more traditional adventure games of this era and our own. There’s a lot going on under the hood in this one, far more than Cinemaware’s previous games would ever lead one to expect. Over the course of the fifteen days, the town’s inhabitants go from utterly unconcerned about the strange critters out there in the desert to full-on, backs-against-the-wall, fight-or-flight panic mode. By the end, when the ants are roaming at will through the rubble that once was Lizard Breath destroying anything and anyone in their path, the mood feels far more apocalyptic than that of any number of would-be “epic” games. One need only contrast the frantic mood at the end of the game with the dry, sarcastic tone of the beginning — appropriate to an academic stranded in a podunk town — to realize that one really does go on a narrative journey over the few hours it takes to play.

Which brings me to another remarkable thing: you can’t die in It Came from the Desert. If you lose at one of the action games, you wake up in the hospital, where you have the option of spending some precious time recuperating or trying to escape in shorter order via another mini-game. (No, I have no idea why a town the size of Lizard Breath should have a hospital.) In making sure that every individual challenge or decision doesn’t represent a zero-sum game, It Came from the Desert leaves room for the sort of improvisational derring-do that turns a play-through into a memorable, organic story. It’s not precisely that knowledge of past lives isn’t required; you’re almost certain to need several tries to finally save Lizard Breath. Yet each time you play you get to live a complete story, even if it is one that ends badly. Meanwhile you’re learning the lay of the land, learning to play more efficiently and getting steadily better at the action games, which are themselves unusually varied and satisfying by Cinemaware’s often dodgy standards. There are not just many ways to lose It Came from the Desert but also many paths to victory. Win or lose, your story in It Came from the Desert is your story; you get to own it. There’s a save-game feature, but I don’t recommend that you use it except as a bookmark when you really do need to do something else for a while. Otherwise just play along and let the chips fall where they may. At last, here we have a Cinemaware interactive movie that’s neither too easy nor too hard; this one is just right, challenging but not insurmountable.

It Came from the Desert evolves into a strategy game among other things, as you manuveur the town's forces to battle new infestations while you search for the main hive with the queen to put an end to the menace once and for all.

It Came from the Desert evolves into a strategy game among other things, as you deploy the town’s forces to battle each new ant infestation while you continue the search for the main hive.

Widely and justifiably regarded among the old-school Amiga cognoscenti of today as Cinemaware’s finest hour, It Came from the Desert was clearly seen as something special within Cinemaware as well back in the day; one only has to glance at contemporary comments from those who worked on the game to sense their pride and excitement. There was a sense both inside and outside their offices that Cinemaware was finally beginning to crack a nut they’d been gnawing on for quite some time. Even Ken St. Andre was happy this time. “Cinemaware’s large creative team has managed to do a lot of things very well indeed in this game,” he wrote, “and as a result they have produced a game that looks great, sounds great, moves along at a rapid pace, is filled with off-the-wall humor without being dumb, and is occasionally both gripping and exciting.”

When It Came from the Desert proved a big commercial success, Cinemaware pulled together some ideas that had been left out of the original game due to space constraints, combined them with a plot involving the discovery of a second ant queen, and made it all into a sequel subtitled Ant-Heads!. Released at a relatively low price only as an add-on for the original game — thus foreshadowing a practice that would get more and more popular as the 1990s wore on — Ant-Heads! was essentially a new MasterPlan script that utilized the art and music assets from the original game, a fine demonstration of the power of Cinemaware’s new development system. It upped the difficulty a bit by straitening the time limit from fifteen days to ten, but otherwise played much like the original — which, considering how strong said original had been, suited most people just fine.

It Came from the Desert, along with the suite of tools used to create it, might very well have marked the start of exactly the new era of more sophisticated Cinemaware interactive movies that David Riordan had intended it to. As things shook out, however, it would have more to do with endings than beginnings. Cinemaware would manage just one more of these big productions before being undone by bad decisions, bad luck, and a changing marketplace. We’ll finish up with the story of their visionary if so often flawed games soon. In the meantime, by all means go play It Came from the Desert if time and motivation allow. I was frankly surprised at how well it still held up when I tackled it recently, and I think it just might surprise you as well.

(Sources: The One from April 1989, June 1989, and June 1990; ACE from April 1990; Commodore Magazine from November 1988; Questbusters from September 1989, February 1990, and May 1990; Matt Barton’s interview with Bob Jacob on Gamasutra.)

 
 

Tags: , , ,

The Manhole

The Manhole

Because the CD-ROM version of The Manhole sold in relatively small numbers in comparison to the original floppy version, the late Russell Lieblich’s surprisingly varied original soundtrack is too seldom heard today. So, in the best tradition of multimedia computing (still a very new and sexy idea in the time about which I’m writing), feel free to listen while you read.

The Manhole



Were HyperCard “merely” the essential bridge between Ted Nelson’s Xanadu fantasy and the modern World Wide Web, it would stand as one of the most important pieces of software of the 1980s. But, improbably, HyperCard was even more than that. It’s easy to get so dazzled by its early implementation of hypertext that one loses track entirely of the other part of Bill Atkinson’s vision for the environment. True to the Macintosh, “the computer for the rest of us,” Atkinson designed HyperCard as a sort of computerized erector set for everyday users who might not care a whit about hypertext for its own sake. With HyperCard, he hoped, “a whole new body of people who have creative ideas but aren’t programmers will be able to express their ideas or expertise in certain subjects.”

He made good on that goal. An incredibly diverse group of people worked with HyperCard, a group in which traditional hackers were very much the minority. Danny Goodman, the man who became known as the world’s foremost authority on HyperCard programming, was actually a journalist whose earlier experiences with programming had been limited to a few dabblings in BASIC. In my earlier article about hypertext and HyperCard, I wrote how “a professor of music converted his entire Music Appreciation 101 course into a stack.” Well, readers, I meant that literally. He did it himself. Industry analyst and HyperCard zealot Jan Lewis:

You can do things with it [HyperCard] immediately. And you can do sexy things: graphics, animation, sound. You can do it without knowing how to program. You get immediate feedback; you can make a change and see or hear it immediately. And as you go up on the learning curve — let’s say you learn how to use HyperTalk [the bundled scripting language] — again, you can make changes easily and simply and get immediate feedback. It just feels good. It’s fun!

And yet HyperCard most definitely wasn’t a toy. People could and did make great, innovative, commercial-quality software using it. Nowhere is the power of HyperCard — a cultural as well as a technical power — illustrated more plainly than in the early careers of Rand and Robyn Miller.

The Manhole

Rand and Robyn had a very unusual upbringing. The first and third of the four sons of a wandering non-denominational preacher, they spent their childhoods moving wherever their father’s calling took him: from Dallas to Albuquerque, from Hawaii to Haiti to Spokane. They were a classic pairing of left brain and right brain. Rand had taken to computers from the instant he was introduced to them via a big time-shared system whilst still in junior high, and had made programming them into his career. By 1987, the year HyperCard dropped, he was to all appearances settled in life: 28 years old, married with children, living in a small town in East Texas, working for a bank as a programmer, and nurturing a love for the Apple Macintosh (he’d purchased his first Mac within days of the machine’s release back in 1984). He liked to read books on science. His brother Robyn, seven years his junior, was still trying to figure out what to do with his life. He was attending the University of Washington in somewhat desultory fashion as an alleged anthropology major, but devoted most of his energy to drawing pictures and playing the guitar. He liked to read adventure novels.

HyperCard struck Rand Miller, as it did so many, with all the force of a revelation. While he was an accomplished enough programmer to make a living at it, he wasn’t one who particularly enjoyed the detail work that went with the trade. “There are a lot of people who love digging down into the esoterics of compilers and C++, getting down and dirty with typed variables and all that stuff,” he says. “I wanted a quick return on investment. I just wanted to get things done.” HyperCard offered the chance to “get things done” dramatically faster and more easily than any programming environment he had ever seen. He became an immediate convert.

The Manhole

With two small girls of his own, Rand felt keenly the lack of quality children’s software for the Macintosh. He hit upon the idea of making a sort of interactive storybook using HyperCard, a very natural application for a hypertext tool. Lacking the artistic talent to make a go of the pictures, he thought of his little brother Robyn. The two men, so far apart in years and geography and living such different lives, weren’t really all that close. Nevertheless, Rand had a premonition that Robyn would be the perfect partner for his interactive storybook.

But Robyn, who had never owned a computer and had never had any interest in doing so, wasn’t immediately enticed by the idea of becoming a software developer. Getting him just to consider the idea took quite a number of letters and phone calls. At last, however, Robyn made his way down to the Macintosh his parents kept in the basement of the family home in Spokane and loaded up the copy of HyperCard his brother had sent him. There, like so many others, he was seduced by Bill Atkinson’s creation. He started playing around, just to see what he could make. What he made right away became something very different from the interactive storybook, complete with text and metaphorical pages, that Rand had envisioned. Robyn:

I started drawing this picture of a manhole — I don’t even know why. You clicked on it and the manhole cover would slide off. Then I made an animation of a vine growing out. The vine was huge, “Jack and the Beanstalk”-style. And then I didn’t want to turn the page. I wanted to be able to navigate up the vine, or go down into the manhole. I started creating a navigable world by using the very simple tools [of HyperCard]. I created this place.  I improvised my way through this world, creating one thing after another. Pretty soon I was creating little canals, and a forest with stars. I was inventing it as I went. And that’s how the world was born.

For his part, Rand had no problem accepting the change in approach:

Immediately you are enticed to explore instead of turning the page. Nobody sees a hole in the ground leading downward and a vine growing upward and in the distance a fire hydrant that says, “Touch me,” and wants to turn the page. You want to see what those things are. Instead of drawing the next page [when the player clicked a hotspot], he [Robyn] drew a picture that was closer — down in the manhole or above on the vine. It was kind of a stream of consciousness, but it became a place instead of a book. He started sending me these images, and I started connecting them, trying to make them work, make them interactive.

The Manhole

In this fashion, they built the world of The Manhole together: Robyn pulling its elements from the flotsam and jetsam of his consciousness and drawing them on the screen, Rand binding it all together into a contiguous place, and adding sound effects and voice snippets here and there. If they had tried to make a real game of the thing, with puzzles and goals, such a non-designed approach to design would likely have gone badly wrong in a hurry.

Luckily, puzzles and goals were never the point of The Manhole. It was intended always as just an endlessly interesting space to explore. As such, it would prove capable of captivating children and the proverbial young at heart for hours, full as it was of secrets and Easter eggs hidden in the craziest of places. One can play with The Manhole on and off for literally years, and still continue to stumble upon the occasional new thing. Interactions are often unexpected, and unexpectedly delightful. Hop in a rowboat to take a little ride and you might emerge in a rabbit’s teacup. Start watching a dragon’s television — Why does a dragon have a television? Who knows! — and you can teleport yourself into the image shown on the screen to emerge at the top of the world. Search long enough, and you might just discover a working piano you can actually play. The spirit of the thing is perhaps best conveyed by the five books you find inside the friendly rabbit’s home: Alice in Wonderland; The Wind in the Willows; The Lion, the Witch, and the Wardrobe; Winnie the Pooh; and Metaphors of Intercultural Philosophy (“This book isn’t about anything!”). Like all of those books excepting, presumably, the last, The Manhole is pretty wonderful, a perfect blend of sweet cuteness and tart whimsy.

The Manhole

With no contacts whatsoever within the Macintosh software industry, the brothers decided to publish The Manhole themselves via a tiny advertisement in the back of Macworld magazine, taken out under the auspices of Prolog, a consulting company Rand had founded as a moonlighting venture some time before. They rented a tiny booth to show The Manhole publicly for the first time at the Hyper Expo in San Francisco in June of 1988. (Yes, HyperCard mania had gotten so intense that there were entire trade shows dedicated just to it.) There they were delighted to receive a visit from none other than HyperCard’s creator Bill Atkinson, with his daughter Laura in tow; not yet five years old, she had no trouble navigating through their little world. Incredibly, Robyn had never even heard the word “hypertext” prior to the show, had no idea about the decades of theory that underpinned the program he had used, savant-like, to create The Manhole. When he met a band of Ted Nelson’s disgruntled Xanadu disciples on the show floor, come to crash the HyperCard party, he had no idea what they were on about.

But the brothers’ most important Hyper Expo encounter was a meeting with Richard Lehrberg, Vice President for Product Development at Mediagenic, [1]Activision was renamed Mediagenic at almost the very instant that Lehrberg first met the Miller brothers. When the name change was greeted with universal derision, Activision/Mediagenic CEO Bruce Davis quickly began backpedaling on his hasty decision. The Manhole, for instance, was released by Mediagenic under their “Activision” label — which was odd because under the new ordering said label was supposed to be reserved for games, and The Manhole was considered children’s software, not a traditional game. I just stick with the name “Mediagenic” in this article as the least confusing way to address a confusing situation. who took a copy of The Manhole away with him for evaluation. Lehrberg showed it to William Volk, whom he had just hired away from the small Macintosh and Amiga publisher Aegis to become Mediagenic’s head of technology; he described it to Volk unenthusiastically as “this little HyperCard thing” done by “two guys in Texas.” Volk was much more impressed. He was immediately intrigued by one aspect of The Manhole in particular: the way that it used no buttons or conventional user-interface elements at all. Instead, the pictures themselves were the interface; you could just click where you would and see what happened. It was perhaps a product of Robyn Miller’s sheer naïveté as much as anything else; seasoned computer people, so used to conventional interface paradigms, just didn’t think like that. But regardless of where it came from, Volk thought it was genius, a breaking down of a wall that had heretofore always separated the user from the virtual world. Volk:

The Miller brothers had come up with what I call the invisible interface. They had gotten rid of the idea of navigation buttons, which was what everyone was doing: go forward, go backward, turn right, turn left. They had made the scenes themselves the interface. You’re looking at a fire hydrant. You click on the fire hydrant; the fire hydrant sprays water. You click on the fire hydrant again; you zoom in to the fire hydrant, and there’s a little door on the fire hydrant. That was completely new.

Of course, other games did have you clicking “into” their world to make things happen; the point-and-click adventure genre was evolving rapidly during this period to replace the older parser-driven adventure games. But even games like Déjà Vu and Maniac Mansion, brilliantly innovative though they were, still surrounded their windows into their worlds with a clutter of “verb” buttons, legacies of the genre’s parser-driven roots. The Manhole, however, presented the player with nothing but its world. What with its defiantly non-Euclidean — not to say nonsensical — representation of space and its lack of goals and puzzles, The Manhole wasn’t a conventional adventure game by any stretch. Nevertheless, it pointed the way to what the genre would become, not least in the later works of the Miller brothers themselves.

Much of Volk’s working life for the next two years would be spent on The Manhole, by the end of which period he would quite possibly be more familiar with its many nooks and crannies than its own creators were. He became The Manhole‘s champion inside Mediagenic, convincing his colleagues to publish it, thereby bringing it to a far wider audience than the Miller brothers could ever have reached on their own. Released by Mediagenic under their Activision imprint, it became a hit by the modest standards of the Macintosh consumer-software market. Macworld magazine named The Manhole the winner of their “Wild Card” category in a feature article on the best HyperCard stacks, while the Software Publishers Association gave it an “Excellence in Software” award for “Best New Use of a Computer.”

We aware that The Manhole was collecting a certain computer-chic cachet, Mediagenic/Activision didn't hesitate to play that angle up in their advertising.

Well aware that The Manhole was collecting a certain chic cachet to itself, Mediagenic/Activision didn’t hesitate to play that angle up in their advertising.

Had that been left to be that, The Manhole would remain historically interesting as both a delightful little curiosity of its era and as the starting point of the hugely significant game-development careers of the Miller brothers. Yet there’s more to the story.

William Volk, frustrated with the endless delays of CD-I and the state of paralysis the entire industry was in when it came to the idea of publishing entertainment software on CD, had been looking for some time for a way to break the logjam. It was Stewart Alsop, an influential tech journalist, who first suggested to Volk that the answer to his dilemma was already part of Mediagenic’s catalog — that The Manhole would be perfect for CD-ROM. Volk was just the person to see such a project through, having already experimented extensively with CD-ROM and CD-I  as part of Aegis as well as Mediagenic. With the permission of the Miller brothers, he recruited Russell Lieblich, Mediagenic’s longstanding guru in all things music- and sound-related, to compose and perform a soundtrack for The Manhole which would play from the CD as the player explored.

An important difference separates the way the music worked in the CD-ROM version of The Manhole from the way it worked in virtually all computer games to appear before it. The occasional brief digitized snippet aside, music in computer games had always been generated on the computer, whether by sound chips like the Commodore 64’s famous SID or entire sound boards like the top-of-its-class Roland MT-32 (we shall endeavor to forget the horrid beeps and squawks that issued from the IBM PC and Apple II’s native sound hardware). But The Manhole‘s music, while having been originally generated entirely or almost entirely on computers in Lieblich’s studio, was then recorded onto CD for digital playback, just like a song on a music CD. This method, made possible only by evolving computer sound hardware and, most importantly, by the huge storage capacity of a CD-ROM, would in the years to come slowly become simply the way that computer-game music was done. Today many big-budget titles hire entire orchestras to record soundtracks as elaborate and ambitious as the ones found in big Hollywood feature films, whilst also including digitized recordings of voices, squealing tires, explosions, and all the inevitable rest. In fact, surprisingly little of the sound present in most modern games is synthesized sound, a situation that has long since relegated elaborate setups like the Roland MT-32 to the status of white elephants; just pipe your digitized recording through a digital-to-analog converter and be done with it already.

As the very first title to go all digitized all the time, The Manhole didn’t have a particularly easy time of it; getting the music to play without breaking up or stuttering as the player explored presented a huge challenge on the Macintosh, a machine whose minimalist design burdened the CPU with all of the work of sound generation. However, Volk and his colleagues got it going at last. Published in the spring of 1989, the CD-ROM version of The Manhole marked a major landmark in the history of computing, the first American game — or, at least, software toy (another big buzzword of the age, as it happens) — to be released on CD-ROM. [2]The first CD-based software to reach European consumers says worlds about the differences that persisted between American and European computing, and about the sheer can-do ingenuity that so often allowed British programmers in particular to squeeze every last ounce of potential out of hardware that was usually significantly inferior to that enjoyed by their American counterparts. Codemasters, a budget software house based in Warwickshire, came up with a very unique shovelware package for the 1989 Christmas season. They transferred thirty old games from cassette to a conventional audio CD, which they then sold along with a special cable to run the output from an ordinary music-CD player into a Sinclair or Amstrad home computer. “Here’s your CD-ROM,” they said. “Have a ball.” By all accounts, Codemasters’s self-proclaimed “CD revolution,” kind of hilarious and kind of brilliant, did quite well for them. When it came to doing more with less in computing, you never could beat the Brits. Volk, infuriated with Philips for the chaos and confusion CD-I’s endless delays had wrought in an industry he believed was crying out for the limitless vistas of optical storage, sent them a copy of The Manhole along with a curt note: “See! We did it! We’re tired of waiting!”

And they weren’t done yet. Having gotten The Manhole working on CD-ROM on the Macintosh, Volk and his colleagues at Mediagenic next tackled the daunting task of porting it to the most popular platform for consumer software, MS-DOS — a platform without HyperCard. To address this lack, Mediagenic developed a custom engine for CD-ROM titles on MS-DOS, dubbing it the Multimedia Applications Development Environment, or MADE. [3]MADE’s scripting language was to some extent based on AdvSys, a language for amateur text-adventure creation that never quite took off like the contemporaneous AGT. Mediagenic’s in-house team of artists redrew Robyn Miller’s original black-and-white illustrations in color, and The Manhole on CD-ROM for MS-DOS shipped in 1990.

In my opinion, The Manhole lost a little bit of its charm when it was colorized. The VGA graphics, impressive in their day, look a bit garish today.

In my opinion, The Manhole lost some of its unique charm when it was colorized for MS-DOS. The VGA graphics, impressive in their day, look just a bit garish and overdone today in comparison to the classic pen-and-ink style of the original.

The Manhole, idiosyncratic piece of artsy children’s software that it was, could hardly have been expected to break the industry’s optical logjam all on its own. Its CD-ROM incarnation, for that matter, wasn’t all that hugely different from the floppy version. In the end, one has to acknowledge that The Manhole on CD-ROM was little more than the floppy version with a soundtrack playing in the background — a nice addition certainly, but perhaps not quite the transformative experience which all of the rhetoric surrounding CD-ROM’s potential might have led one to expect. It would take another few excruciating years for a CD-ROM drive to become a must-have accessory for everyday American computers. Yet every revolution has to start somewhere, and William Volk deserves his full measure of credit for doing what he could to push this one forward in the only way that could ultimately matter: by stepping up and delivering a real, tangible product at long last. As Steve Jobs used to say, “Real artists ship.”

The importance of The Manhole, existing as it does right there at the locus of so much that was new and important in computing in the late 1980s, can be read in so many ways that there’s always a danger of losing some of them in the shuffle. But it should never be forgotten whilst trying to sort through the tangle that this astonishingly creative little world was principally designed by someone who had barely touched a computer in his life before he sat down with HyperCard. That he wound up with something so fascinating is a huge tribute not just to Robyn Miller and his enabling brother Rand, but also to Bill Atkinson’s HyperCard itself. Apple has long since abandoned HyperCard, and we enjoy no precise equivalent to it today. Indeed, its vision of intuitive, non-pretentious, fun programming is one that we’re in danger of losing altogether. Being one who loves the computer most of all as the most exciting tool for creation ever invented, I can’t help but see that as a horrible shame.

The Miller brothers had, as most of you reading this probably know, a far longer future in front of them than HyperCard would get to enjoy. Already well before 1988 was through they had rechristened themselves Cyan Productions, a name that felt much more appropriate for a creative development house than the businesslike Prolog. As Cyan, they made two more pieces of children’s software, Cosmic Osmo and the Worlds Beyond the Makerei and Spelunx and the Caves of Mr. Seudo. Both were once again made using HyperCard, and both were very much made in the spirit of The Manhole. And like The Manhole both were published on CD-ROM as well as floppy disk; the Miller brothers, having learned much from Mediagenic’s process of moving their first title to CD-ROM, handled the CD-ROM as well as the floppy versions themselves when it came to these later efforts. Opinions are somewhat divided on whether the two later Cyan children’s titles fully recapture the magic that has led so many adults and children alike over the years to spend so much time plumbing the depths of The Manhole. None, however, can argue with the significance of what came next, the Miller brothers’ graduation to games for adults — and, as it happens, another huge milestone in the slow-motion CD-ROM revolution. But that story, like so many others, is one that we’ll have to tell at another time.

(Sources: Amstrad Action of January 1990; Macworld of July 1988, October 1988, November 1988, March 1989, April 1989, and December 1989; Wired of August 1994 and October 1999; The New York Times of November 28 1989. Also the books Myst and Riven: The World of the D’ni by Mark J.P. Wolf and Prima’s Official Strategy Guide: Myst by Rick Barba and Rusel DeMaria, and the Computer Chronicles television episodes entitled “HyperCard,” “MacWorld Special 1988,” “HyperCard Update,” and “Hypertext.” Online sources include Robyn Miller’s Myst postmortem from the 2013 Game Developer’s Conference; Richard Moss’s Ludiphilia podcast; a blog post by Robyn Miller. Finally, my huge thanks to William Volk for sharing his memories and impressions with me in an interview and for sending me an original copy of The Manhole on CD-ROM for my research.

The original floppy-disk-based version of The Manhole can be played online at archive.org. The Manhole: Masterpiece Edition, a remake supervised by the Miller brothers in 1994 which sports much-improved graphics and sound, is available for purchase on Steam.)

Footnotes

Footnotes
1 Activision was renamed Mediagenic at almost the very instant that Lehrberg first met the Miller brothers. When the name change was greeted with universal derision, Activision/Mediagenic CEO Bruce Davis quickly began backpedaling on his hasty decision. The Manhole, for instance, was released by Mediagenic under their “Activision” label — which was odd because under the new ordering said label was supposed to be reserved for games, and The Manhole was considered children’s software, not a traditional game. I just stick with the name “Mediagenic” in this article as the least confusing way to address a confusing situation.
2 The first CD-based software to reach European consumers says worlds about the differences that persisted between American and European computing, and about the sheer can-do ingenuity that so often allowed British programmers in particular to squeeze every last ounce of potential out of hardware that was usually significantly inferior to that enjoyed by their American counterparts. Codemasters, a budget software house based in Warwickshire, came up with a very unique shovelware package for the 1989 Christmas season. They transferred thirty old games from cassette to a conventional audio CD, which they then sold along with a special cable to run the output from an ordinary music-CD player into a Sinclair or Amstrad home computer. “Here’s your CD-ROM,” they said. “Have a ball.” By all accounts, Codemasters’s self-proclaimed “CD revolution,” kind of hilarious and kind of brilliant, did quite well for them. When it came to doing more with less in computing, you never could beat the Brits.
3 MADE’s scripting language was to some extent based on AdvSys, a language for amateur text-adventure creation that never quite took off like the contemporaneous AGT.
 
 

Tags: , ,