RSS

What’s in a Subtitle?

Sharp-eyed readers may have already noticed that I’ve changed the subtitle of this blog from “a history of computer entertainment” to “a history of computer entertainment and digital culture.” This is not so much indicative of any change in focus as it is a better description of what this blog has always been. I’ve always made space for aspects of what we might call “creative computing” that aren’t games, from electronic literature to the home-computer wars, from the birth of hypertext to early online culture, from influential science fiction to important developments in programming, and that will of course continue.

That is all. Carry on.

 

Ebooks and Future Plans

I’m afraid I don’t have a standard article for you this week. I occasionally need to skip a Friday to store up an independent writer’s version of vacation time, and the beginning of a five-Friday month like this one is a good time to do that. That said, this does make a good chance to give you some updates on the latest goings-on here at Digital Antiquarian World Headquarters, and to solicit some feedback on a couple of things that have been on my mind of late. So, let me do that today, and I’ll be back with the usual fare next Friday. (Patreon supporters: don’t worry, this meta-article’s a freebie!)

First and foremost, I’m pleased to be able to release the latest volume of the growing ebook collection compiling the articles on this site, this one centering roughly — even more roughly than usual, in fact — on 1991. Volume 13 has been a long time coming because the last year has brought with it a lot of longer, somewhat digressive series on topics like Soviet computing and the battle over Tetris, the metamorphosis of Imagine Software into Psygnosis, the world of pre-World Wide Web commercial online services, and of course my recently concluded close reading of Civilization, along with the usual singletons on individual games and related topics. This ebook is by far the fattest one yet, and I think it contains some of the best work I’ve ever done; these are certainly, at any rate, some of the articles I’ve poured the most effort into. As usual, it exists only thanks to the efforts of Richard Lindner. He’s outdone himself this time, even providing fresh cover art to suit what he described to me as the newly “glamorous, visual” era of the 1990s. If you appreciate being able to read the blog in this way, feel free to send him a thank-you note at the email address listed on the title page of the ebook proper.

Next, I want to take this opportunity to clear up the current situation around Patreon, something I’ve neglected to do for an unconscionably long time. Many of you doubtless remember the chaos of last December, when Patreon suddenly announced changes to their financial model that would make a blog like this one, which relies mostly on small donations, much less tenable. I scrambled to find alternatives to Patreon for those who felt (justifiably) betrayed by the changes, and had just about settled on a service called Memberful when Patreon reversed course and went back to the old model after a couple of weeks of huge public outcry.

Despite sending some mixed messages in the weeks that followed that reversal, I haven’t ever implemented Memberful as an alternative funding model due to various nagging concerns: I’m worried about tech-support issues that must come with a bespoke solution, not happy about being forced to sell monthly rather than per-article subscriptions (meaning I have to feel guilty if due to some emergency I can’t publish four articles in any given month), and concerned about the complication and confusion of offering two separate subscription models — plus PayPal! — as funding solutions (just writing a FAQ to explain it all would take a full day or two!). In addition, a hard look at the numbers reveals that a slightly higher percentage of most pledges would go to third parties when using Memberful than happens with Patreon. It’s for all these reasons that, after much agonized back-and-forthing, I’ve elected to stay the course with Patreon alone as my main funding mechanism, taking them at their word that they’ll never again do do anything like what they did last December.

I do understand that some of you are less inclined to be forgiving, which is of course your right. For my part, even the shenanigans of last December weren’t quite enough to destroy the good will I have toward Patreon for literally changing my life by allowing me to justify devoting so much time and energy to this blog. (They were of course only the medium; I’m even more grateful to you readers!) At any rate, know that except for that one blip Patreon has always treated me very well, and that their processing fees are lower than I would pay using any other subscription service. And yeah, okay… maybe also keep your fingers crossed that I’ve made the right decision in giving them a second chance before I hit the panic button. Fool me once…

So, that’s where we stand with the Patreon situation, which can be summed up as sticking with the status quo for now.  But it’s not the only thing I’ve been a bit wishy-washy about lately…

As a certain recent ten-article series will testify, I fell hard down the Civilization rabbit hole when I first began to look at that game a year or so ago. I’ve spent quite some time staring at that Advances Chart, trying to decide what might be there for me as a writer. I’m very attracted to the idea of writing some wider-scale macro-history in addition to this ongoing micro-history of the games industry, as I am by the idea of writing said history in terms of achievement and (largely) peaceful progress as opposed to chronicles of wars and battles won and lost.  Still, I’ve struggled to figure out what form it all should take.

My first notion was to start a second blog. It would be called — again, no surprise here for readers of my Civilization articles! — The Narrative of Progress, and would be structured around an Advances Chart similar but not identical to the one in the Civilization box. (Intriguing as it is, the latter also has some notable oddities, such as its decision to make “Alphabet” and “Writing” into separate advances; how could you possibly have one without the other?) I even found a web developer who did some work on prototyping an interactive, dynamically growing Advances Chart with links to individual articles. But we couldn’t ever come up with anything that felt more intuitive and usable than a traditional table of contents, so I gave up on that idea. I was also concerned about whether I could possibly handle the research burden of so many disparate topics in science, technology, and sociology — a concern which the Civilization close reading, over the course of which I made a few embarrassing gaffes which you readers were kind enough to point out to me, has proved were justified.

But still I remain attracted to the idea of doing a different kind of history in addition to this gaming history. Lately, I’ve gravitated to the Wonders of the World. In fact, Civilization prompted my wife Dorte and I to take a trip to Cairo just a month ago — a crazy place, let me tell you! — to see the Pyramids, the Egyptian Museum, and other ancient sites. I think I could do a great job with these topics, as they’re right in my writerly wheelhouse of readable narrative history, and it would be hard to go wrong with stories as fascinating as these. Up until just a couple of weeks ago I had schemed about doing these kinds of stories on this site, but finally had to give it up as well as the wrong approach. I would have to set up a second Patreon anyway, as I couldn’t possibly expect people who signed up to support a “history of interactive entertainment” to support this other stuff as well, and running two Patreons and two parallel tracks out of a single WordPress blog would just be silly.

All of which is to say that I’m as undecided as ever about this stuff. I know I’d like to do some wider-frame historical writing at some point, almost certainly hosted at a different site, but I don’t know exactly when that will be or what form it will take. Would you be interested in reading such a thing? I’d be interested to hear your opinions and suggestions, whether in the comments below or via email.

Whatever happens, rest assured that I remain committed to this ongoing history as well; the worst that might result from a second writing project would be a somewhat slower pace here. I’m occasionally asked how far I intend to go with this history, and I’ve never had a perfect answer. A few years ago, I thought 1993’s Doom might be a good stopping place, as it marked the beginning of a dramatic shift in the culture of computer games. But the problem with that, I’ve come to realize, is that it did indeed only mark the beginning of a shift, and to stop there would be to leave countless threads dangling. These days, the end of the 1990s strikes me as a potential candidate, but we’ll see. At any rate, I don’t have plans for stopping anytime soon — not as long as you’re still willing to read and support this work. Who knows, maybe we’ll make it all the way to 2018 someday.

In that meantime, a quick rundown of coming attractions for the historical year of 1992. (If you want to be completely surprised every week, skip this list!)

  • Jeff Tunnell’s hugely influential physics puzzler The Incredible Machine
  • the seminal platformer Another World, among other things a beautiful example of lyrical nonverbal storytelling
  • a series on the evolution of Microsoft Windows, encompassing the tangled story of OS/2, the legal battle with Apple over look-and-feel issues, and those Windows time-wasters, like Solitaire, Minesweeper, and Hearts, that became some of the most-played computer games in history
  • William Gibson’s experimental poem-that-destroys-itself Agrippa
  • Shades of Gray, an underappreciated literary statement in early amateur interactive fiction which came up already in my conversation with Judith Pintar, but deserves an article of its own
  • Legend’s two Gateway games
  • Indiana Jones and the Fate of Atlantis
  • Electronic Arts in the post “rock-star” years, Trip Hawkins’s departure, and the formation of 3DO
  • The Lost Files of Sherlock Holmes, which might just be my all-time favorite Holmes game
  • Interplay’s two Star Trek graphic adventures
  • the adventures in Sierra’s Discovery line of games for children, which were better than most of their adult adventure games during this period
  • Quest for Glory III and IV
  • the strange story behind the two Dune games which were released back-to-back in 1992
  • Star Control II
  • Ultima Underworld and Ultima VII
  • Darklands

Along with all that, I’ve had a great suggestion from Casey Muratori — who, incidentally, was also responsible for my last article by first suggesting I take a closer look at Dynamix’s legacy in narrative games — to write something about good puzzles in adventure games. I’ve long been conscious of spending a lot more time describing bad puzzles in detail than I do good ones. The reason for this is simply that I hesitate to spoil the magic of the good puzzles for you, but feel far less reluctance with regard to the bad ones. Still, it does rather throw things out of balance, and perhaps I should do something about that. Following Casey’s suggestion, I’ve been thinking of an article describing ten or so good puzzles from classic games, analyzing how they work in detail and, most importantly, why they work.

That’s something on which I could use your feedback as well. When you think of the games I’ve written about so far on this blog, whether textual or graphical, is there a puzzle that immediately springs to mind as one that you just really, really loved for one reason or another? (For me, just for the record, that puzzle is the T-removing machine from Leather Goddesses of Phobos.) If so, feel free to send it my way along with a sentence or two telling me why, once again either in the comments below or via private email. I can’t promise I can get to all of them, but I’d like to assemble a reasonable selection of puzzles that delight for as many different reasons as possible.

Finally, please do remember that I depend on you for support in order to continue doing this work. If you enjoy and/or find something of value in what I do here, if you’re lucky enough to have disposable income, and if you haven’t yet taken the plunge, please do think about signing up as a Patreon supporter at whatever level strikes you as practical and warranted. I run what seems to be one of the last “clean” sites on the Internet — no advertisements, no SEO, no personal-data-mining, no “sponsored articles,” just the best content I can provide — but that means that I have to depend entirely upon you to keep it going. With your support, we can continue this journey together for years to come.

And with that, I’ll say thanks to all of you for being the best readers in the world and wish you a great weekend. See you next week with a proper article!

 

The Dynamic Interactive Narratives of Dynamix

By 1990, the world of adventure games was coming to orient itself after the twin poles of Sierra Online and LucasFilm Games. The former made a lot of games, on diverse subjects and of diverse quality, emphasizing always whatever new audiovisual flash was made possible by the very latest computing technology. The latter, on the other hand, made far fewer and far less diverse but more careful games, fostering a true designer’s culture that emphasized polish over flash. In their attitudes toward player-character death and dead ends, toward puzzle design, toward graphics style, each company had a distinct personality, and adventure-game fans lined up, as they continue to do even today, as partisans of one or the other.

Yet in the vast territory between these two poles were many other developers experimenting with the potential of adventure games, and in many cases exploring approaches quite different from either of the two starring players. One of the more interesting of these supporting players was the Oregon-based Dynamix, who made five adventure or vaguely adventure-like games between 1988 and 1991 — as many adventure-like games, in fact, as LucasFilm Games themselves managed to publish during the same period. Despite this relative prolificacy, Dynamix was never widely recognized as an important purveyor of adventures; they enjoyed their greatest fame in the very different realm of 3D vehicular simulations. There are, as we’ll see, some pretty good reasons for that to be the case; for all their surprisingly earnest engagement with interactive narrative, none of the five games in question managed to rise to the level of a classic. Still, they all are, to a one, interesting at the very least, which is a track record few other dabblers in the field of adventure games can lay claim to.


Arcticfox, Dynamix’s breakout hit, arrived when Electronic Arts was still nursing the remnants of Trip Hawkins’s original dream of game developers as rock stars, leading to lots of strange photos like this one. This early incarnation of Dynamix consisted of (from left to right) Kevin Ryan, Jeff Tunnell, Damon Slye, and Richard Hicks.

Like a number of other early software developers, Dynamix was born on the floor of a computer shop. The shop in question was The Computer Tutor of Eugene, Oregon, owned and operated in the early 1980s by a young computer fanatic named Jeff Tunnell (the last name is pronounced with the accent on the second syllable, like “Raquel” — not like “tunnel”). He longed to get in on the creative end of software, but had never had the patience to progress much beyond BASIC as a programmer in his own right. Then came the day when one of his regular customers, a University of Oregon undergraduate named Damon Slye, showed him a really hot Apple II action game he was working on. Tunnell knew opportunity when he saw it.

Thus was born a potent partnership, one not at all removed from the similar partnership that had led to MicroProse Software on the other coast. Jeff Tunnell was the Wild Bill Stealey of this pairing: ambitious, outgoing, ready and willing to tackle the business side of software development. Damon Slye was the Sid Meier: quiet, a little shy, but one hell of a game programmer.

Tunnell and Slye established their company in 1983, at the tail end of the Ziploc-bag era of software publishing, under the dismayingly generic name of The Software Entertainment Company. They started selling the game that had sparked the company, which Slye had now named Stellar 7, through mail order. The first-person shoot-em-up put the player in charge of a tank lumbering across the wire-frame surface of an enemy-infested planet. It wasn’t the most original creation in the world, owing a lot to the popular Atari quarter-muncher Battlezone, but 3D games of this sort were unusual on the Apple II, and this one was executed with considerable aplomb. A few favorable press notices led to it being picked up by Penguin Software in 1984, which in turn led to Tunnell selling The Computer Tutor in order to concentrate on his new venture. (Unbeknownst to Tunnell and Slye at the time, Stellar 7 was purchased and adored by Tom Clancy, author of one of the most talked-about books of the year. “It is so unforgiving,” he would later say. “It is just like life. It’s just perfect to play when I’m exercising. I get on my exercycle, start pedaling, pick up the joystick, and I’m off…”)

But the life of a small software developer just as the American home-computer industry was entering its first slump wasn’t an easy one. Tunnell signed contracts wherever he could find them to keep his head above water: releasing Sword of Kadash, an adventure/CRPG/platformer hybrid masterminded by another kid from The Computer Tutor named Chris Cole; writing a children’s doodler called The Electronic Playground himself with a little help from Damon Slye; even working on a simple word processor for home users.

In fact, it was this last which led to the company’s big break. They had chosen to write that program in C, a language which wasn’t all that common on the first generation of 8-bit microcomputers but which was officially blessed as the best way to program a new 16-bit audiovisual powerhouse called the Commodore Amiga. Their familiarity with C gave Tunnell’s company, by now blessedly renamed Dynamix, an in with Electronic Arts, the Amiga’s foremost patron in the software world, who were looking for developers to create products for the machine while it was still in the prototype phase. Damon Slye thus got started programming Arcticfox on an Amiga that didn’t even have a functioning operating system, writing and compiling his code on an IBM PC and sending it over to the Amiga via cable for execution.

Conceptually, Arcticfox was another refinement on the Battlezone/Stellar 7 template, another tooling-around-and-shooting-things-in-a-tank game. As a demonstration of the Amiga’s capabilities, however, it was impressive, replacing its predecessors’ monochrome wire-frame graphics with full-color solids. Reaching store shelves in early 1986 as part of the first wave of Amiga games, Arcticfox was widely ported and went on to sell over 100,000 copies in all, establishing Dynamix’s identity as a purveyor of cutting-edge 3D graphics. In that spirit, the next few years would bring many more 3D blast-em games, with names like Skyfox II, F-14 Tomcat, Abrams Battle Tank, MechWarrior, Deathtrack, and A-10 Tank Killer.

Yet even in the midst of all these adrenaline-gushers, Jeff Tunnell was nursing a quiet interest in the intersection of narrative with interactivity, even as he knew that he didn’t want to make a traditional adventure game of either the text or the graphical stripe. Like many in his industry by the second half of the 1980s, he believed the parser was hopeless as a long-term sell to the mass market, while the brittle box of puzzles that was the typical graphic adventure did nothing for him either. He nursed a dream of placing the player in a real unfolding story, partially driving events but partially being driven by them, like in a movie. Of course, he was hardly alone at the time in talking about “interactive movies”; the term was already becoming all the rage. But Dynamix’s first effort in that direction certainly stood out from the pack — or would have, if fate had been kinder to it.

Jeff Tunnell still calls Project Firestart the most painful single development project he’s ever worked on over the course of more than three decades making games. Begun before Arcticfox was published, it wound up absorbing almost three years at a time when the average game was completed in not much more than three months. By any sane standard, it was just way too much game for the Commodore 64, the platform for which it was made. It casts the player as a “special agent” of the future named Jon Hawking, sent to investigate a spaceship called the Prometheus that had been doing controversial research into human genetic manipulation but has suddenly stopped communicating. You can probably guess where this is going; I trust I won’t be spoiling too much to reveal that zombie-like mutants now roam the ship after having killed most of the crew. The influences behind the story Tunnell devised aren’t hard to spot — the original Alien movie being perhaps foremost among them — but it works well enough on its own terms.

In keeping with Tunnel’s commitment to doing something different with interactive narrative, Project Firestart doesn’t present itself in the guise of a traditional adventure game. Instead it’s an action-adventure, an approach that was generally more prevalent among British than American developers. You explore the ship’s rooms and corridors in real time, using a joystick to shoot or avoid the monsters who seem to be the only life remaining aboard the Prometheus. What makes it stand out, however, is the lengths Dynamix went to to make it into a real unfolding story with real stakes. As you explore, you come across computer terminals holding bits and pieces of what has happened here, and of what you need to do to stop the contagion aboard from spreading further. You have just two hours, calculated in real playing time, to gather up all of the logs you can for the benefit of future researchers, make contact with any survivors from the crew who might have managed to hole up somewhere, set the ship’s self-destruct mechanism, and escape. You’re personally expendable; if you exceed the time limit, warships that are standing by will destroy the Prometheus with you aboard.

Throughout the game, cinematic touches are used to build tension and drama. For example, when you step out of an elevator to the sight of your first dead body, a stab of music gushes forth and the “camera” cuts to a close-up of the grisly scene. Considering what a blunt instrument Commodore 64 graphics and sound are, the game does a rather masterful job of ratcheting up the dread, whilst managing to sneak in a plot twist or two that even people who have seen Alien won’t be able to anticipate. Ammunition is a scarce commodity, leaving you feeling increasingly hunted and desperate as the ship’s systems begin to fail and the mutants relentlessly hunt you down through the claustrophobic maze of corridors. And yet, tellingly, Project Firestart diverges from the British action-adventure tradition in not being all that hard of a game in the final reckoning. You can reasonably expect to win within your first handful of tries, if perhaps not with the most optimal ending. It’s clearly more interested in giving you a cinematic experience than it is in challenging you in purely ludic terms.

Project Firestart was finally released in 1988, fairly late in the day for the Commodore 64 in North America and just as Tunnell was terminating his publishing contract with Electronic Arts under less-than-entirely-amicable terms and signing a new deal with Mediagenic. It thus shipped as one of Dynamix’s last games for Electronic Arts, received virtually no promotion, and largely vanished without a trace; what attention it did get came mostly from Europe, where this style of game was more popular in general and where the Commodore 64 was still a strong seller. But in recent years it’s enjoyed a reevaluation in the gaming press as, as the AV Club puts it, “a forgotten ’80s gem” that “created the formula for video game horror.” It’s become fashionable to herald it as the great lost forefather of the survival-horror genre that’s more typically taken to have been spawned by the 1992 Infogrames classic Alone in the Dark.

Such articles doubtless have their hearts in the right place, but in truth they rather overstate Project Firestart‘s claim to such a status at the same time that they rather understate its weaknesses. While the mood of dread the game manages to evoke with such primitive graphics and sound is indeed remarkable, it lacks any implementation of line of sight, and thus allows for no real stealth or hiding; the only thing to do if you meet some baddies you don’t want to fight is to run into the next room. If it must be said to foreshadow any future classic, my vote would go to Looking Glass Studio’s 1994 System Shock rather than Alone in the DarkSystem Shock too sees you gasping with dread as you piece together bits of a sinister story from computer terminals, even as the monsters of said story hunt you down. But even on the basis of that comparison Project Firestart remains more of a formative work than a classic in its own right. Its controls are awkward; you can’t even move and shoot at the same time. And, rather than consisting of a contiguous free-scrolling world, its geography is, due to technical limitations, segmented into rooms which give the whole a choppy, disconnected feel, especially given that they must each be loaded in from the Commodore 64’s achingly slow disk drive.

Accessing a shipboard computer in Project Firestart.

Perhaps unsurprisingly given Project Firestart‘s protracted and painful gestation followed by its underwhelming commercial performance, Dynamix themselves never returned to this style of game. Yet it provided the first concrete manifestation of Jeff Tunnell’s conception of game narrative as — appropriately enough given the name of his company — a dynamic thing which provokes the player as much as it is provoked by her. Future efforts would gradually hew closer, on a superficial level at least, to the form of more traditional adventure games without abandoning this central conceit.

That said, the next narrative-oriented Dynamix game would still be an oddball hybrid by anyone’s standard. By 1989, Dynamix, like an increasing number of American computer-game developers, had hitched their wagon firmly to MS-DOS, and thus David Wolf: Secret Agent appeared only on that platform. It was intended to be an interactive James Bond movie.

But intention is not always result. To accept Dynamix’s own description of David Wolf as an interactive movie is to be quite generous indeed. It’s actually a non-interactive story, presented as a series of still images with dialog overlaid, interspersed with a handful of vehicular action games that feel like fragments Dynamix just happened to have lying around the office: a hang-glider sequence, a jet-fighter sequence, a car chase, the old jumping-out-of-an-airplane-without-a-parachute-just-behind-a-villain-who-does-have-one gambit. If you succeed at these, you get to watch more static story; if you fail, that’s that. Or maybe not: in a telling statement as to what was really important in the game, Dynamix made it possible to bypass any minigame at which you failed with the click of a button and keep right on trucking with the story. In a perceptive review for Computer Gaming World magazine, Charles Ardai compared David Wolf to, of all things, the old arcade game Ms. Pac-Man. The latter featured animated “interludes” every few levels showing the evolving relationship between Mr. and Mrs. Pac-Man. These served, Ardai noted, as the icing on the cake, a little bonus to reward the player’s progress. But David Wolf inverted that equation: the static story scenes were now the cake. The game existed “just for the sheer joy of seeing digitized images on your PC.”

Our hero David Wolf starts salivating over the game’s lone female as soon as he sees her picture during his mission briefing, and he and the villains spend most of the game passing her back and forth like a choice piece of meat.

These days, of course, seeing pixelated 16-color digitizations on the screen prompts considerably less joy, and the rest of what’s here is so slight that one can only marvel at Dynamix’s verve in daring to slap a $50 suggested list price on the thing. The whole game is over within an hour or so, and a cheesy hour it is at that; it winds up being more Get Smart than James Bond, with dialog that even Ian Fleming would have thought twice about before committing to the page. (A sample: “Garth, I see your temper is still intact. Too bad I can’t say the same for your sense of loyalty.”) It’s difficult to tell to what extent the campiness is accidental and to what extent it’s intentional. Charles Ardai:

The viewer isn’t certain how to take the material. Is it a parody of James Bond (which is, by now, self-parodic), a straight comic adventure (imitation Bond as opposed to parody), or a serious thriller? It is hard to take the strictly formula plot seriously, but several of the scenes suggest that one is supposed to. I suspect that the screenwriters never quite decided which direction to take, and hoped to be able to do with a little of each. This can’t possibly work. You can’t both parody a genre and, at the same time, place yourself firmly within that genre because the resulting self-parody looks embarrassingly unwitting. Certainly you can’t do this and expect to be taken seriously. Airplane! couldn’t ask us to take seriously its disaster plot and Young Frankenstein didn’t try to make viewers cry over the monster’s plight, but this is what the designers of David Wolf seem to be doing.

Such wild vacillations in tone are typical of amateur writers who haven’t yet learned to control their wayward pens. They’re thus all too typical as well of the “programmer-written” era of games, before hiring real writers became standard industry practice. David Wolf wouldn’t be the last Dynamix game to suffer from the syndrome.

The cast of David Wolf manages the neat trick of coming off as terrible actors despite having only still images to work with. Here they’re trying to look shocked upon being surprised by villains pointing guns at them.

But for all its patent shallowness, David Wolf is an interesting piece of gaming history for at least a couple of reasons. Its use of digitized actors, albeit only in still images, presaged the dubious craze for so-called “full-motion-video” games that would dominate much of the industry for several years in the 1990s. (Tellingly, during its opening credits David Wolf elects to list its actors, drawn along with many of the props they used from the University of Oregon’s theatrical department, in lieu of the designers, programmers, and artists who actually built the game; they have to wait until the end scroll for recognition.) And, more specifically to the context of this article, David Wolf provides a further illustration of Jeff Tunnell’s vision of computer-game narratives that weren’t just boxes of puzzles.

Much of the reason David Wolf wound up being such a constrained experience was down to Dynamix being such a small developer with fairly scant resources. Tunnell was therefore thrilled when an arrangement with the potential to change that emerged.

At some point in late 1989, Ken Williams of Sierra paid Dynamix a visit. Flight simulations and war games of the sort in which Dynamix excelled were an exploding market (no pun intended!) at the time, one which would grow to account for 35.6 percent of computer-game sales by the second half of 1990, dwarfing the 26.2 percent that belonged to Sierra’s specialty of adventure games. Williams wanted a piece of that exploding market. He was initially interested only in licensing some of Dynamix’s technology as a leg-up. But he was impressed enough by what he saw there — especially by a World War I dog-fighting game that the indefatigable Damon Slye had in the works — that the discussion of technology licensing turned into a full-on acquisition pitch. For his part, Jeff Tunnell, recognizing that the games industry was becoming an ever more dangerous place for a small company trying to go it alone, listened with interest. On March 27, 1990, Sierra acquired Dynamix for $1.5 million.

In contrast to all too many such acquisitions, neither party would come to regret the deal. Even in the midst of a sudden, unexpected glut in World War I flight simulators, Damon Slye’s Red Baron stood out from the pack with a flight model that struck the perfect balance between realism and fun. (MicroProse had planned to use the same name for their own simulator, but were forced to go with the less felicitious Knights of the Sky when Dynamix beat them to the trademark office by two weeks.) Over the Christmas 1990 buying season, Red Baron became the biggest hit Dynamix had yet spawned, proving to Ken Williams right away that he had made the right choice in acquiring them.

Williams and Tunnell maintained a relationship of real cordiality and trust, and Dynamix was given a surprising amount of leeway to set their own agenda from offices that remained in Eugene, Oregon. Tunnell was even allowed to continue his experiments with narrative games, despite the fact that Sierra, who were churning out a new adventure game of their own every couple of months by this point, had hardly acquired Dynamix with an eye to publishing still more of them.

And so Dynamix’s first full-fledged adventure game, with real interactive environments, puzzles, and dialog menus, hit the market not long after the acquisition was finalized. Rise of the Dragon had actually been conceived by Jeff Tunnell before David Wolf was made, only to be shelved as too ambitious for the Dynamix of 1988. But the following year, with much of the technical foundation for a real adventure game already laid down by David Wolf, they had felt ready to give it a go.

Rise of the Dragon found Dynamix once again on well-trodden fictional territory, this time going for a Bladerunner/Neuromancer cyberpunk vibe; the game’s setting is the neon-lit Los Angeles of a dystopic future of perpetual night and luscious sleaze. You play a fellow stamped with the indelible name of Blade Hunter, a former cop who got himself kicked off the force by playing fast and loose with the rules. Now, he works as a private detective for whoever can pay him. When the game begins, he’s just been hired by the mayor to locate his drug-addicted daughter, who has been swallowed up by the city’s underworld. As a plot like that would indicate, this is a game that very much wants to be edgy. King’s Quest it isn’t.

There are a lot of ideas in Rise of the Dragon, some of which work better than others, but all of which reflect a real, earnest commitment to a more propulsive form of interactive narrative than was typical of the new parent company Sierra’s games. Jeff Tunnell:

Dynamix adventures have an ongoing story that will unfold even if the player does nothing. The player needs to interact with the game world to change the outcome of that story. For example, if the player does nothing but sit in the first room of Dragon, he will observe cinematic “meanwhile cutaways” depicting the story of drug lord Deng Hwang terrorizing the futuristic city of Los Angeles with tainted drug patches that cause violent mutations. So the player’s job is to interact with the world and change the outcome of the story to one that is more pleasing and heroic.

The entire game runs in real time, with characters coming and going around the city on realistic schedules. Dialog is at least as important as object-based puzzle-solving, and characters remember how you treat them to an impressive degree. This evolving web of relationships, combined with a non-linear structure and multiple solutions to most dilemmas, creates a possibility space far greater than the typical adventure game, all set inside a virtual world which feels far more alive.

The interface as well goes its own way. The game uses a first-person rather than third-person perspective, a rarity in graphic adventures of this period. And, at a time when both Sierra and Lucasfilm Games were still presenting players with menus of verbs to click on, Rise of the Dragon debuted a cleaner single-click system: right-clicking on an object will examine it, left-clicking will take whatever action is appropriate to that object. Among its other virtues, the interface frees up screen real estate to present the striking graphics to maximum effect. Instead of continuing to rely on live actors, Dynamix hired veteran comic-book illustrator Robert Caracol to draw most of the scenery with pen and ink for digitization. Combined with the jump from 16-color EGA to 256-color VGA graphics, the new approach results in art worthy of a glossy, stylized graphic novel. Computer Gaming World gave the game a well-deserved “Special Award for Artistic Achievement” in their “Games of the Year” awards for 1990.

Just to remind us of who made the game, a couple of action sequences do pop up, neither of them all that notably good or bad. But, once again, failing at one of them brings an option to skip it and continue with the story as if you’d succeeded. Indeed, the game as a whole evinces a forgiving nature that’s markedly at odds both with its hard-bitten setting and with those other adventure games being made by Dynamix’s parent company. It may be possible to lock yourself out of victory or run out of time to solve the mystery, but you’d almost have to be willfully trying to screw up in order to do so. That Dynamix was able to combine this level of forgivingness with so much flexibility in the narrative is remarkable.

If you really screw up, Rise of the Dragon is usually kind enough to tell you so.

But there are flies in the ointment that hold the game back from achieving classic status. Perhaps inevitably given the broad possibility space, it’s quite a short game on any given playthrough, and once the story is known the potential interest of subsequent playthroughs is, to say the least, limited. Of course, this isn’t so much of a problem today as it was back when the game was selling for $40 or more. Other drawbacks, however, remain as problematic as ever. The interface, while effortless to use in many situations, is weirdly obtuse in others. The inventory system in particular, featuring a paper-doll view of Blade Hunter and employing two separate windows in the form of a “main” and a “quick” inventory, is far too clever for its own good. I also find it really hard to understand where room exits are and how the environment fits together. And the writing is once again on the dodgy side; it’s never entirely clear whether Blade Hunter is supposed to be a real cool cat (like the protagonist of Neuromancer the novel) or a lovable (?) loser (like the protagonist of Neuromancer the game). Add to these issues an impossible-to-take-seriously plot that winds up revolving around an Oriental death cult, plus some portrayals of black and Chinese people that border on the outright offensive, and we’re a long way from even competent comic-book fiction.

Still, Rise of the Dragon in my opinion represents the best of Jeff Tunnell’s experiments with narrative games. If you’re interested in exploring this odd little cul-de-sac in adventure-gaming history, I recommend it as the place to start, as it offers by far the best mix of innovation and playability, becoming in the process the best all-around expression of just where Tunnell was trying to go with interactive narrative.

Heart of China, the early 1991 follow-up to Rise of the Dragon, superficially appears to be more of the same in a different setting. The same engine and interface are used, including a couple more action-based mini-games to play or skip, with the genre dance taking us this time to a 1930s pulp-adventure story in the spirit of Indiana Jones. The most initially obvious change is the return to a heavy reliance on digitized “actors.” Dynamix wound up employing some 85 separate people on the business end of their cameras in a production which overlapped with that of Rise of the Dragon, with its beginning phases stretching all the way back into 1989. Thankfully, the integration of real people with computer graphics comes off much better than it does in David Wolf, evincing much more care on the part of the team responsible. Heart of China thus manages to become one of the less embarrassing examples of a style of graphics that was all but predestined to look hopelessly cheesy about five minutes after hitting store shelves.

I’m not sure if this is Really Bad Writing that expects to be taken seriously or Really Bad Writing that’s trying (and failing) to be funny. I’m quite sure, however, that it’s Really Bad Writing of some sort.

When you look more closely at the game’s design, however, you see a far more constrained approach to interactive storytelling than that of its predecessor. You play a down on-his-luck ex-World War I flying ace named “Lucky” Jake Masters. (If there was one thing Dynamix knew how to do, it was to create stereotypically macho names.) He’s running a shady air-courier cum smuggling business out of Hong Kong when he’s enlisted by a wealthy “international business tycoon and profiteer” to rescue his daughter, who’s been kidnapped by a warlord deep inside the Chinese mainland. (If there was one thing Dynamix didn’t know how to do, it was to create plots that didn’t involve rescuing the daughters of powerful white men from evil Chinese people.) The story plays out in a manner much more typical of a plot-heavy adventure game — i.e., as a linear series of acts to be completed one by one — than does that of its predecessor. Jeff Tunnell’s commitment to his original vision for interactive narrative was obviously slipping in the face of resource constraints and a frustration, shared by some of his contemporaries who were also interested in more dynamic interactive storytelling, that gamers didn’t really appreciate the extra effort anyway. His changing point of view surfaces in a 1991 interview:

From a conceptual standpoint, multiple plot points are exciting. But when you get down to the implementation, they can make game development an absolute nightmare. Then, after all of the work to implement these multiple paths and endings, we’ve found that most gamers never even discover them.

When the design of Heart of China does allow for some modest branching, Dynamix handles it in rather hilariously passive-aggressive fashion: big red letters reading “plot branch” appear on the screen. Take that, lazy gamers!

That Lucky’s a real charmer, alright.

Heart of China bears all the signs of a project that was scaled back dramatically in the making, a game which wound up being far more constrained and linear than had been the original plan. Yet it’s not for this reason that I find it to be one of the more unlikable adventure games I’ve ever played. The writing is so ludicrously terrible that one wants to take the whole thing as a conscious B-movie homage of the sort Cinemaware loved to make. But darned if Dynamix didn’t appear to expect us to take it seriously. “It has more depth and sensibility than I’ve ever seen in a computer storytelling game,” said Tunnell. It’s as if he thought he had made Heart of Darkness when he’d really made Tarzan the Ape-Man. The ethnic stereotyping manages to make Rise of the Dragon look culturally sensitive, with every Chinese character communicating in the same singsong broken English. And as for Lucky Jake… well, he’s evidently supposed to be a charming rogue just waiting for Harrison Ford to step into the role, but hitting those notes requires far, far more writerly deftness than anyone at Dynamix could muster. Instead he just comes off as a raging asshole — the sort of guy who creeps out every woman he meets with his inappropriate comments; the sort of guy who warns his friend-with-benefits that she’s gained a pound or two and thus may soon no longer be worthy of his Terrible Swift Sword. For all these reasons and more, I can recommend Heart of China only to the truly dedicated student of adventure-game history.

The third and final point-and-click adventure game created by Jeff Tunnell and Dynamix is in some ways the most impressive of them all and in others the most disappointing, given that it turns into such a waste of potential. Tunnell took a new tack with The Adventures of Willy Beamish, deciding to create a light-hearted comedic adventure that would play like a Saturday-morning cartoon. Taking advantage of a lull in Hollywood’s cartoon-production factories, he hired a team of professional animators of the old-school cel-based stripe, veterans of such high-profile productions as The Little Mermaid, Jonny Quest, and The Simpsons, along with a husband-and-wife team of real, honest-to-God professional television writers to create a script. Dynamix’s offices came to look, as a Computer Gaming World preview put it, like a “studio in the golden age of animation,” with animators “etching frantically atop the light tables” while “pen-and-pencil images of character studies, backgrounds, and storyboard tests surround them on the office walls.” The team swelled to some fifty people before all was said and done, making Willy Beamish by far the most ambitious and expensive project Dynamix had ever tackled.

Willy Beamish looked amazing in its time, and still looks just fine today, especially given that its style of hand-drawn cel-based animation is so seldom seen in the post-Pixar era. Look at the amount of detail in this scene!

What Tunnell got for his money was, as noted, darned impressive at first glance. Many companies — not least Sierra with their latest King’s Quest — were making noises about bringing “interactive cartoons” to computer monitors, but Dynamix was arguably the first to really pull it off. Switching to a third-person perspective in keeping with the cartoon theme, every frame was fussed-over to a degree that actually exceeded the likes of The Simpsons, much less the typical Saturday-morning rush job. Tunnel would later express some frustration that the end result may have been too good; he suspected that many people were mentally switching gears and subconsciously seeing it the way they might a cartoon on their television screen, not fully registering that everything they were seeing was playing on their computer. Today, all of this is given a further layer of irony by the way that 3D-rendered computer animation has all but made the traditional cel-based approach to cartoon animation used by Willy Beamish into a dead art. How odd to think that a small army of pencil-wielding illustrators was once considered a sign of progress in computer animation!

The game’s story is a deliberately modest, personal one — which in an industry obsessed with world-shaking epics was a good thing. The young Willy Beamish, a sort of prepubescent version of Ferris Bueller, wants to compete for the Nintari Championship of videogaming, but he and his dubiously named pet frog Horny are at risk of being grounded thanks to a bad mark in his music-appreciation class. From this simple dilemma stems several days of comedic chaos, including a variety of other story beats that involve his whole family. The tapestry was woven together with considerable deftness by the writers, whose experience with prime-time sitcoms served them well. It’s always a fine line between a precocious, smart-Alecky little boy and a grating, bratty one, but The Adventures of Willy Beamish mostly stays on the right side of it. For once, in other words, a Dynamix game has writing to match its ambition.

Horny the Frog springs into action.

Unfortunately, the game finds a new way to go off the rails. Rise of the Dragon and Heart of China had combined smart design sensibilities with dodgy writing; Willy Beamish does just the opposite. Like Rise of the Dragon, it runs in real time; unlike Rise of the Dragon, you are given brutally little time to accomplish what you need to in each scene. The game winds up playing almost like a platformer; you have to repeat each scene again and again to learn what to do, then execute perfectly in order to progress. Worse, it’s possible to progress without doing everything correctly, only to be stranded somewhere down the line. The experience of playing Willy Beamish is simply infuriating, a catalog of all the design sins Dynamix adventure games had heretofore been notable for avoiding. I have to presume that all those animators and writers caused Dynamix to forget that they were making an interactive work. As was being proved all over the games industry at the time, that was all too easy to do amidst all the talk about a grand union of Silicon Valley and Hollywood.

Sierra had been a little lukewarm on Dynamix’s previous adventure games, but they gave Willy Beamish a big promotional push for the Christmas 1991 buying season, even grabbing for it the Holy Grail of game promotion: a feature cover of Computer Gaming World. While I’m tempted to make a rude comment here about Dynamix finally making a game that embraced Sierra’s own design standards and them getting excited about that, the reality is of course that the game just looked too spectacular to do anything else. Sierra and Dynamix were rewarded with a solid hit that racked up numbers in the ballpark of one of the former’s more popular numbered adventure series. In 1993, Willy Beamish would even get a re-release as a full-fledged CD-ROM talkie, albeit with some of the most annoying children’s voices ever recorded.

Willy Beamish has a “trouble meter” that hearkens back to Bureaucracy‘s blood-pressure monitor, except this time it’s presumably measuring the blood pressure of the adults around you. If you let it get too high, you get shipped off to boarding school.

But it had been a hugely expensive product to create, and it’s questionable whether its sales, strong though they were, were actually strong enough to earn much real profit. At any rate, Jeff Tunnell, the primary driver behind Dynamix’s sideline in adventure games, suddenly decided he’d had enough shortly after Willy Beamish was finished. It seems that the experience of working with such a huge team had rubbed him the wrong way. He therefore resigned the presidency of Dynamix to set up a smaller company, Jeff Tunnell Productions, “to return to more hands-on work with individual products and to experiment in product genres that do not require the large design teams necessitated by [his] last three designs.” It was all done thoroughly amicably, and was really more of a role change than a resignation; Jeff Tunnell Productions would release their games through Dynamix (and, by extension, Sierra). But Tunnell would never make another adventure game. “After doing story-based games for a while,” he says today, “I realized it wasn’t something I wanted to continue to do. I think there is a place for story in games, but it’s…. hard.” Dynamix’s various experiments with interactive narrative, none of them entirely satisfying, apparently served to convince him that his talents were better utilized elsewhere. Ironically, he made that decision just as CD-ROM, a technology which would have gone a long way toward making his “interactive movies” more than just an aspiration, was about to break through into the mainstream at last.

Still, and for all that it would have been nice to see everything come together at least once for him, I don’t want to exaggerate the tragedy, especially given that the new Jeff Tunnell Productions would immediately turn out a bestseller and instant classic of a completely different type. (More on that in my next article!) If the legacy of Dynamix story games is a bit of a frustrating one, Tunnell’s vision of interactive narratives that are more than boxes of puzzles would eventually prove its worth across a multiplicity of genres. For this reason at the very least, the noble failures of Dynamix are worth remembering.

(Sources: Computer Gaming World of July 1988, December 1989, May 1990, February 1991, September 1991, October 1991, November 1991, March 1992, February 1994, and May 1994; Sierra’s news magazines of Summer 1990, Spring 1991, Summer 1991, Fall 1991, and June 1993; InfoWorld of March 5 1984; Apple Orchard of December 1983; Zzap! of July 1989; Questbusters of August 1991; Video Games and Computer Entertainment of May 1991; Dynamix’s hint books for Rise of the Dragon, Heart of China, and The Adventures of Willy Beamish; Matt Barton’s interviews with Jeff Tunnell in Matt Chat 199 and 200; press releases, annual reports, and other internal and external documents from the Sierra archive at the Strong Museum of Play.

You can download emulator-ready Commodore 64 disk images of Project Firestart and a version of David Wolf: Secret Agent that’s all ready to go under DOSBox from right here. Rise of the Dragon, Heart of China, and The Adventures of Willy Beamish — and for that matter Red Baron — are all available for purchase on GOG.com.)

 

Tags: , , , , , , ,

The Game of Everything, Part 10: Civilization and the Limits of Progress

To listen to what Sid Meier says about his most famous achievement today, my writing all of these articles on Civilization has been like doing a deep reading of an episode of The Big Bang Theory; there just isn’t a whole lot of there there. Meier claims that the game presents at best a children’s-book view of history, that the only real considerations that went into it were what would be fun and what wouldn’t. I don’t want to criticize him for that stance here, any more than I want to minimize the huge place that fun or the lack thereof really did fill in the decisions that he and his partner Bruce Shelley made about Civilization. I understand why he says what he says: he’s a commercial game designer, not a political pundit, and he has no desire to wade into controversy — and possibly shrink his customer base — by taking public positions on the sorts of fractious topics I’ve been addressing over the course of these articles. If he should need further encouragement to stay well away from those topics, he can find it in the many dogmatic academic critiques of Civilization which accuse it of being little more than triumphalist propaganda. He’d rather spend his time talking about game design, which strikes me as perfectly reasonable.

Having said all that, it’s also abundantly clear to me that Civilization reflects a much deeper and more earnest engagement with the processes of history than Meier is willing to admit these days. This is, after all, a game which cribs a fair amount of its online Civilopedia directly from Will Durant, author of the eleven-volume The Story of Civilization, the most ambitious attempt to tell the full story of human history to date. And it casually name-drops the great British historian Arnold J. Toynbee, author of the twelve-volume A Study of History, perhaps the most exhaustive — and certainly the most lengthy — attempt ever to construct a grand unified theory of history. These are not, needless to say, books which are widely read by children. There truly is a real theory of history to be found in Civilization as well, one which, if less thoroughly worked-out than what the likes of Toynbee have presented in book form, is nevertheless worth examining and questioning at some length.

The heart of Civilization‘s theory of history is of course the narrative of progress. In fact, the latter is so central to the game that it’s joined it as the second of our lodestars throughout this series of articles. And so, as we come to the end of the series, it seems appropriate to look at what the game and the narrative of progress have to say about one another one last time, this time in the context of a modern society like the ones in which we live today. Surprisingly given how optimistic the game’s take on history generally is, it doesn’t entirely ignore the costs that have all too clearly been shown to be associated with progress in this modern era of ours.

Meier and Shelley were already working on Civilization when the first international Earth Day was held on April 22, 1990, marking the most important single event in the history of the environmental movement since the publication of Rachel Carson’s Silent Spring back in 1962. Through concerts, radio and television programs, demonstrations, and shrewd publicity stunts like a Mount Everest “Peace Climb” including American, Soviet, and Chinese climbers roped together in symbolic co-dependence, Earth Day catapulted the subject of global warming among other environmental concerns into the mass media, in some cases for the first time.

Whether influenced by this landmark event or not, Civilization as well manifests a serious concern for the environment in the later, post-Industrial Revolution stages of the game. Coal- and oil-fired power plants increase the productivity of your factories dramatically, but also spew pollution into the air which you must struggle to clean up. Nuclear power plants, while the cheapest, cleanest, and most plentiful sources of energy most of the time, can occasionally melt down with devastating consequences to your civilization. Large cities generate pollution of their own even absent factories and power plants, presumably as a result of populations that have discovered the joy of automobiles. Too much pollution left uncleaned will eventually lead not only to sharply diminished productivity for your civilization but also to global warming, making Civilization one of the first works of popular entertainment to acknowledge the growing concern surrounding the phenomenon already among scientists of the early 1990s.

In fighting your rearguard action against these less desirable fellow travelers on the narrative of progress, you have various tools at your disposal. To clean up pollution that’s already occurred, you can build and deploy settler units to the affected areas. To prevent some pollution from occurring at all, you can invest in hydroelectric plants in general and/or the Wonder of the World that is the Hoover Dam. And/or you can build mass-transit systems to wean your people away from their precious cars, and/or build recycling centers to prevent some of their trash from winding up in landfills.

Interestingly, the original Civilization addresses the issues of environment and ecology that accompany the narrative of progress with far more earnestness than any of its sequels — another fact that rather gives the lie to Meier’s assertion that the game has little to do with the real world. Although even the first game’s implementation of pollution is far from unmanageable by the careful player, it’s something that most players just never found to be all that much fun, and this feedback caused the designers who worked on the sequels to gradually scale back its effects.

In the real world as well, pollution and the threat of global warming aren’t much fun to talk or think about — so much so that plenty of people, including an alarming number of those in positions of power, have chosen to stick their heads in the sand and pretend they don’t exist. None of us enjoy having our worldviews questioned in the uncomfortable ways that discussions of these and other potential limits of progress — progress as defined in terms of Francis Fukuyama’s explicit and Civilization‘s implicit ideals of liberal, capitalistic democracy — tend to engender.

As Adam Smith wrote in the pivotal year of 1776 and the subsequent centuries of history quite definitively proved, competitive free markets do some things extraordinarily well. The laws of supply and demand conspire to ensure that a society’s resources are allocated to those things its people actually need and want, while the profit motive drives innovation in a way no other economic system has ever come close to equaling. The developed West’s enormous material prosperity — a prosperity unparalleled in human history — is thanks to capitalism and its kissing cousin, democracy.

Yet unfettered capitalism, that Platonic ideal of libertarian economists, has a tendency to go off the rails if not monitored and periodically corrected by entities who are not enslaved by the profit motive. The first great crisis of American capitalism could be said to have taken place as early as the late 1800s, during the “robber baron” era of monopolists who discovered a way to cheat the law of supply and demand by cornering entire sectors of the market to themselves. Meanwhile the burgeoning era of mass production and international corporations, so dramatically different from Adam Smith’s world of shopkeepers and village craftsmen, led to the mass exploitation of labor. The response from government was an ever-widening net of regulations to keep corporations honest, while the response from workers was to unionize for the same purpose. Under these new, more restrictive conditions, capitalism continued to hum along, managing to endure another, still greater crisis of confidence in the form of the Great Depression, which led to the idea of a taxpayer-funded social safety net for the weak and the unlucky members of society.

The things that pure capitalism doesn’t do well, like providing for the aforementioned weak and unlucky who lack the means to pay for goods and services, tend to fall under the category that economists call “externalities”: benefits and harms that aren’t encompassed by Adam Smith’s supposedly all-encompassing law of supply and demand. In Smith’s world of shopkeepers, what was best for the individual supplier was almost always best for the public at large: if I sold you a fine plow horse for a reasonable price, I profited right then and there, and also knew that you were likely to tell your friends about it and to come back yourself next year when you needed another. If I sold you a lame horse, on the other hand, I’d soon be out of business. But if I’m running a multinational oil conglomerate in the modern world, that simple logic of capitalism begins to break down in the face of a much more complicated web of competing concerns. In this circumstance, the best thing for me to do in order to maximize my profits is to deny that global warming exists and do everything I can to fight the passage of laws that will hurt my business of selling people viscous black gunk to burn inside dirty engines. This, needless to say, is not in the public’s long-term interest; it’s an externality that could quite literally spell the end of human civilization. So, government must step in — hopefully! — to curb the burning of dirty fuels and address the effects of those fossil fuels that have already been burned.

But externalities are absolutely everywhere in our modern, interconnected, globalized world of free markets. Just as there’s no direct financial benefit in an unfettered free market for a doctor to provide years or decades worth of healthcare to a chronically sick person who lacks the means to pay for it, there’s no direct financial harm entailed in a factory dumping its toxic effluent into the nearest lake. There is, of course, harm in the abstract, but that harm is incurred by the people unlucky enough to live by the lake rather than by the owners of the factory. The trend throughout the capitalist era has therefore been for government to step in more and more; every successful capitalist economy in the world today is really a mixed economy, to a degree that would doubtless have horrified Adam Smith. As externalities continue to grow in size and scope, governments are forced to shoulder a bigger and bigger burden in addressing them. At what points does that burden become unbearable?

One other internal contradiction of modern capitalism, noticed by Karl Marx already in the nineteenth century, has come to feel more real and immediate than ever before in the years since the release of Civilization. The logic of modern finance demands yearly growth — ever greater production, ever greater profits. Just holding steady isn’t good enough; if you doubt my word, consider what your pension fund will look like come retirement time if the corporations in which you’ve invested it are content to merely hold steady. Up to this point, capitalism’s efficiency as an economic system has allowed it to deliver this growth over a decade-by-decade if not always year-by-year basis. But the earth’s resources are not unlimited. At some point, constant growth — the constant demand for more, more, more — must become unsustainable. What happens to capitalism then?

Exactly the future that believers in liberal democracy and capitalism claim to be the best one possible — that the less-developed world remakes itself in the mold of North America and Western Europe — would appear to be literally impossible in reality. The United States alone, home to 6 percent of the world’s population, consumes roughly 35 percent of its resources. One doesn’t need to be a statistician or an ecologist to understand that the rest of the world simply cannot become like the United States without ruining a global ecosystem that already threatens to collapse under the weight of 7.5 billion souls — twice the number of just thirty years ago. Humans are now the most common mammal on the planet, outnumbering even the ubiquitous mice and rats. Two-thirds of the world’s farmland is already rated as “somewhat” or “strongly” degraded by the Organization for Economic Cooperation and Development. Three-quarters of the world’s biodiversity has been lost since 1900, and 50 percent of all remaining plant and animal species are expected to go extinct before 2100. And hovering over it all is the specter of climate change; the polar ice caps have melted more in the last 20 years than they did in the previous 12,000 years since the end of the last ice age.

There’s no doubt about it: these are indeed uncomfortable conversations to have. Well before the likes of Brexit and President Donald Trump, even before the events of September 11, 2001, Western society was losing the sense of triumphalism that had marked the time of the original Civilization, replacing it with a jittery sense that humanity was packed too closely together on an overcrowded and overheating little planet, that the narrative of progress was rushing out of control toward some natural limit point that was difficult to discern or describe. The first clear harbinger of the generalized skittishness to come was perhaps the worldwide angst that accompanied the turn of the millennium — better known as “Y2K,” a fashionable brand name for disaster that smacked of Hollywood, thereby capturing the strange mixture of gloom and mass-media banality that would come to characterize much of post-millennial life. The historian of public perception David Lowenthal, writing in 2015:

Events spawned media persistently catastrophic in theme and tone, warning of the end of history, the end of humanity, the end of nature, the end of everything. Millennial prospects in 2000 were lacklustre and downbeat; Y2K seemed a portent of worse to come. Not even post-Hiroshima omens of nuclear annihilation unleashed such a pervasive glum foreboding. Today’s angst reflects unexampled loss of faith in progress: fears that our children will be worse off than ourselves, doubts that neither government nor industry, science nor technology, can set things right.

The turn of the millennium had the feeling of an end time, yet none of history’s more cherished eschatologies seemed to be coming true: not Christianity’s Rapture, not Karl Marx’s communist world order, not Wilhelm Friedrich Hegel or Francis Fukuyama’s liberal-democratic end of history, certainly not Sid Meier and Bruce Shelley’s trip to Alpha Centauri. Techno-progressives began to talk more and more of a new secular eschatology in the form of the so-called Singularity, the point where, depending on the teller, artificial intelligence would either merge with human intelligence to create a new super-species fundamentally different from the humans of prior ages, or our computers would simply take over the world, wiping out their erstwhile masters or relegating them to the status of pets. And that was one of the more positive endgames for humanity that came to be batted around. Others nursed apocalyptic visions of a world ruined by global warming and the rising sea levels associated with it — a secular version of the Biblical Flood — or completely overrun by Islamic Jihadists, those latest barbarians at the gates of civilization heralding the next Dark Ages. Our television and movies turned increasingly dystopic, with anti-heroes and planet-encompassing disasters coming to rule our prime-time entertainment.

The last few years in particular haven’t been terribly good ones for believers in the narrative of progress and the liberal-democratic world order it has done so much to foster. The Arab Spring, touted for a time as a backward region’s belated awakening to progress, collapsed without achieving much of anything at all. Britain is leaving the European Union; the United States elected Donald Trump; Russia is back to relishing the role of the Evil Empire, prime antagonist to the liberal-democratic West; China has gone a long way toward consummating a marriage once thought impossible: the merging of an autocratic, human-rights-violating government with an economy capable of competing with the best that democratic capitalism can muster. Our politicians issue mealy-mouthed homages to “realism” and “transactional diplomacy,” ignoring the better angels of our nature. Everywhere nativism and racism seem to be on the rise. Even in the country where I live now, the supposed progressive paradise of Denmark, the Danish People’s Party has won considerable power in the government by sloganeering that “Denmark is not a multicultural society,” by drawing lines between “real” Danes and those of other colors and other religions. In my native land of the United States, one side of the political discourse, finding itself unable to win a single good-faith argument on the merits, has elected to simply lie about the underlying facts, leading some to make the rather chilling assertion that we now live in a “post-truth” world. (How ironic that the American right, long the staunchest critic of postmodernism, should have been the ones to turn its lessons about the untenability of objective truth into an electoral strategy!)

And then there’s the incoming fire being taken by the most sacred of all of progress’s sacred cows, as The Economist‘s latest Democracy Index announces that it “continues its disturbing retreat.” In an event redolent with symbolism, the same index in 2016 changed the classification of the United States, that beacon of democracy throughout its history, from a “Full Democracy” to a “Flawed Democracy.” Functioning as both cause and symptom of this retreat is the old skepticism about whether democracy is just too chaotic to efficiently run a country, whether people who can so easily be duped by Facebook propaganda and email chain letters can really be trusted to decide their countries’ futures.

Looming over such discussions of democracy and its efficacy is the specter of China. When Mao Zedong’s Communist Party seized power there in 1949, the average Chinese citizen earned just $448 per year in inflation-adjusted terms, making it one of the poorest countries in the world. Mao’s quarter-century of orthodox communist totalitarianism, encompassing the horrors of the Great Leap Forward and the Cultural Revolution, managed to improve that figure only relatively slowly; average income had increased to $978 by 1978. But, following Mao’s death, his de-facto successor Deng Xiaoping began to depart from communist orthodoxy, turning from a centrally-managed economy to the seemingly oxymoronic notion of “market-oriented communism” — effectively a combination of authoritarianism with capitalism. Many historians and economists — not least among them Francis Fukuyama — have always insisted that a non-democracy simply cannot compete with a democracy on economic terms over a long span of time. Yet the economy of the post-Mao China has seemingly grown at a far more impressive rate than they allow to be possible, with average income reaching $6048 by 2006, then $16,624 by 2017. China today would seem to be a compelling rebuttal to all those theories about the magic conjunction of personal freedoms and free markets.

But is it really? We should be careful not to join some of our more excitable pundits in getting ahead of the real facts of the case. China’s economic transformation, remarkable as it’s been, has only elevated it to the 79th position among all the world’s nations in terms of GDP per capita. Its considerable economic clout in the contemporary world, in other words, has a huge amount to do with the fact that it’s the most populous country in the world. Further, the heart of its economy is manufacturing, as is proved by all of those “Made in China” tags on hard goods of every description that are sold all over the world. China is still a long, long way from joining the vanguard of post-industrial knowledge economies. To a large extent, economic innovation still comes from the latter; China then does the grunt work of manufacturing the products that the innovators design.

Of course, authoritarianism does have its advantages. China’s government, which doesn’t need to concern itself with elections every set number of years, can set large national projects in motion, such as a green energy grid spanning the entire country or even a manned trip to Mars, and see them methodically through over the course of decades if need be. But can China under its current system of government produce a truly transformative, never-seen-or-imagined-anything-like-it product like the Apple iPhone and iPad, the World Wide Web, or the Sony Walkman? It isn’t yet clear to me that it can transcend being an implementor of brilliant ideas — thanks to all those cheap and efficient factories — to being an originator of same. So, personally, I’m not quite ready to declare the death of the notion that a country requires democracy to join the truly top-tier economies of the world. The next few decades should be very interesting in one way or another — whether because China does definitively disprove that notion, because its growth tops out, or, most desirably, because a rising standard of living there and the demands of a restive middle class bring an end at last to China’s authoritarian government.

Still, none of these answers to The China Puzzle will do anything to help us with the fundamental limit point of the capitalistic world order: the demand for infinite economic growth in a world of decidedly finite resources. Indeed, the Chinese outcome I just named as the most desirable — that of a democratic, dynamic China free of the yoke of its misnamed Communist Party — only causes our poor, suffering global ecosystem to suffer that much more under the yoke of capitalism. For this reason, economists today have begun to speak more and more of a “crisis of capitalism,” to question whether Adam Smith’s brilliant brainchild is now entering its declining years. For a short time, the “Great Recession” of 2007 and 2008, when some of the most traditionally rock-solid banks and corporations in the world teetered on the verge of collapse, seemed like it might be the worldwide shock that signaled the beginning of the end. Desperate interventions by governments all over the world managed to save the capitalists from themselves at the last, but even today, when the economies of most Western nations are apparently doing quite well, the sense of unease that was engendered by that near-apocalypse of a decade ago has never fully disappeared. The feeling remains widespread that something has to give sooner or later, and that that something might be capitalism as we know it today.

But what would a post-capitalist world look like? Aye, there’s the rub. Communism, capitalism’s only serious challenger over the course of the last century, would seem to have crashed and burned a long time ago as a practical way of ordering an economy. Nor, based on the horrid environmental record of the old Soviet bloc, is it at all clear that it would have proved any better a caretaker of our planet than capitalism even had it survived.

One vision for the future, favored by the anarchist activists whom we briefly met in an earlier article, entails a deliberate winding down of the narrative of progress before some catastrophe or series of catastrophes does it for us. It’s claimed that we need to abandon globalization and re-embrace localized, self-sustaining ways of life; it’s thus perhaps not so much a complete rejection of capitalism as a conscious return to Adam Smith’s era of shopkeepers and craftsman. The prominent American anarchist Murray Bookchin dreams of a return to “community, decentralization, self-sufficiency, mutual aid, and face-to-face democracy” — “a serious challenge to [globalized] society with its vast, hierarchical, sexist, class-ruled state apparatus and militaristic history.” Globalization, he and other anarchists note, often isn’t nearly as efficient as its proselytizers claim. In fact, the extended international supply chains it fosters for even the most basic foodstuffs are often absurdly wasteful in terms of energy and other resources, and brittle to boot, vulnerable to the slightest shock to the globalized system. Why should potatoes which can be grown in almost any back garden in the world need to be shipped in via huge, fuel-guzzling jet airplanes and forty-ton semis? Locally grown agriculture, anarchists point out, can provide eight units of food energy for every one unit of fossil-fuel energy needed to bring it to market, while in many cases exactly the opposite ratio holds true for internationally harvested produce.

But there’s much more going on here philosophically than a concern with the foodstuff supply chain. Modern anarchist thought reflects a deep discomfort with consumer culture, a strand of philosophy we’ve met before in the person of Jean-Jacques Rousseau and his “noble savage.” In truth, Rousseau noted, the only things a person really, absolutely needs to survive are food and shelter. All else is, to paraphrase the Bible, vanity, and all too often brings only dissatisfaction. Back in the eighteenth century, Rousseau could already describe the collector who is never satisfied by the collection he’s assembled, only dissatisfied by its gaps.

What would he make of our times? Today’s world is one of constant beeping enticements — cars, televisions, stereos, computers, phones, game consoles — that bring only the most ephemeral bursts of happiness before we start craving the upgraded model. The anarchist activist Peter Harper:

People aspire to greater convenience and comfort, more personal space, easy mobility, a sense of expanding possibilities. This is the modern consumerist project: what modern societies are all about. It is a central feature of mainstream politics and economics that consumerist aspirations are not seriously challenged. On the contrary, the implied official message is “Hang on in there: we will deliver.” The central slogan is brutally simple: MORE!

Harper claims that, as the rest of the world continues to try and fail to find happiness in the latest shiny objects, anarchists will win them over to their cause by example. For those who reject materialist culture “will quite visibly be having a good time: comfortable, with varied lives and less stress, healthy and fit, having rediscovered the elementary virtues of restraint and balance.”

Doubtless we could all use a measure of restraint and balance in our lives, but the full anarchist project for happiness and sustainability through a deliberate deconstruction of the fruits of progress is so radical — entailing as it does the complete dissolution of nation-states and a return to decentralized communal living — that it’s difficult to fully envision how it could happen absent the sort of monumental precipitating global catastrophe that no one can wish for. While human nature will always be tempted to cast a wistful eye back to an imagined simpler, more elemental past, another, perhaps nobler part of our nature will always look forward with an ambitious eye to a bolder, more exciting future. The oft-idealized life of a tradesman prior to the Industrial Revolution, writes Francis Fukuyama, “involved no glory, dynamism, innovation, or mastery; you just plied the same traditional markets or crafts as your father and grandfather.” For many or most people that may be a fine life, and more power to them. But what of those with bigger dreams, who would spur humanity on to bigger and better things? That is to say, what of the authors of the narrative of progress of the past, present, and future, who aren’t willing to write the whole thing off as fun while it lasted and return to the land? The builders among us will never be satisfied with a return to some agrarian idyll.

The world’s current crisis of faith in progress and in the liberal-democratic principles that are so inextricably bound up with it isn’t the first or the worst of its kind. Not that terribly long ago, Nazi Germany and Imperial Japan posed a far more immediate and tangible threat to liberal democracy all over the world than anything we face today; the American Nazi party was once strong enough to rent and fill Madison Square Garden, a fact which does much to put the recent disconcerting events in Charlottesville in perspective. And yet liberal democracy got through that era all right in the end.

Even in 1983, when the Soviet Union was already teetering on the verge of economic collapse, an unknowing Jean-François Revel could write that “democracy may, after all, turn out to have been an historical accident, a brief parenthesis that is closing before our eyes.” The liberal West’s periods of self-doubt have always seemed to outnumber and outlast its periods of triumphalism, and yet progress has continued its march. During the height of the fascist era, voting rights in many democratic countries were being expanded to include all of their citizens at long last; amidst the gloominess about the future that has marked so much of post-millennial life, longstanding prejudices toward gay and lesbian people have fallen away so fast in the developed West that it’s left even many of our ostensibly progressive politicians scrambling to keep up.

Of course, the fact still remains that our planet’s current wounds are real, and global warming may in the long run prove to be the most dangerous antagonist humanity has ever faced. If we’re unwilling to accept giving up the fruits of progress in the name of healing our planet, where do we go from here? One thing that is clear is that we will have to find different, more sustainable ways of ordering our economies if progress is to continue its march. Capitalism is often praised for its ability to sublimate what Friedrich Nietzsche called the megalothymia of the most driven souls among us — the craving for success, achievement, recognition, victory — into the field of business rather than the field of battle. Would other megalothymia sublimators, such as sport, be sufficient in a post-capitalist world? What would a government/economy look like that respects people’s individual freedoms but avoids the environment-damaging, resource-draining externalities of capitalism? No one — certainly not I! — can offer entirely clear answers to these questions today. This is not so much a tribute to anything unique about our current times as it is a tribute to the nature of history itself. Who anticipated Christianity? Who anticipated that we would use the atomic bomb only twice? Who, for that matter, anticipated a President Donald Trump?

One possibility, at least in the short term, is to rejigger the rules of capitalism to bring its most problematic externalities back under the umbrella of the competitive marketplace. Experiments in cap-and-trade, which turn environment-ruining carbon emissions into a scarce commodity that corporations can exchange among themselves, have shown promising results.

But in the longer term, still more than just our economics will have to change. Because the problems of ecology and environment are global problems of a scope we’ve never faced before, we will need to think of ourselves more and more as a global society in order to solve them. In time, the nation-states in which we still invest so much patriotic fervor today may need to go the way of the scattered, self-sufficient settlements of a few dozens or hundreds that marked the earliest stages of the earliest civilizations. In time, the seeds that were planted with the United Nations in the aftermath of the bloodiest of all our stupid, pointless wars may flower into a single truly global civilization.

Really, though, I can’t possibly predict how humanity will progress its way out of its current set of predicaments. I can only have faith in the smarts and drive that have brought us this far. The best we can hope for is probably to muddle through by the skin of our teeth — but then, isn’t that what we’ve always been doing? The first civilizations began as improvised solutions to the problem of a changing climate, and we’ve been making it up as we go along ever since. So, maybe the first truly global civilization will also arise as, you guessed it, an improvised solution to the problem of a changing climate. Even if we’ve met our match with our latest nemesis of human-caused climate change, perhaps it really is better to burn out than to fade away. Perhaps it’s better to go down swinging than to survive at the cost of the grand dream of an eventual trip to Alpha Centauri.

The game which has the fulfillment of that dream as its most soul-stirring potential climax has been oft-chided for promoting a naive view of history — for being Western- and American-centric, for ignoring the plights of the vast majority of the people who have ever inhabited this planet of ours, for ignoring the dangers of the progress it celebrates. It is unquestionably guilty of all these things in whole or in part, and guilty of many more sins against history besides. But I haven’t chosen to emphasize overmuch its many problems in this series of articles because I find its guiding vision of a human race capable of improving itself down through the millennia so compelling and inspiring. Human civilization needs it critics, but it needs its optimists perhaps even more. So, may the optimistic outlook of the narrative of progress last as long as our species, and may we always have to go along with it the optimism of the game of Civilization — or of a Civilization VI, Civilization XVI, or Civilization CXVI — to exhort us to keep on keeping on.

(Sources: the books Civilization, or Rome on 640K A Day by Johnny L. Wilson and Alan Emrich, The End of History and the Last Man by Francis Fukuyama, Democracy: A Very Short Introduction by Bernard Crick, Anarchism: A Very Short Introduction by Colin Ward, Environmental Economics: A Very Short Introduction by Stephen Smith, Globalization: A Very Short Introduction by Manfred B. Steger, Economics: A Very Short Introduction by Partha Dasgupta, Global Economic History: A Very Short Introduction by Robert C. Allen, Capital by Karl Marx, The Social Contract by Jean-Jacques Rousseau, The Genealogy of Morals by Friedrich Nietzsche, Lectures on the Philosophy of History by Georg Wilhelm Friedrich Hegel, The Wealth of Nations by Adam Smith, How Democracies Perish by Jean-François Revel, and The Past is a Foreign Country by David Lowenthall.)

 

Tags: , , ,

The Game of Everything, Part 9: Civilization and Economics

If the tailor goes to war against the baker, he must henceforth bake his own bread.

— Ludwig von Mises

There’s always the danger that an analysis of a game spills into over-analysis. Some aspects of Civilization reflect conscious attempts by its designers to model the processes of history, while some reflect unconscious assumptions about history; some aspects represent concessions to the fact that it first and foremost needs to work as a playable and fun strategy game, while some represent sheer random accidents. It’s important to be able to pull these things apart, lest the would-be analyzer wander into untenable terrain.

Any time I’m tempted to dismiss that prospect, I need only turn to Johnny L. Wilson and Alan Emrich’s ostensible “strategy guide” Civilization: or Rome on 640K a Day, which is actually far more interesting as the sort of distant forefather of this series of articles — as the very first attempt ever to explore the positions and assumptions embedded in the game. Especially given that it is such an early attempt — the book was published just a few months after the game, being largely based on beta versions of same that MicroProse had shared with the authors — Wilson and Emrich do a very credible job overall. Yet they do sometimes fall into the trap of seeing what their political beliefs make them wish to see, rather than what actually existed in the minds of the designers. The book doesn’t explicitly credit which of the authors wrote what, but one quickly learns to distinguish their points of view. And it turns out that Emrich, whose arch-conservative worldview is on the whole more at odds with that of the game than Wilson’s liberal-progressive view, is particularly prone to projection. Among the most egregious and amusing examples of him using the game as a Rorschach test is his assertion that the economy-management layer of Civilization models a rather dubious collection of ideas that have more to do with the American political scene in 1991 than they do with any proven theories of economics.

We know we’re in trouble as soon as the buzzword “supply-side economics” turns up prominently in Emrich’s writing. It burst onto the stage in a big way in the United States in 1980 with the election of Ronald Reagan as president, and has remained to this day one of his Republican party’s main talking points on the subject of economics in general. Its central, counter-intuitive claim is that tax revenues can often be increased by cutting rather than raising tax rates. Lower taxes, goes the logic, provide such a stimulus to the economy as a whole that people wind up making a lot more money. And this in turn means that the government, even though it brings in less taxes per dollar, ends up bringing in more taxes in the aggregate.

In seeing what he wanted to see in Civilization, Alan Emrich decided that it hewed to contemporary Republican orthodoxy not only on supply-side economics but also on another subject that was constantly in the news during the 1980s and early 1990s: the national debt. The Republican position at the time was that government deficits were always bad; government should be run like a business in all circumstances, went their argument, with an orderly bottom line.

But in the real world, supply-side economics and a zero-tolerance policy on deficits tend to be, shall we say, incompatible with one another. Since the era of Ronald Reagan, Republicans have balanced these oil-and-water positions against one another by prioritizing tax cuts when in power and wringing their hands over the deficit — lamenting the other party’s supposedly out-of-control spending on priorities other than their own — when out of power. Emrich, however, sees in Civilization‘s model of an economy the grand unifying theory of his dreams.

Let’s quickly review the game’s extremely simplistic handling of the economic aspects of civilization-building before we turn to his specific arguments, such as they are. The overall economic potential of your cities is expressed as a quantity of “trade arrows.” As leader, you can devote the percentage of trade arrows you choose to taxes, which add money to your treasury for spending on things like the maintenance costs of your buildings and military units and tributes to other civilizations; research, which lets you acquire new advances; and, usually later in the game, luxuries, which help to keep your citizens content. There’s no concept of deficit spending in the game; if ever you don’t have enough money in the treasury to maintain all of your buildings and units at the end of a turn, some get automatically destroyed. This, then, leads Emrich to conclude that the game supports his philosophy on the subject of deficits in general.

But the more entertaining of Emrich’s arguments are the ones he deploys to justify supply-side economics. At the beginning of a game of Civilization, you have no infrastructure to support, and thus you have no maintenance costs at all — and, depending on which difficulty level you’ve chosen to play at, you may even start with a little bit of money already in the treasury. Thus it’s become standard practice among players to reduce taxes sharply from their default starting rate of 50 percent, devoting the bulk of their civilization’s economy early on to research on basic but vital advances like Pottery, Bronze Working, and The Wheel. With that in mind, let’s try to follow Emrich’s thought process:

To maximize a civilization’s potential for scientific and technological advancement, the authors recommend the following exercise in supply-side economics. Immediately after founding a civilization’s initial city, pull down the Game Menu and select “Tax Rate.” Reduce the tax rate from its default 50% to 10% (90% Science). This reduced rate will allow the civilization to continue to maintain its current rate of expenditure while increasing the rate at which scientific advancements occur. These advancements, in turn, will accelerate the wealth and well-being of the civilization as a whole.

In this way, the game mechanics mirror life. The theory behind tax reduction as a spur to economic growth is built on two principles: the multiplier and the accelerator. The multiplier effect is abstracted out of Sid Meier’s Civilization because it is a function of consumer spending.

The multiplier effect says that each tax dollar cut from a consumer’s tax burden and actually spent on consumer goods will net an additional 50 cents at a second stage of consumer spending, an additional 25 cents at a third stage, an additional 12.5 cents at a fourth stage, etc. Hence, economists claim that the full progression nets a total of two dollars for each extra consumer dollar spent as a result of a tax cut.

The multiplier effect cannot be observed in the game because it is only presented indirectly. Additional consumer spending causes a flash point where additional investment takes place to increase, streamline, and advance production capacity and inventory to meet the demands of the increased consumption. Production increases and advances, in turn, have an additional multiplier effect beyond the initial consumer spending. When the scientific advancements occur more rapidly in Sid Meier’s Civilization, they reflect that flash point of additional investment and allow civilizations to prosper at an ever accelerating rate.

Wow. As tends to happen a lot after I’ve just quoted Mr. Emrich, I’m not quite sure where to start. But let’s begin with his third paragraph, in particular with a phrase which is all too easy to overlook: that for this to work, the dollar cut must “actually be spent on consumer goods.” When tax rates for the wealthy are cut, the lucky beneficiaries don’t tend to go right out and spend their extra money on consumer goods. The most direct way to spur the economy through tax cuts thus isn’t to slash the top tax bracket, as Republicans have tended to do; it’s to cut the middle and lower tax brackets, which puts more money in the pockets of those who don’t already have all of the luxuries they could desire, and thus will be more inclined to go right out and spend their windfall.

But, to give credit where it’s due, Emrich does at least include that little phrase about the importance of spending on consumer goods, even if he does rather bury the lede. His last paragraph is far less defensible. To appreciate its absurdity, we first have to remember that he’s talking about “consumer spending” in a Stone Age economy of 4000 BC. What are these consumers spending on? Particularly shiny pieces of quartz?  And for that matter what are they spending, considering that your civilization hasn’t yet developed currency? And how on earth can any of this be said to justify supply-side economics over the long term? You can’t possibly maintain your tax rate of 10 percent forever; as you build up your cities and military strength, your maintenance costs steadily increase, forcing you back toward that starting default rate of 50 percent. To the extent that Civilization can be said to send any message at all on taxes, said message must be that a maturing civilization will need to steadily increase its tax rate as it advances toward modernity. And indeed, as we learned in an earlier article in this series, this is exactly what has happened over the long arc of real human history. Your economic situation at the beginning of a game of Civilization isn’t some elaborate testimony to supply-side economies; it just reflects the fact that one of the happier results of a lack of civilization is the lack of a need to tax anyone to maintain it.

In reality, then, the taxation model in the game is a fine example of something implemented without much regard for real-world economics, simply because it works in the context of a strategy game like this one. Even the idea of a such a centralized system of rigid taxation for a civilization as a whole is a deeply anachronistic one in the context of most societies prior to the Enlightenment, for whose people local government was far more important than some far-off despot or monarch. Taxes, especially at the national level, tended to come and go prior to AD 1700, depending on the immediate needs of the government, and lands and goods were more commonly taxed than income, which in the era before professionalized accounting was hard for the taxpayer to calculate and even harder for the tax collector to verify. In fact, a fixed national income tax of the sort on which the game’s concept of a “tax rate” seems to be vaguely modeled didn’t come to the United States until 1913. Many ancient societies — including ones as advanced as Egypt during its Old Kingdom and Middle Kingdom epochs —  never even developed currency at all. Even in the game Currency is an advance which you need to research; the cognitive dissonance inherent in earning coins for your treasury when your civilization lacks the concept of money is best just not thought about.

Let’s take a moment now to see if we can make a more worthwhile connection between real economic history and luxuries, that third category toward which you can devote your civilization’s economic resources. You’ll likely have to begin doing so only if and when your cities start to grow to truly enormous sizes, something that’s likely to happen only under the supercharged economy of a democracy. When all of the usual bread and circuses fail, putting resources into luxuries can maintain the delicate morale of your civilization, keeping your cities from lapsing into revolt. There’s an historical correspondence that actually does seem perceptive here; the economies of modern Western democracies, by far the most potent the world has ever known, are indeed driven almost entirely by a robust consumer market in houses and cars, computers and clothing. Yet it’s hard to know where to really go with Civilization‘s approach to luxuries beyond that abstract statement. At most, you might put 20 or 30 percent of your resources into them, leaving the rest to taxes and research, whereas in a modern developed democracy like the United States those proportions tend to be reversed.

Ironically, the real-world economic system to which Civilization‘s overall model hews closest is actually a centrally-planned communist economy, where all of a society’s resources are the property of the state — i.e, you — which decides how much to allocate to what. But Sid Meier and Bruce Shelley would presumably have run screaming from any such association — not to mention our friend Mr. Emrich, who would probably have had a conniption. It seems safe to say, then, that what we can learn from the Civilization economic model is indeed sharply limited, that most of it is there simply as a way of making a playable game.

Still, we might usefully ask whether there’s anything in the game that does seem like a clear-cut result of its designers’ attitudes toward real-world economics. We actually have seen some examples of that already in the economic effects that various systems of government have on your civilization, from the terrible performance of despotism to the supercharging effect of democracy. And there is one other area where Civilization stakes out some clear philosophical territory: in its attitude toward trade between civilizations, a subject that’s been much in the news in recent years in the West.

In the game, your civilization can reap tangible benefits from its contact with other civilizations in two ways. For one, you can use special units called caravans, which become available after you’ve researched the advance of Trade, to set up “trade routes” between your cities and those of other civilizations. Both then receive a direct boost to their economies, the magnitude of which depends on their distance from one another — farther is better — and their respective sizes. A single city can set up such mutually beneficial arrangements with up to five other cities, and see them continue as long as the cities in question remain in existence.

In addition to these arrangements, you can horse-trade advances directly with the leaders of other civilizations, giving your counterpart one of your advances in exchange for one you haven’t yet acquired. It’s also possible to take advances from other civilizations by conquering their cities or demanding tribute, but such hostile approaches have obvious limits to which a symbiotic trading relationship isn’t subject; fighting wars is expensive in terms of blood and treasure alike, and you’ll eventually run out of enemy cities to conquer. If, on the other hand, you can set up warm relationships with four or five other civilizations, you can positively rocket up the Advances Chart.

The game’s answer to the longstanding debate between free trade and protectionism — between, to put a broader framing on it, a welcoming versus an isolationist attitude toward the outside world — is thus clear: those civilizations which engage economically with the world around them benefit enormously and get to Alpha Centauri much faster. Such a position is very much line in line with the liberal-democratic theories of history that were being espoused by thinkers like Francis Fukuyama at the time Meier and Shelley were making the game — thinkers whose point of view Civilization unconsciously or knowingly adopts.

As has become par for the course by now, I believe that the position Civilization and Fukuyama alike take on this issue is quite well-supported by the evidence of history. To see proof, one doesn’t have to do much more than look at where the most fruitful early civilizations in history were born: near oceans, seas, and rivers. Egypt was, as the ancient historian Herodotus so famously put it, “the gift of the Nile”; Athens was born on the shores of the Mediterranean; Rome on the east bank of the wide and deep Tiber river. In ancient times, when overland travel was slow and difficult, waterways were the superhighways of their era, facilitating the exchange of goods, services, and — just as importantly — ideas over long distances. It’s thus impossible to imagine these ancient civilizations reaching the heights they did without this access to the outside world. Even today port cities are often microcosms of the sort of dynamic cultural churn that spurs civilizations to new heights. Not for nothing does every player of the game of Civilization want to found her first city next to the ocean or a river — or, if possible, next to both.

To better understand how these things work in practice, let’s return one final time to the dawn of history for a narrative of progress involving one of the greatest of all civilizations in terms of sheer longevity.

Egypt was far from the first civilization to spring up in the Fertile Crescent, that so-called “cradle of civilization.” The changing climate that forced the hunter-gatherers of the Tigris and Euphrates river valleys to begin to settle down and farm as early as 10,000 BC may not have forced the peoples roaming the lands near the Nile to do the same until as late as 4000 BC. Yet Egyptian civilization, once it took root, grew at a crazy pace, going from primitive hunter-gatherers to a culture that eclipsed all of its rivals in grandeur and sophistication in less than 1500 years. How did Egypt manage to advance so quickly? Well, there’s strong evidence that it did so largely by borrowing from the older, initially wiser civilizations to its east.

Writing is among the most pivotal advances for any young civilization; it allows the tallying of taxes and levies, the inventorying of goods, the efficient dissemination of decrees, the beginning of contracts and laws and census-taking. It was if anything even more important in Egypt than in other places, for it facilitated a system of strong central government that was extremely unusual in the world prior to the Enlightenment of many millennia later. (Ancient Egypt at its height was, in other words, a marked exception to the rule about local government being more important than national prior to the modern age.) Yet there’s a funny thing about Egypt’s famous system of hieroglyphs.

In nearby Sumer, almost certainly the very first civilization to develop writing, archaeologists have traced the gradual evolution of cuneiform writing by fits and starts over a period of many centuries. But in Egypt, by contrast, writing just kind of appears in the archaeological record, fully-formed and out of the blue, around 3000 BC. Now, it’s true that Egypt didn’t simply take the Sumerian writing system; the two use completely different sets of symbols. Yet many archaeologists believe that Egypt did take the idea of writing from Sumer, with whom they were actively trading by 3000 BC. With the example of a fully-formed vocabulary and grammar, all translated into a set of symbols, the actual implementation of the idea in the context of the Egyptian language was, one might say, just details.

How long might it have taken Egypt to make the conceptual leap that led to writing without the Sumerian example? Not soon enough, one suspects, to have built the Pyramids of Giza by 2500 BC. Further, we see other diverse systems of writing spring up all over the Mediterranean and Middle East at roughly the same time. Writing was an idea whose time had come, thanks to trading contacts. Trade meant that every new civilization wasn’t forced to reinvent every wheel for itself. It’s since become an axiom of history that an outward-facing civilization is synonymous with youth and innovation and vigorous growth, an inward-turning civilization synonymous with age and decadence and decrepit decline. It happened in Egypt; it happened in Greece; it happened in Rome.

But, you might say, the world has changed a lot since the heyday of Rome. Can this reality that ancient civilizations benefited from contact and trade with one another really be applied to something like the modern debate over free trade and globalization? It’s a fair point. To address it, let’s look at the progress of global free trade in times closer to our own.

In the game of Civilization, you won’t be able to set up a truly long-distance, globalized trading network with other continents until you’ve acquired the advance of Navigation, which brings with it the first ships that are capable of transporting your caravan units across large tracts of ocean. In real history, the first civilizations to acquire such things were those of Europe, in the late fifteenth century AD. Economists have come to call this period “The First Globalization.”

And, tellingly, they also call this period “The Great Divergence.” Prior to the arrival of ships capable of spanning the Atlantic and Pacific Oceans, several regions of the world had been on a rough par with Europe in terms of wealth and economic development. In fact, at least one great non-European civilization — that of China — was actually ahead; roughly one-third of the entire world’s economic output came from China alone, outdistancing Europe by a considerable margin. But, once an outward-oriented Europe began to establish itself in the many less-developed regions of the world, all of that changed, as Europe surged forward to the leading role it would enjoy for the next several centuries.

How did the First Globalization lead to the Great Divergence? Consider: when the Portuguese explorer Vasco de Gama reached India in 1498, he found he could buy pepper there, where it was commonplace, for a song. He could then sell it back in Europe, where it was still something of a delicacy, for roughly 25 times what he had paid for it, all while still managing to undercut the domestic competition. Over the course of thousands of similar trading arrangements, much of the rest of the world came to supply Europe with the cheap raw materials which were eventually used to fuel the Industrial Revolution and to kick the narrative of progress into overdrive, making even tiny European nations like Portugal into deliriously rich and powerful entities on the world stage.

And what of the great competing civilization of China? As it happens, it might easily have been China instead of Europe that touched off the First Globalization and thereby separated itself from the pack of competing civilizations. By the early 1400s, Chinese shipbuilding had advanced enough that its ships were regularly crisscrossing the Indian Ocean between established trading outposts on the east coast of Africa. If the arts of Chinese shipbuilding and navigation had continued to advance apace, it couldn’t have been much longer until its ships crossed the Pacific to discover the Americas. How much different would world history have been if they had? Unfortunately for China, the empire’s imperial leaders, wary of supposedly corrupting outside influences, made a decision around 1450 to adopt an isolationist posture. Existing trans-oceanic trade routes were abandoned, and China retreated behind its Great Wall, leaving Europe to reap the benefits of global trade. By 1913, China’s share of the world’s economy had dropped to 4 percent. The most populous country in the world had become a stagnant backwater in economic terms. So, we can say that Europe’s adoption of an outward-facing posture just as China did the opposite at this critical juncture became one of the great difference-makers in world history.

We can already see in the events of the late fifteenth century the seeds of the great debate over globalization that rages as hotly as ever today. While it’s clear that the developed countries of Europe got a lot out of their trading relationships, it’s far less clear that the less-developed regions of the world benefited to anything like the same extent — or, for that matter, that they benefited at all.

This first era of globalization was the era of colonialism, when developed Europe freely exploited the non-developed world by toppling or co-opting whatever forms of government already existed among its new trading “partners.” The period brought a resurgence of the unholy practice of slavery, along with forced religious conversions, massacres, and the theft of entire continents’ worth of territory. Much later, over the course of the twentieth century, Europe gradually gave up most of its colonies, allowing the peoples of its former overseas possessions their ostensible freedom to build their own nations. Yet the fundamental power imbalances that characterized the colonial period have never gone away. Today the developing world of poor nations trades with the developed world of rich nations under the guise of being equal sovereign entities, but the former still feeds raw materials to the industrial economies of the latter — or, increasingly, developing industrial economies feed finished goods to the post-industrial knowledge economies of the ultra-developed West. Proponents of economic globalization argue that all of this is good for everyone concerned, that it lets each country do what it does best, and that the resulting rising economic tide lifts all their boats. And they argue persuasively that the economic interconnections globalization has brought to the world have been a major contributing factor to the unprecedented so-called “Long Peace” of the last three quarters of a century, in which wars between developed nations have not occurred at all and war in general has become much less frequent.

But skeptics of economic globalism have considerable data of their own to point to. In 1820, the richest country in the world on a per-capita basis was the Netherlands, with an inflation-adjusted average yearly income of $1838, while the poorest region of the world was Africa, with an average income of $415. In 2017, the Netherlands had an average income of $53,582, while the poorest country in the world for which data exists was in, you guessed it, Africa: it was the Central African Republic, with an average income of $681. The richest countries, in other words, have seen exponential economic growth over the last two centuries, while some of the poorest have barely moved at all. This pattern is by no means entirely consistent; some countries of Asia in particular, such as Taiwan, South Korea, Singapore, and Japan, have done well enough for themselves to join the upper echelon of highly-developed post-industrial economies. Yet it does seem clear that the club of rich nations has grown to depend on at least a certain quantity of nations remaining poor in order to keep down the prices of the raw materials and manufactured goods they buy from them. If the rising tide lifted these nations’ boats to equality with those of the rich, the asymmetries on which the whole world economic order runs today wouldn’t exist anymore. The very stated benefits of globalization carry within them the logic for keeping the poor nations’ boats from rising too high: if everyone has a rich, post-industrial economy, who’s going to do the world’s grunt work? This debate first really came to the fore in the 1990s, slightly after the game of Civilization, as anti-globalization became a rallying cry of much of the political left in the developed world, who pointed out the seemingly inherent contradictions in the idea of economic globalization as a universal force for good.

Do note that I referred to “economic globalization” there. We should do what we can to separate it from the related concepts of political globalization and cultural globalization, even as the trio can often seem hopelessly entangled in the real world. Still, political globalization, in the form of international bodies like the United Nations and the International Court of Justice, is usually if not always supported by leftist critics of economic globalization.

But cultural globalization is decried to almost an equal degree, being sometimes described as the “McDonaldization” of the world. Once-vibrant local cultures all over the world, goes the claim, are being buried under the weight of an homogenized global culture of consumption being driven largely from the United States. Kids in Africa who have never seen a baseball game rush out to buy the Yankees caps worn by the American rap stars they worship, while gangsters kill one another over Nike sneakers in the streets of China. Developing countries, the anti-globalists say, first get exploited to produce all this crap, then get the privilege of having it sold back to them in ways that further eviscerate their cultural pride.

And yet, as always with globalization, there’s also a flip side. A counter-argument might point out that at the end of the day people have a right to like what they like (personally, I have no idea why anyone would eat a McDonald’s hamburger, but tastes evidently vary), and that cultures have blended with and assimilated one another from the days when ancient Egypt traded with ancient Sumer. Young people in particular in the world of today have become crazily adept at juggling multiple cultures: getting married in a traditional Hindu ceremony on Sunday and then going to work in a smart Western business suit on Monday, listening to Beyoncé on their phone as they bike their way to sitar lessons. Further, the emergence of new forms of global culture, assisted by the magic of the Internet, have already fostered the sorts of global dialogs and global understandings that can help prevent wars; it’s very hard to demonize a culture which has produced some of your friends, or even just creative expressions you admire. As the younger generations who have grown up as members of a sort of global Internet-enabled youth culture take over the levers of power, perhaps they will become the vanguard of a more peaceful, post-nationalist world.

The debate about economic globalization, meanwhile, has shifted in some surprising ways in recent years. Once a cause associated primarily with the academic left, cosseted in their ivory towers, the anti-globalization impulse has now become a populist movement that has spread across the political spectrum in many developed countries of the West. Even more surprisingly, the populist debate has come to center not on globalization’s effect on the poor nations on the wrong side of the power equation but on those rich nations who would seem to be its clear-cut beneficiaries. In just the last couple of years as of this writing, blue-collar workers who feel bewildered and displaced by the sheer pace of an ever-accelerating narrative of progress in an ever more multicultural world were a driving force behind the Brexit vote in Britain and the election of Donald Trump to the presidency of the United States. The understanding of globalization which drove both events was simplistic and confused — trade deficits are no more always a bad thing for any given country than is a national tax deficit — but the visceral anger behind them was powerful enough to shake the established Western world order more than any event since the World Trade Center attack of 2001. It should become more clear in the next decade or so whether, as I suspect, these movements represent a reactionary last gasp of the older generation before the next, more multicultural and internationalist younger generation takes over, or whether they really do herald a more fundamental shift in geopolitics.

As for the game of Civilization: to attempt to glean much more from its simple trading mechanisms than we already have would be to fall into the same trap that ensnared Alan Emrich. A skeptic of globalization might note that the game is written from the perspective of the developed world, and thus assumes that your civilization is among the privileged ranks for whom globalization on the whole has been — sorry, Brexiters and Trump voters! — a clear benefit. This is true even if the name of the civilization you happen to be playing is the Aztecs or the Zulus, peoples for whom globalization in the real world meant the literal end of their civilizations. As such examples prove, the real world is far more complicated than the game makes it appear. Perhaps the best lesson to take away — from the game as well as from the winners and arguable losers of globalization in our own history — is that it really does behoove a civilization to actively engage with the world. Because if it doesn’t, at some point the world will decide to engage with it.

(Sources: the books Civilization, or Rome on 640K A Day by Johnny L. Wilson and Alan Emrich, The End of History and the Last Man by Francis Fukuyama, Economics by Paul Samuelson, The Rise and Fall of Ancient Egypt by Toby Wilkinson, Enlightenment Now: The Case for Reason, Science, Humanism, and Progress by Steven Pinker, Global Economic History: A Very Short Introduction by Robert C. Allen, Globalization: A Very Short Introduction by Manfred B. Steger, Taxation: A Very Short Introduction by Stephen Smith, and Guns, Germs, and Steel: The Fates of Human Societies by Jared Diamond.)

 

Tags: , , ,