RSS

Peter Molyneux’s Kingdom in a Box

Peter Molyneux, circa 1990

Peter Molyneux, circa 1990.

I have this idea of a living world, which I have never achieved. It’s based upon this picture in my head, and I can see what it’s like to play that game. Every time I do it, then it maybe gets closer to that ideal. But it’s an ambitious thing.

— Peter Molyneux

One day as a young boy, Peter Molyneux stumbled upon an ant hill. He promptly did what young boys do in such situations: he poked it with a stick, watching the inhabitants scramble around as destruction rained down from above. But then, Molyneux did something that set him apart from most young boys. Feeling curious and maybe a little guilty, he gave the ants some sugar for energy and watched quietly as they methodically undid the damage to their home. Just like that, he woke up to the idea of little living worlds with lots of little living inhabitants — and to the idea of he himself, the outsider, being able to affect the lives of those inhabitants. The blueprint had been laid for one of the most prominent and influential careers in the history of game design. “I have always found this an interesting mechanic, the idea that you influence the game as opposed to controlling the game,” he would say years later. “Also, the idea that the game can continue without you.” When Molyneux finally grew bored and walked away from the ant hill on that summer day in his childhood, it presumably did just that, the acts of God that had nearly destroyed it quickly forgotten. Earth — and ants — abide.

Peter Molyneux was born in the Surrey town of Guildford (also hometown of, read into it what you will, Ford Prefect) in 1959, the son of an oil-company executive and a toy-shop proprietor. To hear him tell it, he was qualified for a career in computer programming largely by virtue of being so hopeless at everything else. Being dyslexic, he found reading and writing extremely difficult, a handicap that played havoc with his marks at Bearwood College, the boarding school in the English county of Berkshire to which his family sent him for most of his teenage years. Meanwhile his less than imposing physique boded ill for a career in the military or manual labor. Thankfully, near the end of his time at Bearwood the mathematics department acquired a Commodore PET,  while the student union almost simultaneously installed a Space Invaders machine. Seeing a correspondence between these two pieces of technology that eluded his fellow students, Molyneux set about trying to program his own Space Invaders on the PET, using crude character glyphs to represent the graphics that the PET, being a text-only machine, couldn’t actually draw. No matter. A programmer had been born.

These events, followed shortly by Molyneux’s departure from Bearwood to face the daunting prospect of the adult world, were happening at the tail end of the 1970s. Like so many of the people I’ve profiled on this blog, Molyneux was thus fortunate enough to be born not only into a place and circumstances that would permit a career in games, but at seemingly the perfect instant to get in on the ground floor as well. But, surprisingly for a fellow who would come to wear his huge passion for the medium on his sleeve — often almost as much to the detriment as to the benefit of his games and his professional life — Molyneux took a meandering path filling fully another decade to rise to prominence in the field. Or, to put it less kindly: he failed, repeatedly and comprehensively, at every venture he tried for most of the 1980s before he finally found the one that clicked.

Perhaps inspired by his mother’s toy shop, his original dream was to be not so much a game designer as a computer entrepreneur. After earning a degree in computer science from Southampton University, he found himself a job working days as a systems analyst for a big company. By night, he formed a very small company called Vulcan in his hometown of Guildford to implement a novel scheme for selling blank disks. He wrote several simple programs: a music creator, some mathematics drills, a business simulator, a spelling quiz. (The last, having been created by a dyslexic and terrible speller in general, was a bit of a disaster.) For every ten disks you bought for £10, you would get one of the programs for free along with your blank disks. After placing his tiny advertisement in a single magazine, Molyneux was so confident of the results that he told his local post office to prepare for a deluge of mail, and bought a bigger mailbox for his house to hold it all. He got five orders in the first ten days, less than fifty in the scheme’s total lifespan — along with about fifty more inquiries from people who had no interest in the blank disks but just wanted to buy his software.

Taking their interest to heart, Molyneux embarked on Scheme #2. He improved the music creator and the business simulator and tried to sell them as products in their own right. Even years later he would remain proud of the latter in particular — his first original game, which he named Entrepreneur: “I really put loads of features into it. You ran a business and you could produce anything you liked. You had to do things like keep the manufacturing line going, set the price for your product, decide what advertising you wanted, and these random events would happen.” With contests all the rage in British games at the time, he offered £100 to the first person to make £1 million in Entrepreneur. The prize went unclaimed; the game sold exactly two copies despite being released near the zenith of the early-1980s British mania for home computers. “Everybody around me was making an absolute fortune,” Molyneux remembers. “You had to be a complete imbecile in those days not to make a fortune. Yet here I was with Entrepreneur and Composer, making nothing.” He wasn’t, it appeared, very good at playing his own game of entrepreneurship; his own £1 million remained far out of reach. Nevertheless, he moved on to the next scheme.

Scheme #3 was to crack the business and personal-productivity markets via a new venture called Taurus, initiated by Molyneux and his friend Les Edgar, who were later joined by one Kevin Donkin. Molyneux having studied accounting at one time in preparation for a possible career in the field (“the figures would look so messy that no one would ever employ me”), it was decided that Taurus would initially specialize in financial software with exciting names like Taurus Accounts, Taurus Invoicing, and Taurus Stock Control. Those products, like all the others Molyneux had created, went nowhere. But now came a bizarre story of mistaken identity that… well, it wouldn’t make Molyneux a prominent game designer just yet, but it would move him further down the road to that destination.

Commodore was about to launch the Amiga in Britain, and, this being early on when they still saw it as potential competition for the IBMs of the world, was looking to convince makers of productivity software to write for the machine.  They called up insignificant little Taurus of all people to request a meeting to discuss porting the “new software” the latter had in the works to the Amiga. Molyneux and Edgar assumed Commodore must have somehow gotten wind of a database program they were working on. In a state of no small excitement, they showed up at Commodore UK’s headquarters on the big day and met a representative. Molyneux:

He kept talking about “the product,” and I thought they were talking about the database. At the end of the meeting, they say, “We’re really looking forward to getting your network running on the Amiga.” And it suddenly dawned on me that this guy didn’t know who we were. Now, we were called Taurus, as in the star sign. He thought we were Torus, a company that produced networking systems. I suddenly had this crisis of conscience. I thought, “If this guy finds out, there go my free computers down the drain.” So I just shook his hand and ran out of that office.

An appropriately businesslike advertisement for Taurus's database manager gives no hint of what lies in the company's futures.

An appropriately businesslike advertisement for Taurus’s database manager gives no hint of what actually lies in the company’s future…

By the time Commodore figured out they had made a terrible mistake, Taurus had already been signed as official Amiga developers and given five free Amigas. They parlayed those things into a two-year career as makers of somewhat higher-profile but still less than financially successful productivity software for the Amiga. After the database, which they named Acquisition and declared “the most complete database system conceived on any microcomputer” — Peter Molyneux’s habit of over-promising, which gamers would come to know all too well, was already in evidence — they started on a computer-aided-design package called X-CAD Designer. Selling in the United States for the optimistic prices of $300 and $500 respectively, both programs got lukewarm reviews; they were judged powerful but kind of incomprehensible to actually use. But even had the reviews been better, high-priced productivity software was always going to be a hard sell on the Amiga. There were just three places to really make money in Amiga software: in personal-creativity software like paint programs, in video-production tools, and, most of all, in games. In spite of all of Commodore’s earnest efforts to the contrary, the Amiga had by now become known first and foremost as the world’s greatest gaming computer.

The inspiration for the name of Bullfrog Software.

The inspiration for Bullfrog Software.

Molyneux and his colleagues therefore began to wind down their efforts in productivity software in favor of a new identity. They renamed their company Bullfrog after a ceramic figurine they had lying around in the “squalor” of what Molyneux describes as their “absolutely shite” office in a Guildford pensioner’s attic. Under the new name, they planned to specialize in games — Scheme #4 for Peter Molyneux. “We had a simple choice of hitting our head against a brick wall with business software,” he remembers, “or doing what I really wanted to do with my life anyway, which was write games.” Having made the choice to make Bullfrog a game developer, their first actual product was not a game but a simple drum sequencer for the Amiga called A-Drum. Hobgoblins and little minds and all the rest. When A-Drum duly flopped, they finally got around to games.

A friend of Molyneux’s had written a budget-priced action-adventure for the Commodore 64 called Druid II: Enlightenment, and was looking for someone to do an Amiga conversion. Bullfrog jumped at the chance, even though Molyneux, who would always persist in describing himself as a “rubbish” programmer, had very little idea how to program an action game. When asked by Enlightenment‘s publisher Firebird whether he could do the game in one frame — i.e., whether he could update everything onscreen within a single pass of the electron gun painting the screen to maintain the impression of smooth, fluid movement — an overeager Molyneux replied, “Are you kidding me? I can do it in ten frames!” It wasn’t quite the answer Firebird was looking for. But in spite of it all, Bullfrog somehow got the job, producing what Molyneux describes as a “technically rather poor” port of what had been a rather middling game in the first place. (Molyneux’s technique for getting everything drawn in one frame was to simply keep shrinking the size of the display until even his inefficient routines could do the job.) And then, as usual for everything Molyneux touched, it flopped. But Bullfrog did get two important things out of the project: they learned much about game programming, and they recruited as artist for the project one Glenn Corpes, who was not only a talented pixel pusher but also a talented programmer and fount of ideas almost the equal of Molyneux.

Despite the promising addition of Corpes, the first original game conjured up by the slowly expanding Bullfrog fared little better than Enlightenment. Corpes and Kevin Donkin turned out a very of-its-time top-down shoot-em-up called Fusion, which Electronic Arts agreed to release. Dismissed as “a mixture of old ideas presented in a very unexciting manner” by reviewers, Fusion was even less impressive technically than had been the Enlightenment port, being plagued by clashing colors and jittery scrolling — not at all the sort of thing to impress the notoriously audiovisually-obsessed Amiga market. Thus Fusion flopped as well, keeping Molyneux’s long record of futility intact. But then, unexpectedly from this group who’d shown so little sign of ever rising above mediocrity, came genius.

To describe Populous as a stroke of genius would be a misnomer. It was rather a game that grew slowly into its genius over a considerable period of time, a game that Molyneux himself considers more an exercise in evolution than conscious design. “It wasn’t an idea that suddenly went ‘Bang!'” he says. “It was an idea that grew and grew.” And its genesis had as much to do with Glenn Corpes as it did with Peter Molyneux.

Every Populous world is built out of combinations of just 16 blocks.

Every Populous world is built out of combinations of just 56 blocks.

It all began when Corpes started showing off a routine he had written which let him build isometric landscapes out of three-dimensional blocks, like a virtual Lego set. You could move the viewpoint about the landscape, raising and lowering the land by left-clicking to add new blocks, right-clicking to remove them. Molyneux was immediately sure there was a game in there somewhere. His childhood memory of the ant farm leaping to mind, he said, “Let’s have a thousand people running around on it.”

Populous thus began with those little people in lieu of ants, wandering independently over Corpes’s isometric landscapes in real time. When they found a patch they liked, they would settle down, building little huts. Since, this being a computer game, the player would obviously need something to do as well, Molyneux started adding ways for you, as a sort of God on high, to influence the people’s behavior in indirect ways. He added something he called a “Papal Magnet,” a huge ankh you could place in the world to draw your people toward a given spot. But there would come a problem if the way to the Ankh happened to be blocked by, say, a lake. Molyneux claims he added Populous‘s most basic mechanic, the thing you spend by far the most time doing when playing the game, as a response to his “incompetence” as a coder and resulting inability to write a proper path-finding algorithm: when your people get stuck somewhere, you can, subject to your mana reserves — even gods have limits — raise or lower the land to help them out. With that innovation, Populous from the player’s perspective became largely an exercise in terraforming, creating smooth, even landscapes on which your people can build their huts, villages, and eventually castles. As your people become fruitful and multiply, their prayers fuel your mana reserves.

Next, Molyneux added warfare to the picture. Now you would be erecting mountains and lakes to protect your people from their enemies, who start out walking about independently on the other side of the world. The ultimate goal of the game, of course, is to use your people to wipe out your enemy’s people before they do the same to you; this is a very Old Testament sort of religious experience. To aid in that goal, Molyneux gradually added lots of other godly powers to your arsenal, more impressive than the mere raising and lowering of land if also far more expensive in terms of precious mana: flash floods, earthquakes, volcanic eruptions, etc. You know, all your standard acts of God, as found in the Bible and insurance claims.

Lego Populous. Bullfrog had so much fun with this implementation of the idea that they seriously discussed trying to turn it into a commercial board game.

Lego Populous. Bullfrog had so much fun with this implementation of the idea that they seriously discussed trying to turn it into a commercial board game.

Parts of Populous were prototyped on the tabletop. Bullfrog used Lego bricks to represent the landscapes, a handy way of implementing the raising-and-lowering mechanic in a physical space. They went so far as to discuss a license with Lego, only to be told that Lego didn’t support “violent games.” Molyneux admits that the board game, while playable, was very different from the computerized Populous, playing out as a slow-moving, chess-like exercise in strategy. The computer Populous, by contrast, can get as frantic as any action game, especially in the final phase when all the early- and mid-game maneuvering and feinting comes down to the inevitable final genocidal struggle between Good and Evil.

Bullfrog. From left: Glenn Corpes (artist), Shaun Cooper (tester), Peter Molyneux (designer and programmer), Kevin Donkin (designer and programmer), Les Edgar (office manager), Andy Jones (artist and tester).

Bullfrog. From left: Glenn Corpes (artist and programmer), Shaun Cooper (artist and tester), Peter Molyneux (designer and programmer), Kevin Donkin (designer and programmer), Les Edgar (office manager), Andy Jones (artist and tester).

Ultimately far more important to the finished product than Bullfrog’s Lego Populous were the countless matches Molyneux played on the computer against Glenn Corpes. Apart from all of its other innovations in helping to invent the god-game and real-time-strategy genres, Populous was also a pioneering effort in online gaming. Multi-player games — the only way to play Populous for many months — took place between two people seated at two separate Amigas, connected together via modem or, if together in the same room as Molyneux and Corpes were, via a cable. Vanishingly few other designers were working in this space at the time, for understandable reasons: even leaving aside the fact that the majority of computer owners didn’t own modems, running a multi-player game in real-time over a connection as slow as 1200 baud was hardly a programming challenge for the faint-hearted. The fact that it works at all in Populous rather puts the lie to Molyneux’s self-deprecating description of himself as a “rubbish” coder.

You draw your people toward different parts of the map by placing the Papal Magnet. The first one to touch it becomes the leader. There are very few words in the game, which made it much easier to localize and popularize across Europe. Everything is done using the initially incomprehensible suite of icons you near the bottom of the screen.

You draw your people toward different parts of the map by placing the Papal Magnet. The first one to touch it becomes the leader. There are very few words in the game, which only made it that much easier for Electronic Arts to localize and popularize across Europe. Everything is instead done using the initially incomprehensible suite of icons near the bottom of the screen. Populous does become intuitive in time, but it’s not without a learning curve.

Development of Populous fell into a comfortable pattern. Molyneux and Corpes would play together for several hours every evening, then nip off to the pub to talk about their experiences. Next day, they’d tweak the game, then they’d go at it again. It’s here that we come to the beating heart of Molyneux’s description of Populous as a game evolved rather than designed. Almost everything in the finished game beyond the basic concept was added in response to Molyneux and Corpes’s daily wars. For instance, Molyneux initially added knights, super-powered individuals who can rampage through enemy territory and cause a great deal of havoc in a very short period of time, to prevent their games from devolving into endless stalemates. “A game could get to the point where both players had massive populations,” he says, “and there was just no way to win.” With knights, the stronger player “could go and massacre the other side and end the game at a stroke.”

A constant theme of all the tweaking was to make a more viscerally exciting game that played more quickly. For commercial as well as artistic reasons — Amiga owners weren’t particularly noted for their patience with slow-paced, cerebral games — this was considered a priority. Over the course of development, the length of the typical game Molyneux played with Corpes shrank from several hours to well under one.

Give them time, and your people will turn their primitive villages into castles -- and no, the drawing isn't quite done to scale.

Give them time, and your people will turn their primitive huts into castles.

Even tweaked to play quickly and violently, Populous was quite a departure from the tried-and-true Amiga fare of shoot-em-ups, platformers, and action-adventures. The unenviable task of trying to sell the thing to a publisher was given to Les Edgar. After visiting about a dozen publishers, he convinced Electronic Arts to take a chance on it. Bullfrog promised EA a finished Populous in time for Christmas 1988. By the time that deadline arrived, however, it was still an online multiplayer-only game, a prospect EA knew to be commercially untenable. Molyneux and his colleagues thus spent the next few months creating Populous‘s single-player “Conquest Mode.”

In addition to the green and pleasant land of the early levels, there are also worlds of snow and ice, desert worlds, and even worlds of fire and lava to conquer.

In addition to the green and pleasant land of the early levels, there are also worlds of snow and ice, desert worlds, and even worlds of fire and lava to conquer.

Perilously close to being an afterthought to the multi-player experience though it was, Conquest Mode would be the side of the game that the vast majority of its eventual players would come to know best if not exclusively. Rather than design a bunch of scenarios by hand, Bullfrog wrote an algorithm to procedurally generate 500 different “worlds” for play against a computer opponent whose artificial intelligence also had to be created from scratch during this period. This method of content creation, used most famously by Ian Bell and David Braben in Elite, was something of a specialty and signpost of British game designers, who, plagued by hardware limitations far more stringent than their counterparts in the United States, often used it as a way to minimize the space their games consumed in memory and on disk. Most recently, Geoff Crammond’s hit game The Sentinel, published by Firebird, had used a similar scheme. Glenn Corpes believes it may have been an EA executive named Joss Ellis who first suggested it to Bullfrog.

Populous‘s implementation is fairly typical of the form. Each of the 500 worlds except the first is protected by a password that is, like everything else, itself procedurally generated. When you win at a given level, you’re given the password to a higher, harder level; whether and how many levels you get to skip is determined by how resounding a victory you’ve just managed. It’s a clever scheme, packing a hell of a lot of potential gameplay onto a single floppy disk and even making an effort to avoid boring the good player — and all without forcing Bullfrog to deal with the complications of actually storing any state whatsoever onto disk.

It inevitably all comes down to a frantic final free-for-all between your people and those of your enemy.

It inevitably all comes down to a frantic final free-for-all between your people and those of your enemy.

Given their previous failures, Bullfrog understandably wasn’t the most confident group when a well-known British games journalist named Bob Wade, who had already played a pre-release version of the game, came by for a visit. For hours, Molyneux remained too insecure to actually ask Wade the all-important question of what he thought of the game. At last, after Wade had joined the gang for “God knows how many” pints at their local, Molyneux worked up the courage to pop the question. Wade replied that it was the best game he’d ever played, and he couldn’t wait to get back to it — prompting Molyneux to think he must have made some sort of mistake, and that under no circumstances should he be allowed to play another minute of it in case his opinion should change. It was Wade and the magazine he was writing for at the time, ACE (Advanced Computer Entertainment), who coined the term “god game” in the glowing review that followed, the first trickle of a deluge of praise from the gaming press in Britain and, soon enough, much of the world.

Bullfrog’s first royalty check for Populous was for a modest £13,000. Their next was for £250,000, prompting a naive Les Edgar to call Electronic Arts about it, sure it was a mistake. It was no mistake; Populous alone reportedly accounted for one-third of EA’s revenue during its first year on the market. That Bullfrog wasn’t getting even bigger checks was a sign only of the extremely unfavorable deal they’d signed with EA from their position of weakness. Populous finally and definitively ended the now 30-year-old Peter Molyneux’s long run of obscurity and failure at everything he attempted. In his words, he went overnight from “urinating in the sink” and “owing more money than I could ever imagine paying back” to “an incredible life” in games. Port after port came out for the next couple of years, each of them becoming a bestseller on its platform. Populous was selected to become one of the launch titles for the Super Nintendo console in Japan, spawning a full-blown fad there that came to encompass comic books, tee-shirts, collectibles, and even a symphony concert. When they visited Japan for the first time on a promotional tour, Molyneux and Les Edgar were treated like… well, appropriately enough, like gods. Populous sold 3 million copies in all according to some reports, an almost inconceivable figure for a game during this period.

Amidst all its other achievements, Populous was also something of a pioneer in the realm of e-sports. The One magazine and Electronic Arts hosted a tournament to find the best player in Britain.

The One magazine and Electronic Arts hosted a tournament to find the best Populous player in Britain.

While a relatively small percentage of Populous players played online, those who did became pioneers of sorts in their own right. Some bulletin-board systems set up matchmaking services to pair up players looking for a game, any time, day or night; the resulting connections sometimes spanned national borders or even oceans. The matchmakers were aided greatly by Bullfrog’s forward-thinking decision to make all versions of Populous compatible with one another in terms of online play. In making it so quick and easy to find an online opponent, these services prefigured the modern world of Internet-enabled online gaming. Molyneux pronounced them “pretty amazing,” and at the time they really were. In 1992, he spoke excitedly of a recent trip to Japan, where’d he seen a town “with 10,000 homes all linked together. You can play games with anybody in the place. It’s enormous, really enormous, and it’s growing.” If only he’d known what online gaming would grow into in the next decade or two…

A youngster named Andrew Reader wound up winning the tournament, only to get trounced in an exhibitio match by the master, Peter Molyneux himself. There was talk of televising a follow-up tournament on Sky TV, but it doesn't appear to have happened.

A youngster named Andrew Reader wound up winning the tournament, only to get trounced in an exhibition match by the master, Peter Molyneux himself. There was talk of televising a follow-up tournament on Sky TV, but it doesn’t appear to have happened.

The original Amiga version of Populous had been released all but simultaneously with the Amiga version of SimCity. Press and public alike immediately linked the two games together; AmigaWorld magazine, for instance, went so far as to review them jointly in a single article. Both Will Wright of SimCity fame and Peter Molyneux were repeatedly asked in interviews whether they’d played the other’s game. Wright was polite but, one senses, a little disinterested in Populous, saying he “liked the idea of playing God and having a population follow you,” but “sort of wish they’d gone for a slightly more educational angle.” Molyneux was much more enthusiastic about his American counterpart’s work, repeatedly floating a scheme to somehow link the two games together in more literal fashion for online play.  He claimed at one point that Maxis (developers of SimCity) and his own Bullfrog had agreed on a liaison “to go backwards and forwards” between their two companies to work on linking their games. The liaison, he claimed, had “the Populous landscape moving to and from SimCity,” and a finished product would be out sometime in 1992. Like quite a number of the more unbelievable schemes Molyneux has floated over the years, it never happened.

The idea of a linkage between SimCity and Populous, whether taking place online or in the minds of press and public, can seem on the face of it an exceedingly strange one today. How would the online linkage actually work anyway? Would the little Medieval warriors from Populous suddenly start attacking SimCity‘s peaceful modern utopias? Or would Wright’s Sims plop themselves down in the middle of Molyneux’s apocalyptic battles and start building stadiums and power plants? These were very different games: Wright’s a noncompetitive, peaceful exercise in urban planning with strong overtones of edutainment; Molyneux’s a zero-sum game of genocidal warfare that aspired to nothing beyond entertainment. Knowing as we do today the future paths of these two designers — i.e., ever further in the directions laid down by these their first significant works — only heightens the seeming dichotomy.

That said, there actually were and are good reasons to think of SimCity and Populous as two sides of the same coin. For us today, the list includes first of all the reasons of simple historical concordance. Each marks the coming-out party of one of the most important game designers of all time, occurring within bare weeks of one another.

But of course the long-term importance of these two designers to their field wasn’t yet evident in 1989; obviously players were responding to something else in associating their games with one another. Once you stripped away their very different surface trappings and personalities, the very similar set of innovations at the heart of each was laid bare. AmigaWorld said it very well in that joint review: “The real joy of these programs is the interlocking relationships. Sure, you’re a creator, but even more a facilitator, influencer, and stage-setter for little computer people who act on your wishes in their own time and fashion.” It’s no coincidence that, just as Peter Molyneux was partly inspired by an ant hill to create Populous, one of Will Wright’s projects of the near future would be the virtual ant farm SimAnt. In creating the first two god games, the two were indeed implementing a very similar core idea, albeit each in his own very different way.

Joel Billings of the king of American strategy games SSI had founded his company back in 1979 with the explicit goal of making computerized versions of the board games he loved. SimCity and Populous can be seen as the point when computer strategy games transcended that traditional approach. The real-time nature of these games makes them impossible to conceive of as anything other than computer-based works, while their emergent complexity makes them objects of endless fascination for their designers as much or more so than for their players.

In winning so many awards and entrancing so many players for so long, SimCity and Populous undoubtedly benefited hugely from their sheer novelty. Their flaws stand out more clearly today. With its low-resolution graphics and without the aid of modern niceties like tool tips and graphical overlays, SimCity struggles to find ways to communicate vital information about what your city is really doing and why, making the game into something of an unsatisfying black box unless and until you devote a lot of time and effort to understanding what affects what. Populous has many of the same interface frustrations, along with other problems that feel still more fundamental and intractable, especially if you, like the vast majority of players back in its day, experience it through its single-player Conquest Mode. Clever as they are, the procedurally generated levels combined with the fairly rudimentary artificial intelligence of your computer opponent introduce a lot of infelicities. Eventually you begin to realize that one level is pretty much the same as any other; you just need to execute the same set of strategies and tactics more efficiently to have success at the higher levels.

Both Will Wright and Peter Molyneux are firm adherents to the experimental, boundary-pushing school of game design — an approach that yields innovative games but not necessarily holistically good games every time out. And indeed, throughout his long career each of them has produced at least as many misses as hits, even if we dismiss the complaints of curmudgeons like me and lump SimCity and Populous into the category of the hits. Both designers have often fallen into the trap, if trap it be, of making games that are more interesting for creators and commentators than they are fun for actual players. And certainly both have, like all of us, their own blind spots: in relying so heavily on scientific literature to inform his games, Wright has often produced end results with something of the feel of a textbook, while Molyneux has often lacked the discipline and gravitas to fully deliver on his most grandiose schemes.

But you know what? It really doesn’t matter. We need our innovative experimentalists to blaze new trails, just as we need our more sober, holistically-minded designers to exploit the terrain they discover. SimCity and Populous would be followed by decades of games that built on the possibilities they revealed — many of which I’d frankly prefer to play today over these two original ground-breakers. But, again, that reality doesn’t mean we should celebrate SimCity and Populous one iota less, for both resoundingly pass the test of historical significance. The world of gaming would be a much poorer place without Will Wright and Peter Molyneux and their first living worlds inside a box.

(Sources: The Official Strategy Guide for Populous and Populous II by Laurence Scotford; Master Populous: Blueprints for World Power by Clayton Walnum; Amazing Computing of October 1989; Next Generation of November 1998; PC Review of July 1992; The One of April 1989, September 1989, and May 1991; Retro Gamer 44; AmigaWorld of December 1987, June 1989, and November 1989; The Games Machine of November 1988; ACE of April 1989; the bonus content to the film From Bedrooms to Billions. Archived online sources include features on Peter Molyneux and Bullfrog for Wired Online, GameSpot, and Edge Online. Finally, Molyneux’s postmortem on Populous at the 2011 Game Developers Conference.

Populous is available for purchase from GOG.com.)

 

Tags: , , ,

Acorn and Amstrad

…he explains to her that Sinclair, the British inventor, had a way of getting things right, but also exactly wrong. Foreseeing the market for affordable personal computers, Sinclair decided that what people would want to do with them was to learn programming. The ZX81, marketed in the United States as the Timex 1000, cost less than the equivalent of a hundred dollars, but required the user to key in programs, tapping away on that little motel keyboard-sticker. This had resulted both in the short market-life of the product and, in Voytek’s opinion, twenty years on, in the relative preponderance of skilled programmers in the United Kingdom. They had had their heads turned by these little boxes, he believes, and by the need to program them. “Like hackers in Bulgaria,” he adds, obscurely.

“But if Timex sold it in the United States,” she asks him, “why didn’t we get the programmers?”

“You have programmers, but America is different. America wanted Nintendo. Nintendo gives you no programmers…”

— William Gibson, Pattern Recognition

A couple of years ago I ventured out of the man cave to give a talk about the Amiga at a small game-development conference in Oslo. I blazed through as much of the platform’s history as I could in 45 minutes or so, emphasizing for my audience of mostly young students from a nearby university the Amiga’s status as the preeminent gaming platform in Europe for a fair number of years. They didn’t take much convincing; even this crowd, young as they were, had their share of childhood memories involving Amiga 500s and 1200s. Mostly they seemed surprised that the Amiga hadn’t ever been all that terribly popular in the United States. During the question-and-answer session, someone asked a question that stopped me short: if American kids hadn’t been playing games on their Amigas, just what the hell had they been playing on?

The answer itself wasn’t hard to arrive at: the sorts of kids who migrated from 8-bit Sinclairs, Acorns, Amstrads, and Commodores to 16-bit Amigas and Atari STs in Britain made a much more lateral move in the United States, migrating to the 8-bit Nintendo Entertainment System.

More complex and interesting are the ramifications of these trends. Because the Atari VCS console was never a major presence in Britain and the rest of Europe during its heyday, and because Nintendo arrived only very belatedly, for many years videogames played in the home there meant games played on home computers. One could say much about how having a device useful for creation as well as consumption as the favored platform of most people affected the market across Europe. The magazines were filled with stories of bedroom gamers who had become bedroom coders and finally Software Stars. Such stories make a marked contrast to an American console-gaming magazine like Nintendo Power, all about consumption without the accompanying ethos of creation.

But most importantly for our purposes today, the relative neglect of Britain in particular by the big computing powers in the United States and Japan — for many years, Commodore was the only company of either nation to make a serious effort to sell their machines into British homes — gave space for a flourishing domestic trade in homegrown machines. When Britain became the nation with the most computers per capita on the planet at mid-decade, most of the computers in question bore the logo of either Acorn or Sinclair, the two great rivals at the heart of the young British microcomputer industry.

Acorn, co-founded by Clive Sinclair’s former right-hand man Chris Curry and an Austrian academic named Hermann Hauser, was an archetypal example of an engineering-driven company. Their machines were a little more baroque, a little better built, and consequently a little more expensive than they needed to be, while their public persona was reserved and just a little condescending, much like that of the BBC that had given its official imprimatur to Acorn’s most popular machine, the BBC Micro. Despite “Uncle Clive’s” public reputation as the British Inspector Gadget, Sinclair was just the opposite; cheap and cheerful, they had the common touch. Acorns sold to the educators, to the serious hobbyists, and to the posh, while Sinclairs dominated with the masses.

Yet Acorn and Sinclair were similar in one important respect: they were both in their own ways very poorly managed companies. When the British home-computer market hit an iceberg in 1985, both were caught in untenable positions, drowning in excess inventory. Acorn — quintessentially British, based in the storied heart of Britain’s “Silicon Fen” of Cambridge — was faced with a choice between dissolution and selling themselves to the Italian typewriter manufacturer Olivetti; after some hand-wringing, they chose the latter course. Sinclair also sold out: to the new kid on the block of British computing, Amstrad, owned by a gruff Cockney with a penchant for controversy named Alan Sugar who was well on his way to becoming the British Donald Trump.

Ever practical in their approach to technology, Amstrad made much of the CPC's bundled monitor in their advertising, noting that with the CPC Junior could play on the computer while the rest of the family watched television.

Ever mindful of the practical concerns of their largely working-class customers, Amstrad made much of the CPC’s bundled monitor in their advertising, noting that Junior could play on the CPC without tying up the family television.

Amstrad had already been well-established as a maker of inexpensive stereo equipment and other consumer electronics when their first computers, the CPC (“Colour Personal Computer”) line, debuted in June of 1984. The CPC range was created and sold as a somewhat more capable Sinclair Spectrum. It consisted of well-built and smartly priced if technically unimaginative computers that were fine choices for gaming, boasting as they did reasonably good if hardly revolutionary graphics and sound. Like most Amstrad products, they strained to be as easy to use as possible, shipping as complete units — tape or disk drive and monitor included — at a time when virtually all of their rivals had to be assembled piece by piece via separate purchases.

The CPC line did very well from the outset, even as Acorn and Sinclair were soon watching their own sales implode. Pundits attributed the line’s success to what they called “the Amstrad Effect”: Alan Sugar’s instinct for delivering practical products at a good price at the precise instant when the technology behind them was ready for the mass market — i.e., was about to become desirable to his oft-stated target demographic of “the truck driver and his wife.” Sugar preferred to let others advance the technical state of the art, then swoop in to reap the rewards of their innovations when the time was right. The CPC line was a great example of him doing just that.

But the most dramatic and surprising iteration of the Amstrad Effect didn’t just feed the existing market for colorful game machines; it found an entirely new market segment, one that Amstrad’s competitors had completely missed until now. The story of the creation of the Amstrad PCW line is a classic tale of Alan Sugar, a man who knew almost nothing about computers but knew all he needed to about the people who bought them.

One day just a few months after the release of the first CPC machines, Sugar found himself in an airplane over Asia with Bob Watkins, one of his most trusted executives. A restless Sugar asked Watkins for a piece of paper, and proceeded to draw on it a contraption that included a computer, a monitor, a disk drive, and a printer, all in one unit. Looking at the market during the run-up to the CPC launch, Sugar had recognized that the only true mainstream uses for the current generation of computers in the home were as game machines and word processors. With the CPC, he had the former application covered. But what about the latter? All of the inexpensive machines currently on the market, like the Sinclair Spectrum, were oriented toward playing games rather than word processing, trading the possibility of displaying crisp 80-column text for colorful graphics in lower resolutions. Meanwhile all of the more expensive ones, like the BBC Micro, were created by and for hardcore techies rather than Sugar’s truck drivers. If they could apply their patented technology-for-the-masses approach to a word processor for the home and small business — making a cheap, well-built, all-in-one design emphasizing ease of use for the common person — Amstrad might just have another hit on their hands, this time in a market of their own utterly without competition. Internally, the project was named after Sugar’s secretary Joyce, since it would hopefully make her job and those of many like her much easier. It would eventually come to market as the “PCW,” or “Personal Computer Word Processor.”

The first Amstrad PCW machine, complete with bundled printer.

The first Amstrad PCW machine, complete with bundled printer. Note how the disk drive and the computer itself are built into the same case as the monitor, a very unusual design for the period.

Even more so than the CPC, the PCW was a thoroughly underwhelming package for technophiles. It was build around the tried-and-true Z80 8-bit CPU and ran CP/M, an operating system already considered obsolete by big business, MS-DOS having become the standard in the wake of the IBM PC. The bundled word-processing software, contracted out to a company called Locomotive Software, wasn’t likely to impress power users of WordStar or WordPerfect overmuch — but it was, in keeping with the Amstrad philosophy, unusually friendly and easy to use. Sugar knew his target customers, knew that they “didn’t give a shit whether there was an elastic band or an 8086 or a 286 driving the thing. They wouldn’t know what you were talking about.”

As usual, most of Amstrad’s hardware-engineering efforts went into packaging and cost-cutting. It was decided that the printer would have to be housed separately from the system unit for technical reasons, but otherwise the finished machine conformed remarkably well to Sugar’s original vision. Best of all, it had a price of just £399. By way of comparison, Acorn’s most recent BBC Micro Model B+ had half as much memory and no disk drive, monitor, or printer included — and was priced at £499.

Nervous as ever about intimidating potential customers, Amstrad was at pains to market the PCW first and foremost as a turnkey word-processing solution for homes and small businesses, as a general-purpose computer only secondarily if at all. “It’s more than a word processor for less than most typewriters,” ran their tagline. At the launch event in the heart of the City in August of 1985, three female secretaries paraded across the stage: a snooty one who demanded one of the competition’s expensive computer systems; a tarty one who said a typewriter was more than good enough; and a smart, reasonable one who naturally preferred the PCW. Man-of-the-people Sugar crowed extravagantly that Amstrad had “brought word-processing within the reach of every small business, one-man band, home-worker, and two-finger typist in the country.” Harping on one of his favorite themes, he noted that once again Amstrad had “produced what the customer wants and not a boffin’s ego trip.”

Sugar’s aggressive manner may have grated with many buttoned-down trade journalists, but few could deny that he might just open up a whole new market for computers with the PCW. Electrical Retailer and Trader was typical, calling the PCW “a grown-up computer that does something people want, packaged and sold in a way they can understand, at a price they’ll accept.” But even that note of optimism proved far too mild for the reality of the machine’s success. The PCW exploded out of the gate, selling 350,000 units in the first eight months. It probably could have sold a lot more than that, but Amstrad, caught off-guard by the sales numbers despite their founder’s own bullishness on the product, couldn’t make and ship them fast enough.

Level 9's Time and Magic text adventure running on a PCW.

Level 9’s Time and Magik text adventure running on a PCW.

Surprisingly for such a utilitarian package, the PCW garnered considerable loyalty and even love among the millions in Britain and all across Europe who eventually bought one. Their enthusiasm was enough to sustain a big, glossy newsstand magazine dedicated to the PCW alone — an odd development indeed for this machine that seemed on the face of it to be anything but a hacker’s darling. A thriving software ecosystem that reached well beyond word processing sprung up around the machine. Despite the PCW’s monochrome display and virtually nonexistent animation and sound capabilities, even games were far from unheard of on the platform. For obvious reasons, text adventures in particular became big favorites of PCW owners; with its comfortable full-travel keyboard, its fast disk drive, its relatively cavernous 256 K of memory, and its 80-column text display, a PCW was actually a far better fit for the genre than the likes of a Sinclair Spectrum. The PCW market for text adventures was strong enough to quite possibly allow companies like Magnetic Scrolls and Level 9 to hang on a year or two longer than they might otherwise have managed.

So, Amstrad was already soaring on the strength of the CPC and especially the PCW when they shocked the nation and cemented their position as the dominant force in mainstream British computing with the acquisition of Sinclair in April of 1986. Eminently practical man of business that he was, Sugar bought Sinclair partly to eliminate a rival, but also because he realized that, home-computer slump or no, the market for a machine as popular as the Sinclair Spectrum wasn’t likely to just disappear overnight. He could pick up right where Uncle Clive had left off, selling the existing machine just as it was to new buyers who wanted access to the staggering number of cheap games available for the platform. Sugar thought he could make a hell of a lot of money this way while needing to expend very little effort.

Once again, time proved him more correct than even he had ever imagined. Driven by that huge base of games, demand for new Spectrums persisted into the 1990s. Amstrad repackaged the technology from time to time and, perhaps most importantly, dramatically improved on Sinclair’s infamously shoddy quality control. But they never seriously re-imagined the Spectrum. It was now what Sugar liked to call “a commodity product.” He compared it to suntan lotion of all things: the department stores “put it in their window in July and August and they take it away in the winter.” The Spectrum’s version of July and August was of course November and December; every Christmas sparked a new rush of sales to the parents of a new group of youngsters just coming of age and discovering the magic of videogames.

A battered and uncertain Acorn, now a subsidiary of Olivetti, faced a formidable rival indeed in Alan Sugar’s organization. In a sense, the fundamental dichotomies hadn’t changed that much since Amstrad took Sinclair’s place as the yin to Acorn’s yang. Acorn remained as technology-driven as ever, while Amstrad was all about giving the masses what they craved in the form of cheap computers that were technically just good enough. Amstrad, however, was a much more dangerous form of people’s computer company than had been their predecessor in the role. After releasing some notoriously shoddy stereo equipment under the Amstrad banner in the 1970s and paying the price in returns and reputation, Alan Sugar had learned a lesson that continued to elude Clive Sinclair: that selling well-built, reliable products, even at a price of a few more quid on the final price tag and/or a few less in the profit margin, pays off more than corner-cutting in the long run. Unlike Uncle Clive, who had bumbled and stumbled his way to huge success and just as quickly back to failure, Sugar was a seasoned businessman and a master marketer. The diffident boffins of Acorn looked destined to have a hard time against a seasoned brawler like Sugar, raised on the mean streets of the cutthroat Tottenham Court Road electronics trade. It hardly seemed a fair fight at all.

But then, in the immediate wake of their acquisition by Olivetti nothing at all boded all that well for Acorn. New hardware releases were limited to enhanced versions of the 1981-vintage, 8-bit BBC Micro line that were little more ambitious than Amstrad’s re-packagings of the Spectrum. It was an open secret that Acorn was putting much effort into designing a new CPU in-house to serve as the heart of their eventual next-generation machine, an unprecedented step in an industry where CPU-makers and computer-makers had always been separate entities. For many, it seemed yet one more example of Acorn’s boffinish tendencies getting the best of them, causing them to laboriously reinvent the wheel rather than do what the rest of the microcomputer world was doing: grabbing a 68000 from Motorola or an 80286 from Intel and just getting on with the 16-bit machine their customers were clamoring for. While Acorn dithered with their new chip, they continued to fall further and further behind Amstrad, who in the wake of the Sinclair acquisition had now gone from a British home-computer market share of 0 to 60 percent in less than two years. Acorn was beginning to look downright irrelevant to many Britons in the market for the sorts of affordable, practical computer systems Amstrad was happily providing them with by the bucketful.

Measured in terms of public prominence, Acorn’s best days were indeed already behind them; they would never recapture those high-profile halcyon days of the early 1980s, when the BBC Micro had first been anointed as the British establishment’s officially designated choice for those looking to get in on the ground floor of the computer revolution. Yet the new CPU they were now in the midst of creating, far from being a pointless boondoggle, would ultimately have a far greater impact than anything they’d done before — and not just in Britain but over the entire world. For the CPU architecture Acorn was creating in those uncertain mid-1980s was the one that has gone on to become the most popular ever: the ubiquitous ARM. Since retrofitted into “Advanced RISC Machine,” “ARM” originally stood for “Acorn RISC Machine.” Needless to say, no one at Acorn had any idea of the monster they were creating. How could they?

ARM, the chip that changed the world.

ARM, the chip that changed the world.

“RISC” stands for “Reduced Instruction Set Computer.” The idea didn’t originate with Acorn, but had already been kicking around American university and corporate engineering departments for some time. (As Hermann Hauser later wryly noted, “Normally British people invent something, and the exploitation is in America. But this is a counterexample.”) Still, the philosophy behind ARM was adhered to by only a strident minority before Acorn first picked it up in 1983.

The overwhelming trend in commercial microprocessor design up to that point had been for chips to offer ever larger and more complex instruction sets. By making “opcodes” — single instructions issued directly to the CPU — capable of doing more in a single step, machine-level code could be made more comprehensible for programmers and the programs themselves more compact. RISC advocates came to call this traditional approach to CPU architecture “CISC,” or “Complex Instruction Set Computing.” They believed that CISC was becoming increasingly counterproductive with each new generation of microprocessors. Seeing how the price and size of memory chips continued to drop significantly almost every year, they judged — in the long term, correctly — that memory usage would become much less important than raw speed in future computers. They therefore also judged that it would be more than acceptable in the future to trade smaller programs for faster ones. And they judged that they could accomplish exactly that trade-off by traveling directly against the prevailing winds in CPU design — by making a CPU that offered a radically reduced instruction set of extremely simple opcodes that were each ruthlessly optimized to execute very, very quickly.

A program written for a RISC processor might need to execute far more opcodes than the same program written for a CISC processor, but those opcodes would execute so quickly that the end result would still be a dramatic increase in throughput. Yes, it would use more memory, and, yes, it would be harder to read as machine code — but already fewer and fewer people were programming computers at such a low level anyway. The trend, which they judged likely only to accelerate, was toward high-level languages that abstracted away the details of processor design. In this prediction again, time would prove the RISC advocates correct. Programs may not even need to be as much larger as one might think; RISC advocates argued, with some evidence to back up their claims, that few programs really took full advantage of the more esoteric opcodes of the CISC chips, that the CISC chips were in effect being programed as if they were RISC chips much of the time anyway. In short, then, a definite but not insubstantial minority of academic and corporate researchers were beginning to believe that the time was ripe to replace CISC with RISC.

And now Acorn was about to act on their belief. In typical boffinish fashion, their ARM project was begun as essentially a personal passion project by Roger Wilson [1]Roger Wilson now lives as Sophie Wilson. As per my usual editorial policy on these matters, I refer to her as “he” and by her original name only to avoid historical anachronisms and to stay true to the context of the times. and Steve Furber, two key engineers behind the original BBC Micro. Hermann Hauser admits that for quite some time he gave them “no people” and “no money” to help with the work, making ARM “the only microprocessor ever to be designed by just two people.” When talks began with Olivetti in early 1985, ARM remained such a back-burner long-shot that Acorn never even bothered to tell their potential saviors about it. But as time went on the ARM chip came more and more to the fore as potentially the best thing Acorn had ever done. Having, almost perversely in the view of many, refused to produce a 16-bit replacement for the BBC Micro line for so long, Acorn now proposed to leapfrog that generation entirely; the ARM, you see, was a 32-bit chip. Early tests of the first prototype in April of 1985 showed that at 8 MHz it yielded an average throughput of about 3.5 MIPS, compared to 2.5 MIPS at 10 MHz for the 68020, the first 32-bit entry in Motorola’s popular 68000 line of CISC processors. And the ARM was much, much cheaper and simpler to produce than the 68020. It appeared that Wilson and Furber’s shoestring project had yielded a world-class microprocessor.

ARM made its public bow via a series of little-noticed blurbs that appeared in the British trade press around October of 1985, even as the stockbrokers in the City and BBC Micro owners in their homes were still trying to digest the news of Acorn’s acquisition by Olivetti. Acorn was testing a new “super-fast chip,” announced the magazine Acorn User, which had “worked the first time”: “It is designed to do a limited set of tasks very quickly, and is the result of the latest thinking in chip design.” From such small seeds are great empires sown.

The Acorn Archimedes

The Acorn Archimedes

The machine that Acorn designed as a home for the new chip was called the Acorn Archimedes — or at times, because Acorn had been able to retain the official imprimatur of the BBC, the BBC Archimedes. It was on the whole a magnificent piece of kit, in a different league entirely from the competition in terms of pure performance. It was, for instance, several times faster than a 68000-based Amiga, Macintosh, or Atari ST in many benchmarks despite running at a clock speed of just 8 MHz, roughly the same as all of the aforementioned competitors. Its graphic capabilities were almost as impressive, offering 256 colors onscreen at once from a palette of 4096 at resolutions as high as 640 X 512. So, Acorn had the hardware side of the house well in hand. The problem was the software.

Graphical user interfaces being all the rage in the wake of the Apple Macintosh’s 1984 debut, Acorn judged that the Archimedes as well had to be so equipped. Deciding to go to the source of the world’s very first GUI, they opened a new office for operating-system development a long, long way from their Cambridge home: right next door to Xerox’s famed Palo Alto Research Center, in the heart of California’s Silicon Valley. But the operating-system team’s progress was slow. Communication and coordination were difficult over such a distance, and the team seemed to be infected with the same preference for abstract research over practical product development that had always marked Xerox’s own facility in Palo Alto. The new operating system, to be called ARX, lagged far behind hardware development. “It became a black hole into which we poured effort,” remembers Wilson.

At last, with the completed Archimedes hardware waiting only on some software to make it run, Acorn decided to replace ARX with something they called Arthur, a BASIC-based operating environment very similar to the old BBC BASIC with a rudimentary GUI stuck on top. “All operating-system geniuses were firmly working on ARX,” says Wilson, “so we couldn’t actually spare any of the experts to work on Arthur.” The end result did indeed look like something put together by Acorn’s B team. Parts of Arthur were actually written in interpreted BASIC, which Acorn was able to get away with thanks to the blazing speed of the Archimedes hardware. Still, running Arthur on hardware designed for a cutting-edge Unix-like operating system with preemptive multitasking and the whole lot was rather like dropping a two-speed gearbox into a Lamborghini; it got the job done, after a fashion, but felt rather against the spirit of the thing.

When the Archimedes debuted in August of 1987, its price tag of £975 and up along with all of its infelicities on the software side gave little hope to those not blinded with loyalty to Acorn that this extraordinary machine would be able to compete with Amstrad’s good-enough models. The Archimedes was yet another Acorn machine for the boffins and the posh. Most of all, though, it would be bought by educators who were looking to replace aging BBC Micros and might still be attracted by the BBC branding and the partial compatibility of the new machine with the old, thanks to software emulators and the much-loved BBC BASIC still found as the heart of Arthur.

Even as Amstrad continued to dominate the mass market, a small but loyal ecosystem sprang up around the Archimedes, enough to support a software scene strong on educational software and technical tools for programming and engineering, all a natural fit for the typical Acorn user. And, while the Archimedes was never likely to become the first choice for pure game lovers, a fair number of popular games did get ported. After all, even boffins and educators — or, perhaps more likely, their students — liked to indulge in a bit of pure fun sometimes.

In April of 1989, after almost two long, frustrating years of delays, Acorn released a revision of Arthur comprehensive enough to be given a whole new name. The new RISC OS incorporated many if not all of the original ambitions for ARX, at last providing the Archimedes with an attractive modern operating system worthy of its hardware. But by then, of course, it was far too late to capture the buzz a more complete Archimedes package might have garnered at its launch back in 1987.

Much to the frustration of many of their most loyal customers, Acorn still seemed not so much inept at marketing their wares to the common person as completely disinterested in doing so. It was as if they felt themselves somehow above it all. Perhaps they had taken a lesson from their one earlier attempt to climb down from their ivory tower and sell a computer for the masses. That attempt had taken the form of the Acorn Electron, a cut-down version of the BBC Micro released in 1983 as a direct competitor to the Sinclair Spectrum. Poor sales and overproduction of the Electron had been the biggest single contributor to Acorn’s mid-decade financial collapse and the loss of their independence to Olivetti. Having survived that trauma (after a fashion), Acorn seemed content to tinker away with technology for its own sake and to let the chips fall where they would when it came to actually selling the stuff that resulted.

Alan Sugar shows off the first of his new line of PC clones.

Alan Sugar shows off the first of his new line of PC clones.

If it provided any comfort to frustrated Acorn loyalists, Amstrad also began to seem more and more at sea after their triumphant first couple of years in the computer market. In September of 1986, they added a fourth line of computers to their catalog with the release of the PC — as opposed to PCW — range. The first IBM clones targeted at the British mass market, the Amstrad PC line might have played a role in its homeland similar to that of the Tandy 1000 in the United States, popularizing these heretofore business-centric machines among home users. As usual with Amstrad, the price certainly looked right for the task. The cheapest Amstrad PC model, with a generous 512 K of memory but no hard drive, cost £399; the most expensive, which included a 20 Mb hard drive, £949. Before the Amstrad PC’s release, the cheapest IBM clone on the British market had retailed for £1429.

But, while not a flop, the PC range never took off quite as meteorically as some had expected. For months the line was dogged by reports of overheating brought on by the machine’s lack of a fan (shades of the Apple III fiasco) that may or may not have had a firm basis in fact. Alan Sugar himself was convinced that the reports could be traced back to skulduggery by IBM and other clone manufacturers trying to torpedo his cheaper machines. When he finally bowed to the pressure to add a fan, he did so as gracelessly as imaginable.

I’m a realistic person and we are a marketing organization, so if it’s the difference between people buying the machine or not, I’ll stick a bloody fan in it. And if they say they want bright pink spots on it, I’ll do that too. What is the use of me banging my head against a brick wall and saying, “You don’t need the damn fan, sunshine?”

But there were other problems as well, problems that were less easily fixed. Amstrad struggled to source hard disks, which had proved a far more popular option than expected, resulting in huge production backlogs on many models. And, worst of all, they found that they had finally overreached themselves by setting the prices too low to be realistically sustainable; prices began to creep upward almost immediately.

For that matter, prices were creeping upward across Amstrad’s entire range of computers. In 1986, after years of controversy over the alleged dumping of memory chips into the international market on the part of the Japanese semiconductor industry, the United States pressured Japan into signing a trade pact that would force them to throttle back their production and increase their prices. Absent the Japanese deluge, however, there simply weren’t enough memory chips being made in the world to fill an ever more voracious demand. By 1988, the situation had escalated into a full-blown crisis for volume computer manufacturers like Amstrad, who couldn’t find enough memory chips to build all the computers their customers wanted — and certainly not at the prices their customers were used to paying for them. Amstrad’s annual sales declined for the first time in a long time in 1988 after they were forced to raise prices and cut production dramatically due to the memory shortage. Desperate to secure a steady supply of chips so he could ramp up production again, Sugar bought into Micron Technology, one of only two American firms making memory chips, in October of 1988 to the tune of £45 million. But within a year the memory-chip crisis, anticipated by virtually everyone at the time of the Micron buy-in to go on for years yet, petered out when factories in other parts of Asia began to come online with new technologies to produce memory chips more cheaply and quickly than ever. Micron’s stock plummeted, another major loss for Amstrad. The buy-in hadn’t been “the greatest deal I’ve ever done,” admitted Sugar.

Many saw in the Amstrad of these final years of the 1980s an all too typical story in business: that of a company that had been born and grown wildly as a cult of personality around its founder, until one day it got too big for any one man to oversee. The founder’s vision seemed to bleed away as the middle managers and the layers of bureaucracy moved in. Seduced by the higher profit margins enjoyed by business computers, Amstrad strayed ever further from Sugar’s old target demographic. New models in the PC range crept north of £1000, even £2000 for the top-of-the-line machines, while the more truck-driver-focused PCW and CPC lines were increasingly neglected. The CPC line would be discontinued entirely in 1990, leaving only the antique Spectrum to soldier on for a couple more years for Amstrad in the role of general-purpose home computer. It seemed that Amstrad at some fundamental level didn’t really know how to go about producing a brand new machine in the spirit of the CPC in this era when making a new home computer was much more complicated than plugging together some off-the-shelf chips and hiring a few hackers to knock out a BASIC for the thing. Amstrad would continue to make computers for many years to come, but by the time the 1990s dawned their brief-lived glory days of 60 percent market share were already fading into the rosy glow of nostalgia.

For all their very real achievements over the course of a very remarkable decade in British computing, Acorn and Amstrad each had their own unique blind spot that kept them from achieving even more. In the Archimedes, Acorn had a machine that was a match for any other microcomputer in the world in any application you cared to name, from games to business to education. Yet they released it in half-baked form at too high a price, then failed to market it properly. In their various ranges, Amstrad had the most comprehensive lineup of computers of anyone in Britain during the mid- to late-1980s. Yet they lacked the corporate culture to imagine what people would want five years from now in addition to what they wanted today. The world needs visionaries and commodifiers alike. What British computing lacked in the 1980s was a company capable of integrating the two.

That lack left wide open a huge gap in the market: space for a next-generation home computer with a lot more power and much better graphics and sound than the likes of the old Sinclair Spectrum, but that still wouldn’t cost a fortune. Packaged, priced, and marketed differently, the Archimedes might have been that machine. As it was, buyers looked to foreign companies to provide. Neglected as Europe still was by the console makers of Japan, the British punters’ choice largely came down to one of two American imports, the Commodore Amiga and the Atari ST. Both — especially the former — would live very well in this gap that neither Acorn nor Amstrad deigned to fill for too long. Acorn did belatedly try with the release of the Archimedes A3000 model in mid-1989 — laid out in the all-in-one-case, disk-drive-on-the-side fashion of an Amiga 500, styled to resemble the old BBC Micro, and priced at a more reasonable if still not quite reasonable enough £745. But by that time the Archimedes’s fate as a boutique computer for the wealthy, the dedicated, and the well-connected was already decided. As the decade ended, an astute observer could already detect that the wild and woolly days of British computing as a unique culture unto itself were numbered.

The Archimedes A3000 marked the end of an era, the last Acorn machine to also bear the BBC logo.

The Archimedes A3000 marked the end of an era, the last Acorn machine to bear the BBC logo.

And that would be that, but for one detail: the fairly earth-shattering detail of ARM. The ARM CPU’s ability to get extraordinary performance out of a relatively low clock speed had a huge unintended benefit that was barely even noticed by Acorn when they were in the process of designing it. In the world of computer engineering, higher clock speeds translate quite directly into higher power usage. Thus the ARM chip could do more with less power, a quality that, along with its cheapness and simplicity, made it the ideal choice for an emerging new breed of mobile computing devices. In 1990 Apple Computer, hard at work on a revolutionary “personal digital assistant” called the Newton, came calling on Acorn. A new spinoff was formed in November of 1990, a partnership among Acorn, Apple, and the semiconductor firm VLSI Technology, who had been fabricating Acorn’s ARM chips from the start. Called simply ARM Holdings, it was intended as a way to popularize the ARM architecture, particularly in the emerging mobile space, among end-user computer manufacturers like Apple who might be leery of buying ARM chips directly from a direct competitor like Acorn.

And popularize it has. To date about ten ARM CPUs have been made for every man, woman, and child on the planet, and the numbers look likely to continue to soar almost exponentially for many years to come. ARM CPUs are found today in more than 95 percent of all mobile phones. Throw in laptops (even laptops built around Intel processors usually boast several ARM chips as well), tablets, music players, cameras, GPS units… well, you get the picture. If it’s portable and it’s vaguely computery, chances are there’s an ARM inside. ARM, the most successful CPU architecture the world has ever known, looks likely to continue to thrive for many, many years to come, a classic example of unintended consequences and unintended benefits in engineering. Not a bad legacy for an era, is it?

(Sources: the book Sugar: The Amstrad Story by David Thomas; Acorn User of July 1985, October 1985, March 1986, September 1986, November 1986, June 1987, August 1987, September 1987, October 1988, November 1988, December 1988, February 1989, June 1989, and December 1989; Byte of November 1984; 8000 Plus of October 1986; Amstrad Action of November 1985; interviews with Hermann Hauser, Sophie Wilson, and Steve Furber at the Computer History Museum.)

Footnotes

Footnotes
1 Roger Wilson now lives as Sophie Wilson. As per my usual editorial policy on these matters, I refer to her as “he” and by her original name only to avoid historical anachronisms and to stay true to the context of the times.
 

Tags: , , ,

One is Enough For SimCity

It’s been a mixed week here in the man cave. On the one hand, last week’s article on SimCity blew up pretty big in social media, attracting lots of positive comments in the process; believe me, a writer can never tire of adjectives like “wonderful,” “fantastic,” and “hugely interesting.” But on the other, I made one of the more embarrassing errors in this blog’s history with the video clip I embedded into the same article; I attributed the speaker in that video to be Will Wright instead of his racing partner Rick Doherty. Then, just to compound the error, I decided to get all cute about it: “Wright shows off some of the RX-7’s gadgetry using the same rapid-fire, jargon-laden diction that journalists and tech-conference attendees would later come to know if not always love.” Ouch. Teach me not to try to be too clever.

But most of all this week, I’ve been struggling with my planned second article about SimCity. Actually, I’ve been struggling with my coverage of the game in general on and off for months now. My original plan was to do a deep dive, to try to draw a lot of connections to the history of urban planning and the lives of cities in general. I never could quite figure out how to do that in an interesting way, however, especially as I spent more time with the simulation itself and had to face how limited it ultimately is. My big plans got pared down to a couple of articles, one more factual and historical, one more critical and philosophical.

And then this week I couldn’t seem to get even the second of those articles to come together. My work kept devolving into a nitpicky poking of holes in the simulation, which isn’t really fair given the constraints under which it was produced. Or it became an extended critique of Will Wright for failing to make the sorts of games I personally most enjoy playing, which is still less fair, especially given that his work on SimCity cracked open the door for so many later games I do unabashedly love.

At some point in all of this, I realized something: that the first article stood alone just fine. It says everything I want and need to say about SimCity‘s history and its importance to gaming. Should there be any doubt, the latter will inevitably be continually reemphasized in future articles, as I write about some of the countless games that bear the stamp of Will Wright’s original innovation — not least many of the games designed by my personal hero Sid Meier. So, I’ve decided largely to leave well enough alone. I’ve tinkered a bit with the first — now only — article to highlight a few points, but rereading it is probably only for the extremely dedicated among you.

I feel very good about the decision, but all of this wheel-spinning does mean that I can’t give you a new article this week, for which my apologies. As the writers among you can doubtless attest, sometimes you don’t really know what you already have until you try to add to it. What can I say? It’s a process.

Next week we’ll be continuing with our previously planned programming, returning to the British scene for a few more articles after our brief sojourn back to the United States for SimCity. I’ve got some very interesting material in the oven — and material which I thankfully know exactly what to do with. So, catch you then.

In the meantime, thanks a million as always for reading and supporting this work!

 

Will Wright’s City in a Box

Will Wright, 1990

Will Wright, 1990

In “The Seventh Sally,” a story by the great Polish science-fiction writer Stanislaw Lem, a god-like “constructor” named Trurl comes upon a former tyrant named Excelsius, now exiled to a lonely asteroid by the peoples of the planets he used to terrorize. Upon learning of Trurl’s powers, Excelsius demands that he restore him to his throne. Trurl, however, is wise enough to consider what suffering Excelsius’s reinstatement would bring to his subjects. So, he instead fashions an intricate simulacrum of a kingdom for Excelsius to rule over.

And all of this, connected, mounted, and ground to precision, fit into a box, and not a very large box, but just the size that could be carried about with ease. This Trurl presented to Excelsius, to rule and have dominion over forever; but first he showed him where the input and output of his brand-new kingdom were, and how to program wars, quell rebellions, exact tribute, collect taxes, and also instructed him in the critical points and transition states of that microminiaturized society — in other words the maxima and minima of palace coups and revolutions — and explained everything so well that the king, an old hand in the running of tyrannies, instantly grasped the directions and, without hesitation, while the constructor watched, issued a few trial proclamations, correctly manipulating the control knobs, which were carved with imperial eagles and regal lions. These proclamations declared a state of emergency, martial law, a curfew, and a special levy. After a year had passed in the kingdom, which amounted to hardly a minute for Trurl and the king, by an act of the greatest magnanimity — that is, by a flick of the finger at the controls — the king abolished one death penalty, lightened the levy, and deigned to annul the state of emergency, whereupon a tumultuous cry of gratitude, like the squeaking of tiny mice lifted by their tails, rose up from the box, and through its curved glass cover one could see, on the dusty highways and along the banks of lazy rivers that reflected the fluffy clouds, the people rejoicing and praising the great and unsurpassed benevolence of their sovereign lord.

And so, though at first he had felt insulted by Trurl’s gift, in that the kingdom was too small and very like a child’s toy, the monarch saw that the thick glass lid made everything inside seem large; perhaps too he duly understood that size was not what mattered here, for government is not measured in meters and kilograms, and emotions are somehow the same, whether experienced by giants or dwarfs — and so he thanked the constructor, if somewhat stiffly. Who knows, he might even have liked to order him thrown in chains and tortured to death, just to be safe — that would have been a sure way of nipping in the bud any gossip about how some common vagabond tinkerer presented a mighty monarch with a kingdom. Excelsius was sensible enough, however, to see that this was out of the question, owing to a very fundamental disproportion, for fleas could sooner take their host into captivity than the king’s army seize Trurl. So with another cold nod, he stuck his orb and scepter under his arm, lifted the box kingdom with a grunt, and took it to his humble hut of exile. And as blazing day alternated with murky night outside, according to the rhythm of the asteroid’s rotation, the king, who was acknowledged by his subjects as the greatest in the world, diligently reigned, bidding this, forbidding that, beheading, rewarding — in all these ways incessantly spurring his little ones on to perfect fealty and worship of the throne.

When first published in 1965, Lem’s tale was the most purely speculative of speculative fictions, set as it was thousands if not millions of years in the future. Yet it would take just another quarter of a century before real-world Excelsiuses got the chance to play with little boxed kingdoms of their own, nurturing their subjects and tormenting them as the mood struck. The new strain of living, dynamic worlds filled with apparently living, dynamic beings was soon given the name of “god game” to distinguish it from the more static games of war and grand strategy that had preceded it.

The first of the great god-game constructors, the one whose name would always be most associated with the genre, was a hyperactive chain-smoking, chain-talking Southerner named Will Wright. This is the story of him and his first living world — or, actually, living city — in a box.


 

Will Wright has always been a constructor. As a boy in the 1960s and 1970s, he built hundreds of models of ships, cars, and planes. At age 10, he made a replica of the bridge of the Enterprise out of balsa wood and lugged it to a Star Trek convention; it won a prize there, the first of many Wright would get to enjoy during his life. When developments in electronics miniaturization made it possible, he started making his creations move, constructing primitive robots out of Lego bricks, model kits, and the contents of his local Radio Shack’s wall of hobbyist doodads. In 1980, the 20-year-old Wright and his partner Rick Doherty won the U.S. Express, an illegal coast-to-coast automobile race created by the organizer of the earlier Cannonball Run. A fighter jet’s worth of electronics allowed them to drive from New York City to Santa Monica in 33 hours and 39 minutes in a Mazda RX-7, cruising for long stretches of time at 120 miles per hour.

Wright was able to indulge these passions and others thanks to his late father, a materials engineer who invented a lucrative new process for manufacturing plastic packaging before dying of leukemia when his son was just 9 years old. His widow was very patient with her eccentric tinkerer of a son, similar in some ways to his practical-minded father but in others very different. Wright spent five years at various universities in and out of his home state of Louisiana, excelling in the subjects that caught his fancy — like architecture, economics, mechanical engineering, and military history — while ignoring entirely all the others. Through it all, his mother never put any undue pressure on him to settle on something, buckle down, and get an actual degree. When he told her in no uncertain terms that he wouldn’t be taking over the family business his father had left in trust for him, she accepted that as well. Yet even she must have struggled to accept the notion of her 22-year-old son running off to California with Joell Jones, a painter 11 years his senior; the two had bonded when Jones severed a nerve in her wrist and Wright built a gadget out of metal and rubber bands to allow her to continue to paint. The two would marry in 1984.

Given his love for electronic gadgetry, it will likely come as no surprise that Wright was snared quickly by the nascent PC revolution. Already by 1980 he had added an Apple II to his collection of toys, and with it computer programming and computer gaming to his long list of hobbies; his first computerized love was Bruce Artwick’s primitive original Flight Simulator. But it was only after moving to Oakland with Jones that he started thinking seriously about writing a game of his own. This first and arguably last entirely practical, commercial project of his life was apparently prompted by his now living permanently away from home, an adult at last. At some point even a dreamer has to do something with his life, and making computer games seemed as good a choice as any.

His first game was in some ways the antithesis of everything he would do later: a conventional experience in a proven genre, a game designed to suit the existing market rather than a game designed to create its own new market, and the only Will Wright game that can actually be won in the conventional sense. Like many games of its era, its design was inspired by a technical trick. Wright, who had moved on from his Apple II to a Commodore 64 by this time, had figured out a way to scroll smoothly over what appeared to be a single huge background image. “I knew the Apple couldn’t begin to move that much in the way of graphics around the screen that quickly,” he says. “So I designed the game around that feature.”

Raid on Bungeling Bay on the Commodore 64

Raid on Bungeling Bay on the Commodore 64

Raid on Bungeling Bay owed a lot to Choplifter and a little to Beach-Head, sending you off in a futuristic helicopter to strike at the heart of the evil Bungeling Empire, returning when necessary to your home base for repairs and more ammunition. The most impressive aspect of the game, even more so than its graphical tricks, was the sophisticated modeling of the enemy forces. The Bungeling factories would turn out more advanced hardware as time went on, while your ability and need to disrupt supply lines and to monitor and attack the enemy on multiple fronts created a craving for at least a modicum of strategy as well as reflexes.

Wright sold Raid on Bungeling Bay to Brøderbund Software, who published it in 1984, whereupon it sold a reasonable if hardly overwhelming 30,000 copies on the Commodore 64. But, in contrast to so many of its peers, that wasn’t the end of the story. Hudson Soft in Japan took note of the game, paying Brøderbund and Wright for the right to make it into a cartridge for the Nintendo Entertainment System. Wright claims it sold an astonishing 750,000 copies on the NES in Japan and later the United States, giving him a steady income while he played around with the ideas that would become his next project, the one that would really make his name.

As it happened, the first project merged into the second almost seamlessly. Wright had written a tool for his own use in creating the Bungeling Empire’s cities, a little world editor that would let him scroll around a virtual space, laying down tiles to represent land and sea, factories and gun turrets. He realized at some point — perhaps after his game had shipped and yet he was still tinkering with his world inside the editor — that he found this task of creation much more compelling than the act of destruction that was actually playing the game. Might there be others who felt like him? Based on the success of Electronic Arts’s Pinball Construction Set, a program he hugely admired, he thought there just might be.

One fateful day Wright shared his world editor and his still half-baked ideas about what to do with it with his neighbor Bruce Joffe. An established architect and urban planner, Joffe had studied under Jay Wright Forrester at MIT, generally regarded as the founder of the entire field of system dynamics — i.e., using a computer to simulate a complex, dynamic reality. When he saw Wright’s little Bungeling Empire cities, Joffe was immediately reminded of Forrester’s work. He wasted no time in telling his friend that he really needed to check this guy out.

Even though the two have never to my knowledge met, Jay Wright Forrester and Will Wright were a match made in heaven; they shared much beyond the name of “Wright.” Both, to name one example, got their start in the field of simulation with a flight simulator, Jay Wright Forrester trying to build one and Will Wright trying to figure out how Bruce Artwick’s Flight Simulator really worked.

Driven by his desire to make a flight simulator, Forrester had been instrumental in the creation of Whirlwind, the first real computer, in the sense that we understand the term today, to be built in the United States. [1]The more canonical example in American textbooks, the ENIAC, could only be “programmed” by physically rewiring its internals. It’s probably better understood as an elaborate calculating machine than a true computer; its original purpose was to calculate static artillery firing tables. As in so many things, politics plays a role in ENIAC’s anointment. The first computer programmable entirely in software, pre-dating even Whirlwind, was EDSAC-1, built at Cambridge University in Britain. That such a feat was first managed abroad seems to be just a bit more than some Americans in Silicon Valley and elsewhere can bring themselves to accept. The flight simulator never quite came together, but an undaunted Forrester moved on to Project SAGE, an air-defense early-warning system that became easily the most elaborate computing project of the 1950s. From there, he pioneered economic and industrial modeling on computers, and finally, in the late 1960s, arrived at what he called “urban dynamics.” Forrester’s urban modeling created a firestorm of controversy among city planners and social activists; as he put it in his dry way, it “was the first of my modeling work that produced strong, emotional reactions.” He was accused of everything from incompetence to racism when his models insisted that low-cost urban public housing, heretofore widely regarded as a potent tool for fighting poverty, was in reality “a powerful tool for creating poverty, not alleviating it.”

Of more immediate interest to us, however, is the reaction one Will Wright had to Forrester’s work many years after all the controversy had died away. The jacket copy of Forrester’s book Urban Dynamics reads like a synopsis of the simulation Wright was now about to create on a microcomputer: “a computer model describing the major internal forces controlling the balance of population, housing, and industry within an urban area,” which “simulates the life cycle of a city and predicts the impact of proposed remedies on the system.” When Wright’s neighbor Joffe had studied under Forrester in the 1970s, the latter had been constructing physical scale models of his urban subjects, updating them as time went on with the latest data extracted from his computer programs. If he could build a similar program to live behind his graphical Bungeling Empire cities, Wright would have found a much easier way to study the lives of cities. At about the same time that he had that initial conversation with Joffe, Wright happened to read the Stanislaw Lem story that opened this article. If he needed further inspiration to create his own city in a box, he found plenty of it there.

Never one to shy away from difficult or esoteric academic literature, Wright plunged into the arcane theoretical world of system dynamics. He wound up drawing almost as much from John Horton Conway’s 1970 Game of Life, another major landmark in the field, as he did from Forrester. Wright:

System dynamics is a way to look at a system and divide it into, basically, stocks and flows. Stocks are quantities, like population, and flows are rates, like the death rate, the birth rate, immigration. You can model almost anything using those two features. That was how he [Forrester] started system dynamics and that was the approach he took to his modeling. I uncovered his stuff when I started working on SimCity and started teaching myself modeling techniques. I also came across the more recent stuff with cellular automata [i.e., Conway’s Game of Life], and SimCity is really a hybrid of those two approaches. Because his [Forrester’s] approach was not spatial at all, whereas the cellular automata gives you a lot of really interesting spatial tools for propagation, network flow, proximity, and so forth. So the fact that pollution starts here, spreads over here, and slowly gets less and less, and you can actually simulate propagation waves through these spatial structures. So SimCity in some sense is like a big three-dimensional cellular automata, with each layer being some feature of the landscape like crime or pollution or land value. But the layers can interact on the third dimension. So the layers of crime and pollution can impact the land-value layer.

This description subtly reveals something about the eventual SimCity that is too often misunderstood. The model of urban planning that underpins Wright’s simulation is grossly simplified and, often, grossly biased to match its author’s own preexisting political views. SimCity is far more defensible as an abstract exploration of system dynamics than as a concrete contribution to urban planning. All this talk about “stocks” and “flows” illustrates where Wright’s passion truly lay. For him the what that was being simulated was less interesting than the way it was being simulated. Wright:

I think the primary goal of this [SimCity] is to show people how intertwined such things can get. I’m not so concerned with predicting the future accurately as I am with showing which things have influence over which other things, sort of a chaos introduction, where the system is so complex that it can get very hard to predict the future ramifications of a decision or policy.

After working on the idea for about six months, Wright brought a very primitive SimCity to Brøderbund, who were intrigued enough to sign him to a contract. But over the next year or so of work a disturbing trend manifested. Each time Wright would bring the latest version to Brøderbund, they’d nod approvingly as he showed all the latest features, only to ask, gently but persistently, a question Wright learned to loathe: when would he be making an actual game out of the simulation? You know, something with a winning state, perhaps with a computer opponent to play against?

Even as it was, SimCity was hardly without challenge. You had to plan and manage your city reasonably well or it would go bankrupt or drown in a sea of crime or other urban blights and you, the mayor, would get run out of town on a rail. Yet it was also true that there wasn’t a conventional winning screen to go along with all those potential losing ones. Wright tried to explain that the simulation was the game, that the fun would come from trying things out in this huge, wide-open possibility space and seeing what happened. He thought he had ample evidence from his friends that he wasn’t the only one who liked to play this way. They would dutifully build their cities to a point and then, just like Excelsius in the story, would have just as much fun tearing them down, just to see what happened. Indeed, they found the virtual destruction so enjoyable that Wright added disasters to the program — fires, earthquakes, tornadoes, even a rampaging Godzilla monster — that they could unleash at will. As with everything else in SimCity, the motivation for a player consciously choosing to destroy all her labor was just to see what would happen. After all, you could always save the game first. Wright:

When I first started showing the Commodore version, the only thing that was in there was a bulldozer, basically to erase mistakes. So if you accidentally built a road or a building in the wrong place you could erase it with the bulldozer. What I found was that, invariably, in the first five minutes people would discover the bulldozer, and they would blow up a building with it by accident. And then they would laugh. And then they would go and attack the city with the bulldozer. And they’d blow up all the buildings, and they’d be laughing their heads off. And it really intrigued me because it was like someone coming across an ant pile and poking it with a stick to see what happens. And they would get that out of their system in about ten minutes, and then they would realize that the hard part wasn’t destroying, but building it back up. And so people would have a great time destroying the city with a bulldozer, and then they would discover, “Wow, the power’s out. Wow, there’s a fire starting.” And that’s when they would start the rebuilding process, and that’s what would really hook them. Because they would realize that the destruction was so easy in this game, it was the creation that was the hard part. And this is back when all games were about destruction. After seeing that happen with so many people, I finally decided, “Well I might as well let them get it out of their systems. I’ll add disasters to the game.” And that’s what gave me the idea for the disasters menu.

Wright asked Brøderbund to look at his “game” not as a conventional zero-sum ludic experience, but as a doll house or a train set, an open-ended, interactive creative experience — or, to use the term the market would later choose, as a “sandbox” for the player. Wright:

I think it [sandbox gaming] attracts a different kind of player. In fact, some people play it very goal-directed. What it really does is force you to determine the goals. So when you start SimCity, one of the most interesting things that happens is that you have to decide, “What do I want to make? Do I want to make the biggest possible city, or the city with the happiest residents, or the most parks, or the lowest crime?” Every time you have to idealize in your head, “What does the ideal city mean to me?” It requires a bit more motivated player. What that buys you in a sense is more replayability because we aren’t enforcing any strict goal on you. We could have said, “Get your city to 10,000 people in ten years or you lose.” And you would always have to play that way. And there would be strategies to get there, and people would figure out the strategies, and that would be that. By leaving it more open-ended, people can play the game in a lot of different ways. And that’s where it becomes more like a toy.

But Brøderbund just couldn’t seem to understand what he was on about. At last, Wright and his publisher parted ways in a haze of mutual incomprehension. By the time they did so, the Commodore 64 SimCity was essentially complete; it would finally be released virtually unchanged more than two years later.

SimCity on the Commodore 64

SimCity on the Commodore 64

For the moment, though, nobody seemed interested at all. After halfheartedly shopping SimCity around to some other publishers (among them Cinemaware) without a bite, Wright largely gave up on the idea of ever getting it released. But then in early 1987, with SimCity apparently dead in the water, he was invited to a pizza party for game developers hosted by a young businessman named Jeff Braun. Braun, who envisioned himself as the next great software entrepreneur, had an ulterior motive: he was looking for the next great game idea. “Will is a very shy guy, and he was sitting by himself, and I felt sorry for him,” Braun says. In marked contrast to Brøderbund, Braun saw the appeal of SimCity before he ever even saw the program in action, as soon as a very reluctant, thoroughly dispirited Wright started to tell him about it. His interest was piqued despite Wright being far from a compelling pitchman: “Will kept saying that this won’t work, that no one likes it.”

Braun nevertheless suggested that he and Wright found their own little company to port the program from the Commodore 64 to the Apple Macintosh and Commodore Amiga, more expensive machines whose older and presumably more sophisticated buyers might be more receptive to the idea of an urban-planning simulation. Thus was Maxis Software born.

Wright ported the heart of the simulation from Commodore 64 assembler to platform-independent C while a few other programmers Braun had found developed user interfaces and graphics for the Macintosh and Amiga. The simulation grew somewhat more complex on the bigger machines, but not as much as you might think. “It got more elaborate, more layers were added, and there was higher resolution on the map,” says Wright, “but it had the same basic structure for the simulation and the same basic sets of tools.”

SimCity on the Macintosh

SimCity on the Macintosh

While Wright and the other programmers were finishing up the new versions of SimCity, Braun scared up a very surprising partner for their tiny company. He visited Brøderbund again with the latest versions, and found them much more receptive to Wright’s project this time around, a switch that Wright attributes to the generally “more impressive” new versions and the fact that by this point “the market was getting into much more interesting games.” Still somewhat concerned about how gamers would perceive Wright’s non-game, Brøderbund did convince Maxis to add a set of optional “scenarios” to the sandbox simulation, time-limited challenges the player could either meet or fail to meet, thus definitively winning or losing. The eight scenarios, some historical (the San Francisco earthquake of 1906, the fire-bombing of Hamburg in 1944), some hypothetical (a nuclear meltdown in Boston in 2010, the flooding of Rio de Janeiro in 2047 thanks to global warming), and some unabashedly fanciful (a monster attack on Tokyo in 1957), were all ultimately less compelling than they initially sounded, being all too clearly shoehorned into an engine that had never been designed for this mode of play. Still, Brøderbund’s perceived need to be able to honestly call SimCity a game was met, and that was the most important thing. Brøderbund happily agreed to become little Maxis’s distributor, a desperately needed big brother to look after them in a cutthroat industry.

SimCity

SimCity shipped for the Macintosh in February of 1989, for the Commodore 64 in April, and for the Amiga in May. Some people immediately sat up to take notice of this clearly new thing; sales were, all things considered, quite strong right out of the gate. In an online conference hosted on June 19, 1989, Wright said that they had already sold 11,000 copies of the Macintosh version and 8000 of the Amiga, big numbers in a short span of time for those relatively small American gaming markets. Presaging the real explosion of interest still to come, he noted that Maxis had had “many inquiries from universities and planning departments.” And indeed, already in August of 1989 the first academic paper on SimCity would be presented at an urban-planning conference. Realizing all too well himself how non-rigorous an exercise in urban planning SimCity really was, Wright sounded almost sheepish in contemplating “a more serious version” for the future.

SimCity for MS-DOS

SimCity for MS-DOS

SimCity would begin to sell in really big numbers that September, when the all-important MS-DOS version appeared. Ports to virtually every commercially viable or semi-viable computer in the world appeared over the next couple of years, culminating in a version for the Super Nintendo Entertainment System in August of 1991.

SimCity for Super Nintendo

SimCity for Super Nintendo

It’s at this point that our history of SimCity the private passion project must inevitably become the history of SimCity the public sensation. For, make no mistake, a public sensation SimCity most definitely became. It sold and sold and sold, and then sold some more, for years on end. In 1991, the year it celebrated its second anniversary on the market, it still managed to top the charts as the annum’s best-selling single computer game. Even five years after its release, with Wright’s belated “more serious” — or at least more complicated — version about to ship as SimCity 2000, the original was still selling so well that Maxis decided to rename it SimCity Classic and to continue to offer it alongside its more advanced variant. In that form it continued to sell for yet several more years. Shelf lives like this were all but unheard of in the fickle world of entertainment software.

In all, the original SimCity sold at least 500,000 copies on personal computers, while the Super Nintendo version alone sold another 500,000 to console gamers. Spin-offs, sequels, and derivatives added millions and millions more to those numbers in the years that followed the original’s long heyday; at no point between 1989 and today has there not been at least one SimCity title available for purchase. And, believe me, people have continued to purchase. SimCity 2000 (1994) and SimCity 3000 (1999) both became the best-selling single computer games of their respective release years, while post-millennial iterations have sold in the millions as a matter of routine.

But almost more important than the quantities in which the original SimCity sold and the veritable cottage industry it spawned are the people to whom it was selling. By the time they signed Maxis to a distribution contract, Brøderbund had long since demonstrated their knack for getting past the nerdy hardcore of computer users, for bypassing Dungeons & Dragons and military simulations and all the rest to reach the great unwashed masses of Middle America. Brøderbund’s The Print Shop and their Carmen Sandiego series in particular remain icons of ordinary American life during the 1980s. SimCity must be added to that list for the 1990s. Beginning with a June 15, 1989, piece in no less august a journal than The New York Times, seemingly every newspaper and news magazine in the country wrote about SimCity. For a mainstream media that has never known quite what to make of computer games, this was the rare game that, like Carmen Sandiego, was clearly good for you and your kids.

SimCity even penetrated into the political sphere. With a mayoral election pending in 1990, The Providence Journal set up a contest for the five candidates for the post, letting each have his way with a simulated version of Providence, Rhode Island. The winner of that contest also wound up winning the election. More amusing was the experiment conducted by Detroit News columnist Chuck Moss. He sent Godzilla rampaging through a simulated Detroit, then compared the result with the carnage wrought by Coleman Young during his two-decade real-world reign as mayor. His conclusion? Godzilla had nothing on Mayor Young.

If the interest SimCity prompted in the mainstream media wasn’t unusual enough, academia’s eagerness to jump on the bandwagon in these years long before “game studies” became an accepted area of interest is even more astonishing. Articles and anecdotes about Will Wright’s creation were almost as prevalent in the pages of psychology and urban-planning journals as they were in newspapers. Plenty of the papers in the latter journals, written though they were by professionals in their field who really should have known better, credited Wright’s experiment with an authority out of all proportion to the fairly simplistic reality of the simulation, in spite of candid admissions of its limitations from the people who knew the program best. “I wouldn’t want to predict a real city with it,” Wright said. Bruce Joffe, the urban planner who had set Wright down the road to SimCity, responded with one word when asked if he would use the program to simulate any aspect of a city he was designing in the real world: “No.” And yet SimCity came to offer perhaps the most compelling demonstration of the Eliza Effect since Joseph Weizenbaum’s simple chatbot that had given the phenomenon its name. The world, SimCity proved once again, is full of Fox Mulders. We all want to believe.

In that spirit, SimCity also found a home in a reported 10,000 elementary-, middle-, and high-school classrooms across the country, prompting Maxis to offer a new pedagogical version of the manual, focused on techniques for using the simulation as a teaching tool. And SimCity started showing up on university syllabi as well; the construction of your own simulated city became a requirement in many sociology and economics classes.

Back in May of 1989, Computer Gaming World had concluded their superlative review of SimCity — one of the first to appear anywhere in print — by asking their readers to “buy this game. We want them to make lots of money so they’ll develop SimCounty, SimState, SimNation, SimPlanet, SimUniverse… billions and billions of games!” The hyperbole proved prescient; Maxis spent the 1990s flooding the market with new Sim titles.

SimEarth on MS-DOS

SimEarth on MS-DOS

Jay Wright Forrester’s follow-up to his book Urban Dynamics had been Global Dynamics, an inquiry into the possibility of simulating the entire world as a dynamic system. Wright’s own next game, then, was 1990’s SimEarth, which attempted to do just that, putting you in charge of a planet through 10 billion years of geological and biological evolution. SimEarth became a huge success in its day, one almost comparable to SimCity. The same year-end chart that shows SimCity as the best-selling single title of 1991 has SimEarth at number two — quite a coup for Maxis. Yet, like virtually all of the later Sim efforts, SimEarth is far less fondly remembered today than is its predecessor. The ambitious planet simulator just wasn’t all that much fun to play, as even Wright himself admits today.

But then, one could make the same complaint about many of Maxis’s later efforts, which simulated everything from ant colonies to office towers, healthcare systems (!) to rain forests. New Sim games began to feel not just like failed experiments but downright uninspired, iterating and reiterating endlessly over the same concept of the open-ended “software toy” even as other designers found ways to build SimCity‘s innovations into warmer and more compelling game designs. Relying heavily as always on his readings of the latest scientific literature, Wright could perhaps have stood to put away the academic journals from time to time and crack open a good novel; he struggled to find the human dimension in his simulations. The result was a slow but steady decline in commercial returns as the decade wore on, a trend from which only the evergreen SimCity and its sequels were excepted. Not until 2000 would Maxis finally enjoy a new breakthrough title, one that would dwarf even the success of SimCity… but that is most definitely a story for another time.

Given its storied history and the passion it once inspired in so many players, playing the original SimCity as well for the first time today is all but guaranteed to be a somewhat underwhelming experience. Even allowing for what now feels like a crude, slow user interface and absurdly low-resolution graphics, everything just feels so needlessly obscure, leaving you with the supreme frustration of losing again and again without being able to figure out why you’re losing. Not for nothing was this game among the first to spawn a book-length strategy guide — in fact, two of them. You need inside information just to understand what’s going on much of the time. There are games that are of their time and games that are for all time. In my perhaps controversial opinion, the original SimCity largely falls into the former category.

But, far from negating SimCity‘s claim to our attention, this judgment only means that we, as dutiful students of history, need to try even harder to understand what it was that so many people first saw in what may strike us today as a perversely frustrating simulation. Those who played the original SimCity for the first time, like those who played the original AdventureDefender of the Crown, and a bare handful of other landmark games in the history of the hobby, felt the full shock of a genuinely new experience that was destined to change the very nature of gaming. It’s a shock we can try to appreciate today but can never fully replicate.

You can see traces of SimCity in many if not most of the games we play today, from casual social games to hardcore CRPG and strategy titles. Sid Meier, when asked in 2008 to name the three most important innovations in the history of electronic gaming, listed the invention of the IBM PC, the Nintendo Seal of Quality… and, yes, SimCity. “SimCity was a revelation to most of us game designers,” says Meier. “The idea that players enjoyed a game that was open-ended, non-combative, and emphasized construction over destruction opened up many new avenues and possibilities for game concepts.” Many years before Meier’s statement, Russell Sipe, the respected founder of Computer Gaming World, said simply that “SimCity has changed the face of computer-entertainment software.” He was and is absolutely correct. Its influence really has been that immense.

(Sources: Magazines include Amazing Computing of October 1989; Game Developer from April 2006; MacWorld from April 1990; Computer Gaming World from May 1989; Compute! from January 1992; The New Yorker from November 6 2006. Newspapers include The San Francisco Chronicle from November 3 2003; The New York Times from June 15 1989; The Los Angeles Times from October 2 1992. Books include The Cyberiad by Stanislaw Lem; The SimCity Planning Commission Handbook by Johnny L. Wilson; Game Design Theory and Practice by Richard Rouse III; The City of Tomorrow and Its Planning by Le Corbusier; The Second Self by Sherry Turkle. Current and archived online sources include John Cutter’s blog; Game Research; articles about Will Wright and Sid Meier on Wired; The Next American City; Reform; GameSpot; a 1989 talk given by Jay Wright Forrester, which is hosted at MIT; First Monday; Taylor Francis Online. And finally, there’s the collection of Brøderbund archives I went through during my visit to the Strong Museum of Play.

Beginning with SimCity 2000, the more playable later iterations of the franchise are all available for purchase in various places online. For those of an historical bent who’d like to experience the original, I offer a zip that includes the first three versions — for the Macintosh, Commodore 64, and Amiga.)

Footnotes

Footnotes
1 The more canonical example in American textbooks, the ENIAC, could only be “programmed” by physically rewiring its internals. It’s probably better understood as an elaborate calculating machine than a true computer; its original purpose was to calculate static artillery firing tables. As in so many things, politics plays a role in ENIAC’s anointment. The first computer programmable entirely in software, pre-dating even Whirlwind, was EDSAC-1, built at Cambridge University in Britain. That such a feat was first managed abroad seems to be just a bit more than some Americans in Silicon Valley and elsewhere can bring themselves to accept.
 

Tags: , , , , , ,

1988 Ebook Now Available

The tenth and latest of the ebooks compiling my articles for this site is now freely available from the ebook download page, thanks as always to the beneficence of Richard Lindner. Please note that the division of the ebooks into historical years is necessarily only very rough, with many articles and themes crisscrossing backward and forward through time. So, if your favorite game from 1988 hasn’t yet appeared, don’t entirely despair.

That said… enjoy!

If you recently discovered this site, perhaps via one of the recent articles that blew up a bit in places like Hacker News and Twitter, and you’ve since become a regular reader, I’d firstly like to welcome you to this little journey I and some of my most longtime readers have been on for over five years now. (But don’t worry about having missed out on too much; we’re just getting started.) Secondly — you knew it couldn’t be that easy, right? — please do think about clicking one of the buttons over to the right to add your support via Patreon or PayPal. I can’t continue to do this work — and hopefully find ways to do it even better — without people just like you.

A recent survey that Patreon did with my backers revealed that you’d most like to know more about what’s coming up in the near future on the blog. I have indeed not done a great job with that, partly because I know there are some of you who like to be completely surprised by each new article. So, if you’re in that group, please take this as a spoiler warning and don’t read the next paragraph.

Our next big theme will be the advent of the so-called “god game” and the two designers who are still regarded today as perhaps the genre’s most dedicated practitioners. Interspersed between the two will be a long-delayed history of British computing after everything went sideways for Acorn and Sinclair in the mid-1980s. Then we’ll be following Infocom to the bitter end, finishing their story at last — but I’m personally more excited about digging into the pre-IF Renaissance American amateur scene that was springing up on the online services at the same time that Infocom was dying. The AGT era of text adventuring is little studied and little understood today, and I’m hugely looking forward to trying to correct that; there’s gold in them there hills. After that, we’ll be chronicling the history of the Macintosh after Steve Jobs left Apple. In the late 1980s, the Mac brought us a whole series of hugely important innovations, including HyperCard, Storyspace, and the first consumer-grade CD-ROMs, all of which we’ll be covering. But never fear, we’ll give equal time to the IBM clones as well — with the PS/2, OS/2, VGA graphics, and the first sound cards, there’s plenty to talk about there. We’ll stop in again at Sierra, always a bellwether for the PC-gaming industry as a whole; while there, I’ll get to write about a Sierra adventure game I actually like. (Doesn’t happen often enough, I know.) Throw in another visit with Cinemaware, who began doing some fascinating experiments of their own with CD-ROM during this period, and that about sums about the next few months. It should be a fun ride.

Thanks so much to all of you who help me to do this important work, whether through public comments, private help with my writing and research, or financial support. You all are, to use the single most overused word in the English language appropriately for once, awesome.