We’ve seen how the pundits had already started speculating like crazy long before the actual release of IBM’s TopView, imagining it to be the key to some Machiavellian master plan for seizing complete control of the personal-computer market. But said pundits were giving IBM a bit more credit than perhaps was due. The company nicknamed Big Blue was indeed a big, notoriously bureaucratic place, and that reality tended to interfere with their ability to carry out any scheme, Machiavellian or otherwise, with the single-minded focus of a smaller company. There doubtless were voices inside IBM who could imagine using TopView as a way of shoving Microsoft aside, and had the product been a roaring success those voices doubtless would have been amplified. Yet thanks to the sheer multiplicity of voices IBM contained, the organization always seemed to be pulling in multiple directions at once. Thus even before TopView hit the market and promptly fizzled, a serious debate was taking place inside IBM about the long-term future direction of their personal computers’ system software. This particular debate didn’t focus on extensions to MS-DOS — not even on an extension like TopView which might eventually be decoupled from the unimpressive operating system underneath it. The question at hand was rather what should be done about creating a truly holistic replacement for MS-DOS. The release of a new model of IBM personal computer in August of 1984 had given that question new urgency.
The PC/AT, the first really dramatic technical advance on the original IBM PC, used the new Intel 80286 processor in lieu of the older machine’s 8088. The 80286 could function in two modes. In “real” mode, grudgingly implemented by Intel’s engineers in the name of backward compatibility, it essentially was an 8088, with the important difference that it happened to run much faster. Otherwise, though, it shared most of the older chip’s limitations, most notably the ability to address only 1 MB of memory — the source, after the space reserved for system ROMs and other specialized functions was subtracted, of the original IBM PC’s limitation to 640 K of RAM. It was only in the 80286’s “protected” mode that the new chip’s full capabilities were revealed. In this mode, it could address up to 16 MB of memory, and implemented hardware memory protection ideal for the sort of modern multitasking operating system that MS-DOS so conspicuously was not.
The crux of IBM’s dilemma was that MS-DOS, being written for the 8088, could run on the 80286 only in real mode, leaving most of the new chip’s capabilities unused. Memory beyond 640 K could thus still be utilized only via inefficient and ugly hacks, even on a machine with a processor that, given a less clueless operating system, was ready and willing to address up to 16 MB. IBM therefore decided that sooner or later — and preferably sooner — MS-DOS simply had to go.
This much was obvious. What was less obvious was where this new-from-the-ground-up IBM operating system should come from. Over months of debate, IBM’s management broke down into three camps.
One camp advocated buying or licensing Unix, a tremendously sophisticated and flexible operating system born at AT&T’s Bell Labs. Unix was beloved by hackers everywhere, but remained under the thumb of AT&T, who licensed it to third parties with the wherewithal to pay for it. Ironically, Microsoft had had a Unix license for years, using it to create a product they called Xenix, by far the most widely used version of Unix on microcomputers during the early 1980s. Indeed, their version of Xenix for the 80286 had of late become the best way for ambitious users not willing to settle for MS-DOS to take full advantage of the PC/AT’s capabilities. Being an already extant operating system which Microsoft among others had been running on high-end microcomputers for years, a version of Unix for the business-microcomputing masses could presumably be put together fairly quickly, whether by Microsoft or by IBM themselves. Yet Unix, having been developed with bigger institutional computers in mind, was one heck of a complicated beast. IBM feared abandoning MS-DOS, with its user-unfriendliness born of primitiveness, only to run afoul of Unix’s user-unfriendliness born of its sheer sophistication. Further, Unix, having been developed for text-only computers, wasn’t much good for graphics — and thus not much good for GUIs. [1]Admittedly, this was already beginning to change as IBM was having this debate: the X Window project was born at MIT in 1984. The conventional wisdom held it to be an operating system better suited to system administrators and power users than secretaries and executives.
A second alternative was for IBM to make a new operating system completely in-house for their personal computers, just as they always had for their mainframes. They certainly had their share of programmers with experience in modern system software, along with various projects which might become the basis of a new microcomputer operating system. In particular, the debaters returned over and over again to one somewhat obscure model in their existing personal-computer lineup. Released in late 1983, the 3270 PC came equipped with a suite of additional hardware and software that let it act as a dumb terminal for a mainframe, while also running — simultaneously with multiple mainframe sessions, if the user wished — ordinary MS-DOS software. To accomplish that feat, IBM’s programmers had made a simple windowing environment that could run MS-DOS in one window, mainframe sessions in others. They had continued to develop the same software after the 3270 PC’s release, yielding a proto-operating system with the tentative name of Mermaid. The programmers who created Mermaid would claim in later years that it was far more impressive than either TopView or the first release of Microsoft Windows; “basically, in 1984 or so,” says one, “we had Windows 3.0.” But there was a big difference between Mermaid and even the latter, relatively advanced incarnation of Windows: rather than Mermaid running under MS-DOS, as did Windows, MS-DOS could run under Mermaid. MS-DOS ran, in other words, as just one of many potential tasks within the more advanced operating system, providing the backward compatibility with old software that was considered such a necessary bridge to any post-MS-DOS future. And then, on top all these advantages, Mermaid already came equipped with a workable GUI. It seemed like the most promising of beginnings.
By contrast, IBM’s third and last alternative for the long-term future initially seemed the most unappetizing by far: to go back to Microsoft, tell them they needed a new operating system to replace MS-DOS, and ask them to develop it with them, alongside their own programmers. There seemed little to recommend such an approach, given how unhappy IBM was already becoming over their dependency on Microsoft — not to mention the way the latter bore direct responsibility for the thriving and increasingly worrisome clone market, thanks to their willingness to license MS-DOS to anyone who asked for it. And yet, incredibly, this was the approach IBM would ultimately choose.
Why on earth would IBM choose such a path? One factor might have been the dismal failure of TopView, their first attempt at making and marketing a piece of microcomputer system software single-handedly, in the spring of 1985. Perhaps this really did unnerve them. Still, one has to suspect that there was more than a crisis of confidence behind IBM’s decision to go from actively distancing themselves from Microsoft to pulling the embrace yet tighter in a matter of months. In that light, it’s been reported that Bill Gates, getting wind of IBM’s possible plans to go it alone, threatened to jerk their existing MS-DOS license if they went ahead with work on a new operating system without him. Certainly IBM’s technical rank and file, who were quite confident in their own ability to create IBM’s operating system of the future and none too happy about Microsoft’s return to the scene, widely accepted this story at the time. “The bitterness was unbelievable,” remembers one. “People were really upset. Gates was raping IBM. It’s incomprehensible.”
Nevertheless, on August 22, 1985, Bill Gates and Bill Lowe, the latter being the president of IBM’s so-called “Entry Systems Division” that was responsible for their personal computers, signed a long-term “Joint Development Agreement” in which they promised to continue to support MS-DOS on IBM’s existing personal computers and to develop a new, better operating system for their future ones. All those who had feared that TopView represented the opening gambit in a bid by IBM to take complete control of the business-microcomputing market could breathe a sigh of relief. “We are committed to the open-architecture concept,” said Lowe, “and recognize the importance of this to our customers.” The new deal between the two companies was in fact far more ironclad and more equal than the one that had been signed before the release of the original IBM PC. “For Microsoft,” wrote the New York TImes‘s business page that day, “the agreement elevates it from a mere supplier to IBM, with the risk that it could one day be cut off, into more of a partner.” True equal partner with the company that in the eyes of many still was computing in the abstract… Microsoft was moving up in the world.
The public was shown only the first page or two of the new agreement, full of vague reassurances and mission statements. Yet it went on for many more pages after that, getting deep into the weeds of an all-new operating system to be called CP-DOS. (Curiously, the exact meaning of the acronym has never surfaced to my knowledge. “Concurrent Processing” would be my best guess, given the project’s priorities.) CP-DOS was to incorporate all of the sophistication that was missing from MS-DOS, including preemptive multitasking, virtual memory, the ability to address up to 16 MB of physical memory, and a system of device drivers to insulate applications from the hardware and insulate application programmers from the need to manually code up support for every new printer or video card to hit the market. So far, so good.
But this latest stage of an unlikely partnership would prove a very different experience for Microsoft than developing the system software for the original IBM PC had been. Back in 1980 and 1981, IBM, pressured for time, had happily left the software side of things entirely to Microsoft. Now, they truly expected to develop CP-DOS as partners with them, expected not only to write the specifications for the new operating system themselves but to handle some of the coding themselves as well. Two radically different corporate cultures clashed from the start. IBM, accustomed to carrying out even the most mundane tasks in bureaucratic triplicate, was appalled at the lone-hacker model of software development that still largely held sway at Microsoft, while the latter’s programmers held their counterparts in contempt, judging them to be a bunch of useless drones who never had an original thought in their lives. “There were good people” at IBM, admits one former Microsoft employee. But then, “there were a lot of not-so-good people also. That’s not Microsoft’s model. Microsoft’s model is only good people. If you’re not good, you don’t stick around.” Neal Friedman, a programmer on the CP-DOS team at Microsoft:
The project was extremely frustrating for people at Microsoft and for people at IBM too. It was a clash of two companies at opposite ends of the spectrum. At IBM, things got done very formally. Nobody did anything on their own. You went high enough to find somebody who could make a decision. You couldn’t change things without getting approval from the joint design-review committee. It took weeks even to fix a tiny little bug, to get approval for anything.
IBM measured their programmers’ productivity in the number of lines of code they could write per day. As Bill Gates and plenty of other people from Microsoft tried to point out, this metric said nothing about the quality of the code they wrote. In fact, it provided an active incentive for programmers to write bloated, inefficient code. Gates compared the project to trying to build the world’s heaviest airplane.
A joke memo circulated inside Microsoft, telling the story of an IBM rowing team that had lost a race. IBM, as was their wont, appointed a “task force” to analyze the failure. The bureaucrats assigned thereto discovered that the IBM team had had eight people steering and one rowing, while the other team had had eight people rowing and one steering. Their recommendation? Why, the eight steerers should simply get the one rower to row harder, of course. Microsoft took to calling IBM’s software-development methodology the “masses-of-asses” approach.
But, as only gradually became apparent to Microsoft’s programmers, Bill Gates had ceded the final authority on what CP-DOS should be and how it should be implemented to those selfsame masses of asses. Scott MacGregor, the Windows project manager during 1984, shares an interesting observation that apparently still applied to the Gates of 1985 and 1986:
Bill sort of had two modes. For all the other [hardware manufacturers], he would be very confident and very self-assured, and feel very comfortable telling them what the right thing to do was. But when he worked with IBM, he was always much more reserved and quiet and humble. It was really funny because this was the only company he would be that way with. In meetings with IBM, this change in Bill was amazing.
In charming or coercing IBM into signing the Joint Development Agreement, Gates had been able to perpetuate the partnership which had served Microsoft so well, but the terms turned out to be perhaps not quite so equal as they first appeared: he had indeed given IBM final authority over the new operating system, as well as agreeing that the end result would belong to Big Blue, not (as with MS-DOS) to Microsoft. As work on CP-DOS began in earnest in early 1986, a series of technical squabbles erupted, all of which Microsoft was bound to lose.
One vexing debate was over the nature of the eventual CP-DOS user interface. Rather than combining the plumbing of the new operating system and the user interface into one inseparable whole, IBM wanted to separate the two. In itself, this was a perfectly defensible choice; successful examples of this approach abound in computing history, from Unix and Linux’s X Windows to the modern Macintosh’s OS X. And of course this was an approach which Microsoft and many others had already taken in building GUI environments to run on top of MS-DOS. So, fair enough. The real disagreements started only when IBM and Microsoft started to discuss exactly what form CP-DOS’s preferred user interface should take. Unsurprisingly, Microsoft wanted to adapt Windows, that project in which they had invested so much of their money and reputation for so little reward, to run on top of CP-DOS instead of MS-DOS. But IBM had other plans.
IBM informed Microsoft that the official CP-DOS user interface at launch time was to be… wait for it… TopView. The sheer audacity of the demand was staggering. After developing TopView alone and in secret, cutting out their once and future partners, IBM now wanted Microsoft to port it to the new operating system the two companies were developing jointly. (Had they been privy to it, the pundits would doubtless have taken this demand as confirmation of their suspicion that at least some inside IBM had intended TopView to have an existence outside of its MS-DOS host all along.)
“TopView is hopeless,” pleaded Bill Gates. “Just let it die. A modern operating system needs a modern GUI to be taken seriously!” But IBM was having none of it. When they had released TopView, they had told their customers that it was here to stay, a fundamental part of the future of IBM personal computing. They couldn’t just abandon those people who had bought it; that would be contrary to the longstanding IBM ethic of being the safe choice in computing, the company you invested in when you needed stability and continuity above all else. “But almost nobody bought TopView in the first place!” howled Gates. “Why not just give them their money back if it’s that important to you?” IBM remained unmoved. “Do a good job with a CP-DOS TopView”, they said, “and we can talk some more about a CP-DOS Windows with our official blessing.”
Ever helpful, IBM referred Microsoft to six programmers in Berkeley, California, who called themselves Dynamical Systems Research, who had recently come to them with a portable re-implementation of TopView which was supposedly one-quarter the size and ran four to ten times faster. (That such efficiency gains over the original version were even possible confirmed every one of Microsoft’s prejudices against IBM’s programmers.) In June of 1986, Steve Ballmer duly bought a plane ticket for Berkeley, and two weeks later Microsoft bought Dynamical for $1.5 million. And then, a month after that event, IBM summoned Gates and Ballmer to their offices and told them that they had changed their minds; there would now be no need for a TopView interface in CP-DOS. IBM’s infuriating about-face seemingly meant that Microsoft had just thrown away $1.5 million. (Luckily for them, in the end they would get more than their money’s worth out of the programming expertise they purchased when they bought Dynamical, despite never doing anything with the alternative TopView technology; more on that in a future article.)
The one good aspect of this infuriating news was that IBM had at last decided that they and Microsoft should write a proper GUI for CP-DOS. Even this news wasn’t, however, as good as Microsoft could have wished: said GUI wasn’t to be Windows, but rather a new environment known as the Presentation Manager, which was in turn to be a guinea pig for a new bureaucratic monstrosity known as the Systems Application Architecture. SAA had been born of the way that IBM had diversified since the time when the big System/360 mainframes had been by far the most important part of their business. They still had those hulking monsters, but they had their personal computers now as well, along with plenty of machines in between the two extremes, such as the popular System/38 range of minicomputers. All of these machines had radically different operating systems and operating paradigms, such that one would never guess that they all came from the same company. This, IBM had decided, was a real problem in terms of technological efficiency and marketing alike, one which only SAA could remedy. They described the latter as “a set of software interfaces, conventions, and protocols — a framework for productively designing and developing applications with cross-system dependency.” Implemented across IBM’s whole range of computers, it would let programmers transition easily from one platform to another thanks to a consistent API, and the software produced with it would all have a consistent, distinctively “IBM” look and feel, conforming to a set-in-stone group of interface guidelines called Common User Access.
SAA and CUA might seem a visionary scheme from the vantage point of our own era of widespread interoperability among computing platforms. In 1986, however, the devil was very much in the details. The machines which SAA and CUA covered were so radically different in terms of technology and user expectations that a one-size-fits-all model couldn’t possibly be made to seem anything but hopelessly compromised on any single one of them. CUA in particular was a pedant’s wet dream, full of stuff like a requirement that every single dialog box had to have buttons which said “OK = Enter” and “ESC = Cancel,” instead of just “OK” and “Cancel.” “Surely we can expect people to figure that out without beating them over the head with it every single time!” begged Microsoft.
For a time, such pleas fell on deaf ears. Then, as more and more elements descended from IBM’s big computers proved clearly, obviously confusing in the personal-computing paradigm, Microsoft got permission to replace them with elements drawn from their own Windows. The thing just kept on getting messier and messier, a hopeless mishmash of two design philosophies. “In general, Windows and Presentation Manager are very similar,” noted one programmer. “They only differ in every single application detail.” The combination of superficial similarity with granular dissimilarity could only prove infuriating to users who went in with the reasonable expectation that one Microsoft-branded GUI ought to work pretty much the same as another.
Yet the bureaucratic boondoggle that was SAA and CUA wasn’t even the biggest bone of contention between IBM and Microsoft. That rather took the form of one of the most basic issues of all: what CPU the new operating system should target. Everyone agreed that the old 8088 should be left in the past along with the 640 K barrier it had spawned, but from there opinions diverged. IBM wanted to target the 80286, thus finally providing all those PC/ATs they had already sold with an operating system worthy of their hardware. Microsoft, on the other hand, wanted to skip the 80286 and target Intel’s very latest and greatest chip, the 80386.
The real source of the dispute was that same old wellspring of pain for anyone hoping to improve upon MS-DOS: the need to make sure that the new-and-improved operating system could run old MS-DOS software. Doing so, Bill Gates pointed out, would be far more complicated from the programmer’s perspective and far less satisfactory from the user’s perspective with the 80286 than it would with the 80386. To understand why, we need to look briefly at the historical and technical circumstances behind each of the chips.
It generally takes a new microprocessor some time to go from being available for purchase on its own to being integrated into a commercial computer. Thus the 80286, which first reached the mass market with the PC/AT in August of 1984, first reached Intel’s product catalog in February of 1982. It had largely been designed, in other words, before the computing ecosystem spawned by the IBM PC existed. Its designers had understood that compatibility with the 8088 might be a good thing to have to go along with the capabilities of the chip’s new protected mode, but had seen the two things as an either/or proposition. You would either boot the machine in real mode to run a legacy 8088-friendly operating system and its associated software, or you’d boot it in protected mode to run a more advanced operating system. Switching between the two modes required resetting the chip — a rather slow process that Intel had anticipated happening only when the whole machine in which it lived was rebooted. The usage scenario which Intel had most obviously never envisioned was the very one which IBM and Microsoft were now proposing for CP-DOS: an operating system that constantly switched on the fly between protected mode, which would be used for running the operating system itself and native applications written for it, and real mode, which would be used for running MS-DOS applications.
But the 80386, which entered Intel’s product catalog in September of 1985, was a very different beast, having had the chance to learn from the many Intel-based personal computers which had been sold by that time. Indeed, Intel had toured the industry before finalizing their plans for their latest chip, asking many of its movers and shakers — a group which prominently included Microsoft — what they wanted and needed from a third-generation CPU. The end result offered a 32-bit architecture to replace the 16 bits of the 80286, with the capacity to address up to 4 GB of memory in protected mode to replace the 16 MB address space of the older chip. But hidden underneath the obvious leap in performance were some more subtle features that were if anything even more welcome to programmers in Microsoft’s boat. For one thing, the new chip could be switched between real mode and protected mode quickly and easily, with no need for a reset. And for another, Intel added a third mode, a sort of compromise position in between real and protected mode that was perfect for addressing exactly the problems of MS-DOS compatibility with which CP-DOS was doomed to struggle. In the new “virtual” mode, the 80386 could fool software into believing it was running on an old 8088-based machine, including its own virtual 1 MB memory map, which the 80386 automatically translated into the real machine’s far more expansive memory map.
The power of the 80386 in comparison to the 8088 was such that a single physical 80386-based machine should be able to run a dozen or more virtual MS-DOS machines in parallel, should the need arise — all inside a more modern, sophisticated operating system like the planned CP-DOS. The 80386’s virtual mode really was perfect for Microsoft’s current needs — as it ought to have been, given that Microsoft themselves were largely responsible for its existence. It offered them a chance that doesn’t come along very often in software engineering: the chance to build a modern new operating system while maintaining seamless compatibility with the old one.
Some reports have it that Bill Gates, already aware that the 80386 was coming, had tried to convince IBM not to build the 80286-based PC/AT at all back in 1984, had told them they should just stay with the status quo until the 80386 was ready. But even in 1986, the 80386 according to IBM was just too far off in the future as a real force in personal computing to become the minimum requirement for CP-DOS. They anticipated taking a leisurely two-and-a-half years or so, as they had with the 80286, to package the 80386 into a new model. Said model thus likely wouldn’t appear until 1988, and its sales might not reach critical mass until a year or two after that. The 80386 was, IBM said, simply a bridge too far for an operating system they wanted to release by 1987. Besides, in light of the IBM Safeness Doctrine, they couldn’t just abandon those people who had already spent a lot of money on PC/ATs under the assumption that it was the IBM personal computer of the future.
“Screw the people with ATs,” was Gates’s undiplomatic response. “Let’s just make it for the 386, and they can upgrade.” He gnashed his teeth and raged, but IBM was implacable. Instead of being able to run multiple MS-DOS applications in parallel on CP-DOS, almost as if they had been designed for it from the start, Microsoft would be forced to fall back on using their new operating system as little more than a launcher for the old whenever the user wished to run an MS-DOS application. And it would, needless to say, be possible to run only one such application at a time. None of that really mattered, said IBM; once people saw how much better CP-DOS was, developers would port their MS-DOS applications over to it and the whole problem of compatibility would blow away like so much smoke. Bill Gates was far less sanguine that Microsoft and IBM could so easily kill their cockroach of an operating system. But in this as in all things, IBM’s decision was ultimately the law.
Microsoft’s frustration only grew when IBM’s stately timetable for the 80386 was jumped by the increasingly self-confident clone makers. In September of 1986, Compaq, the most successful and respected clone maker of all, shipped the DeskPro 386, the very first MS-DOS-compatible machine to use the chip. Before the end of the year, several other clone makers had announced 80386-based models of their own in response. It was a watershed moment in the slow transformation of business-oriented microcomputing from an ecosystem where IBM blazed the trails and a group of clone makers copied their innovations to one where many equal players all competed and innovated within an established standard for software and hardware which existed independently of all of them. A swaggering Compaq challenged IBM to match the DeskPro 386 within six months “or be supplanted as the market’s standard-setter.” Michael Swarely, Compaq’s marketing guru, was already re-framing the conversation in ways whose full import would only gradually become clear over the years to come:
We believe that an industry standard that has been established for software for the business marketplace is clearly in place. What we’ve done with the DeskPro 386 is innovate within that existing standard, as opposed to trying to go outside the standard and do something different. IBM may or may not enter the [80386] marketplace at some point in the future. The market will judge what IBM brings in the same way that it judges any other manufacturer’s new products. They have to live within the market’s realities. And the reality is that American business has made an enormous investment in an industry standard.
More than ever before, IBM was feeling real pressure from the clone makers. Their response would give the lie to all of their earlier talk of an open architecture and their commitment thereto.
IBM had already been planning a whole new range of machines for 1987, to be called the PS/2 line. Those plans had originally not included an 80386-based machine, but one was hastily added to the lineup now. Yet the appearance of that machine was only one of the ways in which the PS/2 line showed plainly that clone makers like Compaq were making IBM nervous with their talk of a “standard” that now had an existence independent from the company that had spawned it. IBM planned to introduce with the PS/2 line a new type of system bus for hardware add-ons, known as the Micro Channel Architecture. Whatever its technical merits, which could and soon would be hotly debated, MCA was clearly designed to cut the clone makers off at the knees. Breaking with precedent, IBM wrapped MCA up tight inside a legal labyrinth of patents, meaning that anyone wishing to make a PS/2 clone or even just an MCA-compatible expansion card would have to negotiate a license and pay for the privilege. If IBM got their way, the curiously idealistic open architecture of the original IBM PC would soon be no more.
In a testimony to how guarded the relationship between the two supposed fast partners really was, IBM didn’t even tell Microsoft about their plans for the PS/2 line until just a few months before the public announcement. Joint Development Agreement or no, the former now suspected the latter’s loyalty more strongly than ever — and for, it must be admitted, pretty good reason: a smiling Bill Gates had recently appeared alongside two executives from Compaq and their DeskPro 386 on the front page of InfoWorld. Clearly he was still playing both sides of the fence.
Now, Bill Gates got the news that CP-DOS was to be renamed OS/2, and would join PS/2 as the software half of a business-microcomputing future that would once again revolve entirely around IBM. For some time, he wasn’t even able to get a clear answer to the question of whether IBM intended to allow OS/2 to run at all on non-PS/2 hardware — whether they intended to abandon their old PC/AT customers after all, writing them off as collateral damage in their war against the clonesters and making MCA along with an 80286 a minimum requirement of OS/2.
On April 2, 1987, IBM officially announced the PS/2 hardware line and the OS/2 operating system, sending shock waves through their industry. Would this day mark the beginning of the end of the clone makers?
Any among that scrappy bunch who happened to be observing closely might have been reassured by some clear evidence that this was a far more jittery version of IBM than anyone had ever seen before, as exemplified by the splashy but rather chaotic rollout schedule for the new world order. Three PS/2 machines were to ship immediately: one of them based around an Intel 8086 chip very similar to the 8088 in the original IBM PC, the other two based around the 80286. But the 80386-based machine they were scrambling to get together in response to the Compaq DeskPro 386 — not that IBM phrased things in those terms! — wouldn’t come out until the summer. Meanwhile OS/2, which was still far from complete, likely wouldn’t appear until 1988. It was a far cry from the unified, confident rollout of the System/360 mainframe line more than two decades earlier, the seismic computing event IBM now seemed to be consciously trying to duplicate with their PS/2 line. As it was, the 80286- and 80386-based PS/2 machines would be left in the same boat as the older PC/AT for months to come, hobbled by that monument to inadequacy that was MS-DOS. And even once OS/2 did come out, the 80386-based PS/2 Model 80 would still remain somewhat crippled for the foreseeable future by IBM’s insistence that OS/2 run on the the 80286 as well.
The first copies of the newly rechristened OS/2 to leave IBM and Microsoft’s offices did so on May 29, 1987, when selected developers who had paid $3000 for the privilege saw a three-foot long, thirty-pound box labelled “OS/2 Software Development Kit,” containing nine disks and an astonishing 3100 pages worth of documentation, thump onto their porch two months before Microsoft had told them it would arrive. As such, it was the first Microsoft product ever to ship early; less positively, it was also the first time they had ever asked anyone to pay to be beta testers. Microsoft, it seemed, was feeling their oats as IBM’s equal partners.
The first retail release of OS/2 also beat its announced date, shipping in December of 1987 instead of the first quarter of 1988. Thankfully, IBM listened to Microsoft’s advice enough to quell the very worst of their instincts: they allowed OS/2 to run on any 80286-or-better machine, not restricting it to the PS/2 line. Yet, at least from the ordinary user’s perspective, OS/2 1.0 was a weirdly underwhelming experience after all the hype of the previous spring. The Presentation Manager, OS/2’s planned GUI, had fallen so far behind amidst all the bureaucratic squabbling that IBM had finally elected to ship the first version of their new operating system without it; this was the main reason they had been able to release the remaining pieces earlier than planned. In the absence of the Presentation Manager, what the user got was the plumbing of a sophisticated modern operating system coupled to a command-line interface that made it seem all but identical to hoary old MS-DOS. I’ve already described in earlier articles how a GUI fits naturally with advanced features like multitasking and inter-application data sharing. These and the many other non-surface improvements which MS-DOS so sorely needed were there in OS/2, hidden away, but in the absence of a GUI only the programmer or the true power user could make much use of them. The rest of the world was left to ask why they had just paid $200 for a slightly less compatible, slightly slower version of the MS-DOS that had come free with their computers. IBM themselves didn’t quite seem to know why they were releasing OS/2 now, in this state. “No one will really use OS/2 1.0,” said Bill Lowe. “I view it as a tool for large-account customers or software developers who want to begin writing OS/2 applications.” With a sales pitch like that, who could resist? Just about everybody, as it happened.
OS/2 1.1, the first version to include the Presentation Manager — i.e., the first real version of the operating system in the eyes of most people — didn’t ship until the rather astonishingly late date of October 31, 1988. After such a long wait, the press coverage was lukewarm and anticlimactic. The GUI worked well enough, wrote the reviewers, but the whole package was certainly memory-hungry; the minimum requirement for running OS/2 was 2.5 MB, the recommended amount 5 MB or more, both huge numbers for an everyday desktop computer circa 1988. Meanwhile a lack of drivers for even many of the most common printers and other peripherals rendered them useless. And OS/2 application software as well was largely nonexistent. The chicken-or-the-egg-conundrum had struck again. With so little software or driver support, no one was in a hurry to upgrade to OS/2, and with so little user uptake, developers weren’t in a hurry to deliver software for it. “The broad market will turn its back on OS/2,” predicted Jeffrey Tarter, expressing the emerging conventional wisdom in the widely read insider newsletter Softletter. Phillipe Kahn of Borland, an executive who was never at a loss for words, started a meme when he dubbed the new operating system “BS/2.” In the last two months of 1988, 4 percent of 80286 owners and 16 percent of 80386 owners took the OS/2 plunge. Yet even those middling figures gave a rosier view of OS/2’s prospects than was perhaps warranted. By 1990, OS/2 would still account for just 1 percent of the total installed base of personal-computer operating systems in the United States, while the unkillable MS-DOS still owned a 66-percent share.
A Quick Tour of the OS/2 1.1 Presentation Manager
Amidst all of the hoopla over the introduction of the PS/2 and OS/2 back in the spring of 1987, Byte‘s editor-in-chief Philip Lemmons had sounded a cautionary note to IBM that reads as prescient today:
With the introduction of the PS/2 machines, IBM has begun to compete in the personal-computer arena on the basis of technology. This development is welcome because the previous limitations of the de-facto IBM standard were painfully obvious, especially in systems software. The new PS/2 “standard” offers numerous improvements: the Micro Channel is a better bus than the PC and AT buses, and it provides a full standard for 32-bit buses. The VGA graphics standard improves on EGA. The IBM monitors for the PS/2 series take a new approach that will ultimately deliver superior performance at lower prices. IBM is using 3.5-inch floppy disks that offer more convenience, capacity, and reliability than 5.25-inch floppy disks. And OS/2, the new system software jointly developed by Microsoft and IBM, will offer advances such as true multitasking and a graphic user interface.
Yet a cloud hangs over all this outstanding new technology. Like other companies that have invested in the development of new technology, IBM is asserting proprietary rights in its work. When most companies do this in most product areas, we expect and accept it. When one company has a special role of setting the de-facto standard, however, the aggressive assertion of proprietary rights prevents the widespread adoption of the new standard and delays the broad distribution of new technology.
The personal-computer industry has waited for years for IBM to advance the standard, and now, depending on IBM’s moves, may be unable to participate in that advancement. If so, the rest of the industry and the broad population of computer users still need another standard for which to build and buy products — a standard at least as good as the one embodied in the PS/2 series.
Lemmons’s cautions were wise ones; his only mistake was in not stating his concerns even more forcefully. For the verdict of history is clear: PS/2 and OS/2 are the twin disasters which mark the end of the era of IBM’s total domination of business-oriented microcomputing. The PS/2 line brought with it a whole range of new hardware standards, some of which, like new mouse and keyboard ports and a new graphics standard known as VGA, would remain with us for decades to come. But these would mark the very last technical legacies of IBM’s role as the prime mover in mainstream microcomputing. Other parts of the PS/2 line, most notably the much-feared proprietary MCA bus, did more to point out the limits of IBM’s power than the opposite. Instead of dutifully going out of business or queuing up to buy licenses, third-party hardware makers simply ignored MCA. They would eventually form committees to negotiate new, open bus architectures of their own — just as Philip Lemmons predicts in the extract above.
OS/2 as well only served to separate IBM’s fate from that of the computing standard they had birthed. It arrived late and bloated, and went largely ignored by users who stuck with MS-DOS — an operating system that was now coming to be seen not as IBM’s system-software standard but as Microsoft’s. IBM’s bold bid to cement their grip on the computer industry only caused it to slip through their fingers.
All of which placed Microsoft in the decidedly strange position of becoming the prime beneficiary of the downfall of an operating system which they had done well over half the work of creating. Given the way that Bill Gates’s reputation as the computer industry’s foremost Machiavelli precedes him, some have claimed that he planned it all this way from the beginning. In their otherwise sober-minded book Computer Wars: The Fall of IBM and the Future of Global Technology, Charles H. Ferguson and Charles R. Morris indulge in some elaborate conspiracy theorizing that’s all too typical of the whole Gates-as-Evil-Genius genre. Gates made certain that his programmers wrote OS/2 in 80286 assembly language rather than a higher-level language, the authors claim, to make sure that IBM couldn’t easily adapt it to take advantage of the more advanced capabilities of chips like the 80386 after his long-planned split with them finally occurred. In the meantime, Microsoft could use the OS/2 project to experiment with operating-system design on IBM’s dime, paving the way for their own eventual MS-DOS replacement.
If Gates expected ultimately to break with IBM, he has every interest in ensuring OS/2’s failure. In that light, tying the project tightly to 286 assembler was a masterstroke. Microsoft would have acquired three years’ worth of experience writing an advanced, very sophisticated operating system at IBM’s elbow, applying all the latest development tools. After the divorce, IBM would still own OS/2. But since it was written in 286 assembler, it would be almost utterly useless.
In reality, the sheer amount of effort Microsoft put into making OS/2 work over a period of several years — far more effort than they put into their own Windows over much of this period — argues against such conspiracy-mongering. Bill Gates was unquestionably trying to keep one foot in IBM’s camp and one foot in the clone makers’, much to the frustration of both, who equally craved his undivided loyalty. But he had no crystal ball, and he wasn’t playing three-dimensional chess. He was just responding, rather masterfully, to events on the ground as they happened, and always — always — hedging all of his bets.
So, even as OS/2 was getting all the press, Windows remained a going concern, Gates’s foremost hedge against the possibility that the vaunted new operating system might indeed prove a failure and MS-DOS might remain the standard it had always been. “Microsoft has a religious approach to the graphical user interface,” said the GUI-skeptic Pete Peterson, vice president of WordPerfect Corporation, around this time. “If Microsoft could choose between improved earnings and growth and bringing the graphical user interface to the world, they’d choose the graphical user interface.” In his own way, Peterson was misreading Gates as badly here as the more overheated conspiracy theorists have tended to do. Gates was never one to sacrifice profits to any ideal. It was just that he saw the GUI itself — somebody’s GUI — as such an inevitability. And thus he was determined to ensure that the inevitability had a Microsoft logo on the box when it became an actuality. If the breakthrough product wasn’t to be the OS/2 Presentation Manager, it would just have to be Microsoft Windows.
(Sources: the books The Making of Microsoft: How Bill Gates and His Team Created the World’s Most Successful Software Company by Daniel Ichbiah and Susan L. Knepper, Hard Drive: Bill Gates and the Making of the Microsoft Empire by James Wallace and Jim Erickson, Gates: How Microsoft’s Mogul Reinvented an Industry and Made Himself the Richest Man in America by Stephen Manes and Paul Andrews, and Computer Wars: The Fall of IBM and the Future of Global Technology by Charles H. Ferguson and Charles R. Morris; InfoWorld of October 7 1985, July 7 1986, August 12 1991, and October 21 1991; PC Magazine of November 12 1985, April 12 1988, December 27 1988, and September 12 1989; New York Times of August 22 1985; Byte of June 1987, September 1987, October 1987, April 1988, and the special issue of Fall 1987; the episode of the Computer Chronicles television program called “Intel 386 — The Fast Lane.” Finally, I owe a lot to Nathan Lineback for the histories, insights, comparisons, and images found at his wonderful online “GUI Gallery.”)
Footnotes
↑1 | Admittedly, this was already beginning to change as IBM was having this debate: the X Window project was born at MIT in 1984. |
---|
Sniffnoy
July 13, 2018 at 4:07 pm
Huh, those two aren’t parallel… is that really how they insisted on it?
Jimmy Maher
July 13, 2018 at 4:17 pm
According to the book Gates, yes. Weird, right?
Sniffnoy
July 14, 2018 at 4:30 pm
Yeah, indeed…
Nate
July 15, 2018 at 12:45 am
I wonder if that’s perhaps a misreading: what the CUA standard actually required, if I remember correctly, was not so much the display text of “OK = Enter”, but that pressing Enter would always be the keyboard shortcut for a button marked OK, and that pressing Esc would always be the keyboard shortcut for a button marked Cancel.
Even if that specific text WAS required, it would have been for the very good reason that the user did not yet understand what keys might map to what GUI elements. It was mostly CUA that gave us the ‘standard’ Windows keys like Alt for Menu, Tab for select element, etc.
Remember, before CUA, it was NOT at all consistent what keys like ‘Esc’ and ‘Enter’ might do. Each application interpreted them differently.
Lotus 1-2-3 used ‘/’ as its menu key, Wordperfect 5 used Esc not as ‘back out of a menu’ but ‘repeat character’. Wordstar used Ctrl, etc. It was a nightmare.
CUA brought some sanity to it all and we shouldn’t forget that.
Jimmy Maher
July 15, 2018 at 12:56 pm
CUA definitely had its heart in the right place, but I think you’re overselling its pioneering status a bit here. Well before CUA, Apple was placing huge emphasis on standardized user interfaces for the Lisa and Macintosh across applications, and Microsoft did the same from the time of Windows 1. Indeed, as I noted in the first article in this series, this sort of desperately-needed user-interface standardization was an important advantage of GUIs in general. Virtually every publisher who made a GUI environment published a standard style guide for applications to accompany it.
Orion
July 16, 2018 at 3:45 pm
Screenshots of the 1987 CUA samples disk (at http://toastytech.com/guis/cua1987.html ), shows that they actually used “Enter” and “Esc=Cancel”. The original 1987 document seems to have been lost, but the 1989 revision is at http://publibz.boulder.ibm.com/cgi-bin/bookmgr/BOOKS/f29bdg00/CONTENTS?SHELF=&DT=19921204095534 and doesn’t make mention of “OK = Enter” (see, e.g., Section 2.7.2, Figure 45, which shows a button simply labeled “OK”).
Maybe there was a draft version that had “OK = Enter”, though.
Nate
July 17, 2018 at 10:28 am
Looking at that Toastytech page reminds me, yeah. The ‘KEY=Function’ display syntax was standard for showing what function keys were mapped to (including Esc and Enter). Specifically on the ‘Function key area’ bar, in these screenshots. And I think you can’t really argue that that was a bad thing; function keys were notoriously terrible if you didn’t know the bindings, and Key=Function on a dedicated function key screen line was a pretty good standard way of displaying it.
Note also the underlining for menu shortcut letters. That found its way to Windows all through Windows 3, and I think 95 too. I’m sure Windows 1 did have its own style guide, but I’m pretty darn sure taht *CUA was the source of the Windows style guidelines*, because Windows was built to be compliant with CUA.
Really, CUA was pretty impressive. It gave non-Windows MS-DOS applications a standard ‘Windows-like look and feel’. An IBM feel, yes, but that’s kind of the point.
https://en.wikipedia.org/wiki/IBM_Common_User_Access
“CUA not only covered DOS applications, but was also the basis for the Windows Consistent User Interface standard (CUI), as well as that for OS/2 applications ”
One thing that didn’t make it though is Shift-Del for cut and Shift-Ins for Paste; Ctrl-C and Ctrl-X came through instead.
But CUA never actually went away. If you’re running Windows, chances are you’re still using parts of it right now. Though with Windows 8 and 10, Microsoft is finally trashing the CUA guidelines; I’m not convinced that’s a good thing.
Alex Smith
July 13, 2018 at 4:27 pm
“Unix was beloved by hackers everywhere, but had wound up in the hands of AT&T”
I guess I am not quite sure what this phrase is supposed to intimate. Unix was created at Bell Labs and was thus always a product of AT&T. Stating it “wound up” at AT&T implies it had been created elsewhere.
Jimmy Maher
July 13, 2018 at 4:36 pm
True. Thanks!
whomever
July 14, 2018 at 4:49 pm
The nuance is that Unix had gone from being open-ish and available to any Academic to the attempts by AT&T to take control, in fact the comparison between AT&T and Unix and IBM and the PC are quite parallel (and the corporate cultures between the two not so different, which may explain that). By this point it had become massively forked (The “Open” BSD Unix, from UC Berkeley, vs the “Closed” Sys V, both of which had a long list of differences). The complex history of Unix is something that Jimmy could write a bazillion posts about (Though it may be outside his remit, not being a home or game platform at the time). IBM BTW also started getting into Unix at this time but their Unix (AIX) was always…weird in a very IBM way. The old joke was “AIX is to Unix like if Martians learned English from old I Love Lucy episodes they saw broadcast”.
Alex Smith
July 17, 2018 at 6:18 am
Well, AT&T did not suddenly realize that it liked being in control in the mid 1980s. The company licensed UNIX under extremely generous terms in the early days because it had entered into a consent decree with the United States government in 1956 in which it promised to restrict commercial activities to its national telephone system to avoid being broken up as a monopoly. Absent that consent decree, the operating system would have no doubt been fully commercialized immediately. A second consent decree entered into in 1982 and effective January 1, 1984, dismantled the old AT&T in favor of a series of regional companies and a much reduced AT&T that held onto the long distance, manufacturing, and R&D operations. This new entity was no longer bound by the 1956 decree and could therefore attempt to commercialize (and monopolize) UNIX without fear of government reprisal.
tedder
July 13, 2018 at 5:40 pm
I always thought, obviously incorrectly, that CP-DOS had something to do with CP/M. I also didn’t know of the OS/2-PS/2 naming connection.
“Gates compared the project to trying to build the world’s heaviest airplane.”
.. roughly, measuring airplane development by weight. Worth noting (down here in the comments) that the greater Seattle area was home to Microsoft *and* Boeing.
I wish there were screenshots and pictures in here :/ I’d especially love to see the OS/2 SDK box and contents. I did find the documentation’s contents, though: http://www.os2museum.com/wp/os2-history/os2-library/os2-1-x-programming/
minor fodder for the copy editors’ department:
“useless drones who had never had an original thought”
-> “who never had an”; style/duplication fix
“hobbled by that monument to inadequateness that was MS-DOS”
monument to inadequacy?
Jimmy Maher
July 13, 2018 at 5:46 pm
Thanks!
Gnoman
July 13, 2018 at 9:45 pm
This article is particularly interesting given the current state of Windows. One of the more solid criticisms of Windows 8 and 10 is how badly they implemented the “universal operating system” paradigm. The Windows 8 desktop interface was widely bashed (with reason) as being far too heavily influenced by the mobile paradigm, and being noticeably worse to operate with traditional keyboard and mouse for various reasons. Microsoft fired back at these critics with their insistence that the default way to interface with Windows on all platforms should be as close to identical as possible, and that the only people who objected were those too set in their ways to recognize the obvious future.
The parallels to the IBM “Common User Access” notions are pretty obvious, I think.
Jimmy Maher
July 14, 2018 at 10:46 am
That is indeed an interesting perspective that hadn’t occurred to me…
whomever
July 14, 2018 at 4:57 pm
This is getting into the weeds but In fact, CUA was IBM wide (all products). I never much used OS/2 but very much ran into CUA in the context of AIX (which as I mentioned in another comment is technically a fine OS but…weird. Part of the weirdness was CUA, which felt…not Unixy. Maybe if I’d been a mainframe person I’d have taken to it (I guess it felt more natural on the mainframes and AS/400). Just as an example, the default for error messages in AIX is (was? Haven’t used it for years) to print a number with the message. This was so IBM could translate it into a lot of languages and even if they didn’t spend the money localizing to say, Basque, they could at least publish a book in basque that “1234 means xxx”. But to a Unix guy this just felt bureaucratic. The one thing I will say that shows that IBM wasn’t completely humorless was that the GUI tool for doing stuff on AIX (which was called SMIT, and there was stuff you could only do with it, versus just editing a file which is the Unix way and the way every other Unix was admired) had this animation of a running person when you did something. If the action succeeded, the person cheered, if it failed, it fell down. BTW the RS6000, IBM’s RISC Unix workstation, based on what later became Power and eventually was in every 2000 era Mac and every PS/3, so arguably a success, also used MCA. This was their second attempt, the first (the PC/RT) which actually has a huge fan club but didn’t sell well, was ISA based.
Nate
July 15, 2018 at 12:51 am
> “1234 means xxx”. But to a Unix guy this just felt bureaucratic.
The funny thing is this is exactly what the very Unix-centric SMTP protocol of 1982 did, and which HTTP followed, by defining standard numeric ‘result codes’ followed by text strings. It became ‘the Internet way’, and the Internet generally meant ‘Unix machines’.
And Microsoft went on to use numeric error codes everywhere.
It turns out that giving all your result/error conditions standard numbers is very useful to avoid miscommunication and even hackers adopted it.
Laertes
July 15, 2018 at 12:07 pm
@whomever Curious choice of language (I am basque).
whomever
July 15, 2018 at 6:42 pm
Laertes: Off topic, but, I picked Basque because, no offense, it’s a weird language and almost no one speaks it, not to mention every second letter is an X. Even Hungarian is at least vaguely related to Finnish and Estonian and we sort of think we can guess where it came from, Basque is space alien. I say this as one who would easily pick San Sebastian/Donostia as one of the greatest food cities I’ve ever been (Seriously, I was lucky enough to eat at Arzak which was amazing).
Nate: You aren’t wrong, I’m not saying this was a rational response, it just was different from the way literally everyone else did it (you could also turn it off by setting you LC_LOCALE env variable if I remember). Also, there are a bunch of PDP-10 hackers about to yell at you for calling the early Internet protocols “Unix Centric”. The early Internet was if anything DEC centric. HTTP fair enough, which of course was developed on NeXT (which to this day has morphed into the most common computing platform in the world in the form of the iPhone).
Laertes
July 16, 2018 at 6:19 pm
@whomever: none taken. In fact you are right. You may be interested in this video:
https://www.youtube.com/watch?v=S1l9oDiSiEQ
Apologies for the off-topic.
Pedro Timóteo
July 16, 2018 at 3:28 pm
Is that still a problem? Windows 8 did indeed want to look like a smartphone/tablet, but 8.1 changed the defaults back to the old taskbar-plus-icons-on-the-desktop paradigm, and I’ve never seen any touchscreen stuff in 10.
This isn’t the first time I’ve seen people saying, or at least implying, things like “Windows has had a touchscreen interface since 8,” when in reality it was only a single release (8) which was soon replaced by (and could upgrade to, for free) 8.1. I sometimes suspect a lot of people claiming this hated 8 (understandable), went back to 7, and have since refused to upgrade…
Nate
July 17, 2018 at 10:49 am
“I’ve never seen any touchscreen stuff in 10.”
You’ve been very lucky, then.
If you’ve ever accidentally double-clicked an image, rather than right-clicking, you’ll very likely bring up one of the hideous touchscreen-style apps for displaying images. The app framework formerly called “Metro”… is it still called “Modern”? Both are very bad names.
That’s the touchscreen stuff. It’s there.
Lefty
August 12, 2021 at 3:47 pm
So they ruined a quality Vista/Win7 interface not in an effort to provide a better more intuitive 3D interface for clueless users, but to chase a 4 inch screen mobile millennial pipe dream which failed miserably and now we suffer the consequences for eternity.
Ian Silvester
July 14, 2018 at 1:04 am
Thanks for this series. You’re adding real detail and nuance to a story I thought I knew, and those details make all the difference!
Aula
July 14, 2018 at 12:09 pm
There’s at least one place where CP-DOS is spelled as “CP-DPS”.
One thing about MCA that isn’t mentioned (either here or in the earlier PS/2 article) is that the BIOS of any MCA motherboard had a list of supported extension cards. This was presumably meant to ensure that even in countries where MCA wasn’t legally under IBM’s patents, card manufacturers would still have to pay IBM. This also meant that users were always in danger of possibly having to upgrade their computer’s BIOS every time they installed a new card. And in the mid-80’s upgrading BIOS meant physically replacing the ROM chip (always a funny exercise even for those who know how to do it properly).
Since OS/2 separates the GUI from the rest of the operating system, it’s possible for native OS/2 programs to be non-graphical (they had to be in the first, GUI-less version). If such a program doesn’t use other OS/2-specific features, the same executable file can be run in protected mode under OS/2 *and* in real mode under MS-DOS. How this works is clever: every file in the New Executable format used by OS/2 (as well as Windows 2 and Windows 3.x) starts with an MS-DOS executable, which is ignored under OS/2 but run under MS-DOS. Normally it just prints something like “This program cannot be run in MS-DOS mode” and exits, but it can also be a more complicated MS-DOS program, like a very limited OS/2 emulator that is just barely capable of running one text-mode program. The Microsoft C Compiler for MS-DOS is an example of this kind of dual-mode program.
Tomber
July 14, 2018 at 2:17 pm
I think the BIOS update issue was mostly addressed by use of updated floppy disks that added the required info into the stored config.
Protected-mode console apps that ran under DOS are related to the DOS Extender concept; the executable formats, APIs, and tools of Windows, OS/2, and NT were often borrowed.
Jimmy Maher
July 14, 2018 at 3:06 pm
Thanks!
Lars
July 15, 2018 at 2:30 am
Re the MCA BIOS thing – I’m not familiar with the details of MCA, but the reason might be something similar to the feature in PCI _AND_ USB which both require you to embed vendor and product codes in your hardware. Of course, getting a vendor code requires annual membership fees.
flowmotion
July 14, 2018 at 3:10 pm
Great stuff. This will probably be the next article, but I remember reading the weekly trade tabloids like InfoWorld and the very loud PR war between IBM & MS which lead up to their “divorce”.
(Random google to give folks a window into that era: https://archive.org/details/Infoworld-1991-03-04 )
The enthusiast community was of course very viscerally against Microchannel and everything associated with it. And OS/2 received a negative reception partially because of that.
Also this an excuse to post one of my favorite computer book covers:
https://imgur.com/a/zCFrhrS
whomever
July 14, 2018 at 5:10 pm
One thing that Jimmy Alludes to, but should be made clear: The PS/2 line were, bluntly, crap.
I mean, they were well engineered, but otherwise they were super overpriced, super slow, super proprietary, etc. Buying a PS/2 over a Compaq Desqpro would be a stupid choice to anyone who had their own money at stake (in fact the analogy to, say, the US and UK car industries in the 70s is instructive). They also weren’t even totally consistent within the line, for example the 25 and 30 were NOT VGA but MCGA (a subset that supported 320x200x256 colors). They even had an ISA PS/2, the PS/2E (released somewhat later than this article).
We are also starting to get to the era where if you wanted a computer, you wouldn’t even go to Compaq, you’d buy a copy of Computer Shopper and go to whatever dodgy fly by night Taiwanese people had the cheap ads at the back and assemble yourself.
The other thing that Jimmy may mention in a future article is that the gravy train was over for IBM about now, and they start losing shedloads of money about now, which given IBMs notorious internal politics must have made things complicated (The mainframe salespeople where notorious for basically saying, “well, buy one of competitors mainframes instead of one of our non-mainframes”).
whomever
July 14, 2018 at 5:23 pm
Also, really getting in the weeds re footnote one, Aside from X there were a bunch of others; by this point there was an active market of Unix workstations which had GUIS. SUN (yes it’s an acronym, Stanford University Network) had Sunview, SGI had something I can’t remember, etc. There were also proprietary things like the Perq and the various Lisp Machines that had them, and of course Xerox had finally commercialized their stuff. But your broader point still stands as these were all $20k+ specialized systems that sold in the thousands and you’d never buy one for home unless you were stupid rich so they are totally irrelevant to this article. Also the joke of Unix/Linux on the Desktop is that it’s like Nuclear Fusion, always 10 years away.
Jimmy Maher
July 14, 2018 at 6:05 pm
I’ve always heard that joke applied to strong AI…
Laertes
July 15, 2018 at 12:16 pm
And yet here I am, reading this article on my desktop PC with Linux, and I have been using it since 2003. The only thing that is still in there is the intel processor. IBM, Microsoft… long gone.
I have shown to several techie and not techie people what a live USB loaded with linux (Manjaro is a great example) can do and, well, they all were amazed.
hcs
July 14, 2018 at 8:04 pm
Very interesting coverage of the history, fascinating how much OS/2 presentation manager looked like the windows 3.1 I grew up on!
“Applications can even open “modal” windows, meaning windows that live within other windows”
Is this the correct usage of “modal”? I’m more familiar with it indicating that the child window takes over all interaction until it goes away, i.e. it introduces a “mode” of interaction (see Wikipedia “Modal Window”). It might be that they were called modal in OS/2, I guess?
Jimmy Maher
July 14, 2018 at 9:56 pm
I thought that was the usage I recalled from my programming days, but it’s been a long time since I’ve done any GUI programming. You’re probably right. Thanks!
Nate
July 15, 2018 at 12:52 am
Yeah, I don’t quite think ‘modal’ is the word; the standard Windows term for such nested windows would be ‘MDI’, for ‘Multiple Document Interface’.
https://en.wikipedia.org/wiki/Multiple_document_interface
Xavier
July 15, 2018 at 7:52 am
Super interesting story, as always, but Windows takes a S only when it’s from Microsoft. The One from Unix is called X-Window, singular.
Jimmy Maher
July 15, 2018 at 12:59 pm
Thanks!
whomever
July 15, 2018 at 6:31 pm
That’s one of the pedant things. Yes, technically the name is “The X Window System”, but in practice almost everyone calls it X Windows…
Casey Muratori
July 16, 2018 at 4:04 am
I had trouble with “that reality tended to interfere with their ability to see through any scheme, Machiavellian or otherwise, with the single-minded focus of a smaller company.” The phrase “see through” reads to me at first like definition 1 of “see through” (https://www.google.com/search?q=define+see+through&oq=define+see+through&aqs=chrome..69i57.3282j0j7&sourceid=chrome&ie=UTF-8), as in “to detect the true nature of.” But then when I finish the sentence I realize you mean it as in the phrase “see it through”. Perhaps it could be changed to a less ambiguous verb, like “carry out”, “perpetrate”, “promulgate”?
Sorry to be such a nitpick but I always find your writing to be very smooth, so I feel like it’s worth pointing out the one or two places I stumble when reading your articles, in case it leads them to become even more luxuriously polished :)
– Casey
Jimmy Maher
July 16, 2018 at 7:15 am
Sure. Thanks!
Alex Freeman
July 16, 2018 at 6:07 am
Another great article, Jimmy! The more I read about Bill Gates’s early history, the more it seems he was miles ahead of everyone else. The obvious course of action for IBM would have been to have Windows as the optional GUI for OS/2 for the 386 and discontinue failed products and reimburse customers so as to ensure IBM’s reputation as the safe choice. Even though Gates was a shark you trusted at your own risk, I feel less contempt for him seeing as almost all the others were practically begging him to take advantage of them.
As a side note, I don’t know how much you’ve covered of Gary Kildall and CP/M, but you may find this video interesting:
https://www.youtube.com/watch?v=Tdj8gh9GPc4
Tomber
July 16, 2018 at 11:36 pm
I think Bill Gates was a quick study in the 1970s. He learned about licensing to multiple OEMs, controlling a standard (Microsoft BASIC), differentiating it so that it couldn’t be easily replaced, etc. Microsoft supposedly used the motto “We set the standard.” in the 1970s. When it became apparent that IBM was not going to be able to dictate the direction of the PC marketplace, he was ready.
I don’t think IBMs mistake was not supporting Windows; the mistake was not producing it themselves, not being aggressive on the 386, and not following the piece-meal approach to migration(away from DOS) that the 386 made possible, and that Microsoft followed perfectly with DOS, EMM386, DPMI, and Windows. OS/2 1.x and the 286 may be the mistake that couldn’t be undone quick enough. OS/2 2.0 was their attempt, but it was too late. Microsoft had already “Set the standard.”
Yuhong Bao
July 17, 2018 at 5:27 am
That OS/2 2.0 debacle is so bad it is one of my favorite topics BTW.
matthew
July 19, 2018 at 8:52 am
A small complaint: the first use of the term “Presentation Manager” is in a quote far preceding its definition as “OS/2’s planned GUI.” The context is probably sufficient to guess its meaning but the surrounding discussion of SAA and CUA does make it seem as though maybe one of the those would be the proper name of OS/2’s GUI. In particular, the passage “… said GUI wasn’t to be Windows. CP-DOS was rather to become a guinea pig for a new bureaucratic monstrosity known as the Systems Application Architecture,” makes it sound as though SAA is the GUI and the GUI is SAA.
Jimmy Maher
July 19, 2018 at 9:14 am
Good catch. I hope it reads more clearly now. Thanks!
Erik Twice
July 20, 2018 at 11:12 am
These articles are great, thanks!
I was wondering, is there any book you would recommend on the subject, specially for someone looking at this from a gaming perspective? :)
Jimmy Maher
July 20, 2018 at 12:44 pm
That’s a tough one. I would say that the definitive book on these subjects has yet to be written. I’m not really sure what a “gaming perspective” could even entail here; virtually every company mentioned in these articles showed — at least publicly — no interest whatsoever in gaming during this period. It was only after Windows 3.0 in 1990 that Microsoft began to turn back to consumer computing.
All I can say is to have a look at my sources at the bottom of the article. Hard Drive and Gates — especially the latter, which is still in print on paper and on Kindle — provide the most detail of the books I’ve found. Unfortunately, they both share similar faults. Both approach this as a breezy business story of the “written for executives on the go!” stripe, ignoring the greater cultural implications; both weave together a lot of threads in a somewhat random-feeling fashion; both are somewhat gossipy (the book Gates in particular evinces as much interest in whom Bill Gates is sleeping with at any given time as in how he’s running his company); neither book’s author seems to have much grasp of the actual technology. The technological developments are, as usual, probably best followed through magazines like Byte and modern Internet resources like Nathan Lineback’s GUI Gallery.
cliffton
July 24, 2018 at 9:02 pm
“For some time, he *WOULDN’T* even be able to get a clear answer”
or
“For some time, he wasn’t even able to get a clear answer”
Jimmy Maher
July 25, 2018 at 8:15 am
Thanks!
Jake R
July 25, 2018 at 6:33 am
could breathe a sign of relief.
Should that be “sigh” ?
Jimmy Maher
July 25, 2018 at 8:14 am
Indeed. Thanks!
Skiphipdata
June 30, 2024 at 5:00 pm
“from Unix and Linux’s X Windows”
Technically, it’s called the “X Window System” or simply “X”. Few people probably care, but “X Windows” is generally considered to be a misnomer AFAIK.