In the latest issue of Custom PC magazine, Stuart Andrews looks at the story behind the stopgap CPU that helped to start a revolution.
Amazingly, the most important chip in Intel’s history – the chip that began the whole x86 line – was only intended to be a stopgap. In 1976, there were no personal computers as we know them today, only a range of specialist scientific tools, industrial devices, mainframes and microcomputers, which were slowly trickling down from big corporations and research institutions into the small business market.
At this point, Intel was just one of several major players in the emerging processor market. Its 8-bit 8080 microprocessor had made the jump from calculators and cash registers to computer terminals and microcomputers, and was the core platform for the big OS of the era, CP/M.
However, Intel had some serious competition. Zilog’s Z80 was effectively an enhanced 8080, running a broadly compatible machine code with some added functions for processing strings. The Motorola 6800 had similar capabilities to Intel’s chip, and was beginning to find favour with Altair and other manufacturers of early kit microcomputers. To make matters worse, both Motorola and Zilog were working on new 16-bit designs, the 68000 and the Z8000 respectively.
Intel was working on its own revolutionary 32-bit architecture, dubbed iAPX 432, but had run into some complex challenges. Development of iAPX 432 was taking Intel’s team of engineers much longer than expected. The architecture spanned across three chips, and the process technology required to manufacture them was still years away.
What’s more, the instruction set was focused on running new high-level, object-oriented programming languages that weren’t yet mainstream; Intel’s own operating system for it was being coded entirely in Ada. Concerned that Zilog and Motorola could eat its market share before iAPX 432 even launched, Intel needed a new and exciting chip to hold its competitors at bay.
Intel had followed up the 8080 with its own enhanced version, the 8085, but this was little more than an 8080 with a couple of extra instructions to reduce its reliance on external support circuitry. Knowing it needed more, Intel looked to Steven Morse, a young software engineer who had just written a critical report on the iAPX 432 processor design, and asked him to design the instruction set for a new Intel processor. It had to be 8080-compatible and able to address at least 128KB of memory – double the maximum 64KB supported by the 8080.
Designing the 8086
Morse worked with a small team, including project manager Bill Pohlman and logic designer Jim McKevitt, to design and implement the new architecture. He typed specs and documentation through a text editor on a terminal connected to an Intel mainframe, even creating diagrams from ASCII characters to illustrate how the new CPU would work.
Meanwhile, Jim McKevitt worked on how data would be passed between the CPU and the supporting chips and the circuitry across the system bus. After the first couple of revisions, Intel brought in a second software engineer, Bruce Ravenal, to help Morse refine the specs. As the project neared completion, the team grew to develop the hardware design, and were able to simulate and test what would become the core of the x86 instruction set.
Left pretty much alone by the Intel management to work on an architecture Intel saw as having a limited lifespan, Morse and his team had freedom to innovate. The 8086 design had some clear advantages over the old 8080. It couldn’t merely address double the RAM, but a full 1MB. Rather than handle 8-bit operations, it could manage 16-bit.
It had a selection of new instructions and features, including features designed to handle strings more efficiently and features designed to support high-level programming languages. It made up for one of the 8080’s biggest shortfalls with hardware support for multiply and divide operations. The 8086 design made a coder’s life easier.
However, Morse also took a new approach to processor design. As he told PC World in a 2008 interview, ‘up until that time, hardware people did all the architectural design, and they would put in whatever features they had space for on the chip. It didn’t matter whether the feature was useful or not’.
Instead, Morse looked at what features could be added – and how they could be designed – in order to make software run more efficiently. The 8086 made existing 8080 software run better, and was designed to work more effectively with the software and programming languages that were emerging, unlike both iAPX 432 and Zilog’s Z8000.
What’s more, the 8086 was designed to work with additional processors, co-processors and other components as the heart of a computer system. In the foreword to his book, The 8086 Primer, Morse talks of how ‘the thrust of the 8086 has always been to help users get their products to market faster using compatible software, peripheral components and system support’. It was designed as the heart of ‘a microprocessing family’ with multiple processors supporting the CPU. This, in turn, made the 8086 a good fit as microcomputers evolved into PCs.
A star is born
The 8086 was hardly an overnight success. The Z80 continued to cause Intel nightmares, appearing in more affordable business computers running CP/M. The most exciting processor in development in the late 1970s wasn’t one of Intel’s, but Motorola’s 68000, which went on to dominate the workstation market, and create a new generation of home and business computers. Comparatively, the 8086 made it into some new microcomputers and terminals, plus a few more specialist devices, but it wasn’t the market leader. In March 1979, frustrated by Intel’s working culture, Steven Morse left the company.
Then, a few weeks after Morse left, Intel launched a revamp of the 8086: the 8088. The new chip was actually weaker; half its data pins were removed, so it could only send or receive 8 bits of data at a time. However, this meant it could work with existing 8-bit support chips designed for the 8080, and it didn’t need as much complex circuitry to run. Given that the 8086 architecture was backwards compatible, and that Intel was investing heavily in support, building new versions of existing computers and devices around the 8088 became a no-brainer. At this point, Intel found out about a new project underway at IBM.
In 1980, an Intel field sales engineer, Earl Whetstone, began talking to an IBM engineer, Donald Estridge, about a top-secret project to develop a new kind of computer. Schedules were tight, meaning IBM needed to work with an existing processor, and Estridge faced a choice between the Zilog Z8000, the Motorola 68000, and Intel’s 8086 and 8088.
Whetstone sold Estridge on the technical merits of the 8086 architecture and the lengths Intel would go to support it, while IBM’s bean counters realised that the 8800 could help them keep costs down. IBM was already familiar with Intel, and was working with a version of Microsoft Basic that ran on the 8080.
A version of 86-DOS, sold as part of a Seattle Computer Products kit, would become the basis of Microsoft’s MS-DOS. The project became the IBM PC 5150, or the original IBM PC, and because it used off-the-shelf hardware and Microsoft’s software, it became the inspiration for an army of PC clones.
As they say, the rest is history. Had Intel focused down on iAX 432 and IBM had chosen Motorola, computing may have gone in a different direction. Intel was smart enough to double down on x86, moving on to the 286 and 386 CPUs and – eventually – abandoning what was supposed to be its flagship line. What’s more, while the x86 architecture has now evolved beyond all recognition, even today’s mightiest Core CPUs can still run code written for the lowly 8086. It started as a stopgap, but the 8086 has stood the test of time.
Get Custom PC #228 NOW!
And if you’d like a handy digital version of the magazine, you can also download issue 228 for free in PDF format.