Intel’s revolutionary 8086 chip | Custom PC #228

In the latest issue of Custom PC magazine, Stuart Andrews looks at the story behind the stopgap CPU that helped to start a revolution.

Amazingly, the most important chip in Intel’s history – the chip that began the whole x86 line – was only intended to be a stopgap. In 1976, there were no personal computers as we know them today, only a range of specialist scientific tools, industrial devices, mainframes and microcomputers, which were slowly trickling down from big corporations and research institutions into the small business market. 

8086 Intel chip Custom PC 228
A 1978 photo shows an Intel 8086 in its package form. Reaching market in June 1978, the 8086 was the first 16-bit processor. (Credit: Intel Corporation)

At this point, Intel was just one of several major players in the emerging processor market. Its 8-bit 8080 microprocessor had made the jump from calculators and cash registers to computer terminals and microcomputers, and was the core platform for the big OS of the era, CP/M.

However, Intel had some serious competition. Zilog’s Z80 was effectively an enhanced 8080, running a broadly compatible machine code with some added functions for processing strings. The Motorola 6800 had similar capabilities to Intel’s chip, and was beginning to find favour with Altair and other manufacturers of early kit microcomputers. To make matters worse, both Motorola and Zilog were working on new 16-bit designs, the 68000 and the Z8000 respectively.

Intel was working on its own revolutionary 32-bit architecture, dubbed iAPX 432, but had run into some complex challenges. Development of iAPX 432 was taking Intel’s team of engineers much longer than expected. The architecture spanned across three chips, and the process technology required to manufacture them was still years away. 

What’s more, the instruction set was focused on running new high-level, object-oriented programming languages that weren’t yet mainstream; Intel’s own operating system for it was being coded entirely in Ada. Concerned that Zilog and Motorola could eat its market share before iAPX 432 even launched, Intel needed a new and exciting chip to hold its competitors at bay. 

8086 Intel chip Custom PC 228
The die of the original 8086. It was one of the most complex processors of the time with over 29,000 NMOS transistors.
(Credit: Intel Corporation)

Intel had followed up the 8080 with its own enhanced version, the 8085, but this was little more than an 8080 with a couple of extra instructions to reduce its reliance on external support circuitry. Knowing it needed more, Intel looked to Steven Morse, a young software engineer who had just written a critical report on the iAPX 432 processor design, and asked him to design the instruction set for a new Intel processor. It had to be 8080-compatible and able to address at least 128KB of memory – double the maximum 64KB supported by the 8080. 

Designing the 8086

Morse worked with a small team, including project manager Bill Pohlman and logic designer Jim McKevitt, to design and implement the new architecture. He typed specs and documentation through a text editor on a terminal connected to an Intel mainframe, even creating diagrams from ASCII characters to illustrate how the new CPU would work. 

Meanwhile, Jim McKevitt worked on how data would be passed between the CPU and the supporting chips and the circuitry across the system bus. After the first couple of revisions, Intel brought in a second software engineer, Bruce Ravenal, to help Morse refine the specs. As the project neared completion, the team grew to develop the hardware design, and were able to simulate and test what would become the core of the x86 instruction set. 

8086 Intel chip Custom PC 228

Left pretty much alone by the Intel management to work on an architecture Intel saw as having a limited lifespan, Morse and his team had freedom to innovate. The 8086 design had some clear advantages over the old 8080. It couldn’t merely address double the RAM, but a full 1MB. Rather than handle 8-bit operations, it could manage 16-bit. 

It had a selection of new instructions and features, including features designed to handle strings more efficiently and features designed to support high-level programming languages. It made up for one of the 8080’s biggest shortfalls with hardware support for multiply and divide operations. The 8086 design made a coder’s life easier.

However, Morse also took a new approach to processor design. As he told PC World in a 2008 interview, ‘up until that time, hardware people did all the architectural design, and they would put in whatever features they had space for on the chip. It didn’t matter whether the feature was useful or not’. 

Instead, Morse looked at what features could be added – and how they could be designed – in order to make software run more efficiently. The 8086 made existing 8080 software run better, and was designed to work more effectively with the software and programming languages that were emerging, unlike both iAPX 432 and Zilog’s Z8000.

What’s more, the 8086 was designed to work with additional processors, co-processors and other components as the heart of a computer system. In the foreword to his book, The 8086 Primer, Morse talks of how ‘the thrust of the 8086 has always been to help users get their products to market faster using compatible software, peripheral components and system support’. It was designed as the heart of ‘a microprocessing family’ with multiple processors supporting the CPU. This, in turn, made the 8086 a good fit as microcomputers evolved into PCs. 

A star is born

The 8086 was hardly an overnight success. The Z80 continued to cause Intel nightmares, appearing in more affordable business computers running CP/M. The most exciting processor in development in the late 1970s wasn’t one of Intel’s, but Motorola’s 68000, which went on to dominate the workstation market, and create a new generation of home and business computers. Comparatively, the 8086 made it into some new microcomputers and terminals, plus a few more specialist devices, but it wasn’t the market leader. In March 1979, frustrated by Intel’s working culture, Steven Morse left the company.

8086 Intel chip Custom PC 228

Then, a few weeks after Morse left, Intel launched a revamp of the 8086: the 8088. The new chip was actually weaker; half its data pins were removed, so it could only send or receive 8 bits of data at a time. However, this meant it could work with existing 8-bit support chips designed for the 8080, and it didn’t need as much complex circuitry to run. Given that the 8086 architecture was backwards compatible, and that Intel was investing heavily in support, building new versions of existing computers and devices around the 8088 became a no-brainer. At this point, Intel found out about a new project underway at IBM.

In 1980, an Intel field sales engineer, Earl Whetstone, began talking to an IBM engineer, Donald Estridge, about a top-secret project to develop a new kind of computer. Schedules were tight, meaning IBM needed to work with an existing processor, and Estridge faced a choice between the Zilog Z8000, the Motorola 68000, and Intel’s 8086 and 8088. 

Whetstone sold Estridge on the technical merits of the 8086 architecture and the lengths Intel would go to support it, while IBM’s bean counters realised that the 8800 could help them keep costs down. IBM was already familiar with Intel, and was working with a version of Microsoft Basic that ran on the 8080. 

A version of 86-DOS, sold as part of a Seattle Computer Products kit, would become the basis of Microsoft’s MS-DOS. The project became the IBM PC 5150, or the original IBM PC, and because it used off-the-shelf hardware and Microsoft’s software, it became the inspiration for an army of PC clones.

As they say, the rest is history. Had Intel focused down on iAX 432 and IBM had chosen Motorola, computing may have gone in a different direction. Intel was smart enough to double down on x86, moving on to the 286 and 386 CPUs and – eventually – abandoning what was supposed to be its flagship line. What’s more, while the x86 architecture has now evolved beyond all recognition, even today’s mightiest Core CPUs can still run code written for the lowly 8086. It started as a stopgap, but the 8086 has stood the test of time.

Get Custom PC #228 NOW!

You can read more features like this one in Custom PC #228, available directly from Raspberry Pi Press — we deliver worldwide. Each issue features a custom-made reader’s drive like this one.

custom pc 228

And if you’d like a handy digital version of the magazine, you can also download issue 228 for free in PDF format.

15 comments

supra avatar

Back in 1989, I had my stuff in the basement.

Anders avatar

I use to use these 40 pin DIL package CPUs to make my own little micro controllers on Vero stripboard with insulated tinned copper wire connections. 6800 CPU, 2716 Eprom, 6116 RAM, some discrete TTL logic like 74LS374 latches for keyboard input and holding 7 segment display hexx numbers. I am dragging the depths of my memory.

Michaeal Clark avatar

>…while IBM’s bean counters realised that the 8800 could…
Should that be 8086/8088? Too many 8s and 0s in these naming conventions.

Leon Heller avatar

That was around the time Radio Shack developed the TRS-80 Model I and II computers, using the Z80. I developed a small system based on the MC68000, with software developed on the RS Model II using a cross-assembler.

Paul Milligan avatar

I ran a production line for computer printers on TRS-80’s loading software off audio cassettes. It was an exciting era.

Paul Milligan avatar

IBM didn’t have a major plan for the first PC, you could buy it with 16kb of memory and load and store to audio cassette. The success was probably down to the technical reference manual which had a full schematic and source code listing of the BIOS. This allowed the rapid development of clones and an entire add on industry.

doseas avatar

Dr. Morse’s contributions to genealogy are even more revolutionary than the 8086 processor: https://stevemorse.org/

Rodney avatar

I got my copy of “The 8086 primer” signed by Stephen Morse – at a genealogy conference in 2013.

imm avatar

I’m not sure how robust the claim that the 8086 was the first 16-bit micro really is – I’m pretty old and was working in that field back then, we had systems using the Ferranti F100 way before we ever managed to actually get any 8086 chips…
In the end we switched to 8086 (price, volume) but the F100 was definitely available sooner.

Stephen Jordan avatar

The computing world is always a much bigger world than the US, and it was in those days, too. There would have been other significant and similar developments in, eg Britain, Germany, Japan, but that history usually doesn’t rate if the narrator is from the US, unfortunately.

MogensB avatar

I think many stories on the Intel 8008/8080, leading to 8086/8088, is missing credits to CTC and Datapoint 2200. Victor Poor and Harry Pyle did the instruction set architecture, and ordered the 8008 by specification (called the 1201 before marketing) at Intel and Texas Instruments as an IC. Both Intel and TI failed to deliver. Intel got the rights for the 8008 and CTC wasn’t charged for the development costs. Federico Faggin enhanced the design to become the 8080 (and later Z80 in his own company Zilog).

Alex Rodenbach avatar

I remember reading about when Intel invented the 4004. They choose the number because in the Bible, Torah or some other source I can not think of, this is the year the world came to be. Shortly after, that came out with the 8008 by doubling 4004, then the rest is history. I probably got some details wrong. I am also from the old days of running programs from tape etc.

Stephen Morse avatar

I usually don’t add comments to articles, but I did want to correct a few details.

1) As someone pointed out, the author’s reference to 8800 was confusing. The 8800 was Intel’s internal name for the high-end processor, and that name was changed to iAPX 432 by the time it was announced.

2) Someone else pointed out that the 8086 was not the first 16-bit microprocessor. Intel never claimed that it was. The claim is that it was the first commercially successful 16-bit microprocessor.

3) Final correction: My name is not Steven Morse. It is Stephen Morse.

Clement Khoo avatar

It’s great to have your comments to correct some errors. Perhaps, you would care to expand on what the article had touched on. I’m sure that the article didn’t touch on everything that you are privy to, so your added commentary would complete history, especially when you left Intel before the 8086 became successful. Like many of the readers, I’m also old school & started with punch cards but I believe that the 8086 owes much of its success to the pirates who cloned the IBM PCs.

ringtone avatar

It’s hard to believe that such an old processor can still run well. I usually use the 8086 to run some light bulb flickering effects on billboards. The effect of flashing the light bulb according to the music frequency.

Comments are closed