The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution
By Walter Isaacson
Simon & Schuster, 560 pages
The microprocessor, that dirt-cheap computer on a chip, first came on the market in 1971. It is the most fundamentally important new technology since the steam engine and perhaps since agriculture. Had someone pushed a button in 1970 to make all the world’s computers stop working, the average man in the street wouldn’t have noticed until his bank statement failed to come in at the end of the month. Push that button today and civilization collapses in seconds. Cars and household appliances wouldn’t run. Phones wouldn’t work. Planes would be unable to take off or land. The lights would go out. The reason is simple: Everything more complicated than a pencil these days has one or more microprocessors in it.
Walter Isaacson, most recently the biographer of Steve Jobs, tells the story in The Innovators. Technologies such as moveable type, the steam engine, and the microprocessor remake the world because they radically reduce the cost of a fundamental input to the economy. Transforming the economy, they soon transform society and thus politics as well.
In 1450, there were about 50,000 books in all of Europe, almost all of them held in monasteries and universities under the control of the Church. And then moveable type made books cheap. By 1500, there were 10 million books in Europe, almost none of them under Church control—and the Church soon faced the Protestant Reformation.
The Industrial Revolution that began in the mid-18th century was powered by the steam engine and the factory system of production. For the first time in human history, energy that accomplished work became both cheap and available in enormous quantities. Mounted on rails, the steam engine made overland transportation cheap and much faster. The new factories could now sell their products in continent-wide markets with enormous economies of scale. The aristocratic, land-based society that had dominated Europe for a millennium was transformed into the middle-class-dominated and increasingly democratic society of the 19th century.
The digital revolution is no different. By many orders of magnitude it has reduced the cost and hugely increased the speed of another fundamental input to the economy: the storage, retrieval, and manipulation of data. This has resulted in a cascade of new technologies—email, GPS, smartphones, ebooks, CDs and DVDs, tablets, search engines, digital photography, streaming movies, word processing, spreadsheets, video games, and so on.
In 1940, the word computer meant a person (usually a woman, as women are better at it, making fewer mistakes) who did tedious calculations solving differential equations to determine, say, the trajectories of artillery shells. Wartime need produced ENIAC, which stood for Electronic Numerical Integrator And Computer. It debuted in November 1945 as the world’s first digital and programmable computer, capable of performing 5,000 additions and subtractions per second. It could solve differential equations in moments, not weeks.
But ENIAC was huge. It was 100 feet long and 8 feet wide, weighed nearly 30 tons, and had 17,468 vacuum tubes that sucked up an enormous amount of power and burned out constantly. It took hours if not days to convert ENIAC from one application to another and it had less computing power than an iPhone.
The journey from ENIAC to iPhone in a mere 60 years is the heart of Isaacson’s tale. But he begins with a remarkable woman—Ada, Countess of Lovelace, who lived from 1815 to 1852. The daughter of Lord Byron, Ada had a mother who was fearful she might have inherited her father’s wild genes and so steered her away from poetry and toward mathematics, not a subject then taught to girls beyond simple arithmetic. Ada, it turned out, had a remarkable aptitude for the subject.
She met Charles Babbage at a dinner party. Babbage had designed a “difference engine” (essentially an adding machine) and was thinking about making an “analytical engine”—which, had it been built, would have been the world’s first computer. In 1840, Babbage lectured in Italy on the analytical engine, and an article about the talk was translated by Lady Lovelace. She sent Babbage a copy, and he suggested she add notes to it. The notes turned out to be three times as long as the article and are a fundamental document in the history of computing.
Lovelace wrote that a computer could be programmable, i.e., that it would not be limited to one function, such as adding and subtracting, but could use underlying commonalities of mathematics to do a limitless number of different tasks. And not just mathematical tasks, but anything that could be expressed in symbols, such as words, logic, and music. She wrote the first computer program (to calculate Bernoulli numbers, an important and recondite set). Finally she wondered if machines could be made able to think for themselves and not just be idiot savants. She thought not, but the question rages to this day.
Once ENIAC was up and running a century later, a dazzling array of innovations quickly came along to reduce the cost and increase the speed of computing. Isaacson’s book is primarily a compilation of biographies of the men and women who, mostly in collaboration, pushed the digital world forward. Some are household names—such as Bill Gates, Steve Jobs, and Gordon Moore—while others are more obscure, such as Grace Hopper and Betty Snyder, pioneers in programming. It was Betty Snyder who, at the last minute, solved a glitch and made ENIAC’s public debut a success. (She wasn’t invited to the celebratory dinner afterward, an all-male affair.)
Isaacson points out repeatedly how collaborative the history of the digital age has been, with different people bringing different skills and attributes to bear on a problem in order to produce the onrush of innovation of the past 70 years. The 19th-century inventor, working alone in his garret, was always somewhat mythical, but the image is wholly inapplicable to today’s complex world.
For instance, one of the giants of the digital age, Intel, was formed by three men, Robert Noyce, Gordon Moore, and Andy Grove in 1967. They made an ideal team. As Isaacson explains:
Noyce was great at strategic vision and seeing the big picture; Moore understood the details, particularly of the technology and engineering. So they were perfect partners, except in one way: with their shared aversion to hierarchy and unwillingness to be bossy, neither was a decisive manager…That’s where Andy Grove came in.
The first major innovation after ENIAC was the transistor, developed at Bell Labs in 1947. The transistor was much smaller than the vacuum tube it replaced. It was also much cheaper and more easily manufactured, less fragile, and far less power-hungry. It is the foundation of modern electronics. The transistor quickly and dramatically shrank the size and cost of computers, which by 1952 were a commercial product. But there was a big problem. The power of a computer depends not only on the number of transistors, but also on the number of connections between them. Two transistors require only one connection. Three require three, four require six, five require ten, and so on. This “tyranny of numbers” made large, powerful computers very expensive, as those connections had to be wired or soldered by hand.
The solution was the integrated circuit, the precursor to the microprocessor. The first was developed by Jack Kilby of Texas Instruments. It had a complete circuit etched into a chip of germanium, no wiring or soldering required. But it was Robert Noyce, then of Fairchild Semiconductor, who hit on the idea of using much cheaper silicon as the substrate. Kilby’s first integrated circuit had only four transistors. By the early 1970s, the number of transistors on an integrated circuit was approaching 4,000. The Intel 4004 microprocessor—a complete computer on a chip—was introduced in 1971. It had 2,300 transistors gathered in an area the size of a fingernail, with computing power equal to ENIAC.
Gordon Moore famously predicted that the number of transistors on a chip would double every 18 months for the foreseeable future. Moore’s Law explains why computer costs have fallen so dramatically and why every 10-year-old has more computing power in his backpack than the Pentagon could have afforded 50 years ago. The Apple A8X microprocessor, introduced this year, has 3 billion transistors.
Once personal computers were developed in the late 1970s, along with the Internet, the digital world came into being. It is one of the great stories in human history, and Walter Isaacson does it justice.