Изменить стиль страницы

The reality is that Ada’s contribution was both profound and inspirational. More than Babbage or any other person of her era, she was able to glimpse a future in which machines would become partners of the human imagination, together weaving tapestries as beautiful as those from Jacquard’s loom. Her appreciation for poetical science led her to celebrate a proposed calculating machine that was dismissed by the scientific establishment of her day, and she perceived how the processing power of such a device could be used on any form of information. Thus did Ada, Countess of Lovelace, help sow the seeds for a digital age that would blossom a hundred years later.

I. It was in a review of this book that one of Babbage’s friends, William Whewell, coined the term scientist to suggest the connection among these disciplines.

II. Specifically, he wanted to use the method of divided differences to closely approximate logarithmic and trigonometric functions.

III. Named after the seventeenth-century Swiss mathematician Jacob Bernoulli, who studied the sums of powers of consecutive integers, they play an intriguing role in number theory, mathematical analysis, and differential topology.

IV. Ada’s example involved tabulating polynomials using difference techniques as a subfunction, which required a nested loop structure with a varying range for the inner loop.

The Innovators: How a Group of Inventors, Hackers, Geniuses, and Geeks Created the Digital Revolution _71.jpg

Vannevar Bush (1890–1974), with his Differential Analyzer at MIT.

The Innovators: How a Group of Inventors, Hackers, Geniuses, and Geeks Created the Digital Revolution _72.jpg

Alan Turing (1912–54), at the Sherborne School in 1928.

The Innovators: How a Group of Inventors, Hackers, Geniuses, and Geeks Created the Digital Revolution _73.jpg

Claude Shannon (1916–2001) in 1951.

CHAPTER TWO

THE COMPUTER

Sometimes innovation is a matter of timing. A big idea comes along at just the moment when the technology exists to implement it. For example, the idea of sending a man to the moon was proposed right when the progress of microchips made it possible to put computer guidance systems into the nose cone of a rocket. There are other cases, however, when the timing is out of kilter. Charles Babbage published his paper about a sophisticated computer in 1837, but it took a hundred years to achieve the scores of technological advances needed to build one.

Some of those advances seem almost trivial, but progress comes not only in great leaps but also from hundreds of small steps. Take for example punch cards, like those Babbage saw on Jacquard’s looms and proposed incorporating into his Analytical Engine. Perfecting the use of punch cards for computers came about because Herman Hollerith, an employee of the U.S. Census Bureau, was appalled that it took close to eight years to manually tabulate the 1880 census. He resolved to automate the 1890 count.

Drawing on the way that railway conductors punched holes in various places on a ticket in order to indicate the traits of each passenger (gender, approximate height, age, hair color), Hollerith devised punch cards with twelve rows and twenty-four columns that recorded the salient facts about each person in the census. The cards were then slipped between a grid of mercury cups and a set of spring-loaded pins, which created an electric circuit wherever there was a hole. The machine could tabulate not only the raw totals but also combinations of traits, such as the number of married males or foreign-born females. Using Hollerith’s tabulators, the 1890 census was completed in one year rather than eight. It was the first major use of electrical circuits to process information, and the company that Hollerith founded became in 1924, after a series of mergers and acquisitions, the International Business Machines Corporation, or IBM.

One way to look at innovation is as the accumulation of hundreds of small advances, such as counters and punch-card readers. At places like IBM, which specialize in daily improvements made by teams of engineers, this is the preferred way to understand how innovation really happens. Some of the most important technologies of our era, such as the fracking techniques developed over the past six decades for extracting natural gas, came about because of countless small innovations as well as a few breakthrough leaps.

In the case of computers, there were many such incremental advances made by faceless engineers at places like IBM. But that was not enough. Although the machines that IBM produced in the early twentieth century could compile data, they were not what we would call computers. They weren’t even particularly adroit calculators. They were lame. In addition to those hundreds of minor advances, the birth of the computer age required some larger imaginative leaps from creative visionaries.

DIGITAL BEATS ANALOG

The machines devised by Hollerith and Babbage were digital, meaning they calculated using digits: discrete and distinct integers such as 0, 1, 2, 3. In their machines, the integers were added and subtracted using cogs and wheels that clicked one digit at a time, like counters. Another approach to computing was to build devices that could mimic or model a physical phenomenon and then make measurements on the analogous model to calculate the relevant results. These were known as analog computers because they worked by analogy. Analog computers do not rely on discrete integers to make their calculations; instead, they use continuous functions. In analog computers, a variable quantity such as electrical voltage, the position of a rope on a pulley, hydraulic pressure, or a measurement of distance is employed as an analog for the corresponding quantities of the problem to be solved. A slide rule is analog; an abacus is digital. Clocks with sweeping hands are analog, and those with displayed numerals are digital.

Around the time that Hollerith was building his digital tabulator, Lord Kelvin and his brother James Thomson, two of England’s most distinguished scientists, were creating an analog machine. It was designed to handle the tedious task of solving differential equations, which would help in the creation of tide charts and of tables showing the firing angles that would generate different trajectories of artillery shells. Beginning in the 1870s, the brothers devised a system that was based on a planimeter, an instrument that can measure the area of a two-dimensional shape, such as the space under a curved line on a piece of paper. The user would trace the outline of the curve with the device, which would calculate the area by using a small sphere that was slowly pushed across the surface of a large rotating disk. By calculating the area under the curve, it could thus solve equations by integration—in other words, it could perform a basic task of calculus. Kelvin and his brother were able to use this method to create a “harmonic synthesizer” that could churn out an annual tide chart in four hours. But they were never able to conquer the mechanical difficulties of linking together many of these devices in order to solve equations with a lot of variables.

That challenge of linking together multiple integrators was not mastered until 1931, when an MIT engineering professor, Vannevar (rhymes with beaver) Bush—remember his name, for he is a key character in this book—was able to build the world’s first analog electrical-mechanical computer. He dubbed his machine a Differential Analyzer. It consisted of six wheel-and-disk integrators, not all that different from Lord Kelvin’s, that were connected by an array of gears, pulleys, and shafts rotated by electric motors. It helped that Bush was at MIT; there were a lot of people around who could assemble and calibrate complex contraptions. The final machine, which was the size of a small bedroom, could solve equations with as many as eighteen independent variables. Over the next decade, versions of Bush’s Differential Analyzer were replicated at the U.S. Army’s Aberdeen Proving Ground in Maryland, the Moore School of Electrical Engineering at the University of Pennsylvania, and Manchester and Cambridge universities in England. They proved particularly useful in churning out artillery firing tables—and in training and inspiring the next generation of computer pioneers.