Изменить стиль страницы

This was done, she wrote, by “the introduction into it of the principle which Jacquard devised for regulating, by means of punched cards, the most complicated patterns in the fabrication of brocaded stuffs.” Even more than Babbage, Ada realized the significance of this. It meant that the machine could be like the type of computer we now take for granted: one that does not merely do a specific arithmetic task but can be a general-purpose machine. She explained:

The bounds of arithmetic were outstepped the moment the idea of applying cards had occurred. The Analytical Engine does not occupy common ground with mere “calculating machines.” It holds a position wholly its own. In enabling a mechanism to combine together general symbols, in successions of unlimited variety and extent, a uniting link is established between the operations of matter and the abstract mental processes.37

Those sentences are somewhat clotted, but they are worth reading carefully. They describe the essence of modern computers. And Ada enlivened the concept with poetic flourishes. “The Analytical Engine weaves algebraical patterns just as the Jacquard loom weaves flowers and leaves,” she wrote. When Babbage read “Note A,” he was thrilled and made no changes. “Pray do not alter it,” he said.38

Ada’s second noteworthy concept sprang from this description of a general-purpose machine. Its operations, she realized, did not need to be limited to math and numbers. Drawing on De Morgan’s extension of algebra into a formal logic, she noted that a machine such as the Analytical Engine could store, manipulate, process, and act upon anything that could be expressed in symbols: words and logic and music and anything else we might use symbols to convey.

To explain this idea, she carefully defined what a computer operation was: “It may be desirable to explain that by the word ‘operation,’ we mean any process which alters the mutual relation of two or more things, be this relation of what kind it may.” A computer operation, she noted, could alter the relationship not just between numbers but between any symbols that were logically related. “It might act upon other things besides number, were objects found whose mutual fundamental relations could be expressed by those of the abstract science of operations.” The Analytical Engine could, in theory, even perform operations on musical notations: “Supposing, for instance, that the fundamental relations of pitched sounds in the science of harmony and of musical composition were susceptible of such expression and adaptations, the engine might compose elaborate and scientific pieces of music of any degree of complexity.” It was the ultimate Ada-like “poetical science” concept: an elaborate and scientific piece of music composed by a machine! Her father would have shuddered.

This insight would become the core concept of the digital age: any piece of content, data, or information—music, text, pictures, numbers, symbols, sounds, video—could be expressed in digital form and manipulated by machines. Even Babbage failed to see this fully; he focused on numbers. But Ada realized that the digits on the cogs could represent things other than mathematical quantities. Thus did she make the conceptual leap from machines that were mere calculators to ones that we now call computers. Doron Swade, a computer historian who specializes in studying Babbage’s engines, has declared this one of Ada’s historic legacies. “If we are looking and sifting history for that transition, then that transition was made explicitly by Ada in that 1843 paper,” he said.39

Ada’s third contribution, in her final “Note G,” was to figure out in step-by-step detail the workings of what we now call a computer program or algorithm. The example she used was a program to compute Bernoulli numbers,III an exceedingly complex infinite series that in various guises plays a role in number theory.

To show how the Analytical Engine could generate Bernoulli numbers, Ada described a sequence of operations and then made a chart showing how each would be coded into the machine. Along the way, she helped to devise the concepts of subroutines (a sequence of instructions that performs a specific task, such as computing a cosine or calculating compound interest, and can be dropped into larger programs as needed) and a recursive loop (a sequence of instructions that repeats itself).IV These were made possible by the punch-card mechanism. Seventy-five cards were needed to generate each number, she explained, and then the process became iterative as that number was fed back into the process to generate the next one. “It will be obvious that the very same seventy-five variable cards may be repeated for the computation of every succeeding number,” she wrote. She envisioned a library of commonly used subroutines, something that her intellectual heirs, including women such as Grace Hopper at Harvard and Kay McNulty and Jean Jennings at the University of Pennsylvania, would create a century later. In addition, because Babbage’s engine made it possible to jump back and forth within the sequence of instruction cards based on the interim results it had calculated, it laid the foundation for what we now call conditional branching, changing to a different path of instructions if certain conditions are met.

Babbage helped Ada with the Bernoulli calculations, but the letters show her deeply immersed in the details. “I am doggedly attacking and sifting to the very bottom all the ways of deducing the Bernoulli numbers,” she wrote in July, just weeks before her translation and notes were due at the printers. “I am in much dismay at having gotten so amazing a quagmire and botheration with these Numbers that I cannot possibly get the thing done today. . . . I am in a charming state of confusion.”40

When it got worked out, she added a contribution that was primarily her own: a table and diagram showing exactly how the algorithm would be fed into the computer, step by step, including two recursive loops. It was a numbered list of coding instructions that included destination registers, operations, and commentary—something that would be familiar to any C++ coder today. “I have worked incessantly and most successfully all day,” she wrote Babbage. “You will admire the Table and Diagram extremely. They have been made out with extreme care.” From all of the letters it is clear that she did the table herself; the only help came from her husband, who did not understand the math but was willing to methodically trace in ink what she had done in pencil. “Lord L is at this moment kindly inking it all over for me,” she wrote Babbage. “I had to do it in pencil.”41

It was mainly on the basis of this diagram, which accompanied the complex processes for generating Bernoulli numbers, that Ada has been accorded by her fans the accolade of “the world’s first computer programmer.” That is a bit hard to defend. Babbage had already devised, at least in theory, more than twenty explanations of processes that the machine might eventually perform. But none of these was published, and there was no clear description of the way to sequence the operations. Therefore, it is fair to say that the algorithm and detailed programming description for the generation of Bernoulli numbers was the first computer program ever to be published. And the initials at the end were those of Ada Lovelace.

There was one other significant concept that she introduced in her “Notes,” which harked back to the Frankenstein story produced by Mary Shelley after that weekend with Lord Byron. It raised what is still the most fascinating metaphysical topic involving computers, that of artificial intelligence: Can machines think?

Ada believed not. A machine such as Babbage’s could perform operations as instructed, she asserted, but it could not come up with ideas or intentions of its own. “The Analytical Engine has no pretensions whatever to originate anything,” she wrote in her “Notes.” “It can do whatever we know how to order it to perform. It can follow analysis; but it has no power of anticipating any analytical relations or truths.” A century later this assertion would be dubbed “Lady Lovelace’s Objection” by the computer pioneer Alan Turing (see chapter 3).