Изменить стиль страницы

This was one of the oddest situations in the entire scientific enterprise, but the situation was tolerated for years. Most US civilian cryptographers felt, through patriotic conviction, that it was in the best interests of the United States if the NSA remained far ahead of the curve in cryptographic science. After all, were some other national government's electronic spies to become more advanced than those of the NSA, then American government and military transmissions would be cracked and penetrated. World War II had proven that the consequences of a defeat in the cryptographic arms race could be very dire indeed for the loser.

So the "voluntary restraint" measures worked well for over a decade. Few mathematicians were so enamored of the doctrine of academic freedom that they were prepared to fight the National Security Agency over their supposed right to invent codes that could baffle the US government. In any case, the mathematical cryptography community was a small group without much real political clout, while the NSA was a vast, powerful, well-financed agency unaccountable to the American public, and reputed to possess many deeply shadowed avenues of influence in the corridors of power.

However, as the years rolled on, the electronic exchange of information became a commonplace, and users of computer data became intensely aware of their necessity for electronic security over transmissions and data. One answer was physical security -- protect the wiring, keep the physical computers behind a physical lock and key. But as personal computers spread and computer networking grew ever more sophisticated, widespread and complex, this bar-the-door technique became unworkable.

The volume and importance of information transferred over the Internet was increasing by orders of magnitude. But the Internet was a notoriously leaky channel of information -- its packet-switching technology meant that packets of vital information might be dumped into the machines of unknown parties at almost any time. If the Internet itself could not locked up and made leakproof -- and this was impossible by the nature of the system -- then the only secure solution was to encrypt the message itself, to make that message unusable and unreadable, even if it sometimes fell into improper hands.

Computers outside the Internet were also at risk. Corporate computers faced the threat of computer-intrusion hacking, from bored and reckless teenagers, or from professional snoops and unethical business rivals both inside and outside the company. Electronic espionage, especially industrial espionage, was intensifying. The French secret services were especially bold in this regard, as American computer and aircraft executives found to their dismay as their laptops went missing during Paris air and trade shows. Transatlantic commercial phone calls were routinely tapped by French government spooks seeking commercial advantage for French companies in the computer industry, aviation, and the arms trade. And the French were far from alone when it came to government-supported industrial espionage.

Protection of private civilian data from foreign government spies required that seriously powerful encryption techniques be placed into private hands. Unfortunately, an ability to baffle French spies also means an ability to baffle American spies. This was not good news for the NSA.

By 1993, encryption had become big business. There were one and half million copies of legal encryption software publicly available, including widely-known and commonly-used personal computer products such as Norton Utilities, Lotus Notes, StuffIt, and several Microsoft products. People all over the world, in every walk of life, were using computer encryption as a matter of course. They were securing hard disks from spies or thieves, protecting certain sections of the family computer from sticky-fingered children, or rendering entire laptops and portables into a solid mess of powerfully-encrypted Sanskrit, so that no stranger could walk off with those accidental but highly personal life- histories that are stored in almost every PowerBook.

People were no longer afraid of encryption. Encryption was no longer secret, obscure, and arcane; encryption was a business tool. Computer users wanted more encryption, faster, sleeker, more advanced, and better.

The real wild-card in the mix, however, was the new cryptography. A new technique arose in the 1970s: public-key cryptography. This was an element the codemasters of World War II and the Cold War had never foreseen.

Public-key cryptography was invented by American civilian researchers Whitfield Diffie and Martin Hellman, who first published their results in 1976.

Conventional classical cryptographic systems, from the Caesar cipher to the Nazi Enigma machine defeated by Alan Turing, require a single key. The sender of the message uses that key to turn his plain text message into cyphertext gibberish. He shares the key secretly with the recipients of the message, who use that same key to turn the cyphertext back into readable plain text.

This is a simple scheme; but if the key is lost to unfriendly forces such as the ingenious Alan Turing, then all is lost. The key must therefore always remain hidden, and it must always be fiercely protected from enemy cryptanalysts. Unfortunately, the more widely that key is distributed, the more likely it is that some user in on the secret will crack or fink. As an additional burden, the key cannot be sent by the same channel as the communications are sent, since the key itself might be picked-up by eavesdroppers.

In the new public-key cryptography, however, there are two keys. The first is a key for writing secret text, the second the key for reading that text. The keys are related to one another through a complex mathematical dependency; they determine one another, but it is mathematically extremely difficult to deduce one key from the other.

The user simply gives away the first key, the "public key," to all and sundry. The public key can even be printed on a business card, or given away in mail or in a public electronic message. Now anyone in the public, any random personage who has the proper (not secret, easily available) cryptographic software, can use that public key to send the user a cyphertext message. However, that message can only be read by using the second key -- the private key, which the user always keeps safely in his own possession.

Obviously, if the private key is lost, all is lost. But only one person knows that private key. That private key is generated in the user's home computer, and is never revealed to anyone but the very person who created it.

To reply to a message, one has to use the public key of the other party. This means that a conversation between two people requires four keys. Before computers, all this key-juggling would have been rather unwieldy, but with computers, the chips and software do all the necessary drudgework and number-crunching.

The public/private dual keys have an interesting alternate application. Instead of the public key, one can use one's private key to encrypt a message. That message can then be read by anyone with the public key, i.e,. pretty much everybody, so it is no longer a "secret" message at all. However, that message, even though it is no longer secret, now has a very valuable property: it is authentic. Only the individual holder of the private key could have sent that message.

This authentication power is a crucial aspect of the new cryptography, and may prove to be more socially important than secrecy. Authenticity means that electronic promises can be made, electronic proofs can be established, electronic contracts can be signed, electronic documents can be made tamperproof. Electronic impostors and fraudsters can be foiled and defeated -- and it is possible for someone you have never seen, and will never see, to prove his bona fides through entirely electronic means.