Изменить стиль страницы

The Internet facilitated collaboration not only within teams but also among crowds of people who didn’t know each other. This is the advance that is closest to being revolutionary. Networks for collaboration have existed ever since the Persians and Assyrians invented postal systems. But never before has it been easy to solicit and collate contributions from thousands or millions of unknown collaborators. This led to innovative systems—Google page ranks, Wikipedia entries, the Firefox browser, the GNU/Linux software—based on the collective wisdom of crowds.

There were three ways that teams were put together in the digital age. The first was through government funding and coordination. That’s how the groups that built the original computers (Colossus, ENIAC) and networks (ARPANET) were organized. This reflected the consensus, which was stronger back in the 1950s under President Eisenhower, that the government should undertake projects, such as the space program and interstate highway system, that benefited the common good. It often did so in collaboration with universities and private contractors as part of a government-academic-industrial triangle that Vannevar Bush and others fostered. Talented federal bureaucrats (not always an oxymoron), such as Licklider, Taylor, and Roberts, oversaw the programs and allocated public funds.

Private enterprise was another way that collaborative teams were formed. This happened at the research centers of big companies, such as Bell Labs and Xerox PARC, and at entrepreneurial new companies, such as Texas Instruments and Intel, Atari and Google, Microsoft and Apple. A key driver was profits, both as a reward for the players and as a way to attract investors. That required a proprietary attitude to innovation that led to patents and intellectual property protections. Digital theorists and hackers often disparaged this approach, but a private enterprise system that financially rewarded invention was a component of a system that led to breathtaking innovation in transistors, chips, computers, phones, devices, and Web services.

Throughout history, there has been a third way, in addition to government and private enterprises, that collaborative creativity has been organized: through peers freely sharing ideas and making contributions as part of a voluntary common endeavor. Many of the advances that created the Internet and its services occurred in this fashion, which the Harvard scholar Yochai Benkler has labeled “commons-based peer production.”32 The Internet allowed this form of collaboration to be practiced on a much larger scale than before. The building of Wikipedia and the Web were good examples, along with the creation of free and open-source software such as Linux and GNU, OpenOffice and Firefox. As the technology journalist Steven Johnson has noted, “their open architecture allows others to build more easily on top of existing ideas, just as Berners-Lee built the Web on top of the Internet.”33 This commons-based production by peer networks was driven not by financial incentives but by other forms of reward and satisfaction.

The values of commons-based sharing and of private enterprise often conflict, most notably over the extent to which innovations should be patent-protected. The commons crowd had its roots in the hacker ethic that emanated from the MIT Tech Model Railroad Club and the Homebrew Computer Club. Steve Wozniak was an exemplar. He went to Homebrew meetings to show off the computer circuit he built, and he handed out freely the schematics so that others could use and improve it. But his neighborhood pal Steve Jobs, who began accompanying him to the meetings, convinced him that they should quit sharing the invention and instead build and sell it. Thus Apple was born, and for the subsequent forty years it has been at the forefront of aggressively patenting and profiting from its innovations. The instincts of both Steves were useful in creating the digital age. Innovation is most vibrant in the realms where open-source systems compete with proprietary ones.

Sometimes people advocate one of these modes of production over the others based on ideological sentiments. They prefer a greater government role, or exalt private enterprise, or romanticize peer sharing. In the 2012 election, President Barack Obama stirred up controversy by saying to people who owned businesses, “You didn’t build that.” His critics saw it as a denigration of the role of private enterprise. Obama’s point was that any business benefits from government and peer-based community support: “If you were successful, somebody along the line gave you some help. There was a great teacher somewhere in your life. Somebody helped to create this unbelievable American system that we have that allowed you to thrive. Somebody invested in roads and bridges.” It was not the most elegant way for him to dispel the fantasy that he was a closet socialist, but it did point to a lesson of modern economics that applies to digital-age innovation: that a combination of all of these ways of organizing production—governmental, market, and peer sharing—is stronger than favoring any one of them.

None of this is new. Babbage got most of his funding from the British government, which was generous in financing research that could strengthen its economy and empire. He adopted ideas from private industry, most notably the punch cards that had been developed by the textile firms for automated looms. He and his friends were founders of a handful of new peer-network clubs, including the British Association for the Advancement of Science, and though it may seem a stretch to view that august group as a fancy-dress forerunner to the Homebrew Computer Club, both existed to facilitate commons-based peer collaboration and the sharing of ideas.

The most successful endeavors in the digital age were those run by leaders who fostered collaboration while also providing a clear vision. Too often these are seen as conflicting traits: a leader is either very inclusive or a passionate visionary. But the best leaders could be both. Robert Noyce was a good example. He and Gordon Moore drove Intel forward based on a sharp vision of where semiconductor technology was heading, and they both were collegial and nonauthoritarian to a fault. Even Steve Jobs and Bill Gates, with all of their prickly intensity, knew how to build strong teams around them and inspire loyalty.

Brilliant individuals who could not collaborate tended to fail. Shockley Semiconductor disintegrated. Similarly, collaborative groups that lacked passionate and willful visionaries also failed. After inventing the transistor, Bell Labs went adrift. So did Apple after Jobs was ousted in 1985.

Most of the successful innovators and entrepreneurs in this book had one thing in common: they were product people. They cared about, and deeply understood, the engineering and design. They were not primarily marketers or salesmen or financial types; when such folks took over companies, it was often to the detriment of sustained innovation. “When the sales guys run the company, the product guys don’t matter so much, and a lot of them just turn off,” Jobs said. Larry Page felt the same: “The best leaders are those with the deepest understanding of the engineering and product design.”34

Another lesson of the digital age is as old as Aristotle: “Man is a social animal.” What else could explain CB and ham radios or their successors, such as WhatsApp and Twitter? Almost every digital tool, whether designed for it or not, was commandeered by humans for a social purpose: to create communities, facilitate communication, collaborate on projects, and enable social networking. Even the personal computer, which was originally embraced as a tool for individual creativity, inevitably led to the rise of modems, online services, and eventually Facebook, Flickr, and Foursquare.