Изменить стиль страницы

In Buddhist philosophy, life is often compared to an ever-changing river. There’s a sense that everything, and every individual, is ceaselessly in the process of becoming. In this view of the world, achieving perfection is also a continuous process, and a goal that can never be fully attained. That’s a vision that would come to suit Steve’s exacting nature. Looking ahead to the unmade product, to whatever was around the next corner, and the two or three after that one, came naturally to him. He would never see a limit to possibilities, a perfect endpoint at which his work would be done. And while Steve would eschew almost all self-analysis, the same was true of his own life: despite the fact that he could be almost unfathomably stubborn and opinionated at times, the man himself was constantly adapting, following his nose, learning, trying out new directions. He was constantly in the act of becoming.

None of this was readily apparent to the outside world, and Steve’s Buddhism could befuddle even his closest friends and colleagues. “There was always this spiritual side,” says Mike Slade, a marketing executive who worked with Steve later in his career, “which really didn’t seem to fit with anything else he was doing.” He meditated regularly until he and Laurene became parents, when the demands on his time grew in a way he hadn’t anticipated. He reread Suzuki’s Zen Mind, Beginner’s Mind several times, and made the intersection of elements of Asian spiritualism and his business and commercial life a regular subject of the conversations he and Brilliant enjoyed throughout his life. For years, he arranged for a Buddhist monk by the name of Kobun Chino Otogawa to meet with him once a week at his office to counsel him on how to balance his spiritual sense with his business goals. While nobody who knew him well during his later years would have called Steve a “devout” Buddhist, the spiritual discipline informed his life in both subtle and profound ways.

Becoming Steve Jobs. The Evolution of a Reckless Upstart into a Visionary Leader _2.jpg

WHEN STEVE RETURNED to America in the fall of 1974, he landed back at Atari, mainly doing hardware-related troubleshooting tasks for Nolan Bushnell’s pioneering—and poorly managed—company. Atari was such a loose, strange organization that Jobs could still comfortably disappear for a couple of weeks to pick apples at Robert Friedland’s orchard and not get fired, or even be missed, really. Meanwhile, Woz was working at Hewlett-Packard, in a safe, well-paying, but not particularly challenging gig. Nothing about Jobs’s life at this time would have suggested that he would achieve extraordinary success in business, computer technology, or anything else, for that matter. But unbeknownst even to himself, Steve was about to begin the real work of creating his life. In the next three years, he would morph from a scruffy, drifting nineteen-year-old into the cofounder and leader of a revolutionary new American business.

Steve was blessed to live at a moment that was ripe and ready for someone with his talents. It was an era of change on so many fronts, and especially in the world of information technology. In the 1970s, big machines called mainframes defined computing. Mainframes were enormous, room-sized computing systems sold to customers like airlines, banks, insurers, and large universities. The programming required to get a result—say, to calculate a mortgage payment—was beyond cumbersome. At least it seemed that way for anyone studying computer science in college, which is where most of us had our introduction to making a mainframe actually do something. After settling on the problem you wanted the machine to solve, you would painstakingly write down, in a programming language like COBOL or Fortran, a series of line-by-line, step-by-step instructions for the exact, logical process of the calculation or analytical chore. Then, at a noisy mechanical console, you would type each individual line of the handwritten program onto its own rectangular “punch card,” which was perforated in such a way that a computer could “read” it. After meticulously making sure the typed cards were arranged in the right order—simple programs might need a few dozen cards that could be held by a rubber band, while elaborate programs could require reams that would have to be stacked carefully in a special cardboard box. You would then hand the bundle to a computer “operator,” who would put your deck in the queue behind dozens of others to be fed into the mainframe. Eventually, the machine would spit out your results on broad sheets of green-and-white-striped accordion-folded paper. More often than not, you would have to tweak your program three, four, or even dozens of times, to get the results you were looking for.

In other words, computing in 1975 was anything but personal. Writing software was a laborious and slow process. The big, expensive, high-maintenance computers were manufactured and sold, appropriately enough, by a handful of big, bureaucratic technology companies. As it had been since the 1950s, the computer industry in 1975 was dominated by International Business Machines (IBM), which sold more mainframes than all of its other competitors put together. In the 1960s, those also-rans were called “the Seven Dwarfs,” but during the 1970s, both General Electric and RCA gave up, leaving a stubborn group of manufacturers referred to as the “BUNCH”—an acronym for Burroughs, Univac, NCR, Control Data Corporation, and Honeywell. Digital Equipment Corporation (DEC) dominated an upcoming segment of somewhat cheaper and less-powerful “minicomputers” used by smaller businesses, and by departments within larger corporations. There was one outlier at each end of the cost spectrum. At the high end, Cray Research, founded in 1972, sold so-called supercomputers used primarily for scientific research and mathematical modeling. These were the most expensive computers of all, costing well north of $3 million. On the cheap end of the scale was Wang, which was founded in the early 1970s and made a task-specific machine known as a “word processor.” It was the closest thing to a “personal” computer that existed, since it was designed for a single person to use in the preparation of written reports and correspondence. The computer industry then was primarily an eastern establishment. IBM was headquartered in the bucolic suburbs north of New York City; DEC and Wang were based in Boston. Burroughs was headquartered in Detroit, Univac in Philadelphia, NCR in Dayton, Ohio, and Cray, Honeywell, and Control Data all hailed from Minneapolis. The only notable early computer maker in Silicon Valley was Hewlett-Packard, but its primary business was making scientific test and measurement instruments and calculators.

This industry bore little resemblance to today’s entrepreneurial, innovative, and rapidly iterative tech world. It was a stodgy enterprise most similar to the capital equipment business. Its universe of potential customers could be counted in the hundreds, and these were companies with deep pockets whose demands focused more on performance and reliability than on price. No surprise, then, that the industry had become cloistered and a little complacent.

Out in California a significant number of the people who would have a hand in flipping that industry on its head started meeting regularly as a hobbyist group called the Homebrew Computer Club. Their first get-together occurred shortly after the publication of the January 1975 issue of Popular Electronics, which featured a cover story about the Altair 8800 “microcomputer.” Gordon French, a Silicon Valley engineer, hosted the gathering in his garage to show off an Altair unit that French and a buddy had assembled from the $495 kit sold by Micro Instrumentation and Telemetry Systems (MITS). It was an inscrutable-looking device, about the size of a stereo component amplifier, its face sporting two horizontal arrays of toggle switches and a lot of blinking red lights. The clunky thing couldn’t do too much, but it demonstrated the feasibility of having a computer to yourself, one that you could program twenty-four hours a day if you wanted to, without having to wait in line or punch any cards. Bill Gates read the article, and shortly thereafter famously dropped out of Harvard to start a little outfit called Micro-soft to design software programming languages for the Altair.