Изменить стиль страницы

Recognizing faces sounds like a very specific process, but Schooler has shown that the implications of verbal overshadowing carry over to the way we solve much broader problems. Consider the following puzzle:

A man and his son are in a serious car accident. The father is killed, and the son is rushed to the emergency room. Upon arrival, the attending doctor looks at the child and gasps, “This child is my son!” Who is the doctor?

This is an insight puzzle. It’s not like a math or a logic problem that can be worked out systematically with pencil and paper. The only way you can get the answer is if it comes to you suddenly in the blink of an eye. You need to make a leap beyond the automatic assumption that doctors are always men. They aren’t always, of course. The doctor is the boy’s mother! Here’s another insight puzzle:

A giant inverted steel pyramid is perfectly balanced on its point. Any movement of the pyramid will cause it to topple over. Underneath the pyramid is a $100 bill. How do you remove the bill without disturbing the pyramid?

Think about this problem for a few moments. Then, after a minute or so, write down, in as much detail as you can, everything you can remember about how you were trying to solve the problem—your strategy, your approach, or any solutions you’ve thought of. When Schooler did this experiment with a whole sheet of insight puzzles, he found that people who were asked to explain themselves ended up solving 30 percent fewer problems than those who weren’t. In short, when you write down your thoughts, your chances of having the flash of insight you need in order to come up with a solution are significantly impaired—just as describing the face of your waitress made you unable to pick her out of a police lineup. (The solution to the pyramid problem, by the way, is to destroy the bill in some way—tear it or burn it.)

With a logic problem, asking people to explain themselves doesn’t impair their ability to come up with the answers. In some cases, in fact, it may help. But problems that require a flash of insight operate by different rules. “It’s the same kind of paralysis through analysis you find in sports contexts,” Schooler says. “When you start becoming reflective about the process, it undermines your ability. You lose the flow. There are certain kinds of fluid, intuitive, nonverbal kinds of experience that are vulnerable to this process.” As human beings, we are capable of extraordinary leaps of insight and instinct. We can hold a face in memory, and we can solve a puzzle in a flash. But what Schooler is saying is that all these abilities are incredibly fragile. Insight is not a lightbulb that goes off inside our heads. It is a flickering candle that can easily be snuffed out.

Gary Klein, the decision-making expert, once did an interview with a fire department commander in Cleveland as part of a project to get professionals to talk about times when they had to make tough, split-second decisions. The story the fireman told was about a seemingly routine call he had taken years before, when he was a lieutenant. The fire was in the back of a one-story house in a residential neighborhood, in the kitchen. The lieutenant and his men broke down the front door, laid down their hose, and then, as firemen say, “charged the line,” dousing the flames in the kitchen with water. Something should have happened at that point: the fire should have abated. But it didn’t. So the men sprayed again. Still, it didn’t seem to make much difference. The firemen retreated back through the archway into the living room, and there, suddenly, the lieutenant thought to himself, There’s something wrong. He turned to his men. “Let’s get out, now!” he said, and moments after they did, the floor on which they had been standing collapsed. The fire, it turned out, had been in the basement.

“He didn’t know why he had ordered everyone out,” Klein remembers. “He believed it was ESP. He was serious. He thought he had ESP, and he felt that because of that ESP, he’d been protected throughout his career.”

Klein is a decision researcher with a Ph.D., a deeply intelligent and thoughtful man, and he wasn’t about to accept that as an answer. Instead, for the next two hours, again and again he led the firefighter back over the events of that day in an attempt to document precisely what the lieutenant did and didn’t know. “The first thing was that the fire didn’t behave the way it was supposed to,” Klein says. Kitchen fires should respond to water. This one didn’t. “Then they moved back into the living room,” Klein went on. “He told me that he always keeps his earflaps up because he wants to get a sense of how hot the fire is, and he was surprised at how hot this one was. A kitchen fire shouldn’t have been that hot. I asked him, ‘What else?’ Often a sign of expertise is noticing what doesn’t happen, and the other thing that surprised him was that the fire wasn’t noisy. It was quiet, and that didn’t make sense given how much heat there was.”

In retrospect all those anomalies make perfect sense. The fire didn’t respond to being sprayed in the kitchen because it wasn’t centered in the kitchen. It was quiet because it was muffled by the floor. The living room was hot because the fire was underneath the living room, and heat rises. At the time, though, the lieutenant made none of those connections consciously. All of his thinking was going on behind the locked door of his unconscious. This is a beautiful example of thin-slicing in action. The fireman’s internal computer effortlessly and instantly found a pattern in the chaos. But surely the most striking fact about that day is how close it all came to disaster. Had the lieutenant stopped and discussed the situation with his men, had he said to them, let’s talk this over and try to figure out what’s going on, had he done, in other words, what we often think leaders are supposed to do to solve difficult problems, he might have destroyed his ability to jump to the insight that saved their lives.

In Millennium Challenge, this is exactly the mistake that Blue Team made. They had a system in place that forced their commanders to stop and talk things over and figure out what was going on. That would have been fine if the problem in front of them demanded logic. But instead, Van Riper presented them with something different. Blue Team thought they could listen to Van Riper’s communications. But he started sending messages by couriers on motorcycles. They thought he couldn’t launch his planes. But he borrowed a forgotten technique from World War II and used lighting systems. They thought he couldn’t track their ships. But he flooded the Gulf with little PT boats. And then, on the spur of the moment, Van Riper’s field commanders attacked, and all of a sudden what Blue Team thought was a routine “kitchen fire” was something they could not factor into their equations at all. They needed to solve an insight problem, but their powers of insight had been extinguished.

“What I heard is that Blue Team had all these long discussions,” Van Riper says. “They were trying to decide what the political situation was like. They had charts with up arrows and down arrows. I remember thinking, Wait a minute. You were doing that while you were fighting? They had all these acronyms. The elements of national power were diplomatic, informational, military, and economic. That gives you DIME. They would always talk about the Blue DIME. Then there were the political, military, economic, social, infrastructure, and information instruments, PMESI. So they’d have these terrible conversations where it would be our DIME versus their PMESI. I wanted to gag. What are you talking about? You know, you get caught up in forms, in matrixes, in computer programs, and it just draws you in. They were so focused on the mechanics and the process that they never looked at the problem holistically. In the act of tearing something apart, you lose its meaning.”