- Home
- Matthew Syed
Black Box Thinking Page 9
Black Box Thinking Read online
Page 9
• • •
In 2005 the lawyers representing Juan Rivera applied for a DNA test. At the time, he had been in jail for almost thirteen years. Rivera was excited at the prospect of a method that could finally establish the truth about what had happened on that warm night in Waukegan, Illinois, more than a decade earlier.
On May 24 the results came back. It showed that Rivera was not the source of the semen found inside the corpse of Holly Staker. He was, at first, overwhelmed. He couldn’t quite take in the fact that people would finally see that he was innocent of such a horrendous crime. He told his lawyers that it felt like he was “walking on air.” He celebrated that night in his cell.
But this wasn’t the end of the story. In fact, it wasn’t even the beginning of the end. Rivera would spend another six years in jail. Why? Think back to the police. Were they going to accept their mistake? Were the prosecutors going to hold up their hands and admit they had gotten it wrong? Was the wider system going to accept what the DNA evidence was revealing about its defects?
Perhaps the most fascinating thing about the DNA exonerations is not how they opened the cell doors for wrongly convicted prisoners, but how excruciatingly difficult they were to push through; about how the system fought back, in ways both subtle and profound, against the very evidence that indicated that it was getting things wrong.
How did this happen? How does failure-denial become so deeply entrenched in human minds and systems? To find out we will take a detour into the work of Leon Festinger, arguably the most influential sociologist of the last half-century. It was his study into a small religious cult in Chicago that first revealed the remarkable truth about closed-loop behavior.
III
In the autumn of 1954, Festinger, who at the time was a researcher at the University of Minnesota, came across an unusual headline in his local newspaper. “Prophecy from Planet Clarion Call to City: Flee That Flood” it read. The story was about a housewife named Marian Keech* who claimed to be in psychic contact with a godlike figure from another planet, who had told her that the world would end before dawn on December 21, 1954.
Keech had warned her friends about the impending disaster and some left their jobs and homes, despite resistance from their families, to move in with the woman who had, by now, become their spiritual leader. They were told that true believers would be saved from the apocalypse by a spaceship that would swoop down from the heavens and pick them up from the garden of Keech’s small house in suburban Michigan, at midnight.
Festinger, an ambitious scientist, glimpsed a rare opportunity. If he could get close to the cult, perhaps even infiltrate it by claiming to be a believer, he would be able to observe how the group behaved as the apocalyptic deadline approached. In particular, he was fascinated by how they would react after the prophecy had failed.
Now, this may seem like a rather obvious question. Surely the group would go back to their former lives. They would conclude that Keech was a fraud who hadn’t been in touch with any godlike figure at all. What other conclusion could they possibly reach if the prophecy wasn’t fulfilled? It is difficult to think of a more graphic failure, both for Keech and for those who had put their trust in her.
But Festinger predicted a different response. He suspected that far from disavowing Keech, their belief in her would be unaffected. Indeed, he believed they would become more committed to the cult than ever before.
In early November, Festinger and his colleagues contacted Keech by phone and went about trying to gain her confidence. One of them invented a story about having had a supernatural experience while traveling in Mexico; another pretended to be a businessman who had become intrigued by the newspaper story. By late November they had been granted access to Keech’s cult and were ensconced in her house, observing a small coterie of people who believed that the end of the world was imminent.
Sure enough, as the deadline for the apocalypse passed without any sign of a spaceship (still less a flood), Festinger and his colleagues watched the group in the living room (Keech’s husband, who was a nonbeliever, had gone to his bedroom and slept through the whole thing). At first the cult members kept checking outside to see if the spaceship had landed. Then, as the clock ticked past midnight, they became sullen and bemused.
Ultimately, however, they became defiant. Just as Festinger had predicted, the faith of hard-core members was unaffected by what should have been a crushing disappointment. In fact, for some of them, their faith seemed to strengthen.
How is this possible? After all, this was an unambiguous failure. Keech had said the world would end, and that a spaceship would save true believers. Neither had happened. The cult members could have responded by altering their beliefs about the supernatural insights of Keech. Instead, they altered the “evidence.”
As Festinger recounts in his classic book When Prophecy Fails,14 they simply redefined the failure. “The godlike figure is so impressed with our faith that he has decided to give the planet a second chance,” they proclaimed (I am paraphrasing only a little). “We saved the world!” Far from abandoning the cult, core members went out on a recruitment drive. As Festinger put it: “The little group, sitting all night long, had spread so much light that God had saved the world from destruction.” They were “jubilant.”
Now, this is important not because of what it tells us about cults, but because of what it reveals about all of us. Festinger showed that this behavior, while extreme, provides an insight into psychological mechanisms that are universal. When we are confronted with evidence that challenges our deeply held beliefs we are more likely to reframe the evidence than we are to alter our beliefs. We simply invent new reasons, new justifications, new explanations. Sometimes we ignore the evidence altogether.
Let us move away from religious cults for a moment and take a look at something as everyday as politics. Specifically, let’s take the Iraq War. In the buildup to the conflict, much of the justification centered on Iraq’s alleged possession of weapons of mass destruction (WMD). The idea that WMD had been stockpiled by Saddam Hussein was used by leaders on both sides of the Atlantic as a core part of the case for action. The problem was that, as early as 2003, it was clear that there were no WMD in Iraq.
This was not an easy thing for those who had endorsed the policy to accept. It implied a failure of judgment. Many had spent months arguing for the intervention and backing the leaders who had pushed it through. They strongly believed that military action was the right course. The lack of WMD didn’t show that the intervention was necessarily a mistake, but it did, at the very least, weaken its legitimacy, given that it had been a central plank of the original justification.
What is important for our purposes, however, is not whether the Iraq intervention was right or wrong, but how different people responded to the new evidence. The results were startling. According to a Knowledge Networks poll published in October 2003,15 more than half of Republicans, who had voted for George W. Bush, simply ignored it. They said they believed that weapons had been found.
As the survey’s director put it: “For some Americans, their desire to support the war may be leading them to screen out information that weapons of mass destruction have not been found. Given the intensive news coverage and high levels of public attention to the topic, this level of misinformation [is remarkable].”
Think about that for a moment. The evidence of the lack of WMD had vanished. These people had watched the news, seen the stories about the absence of WMD, but then managed to forget all about it. Democrats, on the other hand, were perfectly aware of the lack of WMD. Many of those who opposed the war had it seared on their memories. But more than half of Republicans? Nope, they couldn’t remember it at all.
“Cognitive dissonance” is the term Festinger coined to describe the inner tension we feel when, among other things, our beliefs are challenged by evidence. Most of us like to think of ourselves as rational and smart. We reckon we ar
e pretty good at reaching sound judgments. We don’t like to think of ourselves as dupes. That is why when we mess up, particularly on big issues, our self-esteem is threatened. We feel uncomfortable, twitchy.
In these circumstances we have two choices. The first is to accept that our original judgments may have been at fault. We question whether it was quite such a good idea to put our faith in a cult leader whose prophecies didn’t even materialize. We pause to reflect on whether the Iraq War was quite such a good idea given that Saddam didn’t pose the threat we imagined.
The difficulty with this option is simple: it is threatening. It requires us to accept that we are not as smart as we like to think. It forces us to acknowledge that we can sometimes be wrong, even on issues on which we have staked a great deal.
So, here’s the second option: denial. We reframe the evidence. We filter it, we spin it, or ignore it altogether. That way, we can carry on under the comforting assumption that we were right all along. We are exactly right on the money! We didn’t get duped! What evidence that we messed up?
The cult members had a lot riding on Keech. They had left their jobs and risked the anger of their families. They had been ridiculed by their neighbors, too. To admit they were wrong was not like admitting they had taken a wrong turn on the way to the supermarket. Their credibility was on the line. They were highly motivated to believe that Keech was the guru she claimed to be.
Think how shaming it would have been to walk out of that house. Think of how excruciating to admit they had put their trust in a crank. Doesn’t it make sense that they were desperate to reinterpret the failure as a success in disguise (a very good disguise!), just as it was easier for many Republicans to edit out the lack of WMD than confront the facts full-on? Both mechanisms helped to smooth out the feelings of dissonance and retain the reassuring sense that they are smart, rational people.
In one experiment by the leading psychologist Elliot Aronson and his colleague Judson Mills, students were invited to join a group that would be discussing the psychology of sex.16 Before joining the group the students were asked to undergo an initiation procedure. For some students this was highly embarrassing (reciting explicit sexual passages from racy novels) while for others it was only mildly embarrassing (reading sexual words from a dictionary). The students were then played a tape of a discussion taking place between members of the group they had just joined.
Aronson had staged the discussion so that it was totally boring. So boring, in fact, that any unbiased person would have been forced to conclude that it was a mistake to join up. The members discussed the secondary sexual characteristics of birds: their plumage, coloring, etc. They droned on and on. Many didn’t even know their material, kept hesitating, and failed to reach the end of their sentences. It was utterly tedious.
At the end of the tape the students were asked to rate how interesting they found the discussion. Those who had undergone the mild initiation found it boring. Of course they did. They could see the discussion for what it was. They were irritated by a member who admitted that he hadn’t done the reading on the mating rituals of a breed of rare bird. “What an irresponsible idiot!” they said. “He didn’t even do the basic reading! He let the group down! Who’d want to be in a group with him!”17
But what about those who had undergone the highly embarrassing initiation? For them, everything changed. As Aronson put in his fascinating book (co-authored with Carol Tavris) Mistakes Were Made (but Not by Me): “. . . they rated the discussion as interesting and exciting and the group members as attractive and sharp. They forgave the irresponsible idiot. His candor was refreshing! Who wouldn’t want to be in a group with such an honest guy? It was hard to believe they were listening to the same recording.”
What was going on? Think about it in terms of cognitive dissonance. If I have put up with a lot to become a member of a group, if I have voluntarily subjected myself to acute embarrassment, I would have to be pretty stupid if the group turned out to be anything less than wonderful. To protect my self-esteem I will want to convince myself that the group is pretty damn good. Hence the necessity to talk it up, to reframe my perceptions in a positive direction.
None of this applies, of course, if the initiation is simple. If the group turns out to be a waste of time, one can say to oneself honestly, and without any threat to one’s self-esteem, “This place is not worth bothering with.” It is only when we have staked our ego that our mistakes of judgment become threatening. That is when we build defensive walls and deploy cognitive filters.
In a similar experiment led by the psychologist Charles Lord, volunteers were recruited who were either adamantly in favor of capital punishment or adamantly against it.18 Those in favor of capital punishment were the kind of people who shout at the TV when liberals argue for clemency, who regale their friends about the deterrent effects of capital punishment. Those against it were the kind of people who are horrified by “state-sanctioned murder,” and who worry about how it brutalizes society.
Lord gave these two groups two research projects to read. He made sure that both research projects were impressive. Both seemed to marshal well-researched evidence about the issue. The reports were robust and weighty. But here’s the thing: one report collated all evidence that called into question the legitimacy of capital punishment while the other articulated evidence that supported it.
Now, at the very least, you might have expected this contradictory evidence to have shown that capital punishment has arguments on both sides. You might have expected people on either side of the divide, reading all this, to have shifted a little closer together in their views. In fact, the opposite happened. The two groups became more polarized. Those in favor were more convinced of the logic of their position; ditto those against.
When asked about their attitudes afterward, those in favor of capital punishment said that they were deeply impressed with the dossier citing evidence in line with their views. The research, they said, was rigorous. It was extensive. It was robust. But the other dossier? Well, it was full of holes, shoddy, weak points everywhere. How could any self-respecting academic publish such rubbish?
Precisely the opposite conclusions were drawn by those who were against capital punishment. It was not just that they disagreed with the conclusions. They also found the (neutral) statistics and methodology unimpressive. From reading exactly the same material, the two groups moved even further apart in their views. They had each reframed the evidence to fit in with their preexisting beliefs.
Festinger’s great achievement was to show that cognitive dissonance is a deeply ingrained human trait. The more we have riding on our judgments, the more we are likely to manipulate any new evidence that calls them into question.
Now let us take these insights back to the subject with which we started this chapter. For it turns out that cognitive dissonance has had huge and often astonishing effects on the workings of the criminal justice system.
IV
On March 20, 1987, a young girl was attacked in her home in Billings, Montana. The Innocence Project, the nonprofit organization set up by two New York lawyers, Barry Scheck and Peter Neufeld, to help prisoners obtain DNA tests, describes the crime as follows:
The young girl was attacked by an intruder who had broken in through a window. She was raped . . . The perpetrator fled after stealing a purse and jacket. The victim was examined the same day. Police collected her underwear and the bed sheets upon which the crime was committed. Semen was identified on the underwear and several hairs were collected from the bed sheets.19
The police produced a composite sketch of the intruder based upon the description given by the victim and this led an officer to interview Jimmy Ray Bromgard, an eighteen-year-old who lived in the area and who resembled the sketch. Bromgard eventually agreed to participate in a line-up. He was picked out by the victim, but not with any real confidence. She said she was “60, 65 percent sure.”
When the case came to trial, most of the prosecution case was based on forensic evidence related to hair found at the crime scene. This evidence (it was later established) was largely concocted by the “expert” called by the prosecution. There were no fingerprints, and no physical evidence beyond the flawed hair testimony. Bromgard, who said he was at home asleep at the time of the crime, was found guilty and sentenced to forty years in prison.
The Innocence Project took up the case in 2000. A DNA test excluded Bromgard as the source of the semen found the victim’s underwear. This represented powerful evidence that he was not the perpetrator. “The original case was flimsy and the new evidence invalidated the conviction,” Barry Scheck told me. “The prosecutors could have dropped the case. They could have put their hands up and admitted they got the wrong man. But they didn’t.”
Or perhaps they just couldn’t.
Michael McGrath, the state prosecutor, responded to the new evidence by coming up with an interpretation that, in many ways, is even more novel than the explanation given by the cult for the failure of the Keech prophecy. As Kathryn Schulz explains in her excellent book Being Wrong, McGrath claimed that Bromgard might be a “chimera.”20 This is where a single person has two different blood types due to the death of a twin in the womb. It has only been reported around thirty times in history. It represented a reframing of the evidence of a quite breathtaking kind.
Sadly, for McGrath at least, further testing proved that Bromgard was not a chimera, but the prosecutor wasn’t finished yet. When Bromgard sued the state of Montana for wrongful conviction, Peter Neufeld from the Innocence Project came face-to-face with McGrath during the deposition. McGrath was still adamant that Bromgard was the prime suspect. Nothing seemed to prize him from that belief: no amount of persuasion, no amount of testimony, no amount of evidence.