Why We Ignore the Obvious at Our Peril
“With deft prose and page after page of keen insights, Heffernan shows why we close our eyes to facts that threaten our families, our livelihood, and our self-image–and, even better, she points the way out of the darkness.”
-Daniel H. Pink
Many of the most horrific crimes against humanity were committed out in the open in front of lots of ordinary citizens. What factor in human nature allows people to live in denial, ignoring problems within their own homes, communities, and religious and governmental institutions? How could executives at Enron, W.R. Grace, Countrywide and BP take actions that caused people to lose their lives or life savings?
“When we are willfully blind, it is in the presence of information that we could know and should know, but don’t know because it makes us feel better not to know.”
Many forces stop people from seeing the truth, asking questions and taking action. Sometimes, a little denial doesn’t hurt. Yet, when the knowledge that people could have prevented a tragedy compounds the tragedy itself, a blind eye is often the culprit. Neuroscience explains some aspects of the brain that cause “willful blindness,” and you also can blame the foibles of human nature.
Humans often fall in love with people who are like themselves. The online dating site eHarmony developed a questionnaire that identifies applicants’ traits in order to match them with similar people. The Internet radio site Pandora registers what kind of music you like and gives you more of it. The site’s founder Tim Westergren acknowledges, “Pandora is about broadening your selection – but narrowing your taste.”
“Individuals, singly and in groups, are equally susceptible to willful blindness; what makes organizations different is the scale of damage they can cause.”
People gravitate toward hiring and working with those who make them feel comfortable. Teams of people with varied experiences and viewpoints are more innovative and productive than homogenous groups. Despite decades of corporate diversity programs and equality legislation, institutional sameness prevails. Research shows that most Americans settle in “communities of like-mindedness,” living among people who share their interests, politics, values and tastes. Even in socially diverse communities, humans tend to stay with their own.
“We believe in ourselves, at least in part, because others believe in us and we depend mightily on their belief.”
Users select websites that reinforce their opinions, and that pattern holds for the media they watch and read. When people read conflicting data, they tend to reject it or pay less attention to it. Rather than broadening their experience, people cluster in like-minded silos that strengthen their existing viewpoints.
“All of us want to bury our heads in the sand when taxes are due, when we have bad habits we know we should change or when the car starts to make that strange sound.”
Congregating in person or online with “affinity groups” offers some advantages. For example, if you’re a cycling enthusiast, you value the opinions of other cyclists. You don’t want to ride every bike or research every trail to make a decision. These shortcuts become dangerous only when you abdicate your thinking to others. Bernie Madoff fooled depositors for years because they trusted their investment community and never double-checked information they received.
Love’s Blind Spot
Love tints people’s perceptions with a rosy hue. You see the people you love as smarter, funnier and better looking than, perhaps, they really are. Reciprocally, the unconditional love they feel for you boosts your confidence and self-belief. Love becomes harmful when the relationship deteriorates and you hang on, making excuses and downplaying your worries. Consider the character Carmela Soprano from the television mob series, The Sopranos. Her husband Tony is a homicidal philanderer. Acknowledging his true nature would threaten Carmela’s security and identity. Her denial is one example of why infidelity often takes people by surprise.
“We mostly admit the information that makes us feel great about ourselves, while conveniently filtering out whatever unsettles our fragile egos and most vital beliefs.”
The effort to turn a blind eye to a home’s harsh realities is even more insidious in families that are damaged by child abuse. One or both parents are accountable for 78% of child-abuse cases. Protecting the happy-family illusion comes at an extremely high cost that escalates when fortified by institutional or community support. In some communities, the love of a church caused congregants to turn a blind eye to priests’ routine abuse of children. The church hierarchy enabled the abuse by keeping silent and moving abusive priests from parish to parish.
Mind Does Matter
Once belief systems become firmly entrenched, people still protect them even if it means turning a blind eye to contrary evidence. Several neural network experiments show that human brains don’t like conflict. They weed out distressing input or information that contradicts ingrained beliefs and, instead, they accept corroborating data. Trying to process conflicting views provokes “cognitive dissonance” – that is, very uncomfortable mental upheaval.
“The downside to the cozy feeling of togetherness is that everyone’s less vigilant and more vulnerable to bad and dangerous decisions.”
In the 1950s, Alice Stewart conducted research demonstrating a clear correlation between X-rays of pregnant women and incidences of leukemia in their offspring. Yet doctors continued to X-ray pregnant mothers for another 25 years. When Stewart first published her research, the medical community, which had invested heavily in X-ray equipment, was loath to change. Dedicated, intelligent doctors continued to use X-rays, ignoring Stewart’s research in order to protect an entrenched ideology.
The Gorilla in the Room
A well-known experiment conducted by Dr. Daniel Simons at Harvard showed that when researchers asked subjects to count the number of times a basketball was passed among players in a video, most participants were so intent on counting the players’ passes that they missed seeing a student in a gorilla suit walk across the court. Humans can process a limited amount of information at a time. People see what they expect to see and disregard the unanticipated. This explains the danger of texting or talking on the phone while driving. Your mental resources are limited. Multitasking is a fallacy. No one does a good job on any one task while multitasking.
“It’s a truism that love is blind; what’s less obvious is just how much evidence it can ignore.”
Since the early 20th century, researchers have shown that optimal productivity occurs during a 40-hour workweek. When people work more hours, they become less focused and more prone to errors. They spend extra work hours correcting mistakes they might have avoided. A tired mind is less able to process information, think through problems or raise questions. Sleep deprivation creates a condition called “resource depletion.”
“Social support makes it easier to do things or believe in ideas that would feel a lot more uncomfortable if we were on our own.”
Employees at BP’s Texas City site were overworked and understaffed when an erupting tower killed 15 people in “one of the worst industrial accidents in American history.” Investigators found that on the day of the incident, operators had worked 37 days in a row and hadn’t slept for more than five hours at a time in weeks.
See What You Want to See
Being oblivious, avoiding conflict and ignoring problems is easier than facing reality. The status quo is much more comfortable than having to think through a dicey situation and transform your approach to it. However, such selective ignorance, deep-rooted in the human psyche, is life threatening. For example, people still use electric tanning beds despite incontrovertible evidence of their dangers. In Libby, Montana, townspeople ostracized activist Gayla Benefield when she tried to hold W.R. Grace accountable for suppressing information showing that mineworkers suffered from asbestos toxicity. Even as their family members and neighbors died of lung disease, many Libby inhabitants refused to accept the truth in the face of irrefutable proof.
The Dark Side of Obedience
Eminent psychologist Stanley Milgram conducted several studies about obedience. He aspired to understand why people obey orders even when asked to do something morally objectionable. He found that people would abandon their ideals in order to fit into a group and prioritize meeting the group’s expectations over evaluating the task at hand. Thereby, soldiers don’t question whether to drop a bomb. Rather, they try to drop the bomb to the best of their abilities. Although bowing to authority is human nature, blindly following orders relieves individuals of accepting responsibility for their actions.
“It is so much easier to be blind to the consequences of your actions when you don’t have to see them play out.”
Organizational culture exerts a similar pressure to conform to your peers’ conduct, language and behaviors. When a company’s culture is corrupt, individuals within the organization find it difficult to buck the tide. That’s what Walt Pavlo experienced at MCI Telecommunications, where the highly competitive culture blinded him to immoral practices – such as laying people off for a few months to avoid paying salary and benefits – and illegal practices, such as counting customer-signed promissory notes as assets, knowing customers would never pay. Pavlo was caught, tried and convicted as a white-collar criminal. He wonders to this day how he could have committed illegal acts when he was a moral person who knew better. He succumbed to the powerful human urge to conform and belong.
See No Evil
A famous experiment demonstrated that individuals are more likely to respond to an emergency when they are alone than when they are with other people. This is the “bystander effect.” In a group, people’s moral selves conflict with their social selves, causing paralyzing indecision. In organizations, the bystander effect diffuses responsibility. People may know that a process is wrong or that their co-workers are acting dangerously or illegally, but they want someone else to do something about it.
“That willful blindness is so pervasive does not mean that it is inevitable.”
After a while, neutrality and inaction become collusion – an implied endorsement of wrong behavior. Wide-scale evil as seen during the Holocaust or repeated, destructive actions by corporations such as Enron or Countrywide mortgage lenders require the participation of thousands of people who stay “willfully blind” to the harmful effects of their actions or inactions.
“Nations, institutions, individuals can all be blinded by love, by the need to believe themselves good and worthy and valued.”
Corporate mechanisms that can enable willful blindness include outsourcing, dividing labor into departments and isolating powerful people. Being physically distant makes it easier to sidestep accountability. BP executives in St. James’s Square in London were a continent and an ocean away from the men working “close to the valve” at BP in Texas City. Not seeing the consequences of your actions makes you more able to turn a blind eye.
“You cannot fix a problem that you refuse to acknowledge.”
Money is a great motivator. That’s why companies can energize employees by offering bonuses, overtime and other monetary incentives. Money also induces negative social behaviors and produces unintended consequences. Letting money reduce people to commodities is dangerous because human beings can become expendable in the face of corporate blindness, as seen when automaker Ford calculated the costs of reinforcing the subcompact Pinto’s back panel versus the cost of lost lives, or when W.R. Grace chose bankruptcy to avoid compensating the people of Libby.
Being the person who speaks out, asks unwelcome questions or challenges prevailing norms isn’t easy. Notable whistleblowers include Robert McNamara, who leaked the Pentagon Papers, and Sheila Bair, who shut down Fremont Investment and Loan for its mortgage-lending abuses. Take these steps to protect against dangerous blind spots:
- Recognize the homogeneity in your life and seek the company of people with different backgrounds and viewpoints.
- Become aware of your biases, both conscious and subconscious, and adjust them.
- Understand your cognitive limits and avoid situations that add stress, such as missing sleep or working long hours.
- Recognize that multitasking undermines your productivity and can be dangerous in some cases, such as texting while driving.
- Question the “big ideas” – ideologies that seem too entrenched to challenge.
- Promote debate, reward those who challenge you and welcome differences of opinion.
- Seek objective third opinions from people outside your company.
- Listen to the little voice in your head telling you that something is not right.