The Blind Men and an Elephant is a parable from India about a group of blind men who are asked to describe an elephant based on touching it, but they’re only allowed to touch one part of the elephant (trunk, leg, ear). Because they describe the elephant based on their partial experience, their descriptions are vastly different (the person who touches the leg thinks the object is a mid-sized tree) and seemingly conflicting. In some versions of the story the men suspect that the others are being dishonest and they come to blows. The moral of the parable is that humans have a tendency to project their partial experiences as the whole truth and discount other people’s partial experiences.
Indeed, sometimes it’s hard to see all aspects of a complicated situation.
For instance, during the coronavirus pandemic:
-
- A friend of mine, a physician who works in the ICU, was on the front lines of the healthcare challenges. Every day he witnessed the pain, suffering, and death caused by the virus. Unsurprisingly, he was cautious about reopening society too quickly.
- Another friend, who owns a small business, has told me, “If the economy doesn’t open up in the next several weeks, my business will go under, my employees will lose their jobs, and I will lose everything.” He favored a quicker resumption of our normal lifestyle.
- A man in my church spent his entire career as an engineer at a major HVAC manufacturing company. He holds several patents in the industry. He advised us to not open the church for corporate worship (or any building with a controlled air system) until the pandemic is completely over because the circulation system will keep the virus in the air and the air filters won’t capture the molecules.
Who’s right?
Most issues, situations, and challenges are multifactorial, which renders simple, one-dimensional explanations incomplete. Additionally, our personal background, experiences, and circumstances influence how we perceive a situation and affects our opinion.
On January 28, 1986, the Space Shuttle Challenger broke apart 73 seconds into its flight, killing all seven crew members aboard. The Rogers Commission, a special commission appointed to investigate the accident, reported that a faulty O-ring had caused the implosion. But there was more to the story. The commission found NASA’s organizational culture and decision-making processes had contributed to the accident, with the agency violating its own safety rules. Since 1977 NASA managers had known that the design of the solid rocket boosters contained a potentially catastrophic flaw in the O-rings, but they had failed to properly address this problem. NASA managers also disregarded warnings from engineers about the dangers of launching in the low temperatures of that morning, and failed to adequately report these technical concerns to their superiors.
The seemingly straightforward explanation—a faulty O-ring caused the Space Shuttle Challenger to implode—was part of the truth but not all the truth.
It takes a lot of effort to balance our narrow perspectives. Here are some suggestions.
- Be skeptical of simple, unidimensional explanations.
- Seek out multiple perspectives and approaches.
- Respect the opinions and observations of others unless they are proven to be false.
- Have less confidence in your own observations and conclusions.
Economist John Stuart Mill said, “He who knows only his side of the case knows little of that.” Proactively pursue the myriad and often seemingly contradictory aspects of a multidimensional situation.