Truth and Perception: Reconsidering What We Know For Sure

Tyler Sack
June 24, 2019

“I believe you understand what you think I said, but I’m not sure you realize that what you heard is not what I meant.” – Unknown, found on a chalkboard at the Weagamow Lake Reserve school.

This quote captures the reality that in any misunderstanding, the difference between what was said versus what was heard is not about arguing fact against fiction, but understanding that our perceptions, and therefore our versions of the truth can be different.

I have been considering whether truth is subjective or not when I first heard the Indian parable of the Blind Men and the Elephant. The story tells of a group of blind men who encounter an elephant for the first time, they each examine only one part of the elephant at the same time and describe aloud what they observe. One man feels the trunk and describes a snake, another holds the ear and describes a fan, the one who grabs the tusk describes a spear, another holding the tail compares it to a rope, and the one who is pushing on the side of the elephant concludes it is like a wall. There are two different endings to the fable, one in which the group of men resort to fighting because of their disagreement, and another where they listen to one another and switch positions to collaborate in order to observe the elephant in its entirety.

Like the blind men in the fable, we tend to believe that our own experiences and observations or ideas are correct and those who disagree with us are somehow misinformed or misguided. Even when others challenge our assumptions we aren’t open to their perspective, evidence, or reasoning because we don’t consider the possibility that what we know for sure might be wrong.

The reality is that the facts we believe might be partially true due to subjective experience. We also don’t consider that we might even be holding onto false beliefs. There are a few theories that collectively might explain how this happens. First, we need to consider how we determine truth; if it sounds believable then we probably accept it without scrutiny. Information that is easier to understand is almost always viewed positively, and is more easily recalled and repeated. The more we are exposed to an idea, the more likely we are to believe it to be true. This is referring to an idea’s fluency.

Second, once we form our beliefs, it’s likely that we overvalue our own perspectives because of a form of bias known as the IKEA effect. The theory suggests that we value something more when we’ve had a role in its creation. In these cases we not only overlook flaws that others might see, but are willing to pay more for an item because of the effort we put into it. We care about the things we create because they communicate to others that we are competent and capable. This bias might also mean that narcissism isn’t a disorder but a personality trait with a spectrum, meaning that we are all fixated on our own public perception to some degree.

Finally, we organize ourselves into social groups, and though we might not be conscious of it we tend to associate with people who are like ourselves. We signal to our peer group what values we find acceptable and unacceptable in our interactions. What is overlooked though, when we share a high level of trust with those we work and socialize with there is a risk that they will assume we are right as often as we do. That isn’t to say that we should tell them to trust us less, but to apply the same critical thinking to our shared beliefs as they would to with those they don’t associate with.

The ways that we acquire and hold onto false beliefs are similar to the resocialization techniques used in total institutions like the military or prison systems. They include programs and behaviours that foster the unlearning and relearning of values, beliefs, and norms for a specific outlook or purpose. How we are socialized determines our beliefs as well as how we acquire and share information. Repeating information that is easy to comprehend and remember is key to making something believable. You’re also more likely to accept the belief as fact when the peer group you trust validates or agrees with your perspective.

In closing, we need to keep in mind that although we might be certain that something is true, there is a chance it actually false or only partially true.  Learning other perspectives is key to knowing better, and that is what leads to better actions.

“If your ambition is to maximize short-term gain without regard to the long-term cost, you are better off not knowing the cost. If you want to preserve your personal immunity to the hard problems, it’s better never to really understand those problems. There is an upside to ignorance, and a downside to knowledge. Knowledge makes life messier. It makes it a bit more difficult for a person who wishes to shrink the world to a worldview.” – Michael Lewis, The Fifth Risk.