( Contact )

Get in touch

If you're a business seeking automation for any part of your workflow or interested in integrating my services into your operations,  reach out.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
( Personal Reflection )

The Paradox of Ignorance: What Do You Believe That May Not Be True?

Chris Cundari
March 21, 2017
11 min read
Estimated Reading Time: 11 minutes
“The difficulty lies, not in the new ideas, but in escaping from the old ones, which ramify, for those brought up as most of us have been, into every corner of our minds.” // John Maynard Keynes

Last August, I was taking a cab in Berlin when I struck up a conversation with my driver who grew up in East Berlin. Our conversation went all over the place but eventually settled to me asking, “How was life before the demolition of the Berlin Wall?” Growing up in a democratic society, I had the impression that everyone would believe that the Wall that divided Germany was a terrible thing, that its demolition was a terrific event. But I was wrong. He explained how life was so much better under communism – he didn’t have to compete for business, basic necessities were given to him, and there was overall contentment with his life. Now there was too much competition and he could barely make rent money.

I have never met someone who was pro-communism until that day. That is to say, I have never met someone who had such different fundamental beliefs to my own.

When understanding our own beliefs, everything seems to make sense internally. From the religion you worship to the political party you support to the best diet to optimize health, this holds true. For the most part, we imagine we have a firm grip on reality.

But externally, everything is a mess. People believed (and some still do) that the Earth was flat, in UFO sightings, and that pizza is a vegetable. We think that if those with opinions contrary to our own had the same knowledge we had, that they would change their beliefs. Yet when they do not, we label them as “crazy” or “stupid”.

The people who own these beliefs have the same biology as you and me. Whatever is at fault for their delusions are therefore working in our mind as well. If this is the case, what makes us the exception in that all our beliefs are true? Clearly, reasoning and understanding is much more of an enigma than we have come to presume.

Herein lies the key question: what do you believe that may not be true?

To tackle this imposing inquiry, we must first dive in to the fallibility of something we believe to have a strong understanding of, and use to observe the reality around us – our eye sight.

What You See May Not Be True

Optical illusions teach us about the limitations of our visual perceptions. Take a look at the classic “Adelson’s Checker Shadow Illusion” below.

illusion_before.png

Ask yourself: Which square is brighter, the one denoted by “A” or “B”? If you’re like me and the rest of the world, “B” looks brighter. Now look at the image below, where I blacked-out all other squares except for A and B.

illusion-after.png

Notice that the two squares have identical luminance. In the first image, our eyes are taking the environmental context into account, making them look different. What is most interesting is that even if you are aware that they are the same colour, you can never “see” that they are the same in the original image. Awareness of the answer does not change your perception.

It is important to note this is a shared illusion, something we all experience because we all own the same visual flaws.

Like your eyes, your mind is at mercy to what it is fed and is shaped through the lenses of your perceptions. If your eyes are this fallible, what makes you think that your thinking is any different?

With how we think, we all possess shared falsehoods, or shared illusions. The three we will touch upon are confirmation bias, the illusion of socially-valued beliefs, and the illusion of explanatory depth.

Illusion 1: Confirmation Bias

The first principle is that you must not fool yourself and you are the easiest person to fool.” // Richard Feynman

Within the brain, there are thousands of lower-level processes acting towards one goal – not truth or understanding but survival and energy conservation. To achieve this, the brain uses heuristics to better process its environment with less effort. Consequently, we land on a lot of wrong answers and stay there not because they are correct but because it’s physically easier; we hold incorrect beliefs out of laziness.

Let’s explore confirmation bias, a major heuristic, to see our laziness in action.

If you and your peers think President Trump is “crazy” or “delusional,” you may be right in your assumptions. You may feel that the actions he is taking or the way he deceives the public is by no means acceptable. But stop to think about how you formed these opinions. Most likely you at some point had watched a democratic news station, perhaps read articles posted by members of your community who are anti-Trump, and discussed how idiotic he is with your friends in a social setting. Have you ever tried to dig deep into the other side of the argument? If he was elected and you think he is “crazy” or “stupid”, does that mean that all the people who voted for him – 63 million people – are also “crazy” or “stupid?” Doesn’t this sound awfully familiar to the point earlier that the same delusions in their heads are working in ours as well?

Confirmation bias causes us to rarely seek disconfirming evidence out of the desire to feel like we are right.

Why?

Firstly, being right means we don’t have to think in order to change our minds, which means the brain can conserve more energy. Secondly, you experience genuine pleasure – a rush of dopamine – when processing information that supports your beliefs. This causes us to give special weight to information that allows us to come to the conclusion we want to reach.

One study collected a group of students, half pro-capital punishment, and half against it. Each group was (unknowingly) given false data that both equally supported and questioned their beliefs. At the end of the experiment, the students were asked once again about their views. Those who’d started out pro-capital punishment accepted the data that favored their opinion as more creditable were now even more in favor of it; those who’d opposed it were even more hostile. They chose to absorb only the data that confirmed their beliefs despite the opposing evidence.

Similar to the checker shadow illusion, awareness of the data did not alter their perceptions.

The sheer scale of the internet allows anyone to find evidence, credible or not, to then cherry-pick the opinion they want to hold. Confirmation bias plays an even greater role when we consider the social value of beliefs.

Illusion 2: Social Value of Beliefs

Think of beliefs like clothing. Clothes are both functional and social. Functionally, clothes keep us warm, help us carry things, and maintain privacy. But since they are visible to others, clothes have social value. They allow us to identify with various groups, exhibit creativity, and to signal our profession and social status.

Beliefs are also both functional and social. Functionally, beliefs help us navigate our daily lives, for example, the functional belief that your meeting at work starts at 10:00 am. But many of our beliefs are also social in that others see and react to them, for example, the religion you associate with.  Like our clothes, these social beliefs can also allow us to identify with groups, exhibit creativity, and signal our profession and social status.

A functional-based belief is not necessarily true and a social-based belief is not necessarily false. How they differ is the incentives provided in believing each.

A functional belief incentive is to be correct. If you thought a movie played at 8:00 pm and your spouse informs you it actually starts at 7:00 pm, you will be happy that you were proven wrong because you want to know the correct movie time.

For social beliefs, the incentive is not to be correct, but to be accepted. For example, there is more social clout in holding a political opinion than there is pragmatic value since the party you attach yourself to alters how others perceive you. There is little incentive to prove that your opinion is true and thus there is no reason to seek the truth, only to settle on a truth that is socially acceptable. This leads to a formation of cognitive tribalism, where your opinions are not unique and informed but shared and socially supported.

If confirmation bias causes us to seek information that reiterates our past beliefs and socially-valued beliefs are not incentivized to be correct, it is safe to assume that you and I hold beliefs that are not true.

But which ones?

No longer is the chaos external, but internal as well. We will see this in action with our final illusion: the illusion of explanatory depth.

Illusion 3: Illusion of Explanatory Depth

“One of the great challenges in life is knowing enough to think you’re right but not enough to know you’re wrong.” // Neil deGrasse Tyson

Imagine you were asked to draw a picture of an airplane. To get its basic shape, you may draw an edgeless rectangle with a tail and a nice hoop for a wing. As the details are outlined – a cockpit, passenger windows, the engine – the picture of the plane becomes higher in resolution. In this experiment, you will still have a low-resolution drawing yet can comprehend that it’s an airplane. If it was built, in no way would you expect it to fly.

If you were asked to build a real model of an airplane, you would have to increase your focus and concentration on every single element of the entity. This takes a large amount of cognitive effort, and our brains are lazy.

Drawing an airplane is a lot like the beliefs we hold. You must spend a tremendous amount of time and effort to build your model of reality into a high level of resolution. Since this is too much of a cognitive load, you default to one-bit answers to feel knowledgeable because not knowing makes you feel like you’re in chaos.

Knowledge derived from one-bit answers leads to the illusion of explanatory depth.

Now, this illusion does not mean that the opinion is not valid. What it tells us is that you should let your opinions be more malleable because of how much you most likely do not know. Like drawing an airplane, unless you have been dedicating your life to a certain topic to form your opinion, it is highly probable that there is still more to learn.

Avoiding the Paradox of Ignorance

The three illusions above lay the foundation for the paradox of ignorance: being ignorant means to lack knowledge but the only way to gain knowledge is to be aware of your ignorance.

In writing, I have come to realize how vulnerable we all are to our cognitive limitations. Though they can never be removed, in order to be more aware of these illusions, there are three takeaways I can offer.

Firstly, to avoid confirmation bias and the negative effects of socially-held beliefs, seek only the opinions contrary to your own. When Charles Darwin was conducting his life-long research for “The Origins of Species”, anytime something went against what he thought, he didn’t ignore it but instead paid extra attention to it. In doing so, he was able to develop one of the most important discoveries in science: evolution. Take from Mr. Darwin and seek disconfirming evidence.

Secondly, look to restructure how you educate yourself. Our educational institutions have raised us to learn facts and not how to think. It is no wonder why our cognitive limitations have such a grip over our beliefs. One way to do this is to build critical thinking and construct your ideas of reality through the scientific method, as it is a system that corrects people’s natural inclinations and biases. This method will aid in turning your facts into knowledge, knowledge into understanding, and understanding into insight.

Finally, act only within your circle of confidence and be more intellectually humble. Avoid repeating falsehoods for the sake of pretending to know. This will open you up to not being afraid to be wrong which will in turn make you more insightful.

The Island of Knowledge – Why Reality Will Remain a Mystery

Since our instruments to which we view the world constantly evolving, ultimate truth is elusive. An exert from Marcelo Gleiser’s “The Island of Knowledge” states:

“As the Island of Knowledge grows, so do the shores of our ignorance—the boundary between the known and unknown. Learning more about the world doesn’t lead to a point closer to a final destination—whose existence is nothing but a hopeful assumption anyways—but to more questions and mysteries. The more we know, the more exposed we are to our ignorance, and the more we know to ask.”

The more we know the more we realize we do not know.

There is so much complexity in nature. As we strive toward knowledge, we must understand that we are, and will remain, surrounded by mystery. We cannot pursue to know everything, but rather pursue to be less wrong about the things you do not know. In the face of our cognitive illusions, be the wave of curiosity in the sea of ignorance, suppress your ego, and explore the natural mystery of the world.

Back to blog
Get a complimentary consultation