What do you know?: Cognitive Biases and the Nature of Reality

By Patrick Metzger

We feel as though we understand the world around us, but what do we really know to be true? Our own brains often get in the way of seeing the world clearly, but we can work to make ourselves better critical thinkers if we’re open to changing our mind.

Born Into a Strange World

As best we can understand it, we are born into the world perceiving a whirlwind of raw, undifferentiated reality.

Babies younger than 2 months don’t make clear distinctions between objects that are not moving. And because objects don’t exist for them, there is no real sense of cause and effect—everything causes everything; what is there to affect something else if it’s all one thing?

Our experience as babies essentially involves a field of color and sound that comes at us at various speeds with pleasant or unpleasant sensations—sometimes accompanied by our parent’s face, which we recognize before other distinct objects in the world. Babies, in fact, don’t make very much distinction between themselves and the world around them until around 6 months, and don’t develop fully-fledged self-consciousness until around 2 years. Phillipe Rochat suggests an intriguing 5 stage process for the development of self-awareness out of Confusion (Phase 0), with infants moving from Differentiation to Situation to Identification to Permanence and finally to Self-consciousness (or “meta” self-awareness). So for infants in the early stages of this development, the swirling whirling color field that surrounds them in many ways is them.

Gradually, as our parents point to things and reinforce the reward centers of our brain with delicious milk, it becomes clear that identifying things as separate objects is incredibly useful. Pretending that things are separate gives you wizard powers. You use your hands to say milk in sign language, and voila! Milk appears and your problem of hunger is temporarily solved. You say “da” or “ma” and your caretakers light up with joy and swaddle you with love and the entire universe is bliss.

As we grow, we internalize more and more complex distinctions between things. We are given heuristics that help us predict the behavior of the world around us. This gives us a sense of security. If you jump out of a swing, you’ll fall back down to earth because of a thing called gravity. Sad is a feeling you have inside you. Pavement is a thing outside you that scrapes your knee. The sun will rise again tomorrow morning in the east. Oh, and by the way—when we said “rise,” we meant that the earth is round, so it turns and the mountains gradually get out of the way of the sun, which is 93 million miles away, and we call that “morning.”

 

Our Brains Try to Make Sense of Chaos

These lessons go on for a while, and eventually we’re internalizing mathematical and social concepts and developing sophisticated notions about a world outside of us and a world inside of us. The sense that certain things are “important” develops over time, and the “important things” are highly relative. Different individuals, families, and cultures prioritize different important things, and, by the time we grow up, most people—as “fish in the water” of our own culture—do not question these assumptions about our conception of reality.

Biases arise when we look around us and assume that our interpretation of the world based on past experience is an exact match with how the world really is in a given moment. It’s a mode of thinking that Edmund Husserl called the “natural attitude,” and it is very often successful. Our intuition that a red stove coil will be hot before we touch it really helps us to not get burned. But certain other ways of thinking are substantially less connected with what’s really going on in the world. Did you know, for instance, that when we talk to someone while drinking warm coffee instead of iced, we are more likely to perceive them as generous and caring? Or take this video, for instance:

 

 

It seems silly to think of ourselves as capable of making such foolish mistakes, but we all make these and other comparable miscalculations many times within a single day. The “warm coffee / warm person” effect is an artifact of embodied cognition, and the video above is an example of selective perception.

Psychologists have been systematically researching and cataloguing these brain miscalculations, known as cognitive biases, since the work of Daniel Kahneman and Amos Tversky in the 1970s. In a way, though, Aristotle and other Greek philosophers were discovering similar things when, starting around 50 B.C., they identified lists of logical fallacies that arise when performing formal logic. For instance, “Post hoc ergo propter hoc” was an important discovery decoupling correlation and causation: Just because thing B happened after thing A doesn’t mean that A caused B. But cognitive biases are fallacies on a more embedded level of consciousness. We often don’t know they’re happening and sometimes can’t see past them even when we’re confronted with them in ourselves.

We all fall prey to cognitive biases. There’s no amount of training or unlearning that can totally wash them away from our brains. After all, as discussed above, it’s often very useful to see the world in simplified terms rather than trying to tackle the impossible problem of analyzing every moment down to the level of quantum details like particle-wave duality. But it’s worth becoming more aware of our tendency to misjudge the world, so that we can identify biases in ourselves and others and be open to changing our minds about things that might feel obviously true due to a false conception of reality.

There are innumerable cognitive biases—mostly because we’re still discovering them. When you look at a visual map categorizing many of them, the terrain of our minds shows itself to be more vast than we had ever previously imagined. A discussion of all the known cognitive biases could fill its own compendium, but looking at two rough categories of cognitive bias should suffice to illustrate how fundamentally they influence our lives.

fingers_in_ears_optimized

1. We think we’re right and that we have all the information we need

Many people have heard of confirmation bias. The basic idea is that we tend to seek out information that confirms our pre-existing beliefs.

For instance, a while back somebody on Reddit proposed a hypothesis:

I swear, every other guy I see at EDM festivals is a jacked, tattoo-d, model-looking motherfucker.

The poster asked if anyone else had noticed this phenomenon and immediately got a wave of responses from bodybuilders who attend EDM festivals, seeming to confirm his hypothesis. But, of course, there are always going to be some people who are bodybuilders at an EDM festival—so asking the question in this way just means he’s going to draw people out who agree that there are lots of buff guys at these festivals. This confirms what he already believes, but doesn’t give him any new data about the actual statistics of festival attendance.

Confirmation bias was discovered by Peter Wason in the 1960s, who ran a series of experiments in which subjects were presented with a series of numbers (e.g. 2, 4, 6) and asked to guess what rule those numbers followed. They were allowed to learn new information about the rule by guessing a new sequence of three numbers and asking the experimenter whether that sequence also followed the rule. Most people, when faced with this challenge, went into a long series of guesses about even numbers and never searched any further, thinking they had discovered the rule. But the rule was actually simpler than that: Any numbers that increase fit the rule. The participants could have guessed “1, 2, 3” or “25, 89, 497” and they would have gotten a “Yes” from the experimenter. But they didn’t. They only guessed even numbers. What they should have done was try to disprove their hypothesis: What sequence of numbers doesn’t fit the rule? That gives you much more information about the rule.

Connected with confirmation bias is the availability heuristic: we use the examples that come to mind first to try and figure out what is true. This leads us to think that our first idea is the right one, which tends to bias us towards things we’ve learned more recently or ideas that are dug in from earlier in our lives—more entrenched beliefs.

The availability heuristic is related to the frequency illusion, or the Baader Meinhof Phenomenon, where a word, number, or idea that we’ve learned about recently suddenly appears everywhere in the world around us. If you’ve ever seen the movie The Number 23, something similar is afoot. Jim Carrey’s character is plagued by the number 23. He sees it all around him. What’s actually happening to Jim Carrey (and to folks who feel like they’re haunted by the new concepts they learn) is that we’re surrounded all the time by billions and billions of data points in our lives, most of which we tend to deprioritize and filter out. Then all of a sudden we recognize a new pattern we’ve learned about and so, with our brains geared up for pattern recognition, we notice it everywhere that it was all along. This can further amplify the feeling that we’re right about an idea because all of a sudden it feels like everyone is coming out of the woodwork to talk about this idea, digging it deeper in our brains as “true.”

Osten_und_Hans

Scientists are not immune to a kind of confirmation bias. One famous example is Wilhelm von Osten—a mathematics teacher and phrenologist who thought he taught a horse to do math. The horse,”Clever Hans,” was asked complex math problems by individual members of an audience, to which Hans would stomp out responses with his hooves. It wasn’t until psychologist Oskar Pfungst ran a series of experiments and discovered that Clever Hans was actually responding to subtle nonverbal cues of participants who already knew the answer. The questioners were leaning over looking at the answers on the ground and would ever-so-slightly tilt their head or eyebrows upwards when the right answer was stomped. Osten was so swayed by the belief that he taught a horse to do math that he wouldn’t allow himself to see any other explanation. But in reality, the knowledge of the questioners was influencing Clever Hans’ behavior.

This is called the observer-expectancy effect—where the hope or expectation of a certain result helps to produce that result, even in a controlled experiment. This can impact scientific conclusions, especially in the social sciences where correlations abound and causations are harder to come by.

All of these examples tie into this first broad category of biases that make us tend to think that we are infallible smartypants. The lesson here is that it’s good to seek out disconfirming evidence—things that go against our assumptions or that are outside of our everyday experience. Looking for different perspectives is only going to make us better critical thinkers and more informed citizens.

crystal_ball_optimized

2. We think the world should be predictable

Change is scary. In the extreme, sudden change could be a plague of locusts that wipes out all of a farmer’s crops, or a hurricane that decimates an entire island. In our everyday environments, it could be delays on our commuter train line, or our favorite coffee shop closing, or a new boring paperwork process at work. Regardless of the scale, we are understandably wary of change. Not only that, but we’ll often fight to keep things the way they are, even if that’s not in our best interest.

A happenstance “real-life experiment” provides us with an example: In the early 1990s, New Jersey and Pennsylvania separately offered citizens a choice regarding car insurance. In both states, people were offered two options—one that was more expensive but gave you the full right to sue another driver if they were at fault, and one that was cheaper but your right to sue was more limited. In both states, there was a default option so you had to actively decide to switch if you wanted the other kind of car insurance. The default in New Jersey was the affordable option. The default in Pennsylvania gave the full right to sue. Can you guess what people chose? In both states, they decided not to decide and over 70% of citizens stuck with whatever default they were given. The only difference was that Pennsylvanians cumulatively spent $200 million more on car insurance.

In cognitive psychology this is called status quo bias. In political science it’s called system justification. In both cases it has to do with the fact that it takes a lot of effort to change things, and as a species we’re often trying to minimize our effort in the world. It’s easier to follow the well-trodden path than to create a new one. There’s also a certain amount of risk associated with being a trailblazer. We tend to say, “It works well enough now. Don’t rock the boat. It’s dangerous to try something new.” We often refuse to plan for events that have never happened before—a phenomenon called normalcy bias. This, in turn, is related to our desire to believe that those events can’t happen in the first place—the ostrich effect.

However, anyone in the technology world knows that disruption, innovation, and risk-taking are all traits that are applauded and associated with success. This is the strategy that startups and entrepreneurs use on a regular basis. They’re facing calculated risks and intentionally trying to disrupt the status quo. When done right, this can be refreshing to people who have frustrations with the way things are. We can imagine that certain other political, financial, and media systems—systems that support racial or gender discrimination, for instance—really should be disrupted too. Practices from meditation to contingency planning can help us stay more open to change, because change is always coming whether we like it or not.

Matrix-System-Control-Technology-Information-Code-3393928_optimized.jpg

Do we know anything at all?

It’s rational to ask ourselves at this point, “Given all the mistakes my mind makes, do I really know anything for certain at all?” Certainly our brains are imperfect. Our memories are fuzzy. Our perceptions are prone to trickery from illusions, optical and otherwise. Our past experiences color our every interaction. But by seeking out other perspectives, comparing our experiences to those of other people, and generally getting outside of our own small world, we can slowly get a more accurate picture of the reality in which we find ourselves.

At the end of the day, whether there’s such a thing as “ground truth” or any kind of objectively “real world” is a question of a certain sort of faith. We could very easily be living in a Matrix-style simulation, but there will never be satisfying proof of this until the stars spontaneously rearrange into a message from our programmers. While it’s fun to consider these far-out possibilities, we have little to no evidence for them and they don’t change our predictions about the world we live in. And as long as we continue to be able to make predictions about the world, science is the best tool we have for trying to wrap our heads around the whole shebang.

Outside-the-box thinking needs to happen at every scale of society. We need to change our own individual behavior, but we especially need to demand rigorous research methods and sound science when making decisions about our health and safety on national and global scales. Validating hypotheses by gathering evidence, getting a group of other scientists to review our methods for any flaws, then getting yet more scientists to replicate those results—this process helps us to minimize the bias that any single mind brings to the table.

Science is a slow process, and it doesn’t always happen so neatly as we would like. But without it, we are surely blind.

Ready to change your mind? I highly recommend the work of Julia Galef. You can start by listening to her interview with Ezra Klein: How to argue better and change your mind more.

3 thoughts on “What do you know?: Cognitive Biases and the Nature of Reality

  1. This is a fantastic overview of how our brains develop into dualistic and confirmation biased tools, which are imperfect. Science is certainly one way to resolve some of these issues, in terms of our shared beliefs about a consensual reality, but as we know, our expectations of the way tings are, influence what those things appear to be at an empirical level. I am sure you know about the experiment where they were investigating whether light was a wave or a particle. They found that the evidence differed based on the expectations of the observer. This seems to demonstrate that our beliefs actually create the world at a fundamental physical level. What are your thoughts about that?

    1. Great question! Again, these considerations are at the boundary of science, which brings us close to matters of faith. I will say that Heisenberg’s Uncertainty Principle (the effect you’re describing) was, itself, discovered by science. So there’s no need to throw out science in the further pursuit of this question (Do our thoughts influence reality?). It does appear that we influence our reality with every interaction, even simply by making observations (i.e. reflecting photons). Whether our thoughts influence reality at a sub-atomic level is, I think, an open question.

      But let’s think about some of the weird science happening here while we’re at it: Our brains themselves are made up of quarks and electrons that are subject to the uncertainty principle. Each particle in our brain is also subject to quantum entanglement. In 2017, scientists were able to entangle a particle on earth with another one 700 miles away in space, so that the two particles influenced each other’s behavior.

      With all these factors stacked one upon the other, who knows what might be possible? At the very least, there’s clear evidence that our mindset impacts how we see the world and stress can impact health outcomes. So our thoughts are incredibly powerful, regardless!

      1. Thanks for these additions, specific terms and links that expand the previous thoughts you shared. I certainly do not discard science, and agree that this is one way we have been presented with many kinds of enlightening information, like Heisenberg’s Uncertainty Principle, I am just wondering if I am completely limited, while either truly being in this world and body, or just believing I am in this world and body, from an ability to have a sustained awareness of total oneness? I have experienced instants of that, and a couple of times in my life a week or so of that knowing. I want that! Maybe I am shooting ahead, and not engaging in what is right now, by wanting what is not yet or was before, however briefly?

Leave a Reply