Confirmation Bias

As I begin a series on heuristics and biases, it’s probably useful to mention that these two terms are not completely understood or accurately used by many people. Especially the term bias. When we hear the word we tend to think of “biased people”, folks with an ax to grind against some minority group. Actually, the sixteenth-century English word bias was derived from Old French or Provençal in the 13th century, and originally referred to an oblique line. So a more complete understanding of bias in thinking might deal with a tendency toward one thought, like a tilt in a lawn-bowling pitch that tends to shift the ball in a particular direction (the term was first used this way in the game of bowls in 1560). More recently, statistics has defined bias as “the difference between the expectation of a sample estimator and the true population value, which reduces the representativeness of the estimator by systematically distorting it” (Wiktionary).

In each case, the issue is a deviation from “truth”, or at least from an expected path. The ideal bowling pitch is flat, so a tilt will deflect balls from their “true” course. It might be worth noting here that this “truth” is a desired ideal rather than a measured reality. And in the sense I mentioned earlier, of “biased people”, this distinction probably applies too. We have an ideal in mind, of how we’d like to see people behave toward each other. Something that tilts that behavior, a bias, is unwelcome, often even if it operates in the positive direction because it breaks an expected symmetry. A subtle bias can be even more problematic than having a blatant ax to grind against certain people, because it’s more difficult to see and adjust for.

However, the unevenness of the bowling pitch, if it is the same for everyone, might be something we can work with, if we become aware of it. That’s part of the point of this series. Another part is because, biases and heuristics aren’t inherently bad. We don’t have them because we’re evil. We have them because they speed up our thought processes, and we evolved in a dangerous world where there was a survival advantage in thinking quickly — even if that sometimes meant making mistakes. The consequences of the two sides of a single decision are often very different. If we jump and run, and it turns out there was no tiger in the tall grass, we can laugh at ourselves (or at worst be laughed at by others). If we slow down and verify that it’s really a tiger, that verification could come in the form of being eaten.

If we look at general biases of thought, one of the most discussed today is the Confirmation Bias. This is our tendency to search for, interpret, remember, and believe information that lines up with our existing knowledge and beliefs. Again, it’s easy to imagine the evolutionary value of fitting new information into our already-formed picture of the world rather than breaking and rebuilding the whole paradigm every time we see something new. On the other hand, this can make us overconfident in our world-view, if we deliberately seek confirming evidence and avoid difficult, disconfirming “exceptions to the rule”. Inductive reasoning that operates by creating knowledge from observation can have a dangerous blind spot, if we ignore or can’t see the data that doesn’t “fit” the pattern we believe we see.

In Thinking, Fast and Slow, Kahneman connects confirmation bias with memory. Associative memory is a process that searches for links between two unrelated ideas (or between a new idea and existing ones) to help us make sense of new information and integrate it into our knowledge. He observes that this process is already biased, and says this problem with “believing and unbelieving” goes all the way back tot he 17th-century philosopher Spinoza. If we are asked, “Is Sam friendly?”, Kahneman says, associative memory will immediately provide us with remembered instances of Sam being friendly. If we are asked, “Is Sam unfriendly?”, we’ll tend remember instances that confirm that description. This may seem like a trivial issue, but it has been linked to “belief perseverance”, where we tend to hang onto our beliefs even after they have been proven false, and the “primacy effect”, where we tend to believe and rely more heavily on information we obtain earlier in a process rather than later.

In today’s world, confirmation bias plays a big part in the creation of filter bubbles in social media. And in this case, the technology reinforces the most problematic part of the bias. We not only accept more easily information that already “fits” our worldview, but social media and even search algorithms have been programmed to record and catalog our worldview and then provide us with information that conforms to it and reinforces our beliefs. The algorithms aren’t doing this to be evil; they’re doing it to maximize the time people spend looking at their content because that’s how they get paid. And there are billions of dollars in play.

Since psychology suggests that confirmation bias is already strongest “for desired outcomes” regarding “emotionally charged issues”, this is especially troubling in politically-oriented social media. Over the past four years, especially, a frequent refrain in the media has been, “You’re entitled to your own interpretation, but not to your own facts.” This has typically been the response of pundits who considered themselves to have a leg up on the ignorant mobs they perceived their opponents to be. Increasingly, though, each opponent draws on a different set of facts, cherry-picked using algorithmicly-enhanced confirmation bias; resulting in each side believing the other is either completely detached from “reality”, or evil, or both.

The Greek historian of the ancient world, Thucydides (460-395 BCE), said “it is a habit of mankind to entrust to careless hope what they long for, and to use sovereign reason to thrust aside what they do not fancy.” Our easy acceptance of information which confirms and resistance to disconfirmatory counter-evidence may be the best argument for freedom of speech and an open forum for debate. If we’re hanging on to our own beliefs, even a little, then we might benefit a whole lot from talking to other folks who are also hanging onto their different worldviews.

Also a video at https://youtu.be/NpjKsbRT07o

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s