At the end of my Modern World History course, we get very close to the present. Among other things, we cover the rise to computers and the internet, which leads to a discussion of filter bubbles and impediments to understanding.
Over time, as I’ve been covering this the past couple of semesters, I’ve become more interested in incorporating more critical thinking exercises into my course. The discipline of History bills itself as one of those humanities that help foster skills like close reading and critical analysis. While this is definitely true, this result doesn’t happen automatically. To build these skills, you need to practice them. So I’m going to spend some of my time this summer building more opportunities for practice into my course.
I’m also going to build a sequence of readings (which I’ll also produce as videos) about the barriers to clear thinking. I’ve been listening to and reading Daniel Kahneman’s 2011 book, Thinking, Fast and Slow, so I’m going to distill some of Kahneman’s findings and make them available to my students.
Daniel Kahneman (1937-) is an emeritus professor of psychology at Princeton University. He won the 2002 Nobel Prize in Economics for Prospect Theory, which he developed in the 1970s with Amos Tversky (who died in 1996). Kahneman and Tversky argued in Prospect Theory that their empirical evidence contradicted the assumption made in nearly all economic theory that humans make rational choices (often described as Homo economicus).
One of the early episodes Kahneman describes in the book is the famous Gorilla Study which shows that not only can we suffer from attentional blindness, but we aren’t even aware of it.
Kahneman describes two “systems” in the human mind, the automatic, instinctual “System 1” and the thoughtful, deliberate “System 2”. The second system, he says, is our conscious identity and thinks it’s in charge. But it’s not.
He points out that we live in a world of limited information, so we’re constantly creating “a coherent story” that includes causality when correlation is all we saw (or maybe not even that). And he suggests that it’s easier to create a plausible narrative the less we actually know. Kahneman says we have an “almost unlimited ability to ignore our ignorance”.
He goes on to say that the confidence people have in their intuitions is not a guide to their validity. It is easier to have intuitions that reflect reality in some circumstances than in others. If the environment is regular enough to be predictable and if the person has lots of opportunity to build up experience, their intuitions are likely to be valid. In chaotic, unpredictable environments, not so much. That’s why a chess-master can intuitively “see” a series of move and predict the number of moves to a checkmate (sometimes at a glance) while a stock-picker can never make an accurate prediction — but feels he can.
Kahneman says “we can be blind to the obvious, and we are also blind to our blindness.” This fits with something Taleb says about how as a species we are optimized for survival, not for self-awareness. This is also vaguely reminiscent of the theme of the science fiction novel Blindsight.
Kahneman lists and describes a bunch of heuristics throughout the book. The term, derived from the ancient Greek word heurisko (I find, discover) refers to a process that helps people make rapid decisions or judgments. A positive description might call them “rules of thumb” while a negative would label them “biases”. Many of the heuristics psychologists such as Kahneman and Tversky have studied seem to provide an evolutionary advantage: we begin running before our conscious mind has identified the tiger preparing to pounce. However, since our world has changed significantly from the one we evolved to survive, in a very short period (from an evolutionary perspective), these shortcuts of thought may no longer be optimal. Many still provide us with “quick and dirty” approximations, but we can often improve on the automatic responses of the “System 1” mental processes (to use Kahneman’s description) by applying the rational thought processes of “System 2”.
Some of the main heuristics Kahneman describes are (I will choose fifteen of these to discuss with my students in the fall):
- Affect Heuristic
- Associative Coherence
- Availability Heuristic
- Certainty effect
- Confirmation Bias
- Correlation-Causation confusion
- Endowment Effect
- Halo Effect
- Hindsight Bias
- Ideomotor effect
- Intensity Matching
- Law of Small Numbers
- Mere Exposure Effect
- Narrative Fallacy
- Probability-randomness misunderstandings
- Regression to the Mean
- Theory-induced Blindness
I’ll also want to talk (following Taleb) about the relationship between (low) probability and (high) consequence (the Black Swan).
Also available as a video: https://youtu.be/qgvbwAFT46c