In the history of mentalism, there is perhaps no greater act that has imprinted itself onto the soul of its performers than the oracle act. In the act, the oracle on stage answers thought-of questions from the audience about their futures. One moment a man is an illusionist, or a mind-reader, or a mentalist; the next, he's unveiling the great and hidden secrets of the universe right before your eyes. But it’s not just magicians who can see into the future.
In a very real way, we all can predict the future. We do it every day, from anticipating the trajectory of a baseball tossed in our general direction to what will happen if we go swimming in Florida (eaten by a shark) to what will happen if we forget to take the trash out again (eaten by a fiancée).
This idea of the brain an expectation machine using stories to create models of the world is, to put it simply, the natural state of learning. As we live, so we shrink the gap between what we expect to happen and what actually happens. And if we're doing it right, we're surprised less and less as the years go on by what we encounter out there in the scary world.
But we are incompetent oracles; our mental models of the world are limited by what most easily comes to mind, even though what most easily comes to mind isn't necessarily representative of the full range of possibilities for what will happen. This selective representation is the delusion I want to talk about today, a cognitive bias known as the availability heuristic.
Originally proposed in 1974 by Tverky & Kahneman, the availability heuristic suggests that “There are situations in which people assess…the probability of an event by the ease with which instances or occurrences can be brought to mind.”
More simply, we make predictions about what will happen based on our memories of what happened. Typically, this is a really effective way to go about modeling reality, but it can be skewed by how we remember things, what we pay attention to, and the stories we hear and tell.
Consider this. What’s scarier, getting into an Uber or getting onto an airplane? Sharks or mosquitos? Getting sick from a vaccine or from the virus?
In one of the original studies on the availability heuristic, Lichtenstein, Slovic, Fischhoff, Layman, and Combs, 1978 found that people drastically over-estimate the frequency of rare causes of death, and drastically underestimate the frequency of common causes of death.
Where do we get these misunderstandings from? Is this the news' fault? My guess is yes, if we add some combination of the internet, social media, and memes into the equation. These are the places where we hear the stories that tell us what's going on in the world and what's important. But these story systems are built to overemphasize the surprising and unlikely and suppress the everyday and the normal. "If it bleeds, it ledes." But our brains are also at fault—it’s well known that we remember surprising things more than common things. The movie Jaws and news reports of shark sightings are likely why we're way more afraid of sharks, which are responsible for around six deaths per year worldwide, than mosquitos, which are responsible for millions of deaths per year. Combs and Slovic, 1979 backs this idea up; the study compared how deaths were reported in two newspapers to how their readers estimated probabilities for causes of death in their communities, and found that the differences in reporting correlated strongly with their estimations.
Now when I read about this stuff, I think: there's gotta be other reasons than the availability heuristic for my increasing heart rate when I’m swimming 100 meters off shore in the murky unknown darkness of hazy ocean waters, or feel the shake of turbulence on an airplane. But it still is an apt metaphor for how we look at the world. We live our lives in a constant battle between knowledge and emotion, and emotion basically always wins.
The availability heuristic teaches us to be a little less sure of ourselves and what we believe about the world. Is it actually all going to shit, like it feels like it is? Are we on an unstoppable collision course with doom and destruction? Should I forgo a vaccine because I heard that 1 person died after getting it? (Forgetting that 450,000 people died after not getting it?)
We would be wise to recognize that memory does not always serve as an accurate map to predicting the probabilities of what course the future will take. We are all incompetent oracles, and recognizing our where our incompetence lies can allow us to spend more time worrying about how to swim safely in the stormy seas of life and less time worrying about sea monsters that might not exist.
Thanks for reading this week’s Delusion. Please consider subscribing if you’re not, or sharing with a friend. Agree? Disagree? Please leave a comment below.
Ep 3: Incompetent Oracles