A while ago, a commenter emailed me to ask if I could recommend any books to read on human cognitive bias, and now that I’ve finished Thinking Fast and Slow by Daniel Kahneman, I can, with great enthusiasm. When we study flaws in human reasoning, we usually start with glaring ones, and find out that they’re just the most obvious examples of a broader problem (and the subtler errors are the more pernicious ones). In the book, Kahneman has a really interesting riff on the Müller-Lyer illusion.
All the lines are the same length, but the different orientations of the arrows trick you into thinking the middle one is longest. Kahneman writes:
Now that you have measured the lines, you–your System 2, the conscious being you call “I”– have a new belief: you know that the lines are equally long. But you still see the bottom line as longer… To resist the illusion, there is only one thing you can do: you must learn to mistrust your impression of the length of lines when fins are attached to then. To implement that rule, you must be able to recognize the illusory pattern and recall what you know about it. Of you can do this, you will never again be fooled by the Müller-Lyer illusion. But you will still see one line as longer than the other.
Learning that we err isn’t enough to fix our flaws. It’s a constant struggle against a part of our nature to not get fooled by our heuristics. And, as Kahneman points out, sometimes we’re never going to beat them, we’re just going to be able to remember we’re wrong in time to not act on them.
I have mild prosopagnosia (face-blindness) and I have a lot of trouble recognizing people. My junior year of college, I had a lot of trouble telling my roommate (GirlOne) apart from a different girl who had a leadership position in the debate group I was running (GirlTwo). This meant that, about once a week, I’d come back to the suite, or go from my room into the common room, and be convinced that GirlTwo was in my dorm — and since there was no reason she’d be there casually, this presumably meant the debate group was having some kind of political crisis, and I’d start feeling panicky.
It was never the case that GirlTwo was lying in wait for me in the common room — it was always just my roommate, GirlOne. I couldn’t stop making the visual error, but I got a lot better at remembering that my intuition was pretty much always wrong, so I felt less jumpy. I had to learn to stop privileged my flawed reactions and actively practice overriding my senses.
Eliezer Yudkowsky highlights a different sphere where we have to strive against our intuitions in the introduction to his sequence on quantum mechanics. He writes:
I am not going to tell you that quantum mechanics is weird, bizarre, confusing, or alien. QM is counterintuitive, but that is a problem with your intuitions, not a problem with quantum mechanics. Quantum mechanics has been around for billions of years before the Sun coalesced from interstellar hydrogen. Quantum mechanics was here before you were, and if you have a problem with that, you are the one who needs to change. QM sure won’t. There are no surprising facts, only models that are surprised by facts; and if a model is surprised by the facts, it is no credit to that model…
In the coming sequence on quantum mechanics, I am going to consistently speak as if quantum mechanics is perfectly normal; and when human intuitions depart from quantum mechanics, I am going to make fun of the intuitions for being weird and unusual. This may seem odd, but the point is to swing your mind around to a native quantum point of view.
The trouble is that, in a lot of cases, it’s not as obvious that our intuitions are wrong as it is in the optical illusion or my faceblindness or quantum mechanics. The challenge is trying to figure out which intuitions need to be subverted, and how confident we need to be to override them. Because fighting intuitions can sound a lot like brainwashing. In my experience with my roommate, I was literally trying to unsee what my eyes were telling me I did see.
Heuristics and reflexes aren’t bad in themselves, so how do we decide when the errors don’t outweigh the convenience, or when we want to try and subvert them in particular circumstances, or when we want to burn them out entirely. This kind of problem is going to come up in a more specific way for tomorrow’s post for the Patheos Book Club, so I’d be interested in your general principles (and intuitions, if you trust them!) today.