Effectiveness beats accuracy
We believe stuff because it benefits us to believe it, not necessarily because it is true.
We believe stuff because it benefits us to believe it, not necessarily because it is true. Phrased that way, it seems like an obvious point—of course evolution made us like that, what else could it have done? But this has surprising explanatory power.
Go talk with your roommate about who washes more of the dishes. Almost certainly, you’ll both think you do more of them than the other person thinks you do. Why? Probably because having a distorted view makes you a more convincing negotiator and gets you a better bargain in the future.
Say you and some friends would like to start a business in some line of work where everyone has lots of opportunities to steal. If all the employees happen to believe there’s an omnipotent power that rewards good behavior after you die, and everyone is part of a close-knit community where betrayals would have huge social costs, then you could all trust each other. Without the need to spend resources policing each other, you will outcompete all the other non-devout groups. (See: The diamond trade in Antwerp, long dominated by devout Orthodox Jews and now dominated by devout Indian Jains.)
Say that you look around and you sense that it would be beneficial to align yourself with a group of people who have certain political views. Now think about the game theory: Your vote has a tiny influence on what policies actually take place, but your stated views have a big influence on your social position and status. And the best way to look like you believe something is to genuinely believe it. You’ll soon find that it’s easy to convince yourself that the group’s positions are correct and anyone who disagrees is an evil idiot and you might feel like it’s a betrayal to even listen or understand the other side. (See: Basically the entire modern world? See also: Why understanding can be traitorous.)
Maybe right and wrong don’t “really” exist. But it’s tough out there and you’re worried that people will hurt you. So you loudly advertise that if anyone defects against you, you’ll go out of your way to punish them—even “irrationally” hurting yourself if that’s necessary to get revenge. To make this threat maximally credible, you adopt as core beliefs that right and wrong do exist and that defectors are wrong.
Even better, you and your neighbors agree to collectively punish defectors. You even punish people for failing to punish, by calling them cowards and denying them respect and status. (See: We all believe in right and wrong. See also: How humans lived in groups for thousands of years before we invented the leviathan, and a huge part of how we live in groups today.)
Are you in a social group where it’s beneficial for you to look down on someone? You’ll soon find it easy to furnish yourself with reasons why this person sucks. If it’s beneficial to admire someone, you’ll soon find all sorts of reasons why that person is amazing. (See: Teenagers.)
For most of us, what we do with our lives isn’t cosmically significant. But if you can delude yourself a little bit that your Important Projects are going to Change the World, this will probably make you better at what you do, and ultimately help you collect more grubby resources and status and so on. (See: Everyone is working on Important Projects.)
Most of us overestimate how good-looking we are and how much people like us and so on. (Except for the clinically depressed?) But if these delusions make us more confident and charismatic, then they’re probably beneficial. (See: “Our results show proof for a strikingly simple observation: that individuals perceive their own beauty to be greater than that expressed in the opinions of others (p < 0.001)”.)
At least in recent history, people on both sides of wars seem to believe they are fighting for the side of good. Obviously, that can’t be right, and in a sense, two such parties fighting should be cause for them to sit down and work through Aumann’s agreement dynamics. But since the French revolution, we know that ideologically committed armies are vastly more effective, so everyone finds a way to believe. (See: Hahahahaha, you think soldiers in wars will follow Aumann’s agreement dynamics? No one has ever followed Aumann’s agreement dynamics.)
Successfully raising a baby takes a huge amount of resources. In principle, this might lead to a tragedy of the commons where each parent has an incentive to invest less than the other one, leading to an overall under-investment in the child. But if each parent could pre-commit to high investment, that’s better for both of them. For this reason, perhaps, people in couples are wired to mutually fixate on each other as uniquely amazing and flawless, meaning both parties can invest heavily.
The point is this: Effectiveness often happens to align with truth, but that’s really sort of a coincidence. Any time there’s a conflict between the two, we evolved to throw truth out the window.
The term “cognitive biases” is arguably misleading in that it suggests that believing truth would be a kind of default. Arguably, it’s amazing that we manage to believe the truth at all. If you want your beliefs to be accurate, you’re constantly swimming against your own biology and instincts.
Broadly speaking, I think we can identify three reasons that people are wrong about stuff.
Sometimes there isn’t enough information available to be right. I don’t think any of us can fault Socrates for not knowing about wave-particle duality.
Sometimes information is available, but it’s hard to sort through the information and figure out what’s true. For example, I think this is why many people believe aspartame is bad for you.
Finally, there’s cases like the ones we’ve examined here, where it’s straight-up beneficial to have a distorted view of the world.
We can deal with the first by gathering more information, and the second by curating that information. But the last one is a real challenge.