Effectiveness beats accuracy
We believe stuff because it benefits us to believe it, not necessarily because it is true.
We believe stuff because it benefits us to believe it, not necessarily because it is true. Phrased that way, it seems like an obvious point—of course evolution made us like that, what else could it have done? But this has surprising explanatory power.
Go talk with your roommate about who washes more of the dishes. Almost certainly, you’ll both think you do more of them than the other person thinks you do. Why? Probably because having a distorted view makes you a more convincing negotiator and gets you a better bargain in the future.
Say you and some friends would like to start a business in some line of work where everyone has lots of opportunities to steal. If all the employees happen to believe there’s an omnipotent power that rewards good behavior after you die, and everyone is part of a close-knit community where betrayals would have huge social costs, then you could all trust each other. Without the need to spend resources policing each other, you will outcompete all the other non-devout groups. (See: The diamond trade in Antwerp, long dominated by devout Orthodox Jews and now dominated by devout Indian Jains.)
Say that you look around and you sense that it would be beneficial to align yourself with a group of people who have certain political views. Now think about the game theory: Your vote has a tiny influence on what policies actually take place, but your stated views have a big influence on your social position and status. And the best way to look like you believe something is to genuinely believe it. You’ll soon find that it’s easy to convince yourself that the group’s positions are correct and anyone who disagrees is an evil idiot and you might feel like it’s a betrayal to even listen or understand the other side. (See: Basically the entire modern world? See also: Why understanding can be traitorous.)
Maybe right and wrong don’t “really” exist. But it’s tough out there and you’re worried that people will hurt you. So you loudly advertise that if anyone defects against you, you’ll go out of your way to punish them—even “irrationally” hurting yourself if that’s necessary to get revenge. To make this threat maximally credible, you adopt as core beliefs that right and wrong do exist and that defectors are wrong.
Even better, you and your neighbors agree to collectively punish defectors. You even punish people for failing to punish, by calling them cowards and denying them respect and status. (See: We all believe in right and wrong. See also: How humans lived in groups for thousands of years before we invented the leviathan, and a huge part of how we live in groups today.)
Are you in a social group where it’s beneficial for you to look down on someone? You’ll soon find it easy to furnish yourself with reasons why this person sucks. If it’s beneficial to admire someone, you’ll soon find all sorts of reasons why that person is amazing. (See: Teenagers.)
For most of us, what we do with our lives isn’t cosmically significant. But if you can delude yourself a little bit that your Important Projects are going to Change the World, this will probably make you better at what you do, and ultimately help you collect more grubby resources and status and so on. (See: Everyone is working on Important Projects.)
Most of us overestimate how good-looking we are and how much people like us and so on. (Except for the clinically depressed?) But if these delusions make us more confident and charismatic, then they’re probably beneficial. (See: “Our results show proof for a strikingly simple observation: that individuals perceive their own beauty to be greater than that expressed in the opinions of others (p < 0.001)”.)
At least in recent history, people on both sides of wars seem to believe they are fighting for the side of good. Obviously, that can’t be right, and in a sense, two such parties fighting should be cause for them to sit down and work through Aumann’s agreement dynamics. But since the French revolution, we know that ideologically committed armies are vastly more effective, so everyone finds a way to believe. (See: Hahahahaha, you think soldiers in wars will follow Aumann’s agreement dynamics? No one has ever followed Aumann’s agreement dynamics.)
Successfully raising a baby takes a huge amount of resources. In principle, this might lead to a tragedy of the commons where each parent has an incentive to invest less than the other one, leading to an overall under-investment in the child. But if each parent could pre-commit to high investment, that’s better for both of them. For this reason, perhaps, people in couples are wired to mutually fixate on each other as uniquely amazing and flawless, meaning both parties can invest heavily.
The point is this: Effectiveness often happens to align with truth, but that’s really sort of a coincidence. Any time there’s a conflict between the two, we evolved to throw truth out the window.
The term “cognitive biases” is arguably misleading in that it suggests that believing truth would be a kind of default. Arguably, it’s amazing that we manage to believe the truth at all. If you want your beliefs to be accurate, you’re constantly swimming against your own biology and instincts.
Broadly speaking, I think we can identify three reasons that people are wrong about stuff.
Sometimes there isn’t enough information available to be right. I don’t think any of us can fault Socrates for not knowing about wave-particle duality.
Sometimes information is available, but it’s hard to sort through the information and figure out what’s true. For example, I think this is why many people believe aspartame is bad for you.
Finally, there’s cases like the ones we’ve examined here, where it’s straight-up beneficial to have a distorted view of the world.
We can deal with the first by gathering more information, and the second by curating that information. But the last one is a real challenge.
> Maybe right and wrong don’t “really” exist.
In my peer group - white collar professionals - this is basically taken as a given nowadays.
What would it look like if right and wrong really _did_ exist, 'right' was effectively 'traverse the steepest gradient of the valence manifold' (i.e. make yourself feel good over long periods of time), but these were more or less impossible to compute directly?
I'd think the result would be that groups which believed in right and wrong, of some sort, could effectively outcompete other groups in numerous situations. Nobody would agree on what precisely was right and wrong, lots of groups would claim to have 'the answer', each answer would lead to really weird consequences in some situations, and lots of people would swear up and down the whole thing was meaningless.
> here’s cases like the ones we’ve examined here, where it’s straight-up beneficial to have a distorted view of the world.
This would only be true if the world were stable over long periods of time. If your environment doesn't really change, then the ideal worldview is essentially a map of the rewards available in that environment. But if you're in a constantly changing environment with unknown risks and rewards, whatever illusions you have that work well in one situation are going to eventually fail elsewhere.
So if you want to continue prospering even in deeply unstable world, you really _do_ want your beliefs to line up with reality. And a good way to get there: continually seek out new experiences, set goals, fail at them, and then learn from the failures. If you have sufficiently broad experiences to cover so many domains that no false heuristics reliably work, then the only real option you have is what is actually the truth.
> Are you in a social group where it’s beneficial for you to look down on someone? You’ll soon find it easy to furnish yourself with reasons why this person sucks. If it’s beneficial to admire someone, you’ll soon find all sorts of reasons why that person is amazing. (See: Teenagers.)
This example works in two ways. Teenagers (or perhaps, more specifically, modern high school students) are famously concerned with popularity and reputation. Also, teenagers are a socially acceptable target for hostile generalizations; it can be socially beneficial to look down on them to show off one's superiority and maturity.