Just on the beauty point, I find the Berksons paradox especially interesting, in the phenomenon of models etc thinking they're not beautiful enough. Applies elsewhere too.

Expand full comment

The default human response to challenge is to defend or justify. It is interesting that we have fields that do not exhibit this behavior - math and science. Except of course that people working in math and science DO exhibit this behavior, it is just slightly mediated.

Expand full comment

> Are you in a social group where it’s beneficial for you to look down on someone? You’ll soon find it easy to furnish yourself with reasons why this person sucks. If it’s beneficial to admire someone, you’ll soon find all sorts of reasons why that person is amazing. (See: Teenagers.)

This example works in two ways. Teenagers (or perhaps, more specifically, modern high school students) are famously concerned with popularity and reputation. Also, teenagers are a socially acceptable target for hostile generalizations; it can be socially beneficial to look down on them to show off one's superiority and maturity.

Expand full comment
Sep 8, 2022·edited Sep 8, 2022

> Maybe right and wrong don’t “really” exist.

In my peer group - white collar professionals - this is basically taken as a given nowadays.

What would it look like if right and wrong really _did_ exist, 'right' was effectively 'traverse the steepest gradient of the valence manifold' (i.e. make yourself feel good over long periods of time), but these were more or less impossible to compute directly?

I'd think the result would be that groups which believed in right and wrong, of some sort, could effectively outcompete other groups in numerous situations. Nobody would agree on what precisely was right and wrong, lots of groups would claim to have 'the answer', each answer would lead to really weird consequences in some situations, and lots of people would swear up and down the whole thing was meaningless.

> here’s cases like the ones we’ve examined here, where it’s straight-up beneficial to have a distorted view of the world.

This would only be true if the world were stable over long periods of time. If your environment doesn't really change, then the ideal worldview is essentially a map of the rewards available in that environment. But if you're in a constantly changing environment with unknown risks and rewards, whatever illusions you have that work well in one situation are going to eventually fail elsewhere.

So if you want to continue prospering even in deeply unstable world, you really _do_ want your beliefs to line up with reality. And a good way to get there: continually seek out new experiences, set goals, fail at them, and then learn from the failures. If you have sufficiently broad experiences to cover so many domains that no false heuristics reliably work, then the only real option you have is what is actually the truth.

Expand full comment