It’s frustrating to propose an idea and have people dismiss it just because it sounds weird. You’ve surely seen people ridicule ideas like that maybe we should worry about wild animal suffering or computers becoming sentient or comets crashing into the planet. I’ve encountered some of this for claiming aspartame is likely harmless but ultrasonic humidifiers might not be.
The thing is, dismissing weird ideas is not wrong.
I have a relative who got the J&J vaccine for Covid so as some people were getting their third shot, she still only had one. I claimed that it would be fine to go ahead and get a second shot of an mRNA vaccine since this was sure to be approved soon (and was already approved in some countries). She gently responded, “I will get another shot when my doctor tells me to.”
Was she wrong? In a narrow sense, yes. Mixing-and-matching of vaccines was approved soon after, and I maintain that this was knowable in advance. But more broadly, she was following a good strategy: For most people, “just do what your doctor says” will give better results than, “take unsolicited medical advice from uppity relatives.”
From a Bayesian standpoint, it would arguably have been a mistake if she did listen to me. Skepticism of weird ideas is a kind of “immune system” to prevent us from believing in nonsense.
The problem, of course, is that weird ideas are sometimes right. For 200 years, most Western people thought that tomatoes were poisonous. Imagine you were one of the initial contrarians going around saying, “Well actually, tomatoes are fine!” and demonstrating that you could eat them. I bet you’d have had a rough time.
Especially because if you convinced someone and they went home and cooked some tomatoes, their cookware probably had lead in it which the acidity in the tomatoes would leach out, leading to lead poisoning. Your follow-up campaign of, “really tomatoes are ok, we just need to switch to non-leaded cookware!” would bomb even harder.
I’m glad people persevered so we aren’t covering our pizzas with mayonnaise. But how are we supposed to resolve this tension in general? Here are eight proposed rules.
1. We need to work at the population level
If you think about it, almost everything you know comes from other people. Even when you “check the facts” what that usually means is “see what other people say”. If you trace back your knowledge to observations in the world, it’s a huge graph of you trusting people who trust other people.
Understanding the world is a social process. This is important because I don’t think the tension of weird ideas can be resolved at an individual level. You’ve got finite time to investigate crackpot theories. But fortunately, you don’t need to resolve all questions yourself. You just need to follow norms that lead to us collectively identifying good ideas and discarding bad ones.
2. Don’t expect most people to take your weird idea seriously
For one, this is just being realistic about the fact that most people aren’t very friendly to weird ideas. But more seriously, it would be unreasonable to expect people to follow a strategy that is bad for them.
We are all assaulted by bad ideas all the time. If every person who heard the claim that vaccines cause autism looked at the evidence with an open mind, well, we’d have a lot more people who think that vaccines cause autism.
Investigating every random claim would also take up way too much time. The complexity of the world greatly exceeds the capacity of individual people. So the only way to live is by trusting the social process where we get trusted information from others. We need to deal with that.
3. Don’t feel bad about dismissing weird ideas
Remember, it’s the correct prior, and it’s correct game theory given that we have tiny little error-prone brains and short lifespans to look into things.
Yet somehow, I think a lot of people feel like they aren’t supposed to do this? The problem isn’t that people don’t dismiss weird ideas—most of us do that instinctually. The problem is that we aren’t honest about why we are dismissing them, either to others or even to ourselves. Speaking of which…
4. Be honest about why you reject weird ideas
There are lots of reasons you might do so.
Pure prior: The idea sounds stupid and you haven’t looked at the argument.
You’ve looked at the argument, but you think it’s wrong.
You looked at the argument, but then realized you don’t have the background to understand it, so you went back to your prior.
You looked at the argument, you do understand it, and you think it’s right. But you still reject the idea because your prior is so strong.
You looked at the argument, you understand it, it seems flawless, and on an intellectual level, it overcomes your prior. But somehow you just aren’t able to get emotionally invested in the conclusion. (Sometimes I feel this way about AI risk.)
These are all valid! But it’s very important to be clear about which one you’re using. Because here’s something that I see a lot:
There’s a weird idea.
Lots of people reject it just because it’s weird (#1) or because they don’t understand the argument (#3)
But they feel like they aren’t “supposed” to reject it for those reasons, so they give a misleading impression that they reject the argument in detail.
This creates an appearance of a false consensus that the argument is actually wrong, screwing up the social process that’s supposed to eventually lead to truth.
5. Beware shifting goal posts
Here’s a common pattern:
A: Here’s a weird idea.
B: That can’t be true because of X.
A: [Evidence that X is false.]
B: Oh, OK. But your idea is still wrong because of Y.
A: [Evidence that Y is false.]
B: Fine, but your idea is still wrong because of Z.
For example, with aspartame, people often claim it’s carcinogenic. When that’s shown to be false, they retreat to saying it’s genotoxic (it isn’t), that it causes an insulin spike (it doesn’t), that it’s metabolized into formaldehyde (that’s normal), that it causes obesity (only in correlational studies), and then something about the microbiome.
Now, it’s fine to oppose an idea because of reasons X, Y, and Z. And it’s good (admirable!) to abandon reasons when they are shown to be false. But still, this pattern is a warning sign.
Most obviously, in disagreements it’s always best to start with your central point. If I say I disagree with you because of X, then showing that X is false should change my mind—otherwise, I haven’t been fully candid about my reasons.
But this pattern has particular relevance for weird ideas. What’s happening in each person’s brain during the conversation?
A, of course, feels frustrated because it seems like there is no evidence that would convince B, so feels like B is arguing in bad faith.
But B’s perspective is different. They decided that the idea is too weird to be considered (which is reasonable!). Then, they applied basic logic: If you know that aspartame is harmful, and you’re shown that it isn’t carcinogenic, then it is correct to infer that there must be some other mechanism of harm.
I think it’s human nature to play the role of B in this conversation. When we dismiss weird ideas, it often “feels” like we have reasons.
What’s the solution? I think B needs to be more self-reflective and more straightforward. It’s OK to just decide you aren’t going to consider an idea and you aren’t going to be convinced by any evidence to the contrary. We all frequently do this. But when doing it, it’s better to explicitly. A fear of looking closed-minded can cause you to throw up a series of Potemkin arguments that only present an illusion of engaging on the merits.
6. Consider a fraction of weird ideas
It’s probably good to look into a certain percentage of crazy ideas. This is mostly an act of altruism, something that we should do to make the social truth process work better.
You probably do this already. For topics that are particularly important to you, or that you particularly enjoy reading about, you probably have more patience to indulge in outlandish concepts.
Another criterion would be expertise. Likely we should leave the rebuttals of perpetual motion machines to physicists.
But I don’t think we want to be too single-minded in leaving truth to the experts. The issue is that expertise is often concentrated in tiny little bubbles of society. When we have high-trust channels from the experts to the public, that’s fine—our current system for communicating when an earthquake has happened works very well.
But other times the experts are siloed and most of the population is several low-trust links away from them. (Or maybe the experts aren’t that reliable, or they just aren’t any experts on this particular topic.) In these cases, we need more participants to give the true weird ideas a chance of escaping.
7. Or on second thought maybe don’t
Public health authorities are now seen as less reliable than they were a few years ago. To my mind, that was a “correct” update: They were always OK but not infallible, so the current view is closer to reality.
But what has the effect of that been?
It’s not clear it was positive. Some people have certainly found alternative sources of information and learned the limits of what public figures can say. But lots of other people also seem to be enraptured with nonsense conspiracy theories.
This worries me and I’m not sure what to do about it. It’s tempting to say that you should only look into things if you can do so successfully. But perhaps your ability to evaluate the details is correlated with your ability to judge your own ability?
8. Accept weird ideas hesitantly
You don’t have to update all the way. Probably you should almost never do that! In most cases, the right conclusion would be, “important if true” and maybe “I don’t see an obvious flaw”. This is enough to make the social process work and avoids the personal risks of acting on crazy ideas.
Here's a weird idea: Obesity is a disease of hunger and hunger management, not a disease of food and exercise management.
People will get angry about this because it clashes with their morality. They will say "diet and exercise worked for me, and it should work for you too, you just need to work harder".
Hunger is very hard to study. There is no unit of hunger that is comparable between individuals. How would you even "prove" this idea in a medical context? Nonetheless, I believe it.
----
Here's another way to deal with weird ideas: they come from someone's personal experience. You don't have to accept the idea as true, you just have to hear their experience. Someone once told me, after a divorce, "You should never fall in love too much". What does this even mean? It sounds so weird. But it was their experience. I take it to mean "Don't become emotionally dependant on another person"
This is the same as holding unpopular opinions. Following social consensus is a safe way to navigate most of the time, but it will allow only incremental developments - https://binaryho.me/opinion/