For pretty much all my life, I have struggled with how I react to people being wrong. “Wrong” meaning, here, “wrong according to my beliefs/knowledge“. The frontier between beliefs and knowledge is murky, and we would all fancy our beliefs to be knowledge, but in some cases we can more or less agree on what is belief and what is knowledge.
I have a really hard time with people who are wrong. Wrong, of course, as described above. I have my beliefs and values, and do my best to accept that not everybody shares them. I don’t believe in any god, some people do. That’s fine, as long as beliefs are not construed as facts or knowledge.
When debating, I have very little tolerance for the “well, it’s my point of view/opinion” argument – systematically offered as a justification for something that was initially presented as fact. You can have opinions and beliefs, but if you present them as facts in a debate, prepare for them to be challenged. But as I said above, the frontier is sometimes murky, particularly as seen “from the inside”, and that is where trouble lies.
Take vaccines. I’m taking that example because it’s easy. I believe things about them. I consider those beliefs pretty rational as they are, to the best of my efforts, based in science. So I know they are safe, I know they work, I know they do not generally offer 100% protection, I know there can be a tiny risk of bad reaction, I know they have helped eradicate some illnesses and control others so they do not rip through society like Covid-19, I have a decent understanding of how a vaccine is built, how and why it works. So, of course, I think that people who believe different things about vaccines, like that they are harmful or even dangerous, are wrong. The problem is that in their “web of belief” (read the book, it’s wonderful), their beliefs are perfectly rational and therefore, knowledge.
We could say that each side of the argument here sees their belief as knowledge, and the other’s as belief.
Faced with somebody who believes something that contradicts something I know, my initial impulse is to explain to them that they are wrong. Because who doesn’t want to be right? I bet you can see how that strategy doesn’t really work out well.
So, over time, I have learned to bite my tongue, accept that what people believe (including myself, though I hate the idea) is never going to be completely objectively rational, and remember that nobody (first of all me) likes being told they are wrong. The tongue-biting is more or less successful, depending on the topic in question, my mental state, and who is facing me.
The current pandemic has given me a golden opportunity to work on not only my tongue-biting, but acceptance of differing viewpoints. Accepting that people see things differently doesn’t mean I believe every point of view is equivalent. Quite the contrary. It’s more about accepting that people will believe what they believe, that they aren’t rational (me neither, though I try my best to be), and that it is normal and OK.
I do my best to share accurate information. I’m not perfect or blameless, but I try to exercise critical judgment and be a reliable source of information for those who choose to dip into my brain, though facebook, this blog, or conversation. I also try to correct erroneous information, and that is where things get slippery. It goes from setting the record straight when people share obvious hoaxes or urban legends (generally by instant messenger), providing critical sources when others share scientifically wobbly information (hydroxychloroquine) or scare themselves needlessly (what use masks really serve, disinfecting groceries). And I’ve had to learn to back off. To keep the peace, to preserve relationships that I otherwise value, and also to preserve my sanity and inner peace.
One milestone was when I realised it was useless trying to tell people who were convinced a certain French scientist had found the miracle cure for Covid-19 that the scientific evidence for it was flaky at best, dishonest at worst. People are scared and will believe what helps them. We tend to want to ignore the emotional dimension of our beliefs, but it’s there, and much more powerful than our rational brain (as anybody who has ever tried to reason through emotions knows).
Some people are more comfortable dealing with uncertainty than others. Some people understand logical fallacies and cognitive bias better than others, and are more or less able to apply that knowledge to the construction of their beliefs (critical distance). But we all have emotions and they colour what we are likely to accept as fact or not, whether we like it or not.
So I tried to drop hydroxychloroquine. That meant I had to accept that (according to my knowledge) false information was going to do the rounds, in my social circle, that people were going to have false hopes, and spread misinformation, and I wasn’t going to do anything about it. Not an easy thing to let go of. I feel like I’m skirting responsibility. My therapist would certainly tell me that fixing other people’s beliefs is not my responsibility…
I’ve been doing the same thing for some time now with people who believe vaccines aren’t safe or efficient. I know facts don’t change people’s minds. Worse, debate reinforces beliefs. I know! But I don’t really believe it, and keep on wanting to try. So I bite my tongue, remember that for the person facing me their belief is perfectly rational, remind myself that telling them they are wrong or debating them will not change their belief, and try and get on with my life. But for vaccines in particular, I seethe, because these beliefs have an impact on actually lives and public health. And I have to say I dread them moment when we will finally have a vaccine against SARS-CoV-2, and people will refuse to use it. It’s going to be a tough exercice in emotion management for me.
Anyway, I’ve reached a point now where I try to provide the information I feel is the best for those who want it, and I’m getting better at feeling OK that somebody I value or appreciate believes something I think is plain wrong – without trying to change their mind about it. I’m getting better at identifying the point where a discussion stops being an exchange of ideas in the search of truth or satisfaction of genuine curiosity, and starts being a standoff between two people with firm beliefs, each trying to shove theirs upon the other.
Also published on Medium.