Scientists are increasingly aware that they need to improve their skills of public engagement, so as to combat scientific misinformation more effectively than they have been able to manage to date. This is a welcome development, on the whole, as the American public’s scientific illiteracy is boundless, which has contributed to substantial unnecessary loss of life in the COVID-19 era, and promises to continue to do so on a range of issues in the future. Having said that, however, it is becoming equally clear that the unexamined assumptions of the scientists hoping to keep us properly informed may lead us as far astray as the self-interested propaganda put out by pseudo-scientific grifters.
Consider the following passage from a recent book on how to reason with “science deniers.”
“Content Rebuttal and technique rebuttal can be effective tools.”
We already know, based on the work of Schmid and Betsch, that it is possible to present science deniers with information that might change their minds. In the case of COVID-19 denial, what information might that be? With content rebuttal, we might share the studies showing that masks are effective. With technique rebuttal, we might point out the problem with conspiracy reasoning, either just before or just after they have heard scientific misinformation. Or — if that doesn’t work — perhaps we might exploit their predilection for conspiracist thought. Imagine a conversation in which you are talking to a COVID-19 denier (which means that the person is already predisposed to believe in conspiracy theories), and you share the fact that Russia and China are engaged in a massive disinformation campaign on social media in support of the idea that the coronavirus is a ‘hoax’ and we need to ‘liberate’ ourselves from more lockdown restrictions. This is not a conspiracy theory, it is a real live conspiracy! Might this not appeal to them? I’ve cited some sources earlier in this chapter that you can print out and hand to them. Some even include graphs and charts. You might encourage your interlocutor to think about who is benefiting from all the polarization and division in the U.S. Doesn’t that make them even a little bit suspicious? If all else fails, suggest that they do their own research. It is perverse, but it just might work!
— Lee McIntyre, How To Talk To A Science Denier (MIT Press, 2021) p. 174
We might cite studies showing that masks are effective.
True, we might, but that won’t necessarily have the desired effect, as there are also studies showing that masks are ineffective. McIntyre’s book shows that cherry picking data is a key problem in “science denial,” so how can he justify overlooking the studies that run counter to his preferred conclusion? See, for example, Dr. Dean Edell’s Eat, Drink, & Be Merry, (HarperCollins, 1999) p. 246, which cites a study of 1500 Swedish surgeons showing that working maskless did not result in a higher rate of infections, and more recently, “The scientific case against face masks,” (Unherd TV, January 13, 2023), which cites a range of mask studies claiming they are minimally effective or ineffective. For those who feel that the kind of mask is all-important, check with the This Week in Virology podcast, which recently cited a study of medical workers that concluded surgical masks and N95 masks were about equally effective in preventing disease transmission.
We might point out the problem with conspiracy reasoning.
Again, we might, but if we do we also have to point out that McIntyre’s own sources claiming “apparently” massive Russian and Chinese disinformation campaigns targeting Americans about COVID-19 and other issues are anonymous U.S. government officials whose intelligence services have a record of lying, distortion, and propaganda that make Mao and Stalin look like hopeless amateurs. In other words, he’s as guilty of conspiracy thinking as the “science deniers” he’s calling out in his book. It’s difficult to see how hypocrisy is the road to rational consensus. ((McIntyre’s dubious sources on alleged Russian and Chinese disinformation include: “The Coronavirus Gives Russia and China Another Opportunity to Spread Their Disinformation,” Washington Post, March 29, 2020. Also, “Chinese Agents Spread Messages That Sowed Virus Panic in U.S., Officials Say,” New York Times, April 22, 2020.))
I’ve cited some sources earlier in the chapter that you can print out and hand to them.
Which is about as condescending as it gets. Jehovah’s Witnesses use this approach. Should that convince us their worldview is rational?
In any case, why would this be a better approach than people sharing links to their favorite online sources, as is commonly done, with no evidence that it leads to broader and deeper rational consensus? On the contrary. It leads to wholesale denunciation of those who fail to accept the presumably superior insight of one’s favored sources.
Some even include graphs and charts.
Graphs and charts are not inherently more credible sources of information than pure text, as McIntyre seems to believe. Trying to convince someone by means of the form of an argument rather than the substance is part of the problem. In short, graphs and charts are only as good as the person who makes them. Vaccine skeptics and climate change skeptics can and do use graphs and charts to support irrational or misleading conclusions.
You might encourage your interlocutor to think about who is benefiting from all the polarization and division in the U.S.
This is good advice, but the answer is not “Russia and China,” as McIntyre thoughtlessly assumes, but the U.S. national security state and the banks and other corporations that dominate it. The power of the propaganda apparatus they have developed to mislead the public mind is unprecedented in human history, with the possible exception of the medieval Church. The Russian and Chinese contribution to false American beliefs is about as significant as how much your toaster contributes to global warming.
Do your own research!
Uh, no. Do your research in collaboration with others. This is how science works, and how democracy best works. Doing one’s own research might lead to endless ratification of confirmation bias, that is, selectively looking for information that reinforces what one already believes. Only when we interact with others do we have a chance of discovering our errors. This is less likely to happen online, where information is curated for your eyes only, which leads to ideological silos where underlying assumptions remain unseen and unchallenged. So do your research with others, and include people who hold different initial assumptions than you do, though hopefully not too radically different. And be respectful.
Obviously, science has a lot to teach us about what the relevant facts are and how they are best accounted for in areas of specialized knowledge, but there are no grounds for believing that when scientists venture into social and political thought that their conclusions are any better than those of the next guy you meet at the bus stop.