randombio.com | Science Dies in Unblogginess | Believe All Science | I Am the Science Saturday, June 08, 2024 | commentary 'Misinformation' can be a threat--but not in the way they want you to think (v.2)Misinformation doesn't mean something you think is incorrect. You have to prove your case, not censor opposing interpretations |
s of today there 1598 articles listed in PubMed containing the word ‘misinformation’ in the title. Of these, 27 are in Nature magazine. The most recent one is titled “Post-January 6th deplatforming reduced the reach of misinformation on Twitter.” It's paywalled so it doesn't get a link, but in the abstract they write:
Here we evaluate the effect of the decision by Twitter to suddenly deplatform 70,000 misinformation traffickers in response to the violence at the US Capitol on 6 January 2021 (a series of events commonly known as and referred to here as 'January 6th') . . . . The results inform the historical record surrounding the insurrection, a momentous event in US history, and indicate the capacity of social media platforms to control the circulation of misinformation, and more generally to regulate public discourse.
When political opinions like this turn up in a science journal, it convinces readers that the editors are losing their dedication to objective facts. It's not a question of bias. Misinformation is not a scientifically meaningful term at all. Neither is “insurrection.” If this article passed peer review, then science is in big trouble.
If you're behind a paywall, that means you don't want to be part of the conversation. Yet they still want to “regulate public discourse.”
One article in Sci Adv.[1] concludes that censorship reduced the number of posts in “antivaccine venues,” but did not change engagement with antivaccine content. In fact, “misinformation patterns” increased.
So, what is ‘misinformation’? Maybe, you might say, it means misinterpreting what someone else wrote or inventing false information. Indeed, I have seen people making statements that inaccurately represent the source material. But in our politicized world, misinformation has come to mean “whatever the other side believes.”
What do we do when a scientific article misrepresents its own data? Or when journals selectively publish articles on the basis of whether the findings fit their political beliefs? Should we then censor the NYT, the directors of NIAID and CDC, and Nature?
Whether on Alzheimer's disease or nutrition or global warming, researchers frequently misinterpret their own results. That can be a gold mine for scientists. One time a paper (in Nature, I think it was) misinterpreted their results. The authors portrayed them as a dead end, but in fact they showed something important. I followed up on their real finding, which the authors had missed, and received a million-dollar grant to study it.
But the average reader or newspaper reporter who accepts an abstract at face value doesn't have that opportunity. They will only be deceived. For example, one paper claimed that 99.85% of climate papers endorse the climate crisis hypothesis, but their data actually showed that the true number was closer to 31.09%. Yet people repeated the incorrect number, and those who noticed the error are too busy to complain.
Another Nature commentary titled either “Misinformation poses a bigger threat to democracy than you might think” or “Misinformation remains a threat to democracy” [2] (depending on where it's indexed—yes, errors are everywhere) illustrates the problem:
The threat to democratic integrity posed by misinformation and disinformation looms large. . . . The Holocaust did happen. COVID-19 vaccines have saved millions of lives. There was no widespread fraud in the 2020 US presidential election. The evidence for each of these facts has been established beyond reasonable doubt, but false beliefs on each of these topics remain widespread.
This is a classic example of the composition fallacy: throw in the things you want people to accept as false along with something horrible in the hope that the rubes will think they're in a distinct category and reject them all.
In the 2020 election, we all know how one party changed the election rules to favor their candidate—they boasted about it in an infamous article in Time magazine—and why the other doesn't challenge them. An argument could be made that the Jan 6 rioters should have stormed RNC headquarters for doing so little to prevent it, even though everyone knew it was going to happen. As for vaccines, see Pandemrix.
Clearly, the authors' idea of a “threat to democratic integrity” is not when these things happen, but when somebody reports them and interprets them the “wrong” way. They're entitled to their opinions, but opinions don't belong in science. And logical fallacies don't belong anywhere.
When people get things wrong, it's often because they don't have access to higher quality sources of information. With so much science behind paywalls, it's harder than ever to get essential facts. That allows falsehoods to spread; a skeptic might say that's why it's done. Other times it was because they didn't have enough hard science background. A few started getting clicks by deliberately misinterpreting things.
During Covid, scientific journals tried an experiment: make everything about Covid publicly available. It produced the biggest explosion of scientific literacy in a century. Tens of thousands of people read the papers, studied the textbooks so they could understand them, and were able to rebut the misinformation created by the press, the government and international organizations.
After the Covid Experiment, we started to see laymen talking intelligently for the first time in history about furin cleavage sites, DNA sequences, prion peptides, messenger RNA, and placebo-controlled clinical trials. It was almost like living in the Italian Renaissance.
This terrifies the authors. They would prefer to return to the old feudalistic model, where only they have access to the information and only they can influence the dialogue. It's legitimate to ask: why would someone be so insistent in being the only voice?
Misinformation, or more accurately false information, is caused by political groupthink that pushes people into all-or-none thinking. The argument about masks was an example. Even at the beginning, we knew Covid was spread by aerosols, not fomites, so sanitizing everything in sight and wearing masks would have little effect. But almost no one else recognized that. Many jumped onto their approved political positions, saying either “masks work” or “masks don't work.” And each accused the other of peddling “misinformation.”
Much of this comes from the media, the science establishment and, increasingly, politicized science magazines. Regardless of the source, the biggest threat to any democracy is not misinformation, but censorship. A society stops being free when it becomes impossible to express dissenting interpretations. That's what is really meant when people say ‘misinformation.’
The solution is simple. If you want people to know the truth, stop hiding it from them. If you want people to believe what you say, stop lying.
[1] Broniatowski DA, Simons JR, Gu J, Jamison AM, Abroms LC. The efficacy of Facebook's vaccine misinformation policies and architecture during the COVID-19 pandemic. Sci Adv. 2023 Sep 15;9(37):eadh2132. doi: 10.1126/sciadv.adh2132. PMID: 37713497; PMCID: PMC11044214. open access.
[2] Ecker U, Roozenbeek J, van der Linden S, Tay LQ, Cook J, Oreskes N, Lewandowsky S. Misinformation poses a bigger threat to democracy than you might think. Nature. 2024 Jun;630(8015):29-32. doi: 10.1038/d41586-024-01587-3. PMID: 38840026. https://www.nature.com/articles/d41586-024-01587-3 open access comment
jun 08 2024, 7:16 am. revised jun 10 2024, 4:46 am. minor updates jun 11 2024. original version
Censorship creates misinformation
The only way to eliminate misinformation on Twitter is to
set the character limit to zero
Artificial intelligence and the problem of disinformation
To be intelligent, an artificial intelligence has to be able to think.
Are the humans really ready for that?
Masking bad science about masks
Was the DANMASK-19 study really too small?
With a big enough population, you could prove that
almost anything would protect you
Does high humidity protect against SARS-CoV-2?
Could preventing the spread of COVID-19 be as simple as turning up
the humidity? Here are the calculations