Why fact-checking alone often fails us on health care topics
Fact-checking has become one of the buzziest buzzwords in journalism. There are more than 100 fact-checking projects around the world. It’s been trotted out to counter alleged “fake” news. And to monitor the accuracy of political leaders who stretch the boundaries of believability. April 2 has been proclaimed International Fact-Checking Day.
Indeed, genuine fact-checking may play an important role in political news coverage. FactCheck.org has been working in this space for 15 years. The current PolitiFact-Kaiser Health News project to “truth-squad” claims in the run-up to the 2020 elections is another noteworthy example. However, partisan criticism of alleged partisan fact-checking has sprung up all over the web. Just do a search on the term “partisan fact-checking” to start your head spinning.
But the genre faces steeper challenges when applied to health care news, which is often based on the results of studies published in medical or scientific journals.
Factual but unhelpful and misleading
When communicating about biomedical research, you can be 100% factually correct while being 100% unhelpful to your audience. I offer examples later. In this arena, fact-checking by itself often is too limited in scope to be useful for the general public of news consumers and health care consumers. Fact-checking alone often fails to capture questions of nuance and context that arise – or should arise – whenever evaluating medical evidence.
Fourteen years ago, John Ioannidis, M.D., wrote the important paper, “Why Most Published Research Findings Are False.” More recently, he wrote about what he called evidence-based hearsay:
Evidence is possible to subvert at all stages of its generation and dissemination, while fierce marketing, rumors, and beliefs become more powerful than data.
The work of researchers like Ioannidis reminds us how murky the claims about evidence, data and facts can be in the current science communication environment.
David vs. Goliath: A one-man band vs. Facebook
Technology giants like Facebook and Google have promoted their fact-checking projects. But the devil is in the details. Last month, the Columbia Journalism Review published a piece headlined: “Facebook’s fact-checking program falls short.”
And Facebook’s health-related fact-checking got another black eye last week when the Popular Information website published, “Facebook giving massive distribution to dangerous misinformation about diabetes.” Excerpt:
Facebook is giving a page featuring incendiary right-wing memes and dangerous misinformation about diabetes massive distribution — reach that exceeds some of the nation’s largest news outlets.
The Rowdy Republican page, which has over 780,000 followers, is run by an affiliate marketer with a history of legal problems and deceptive practices. He is seeking to drive people to a site about ‘The Big Diabetes Lie,’ which tries to convince people to purchase a $55 paperback book.
You need to read the article in its entirety in order to evaluate the depth of the concerns it raises.
The “Big Diabetes Lie” website uses familiar language: “scientifically proven … .achieve the impossible … dramatic results.” One excerpt:
It doesn’t matter if you follow your doctors (sic) recommendations and dosages exactly as prescribed. This isn’t a question of IF, but WHEN. Your health will get worse. The drugs you take will fail. The insulin injections you take will also fail.
STOP
If you have diabetes, you simply cannot continue this way – sooner rather than later you WILL die.
But Popular Information followed up, reporting:
“Less than 24 hours later, all the links to the diabetes scam have been removed from the Rowdy Republican page. Facebook is finally enforcing its own rules.”
Questions remain, such as why it took the efforts of Popular Information, a one-man-band website, to reveal this ugly episode?
Nice try, but ...
Some efforts to help consumers identify reliable health care news are presumably well-intentioned. But some have clear shortcomings.
Healthline says its stories are “fact checked by our panel of experts.” But earlier on HealthNewsReview.org, my colleague blogged about one example that demonstrates how meaningless and useless such alleged health care news fact-checking can be. The Healthline story was about the promise of a new drug against a form of multiple sclerosis. An excerpt:
The story didn’t include key details like side effects, and it used a quote lifted directly from the drug company news release, among other problems. These red flags raise an important question: Who is controlling the ‘facts’ on this story? Healthline’s fact-checkers or the drug company that funded the study?
On a broader scale, NewsGuard — a plugin that calls itself “the internet trust tool” — gives green checkmarks to websites that are “trying to do legitimate journalism.” Its criteria include not repeatedly publishing false content and not publishing deceptive headlines.
Recently NewsGuard looked at several news organizations that blatantly misled readers about a study that showed a statistical association between consuming soft drinks and longevity. Not cause and effect, but a statistical association.
The stories’ lead paragraphs disregarded the important caveat that the study couldn’t prove cause and effect. Reuters reported that soft drinks “may raise the risk of premature death.” CNN said it was time to “consider ditching your favorite soda.” Even drinkers of diet soda should “beware,” warned the Atlanta Journal-Constitution.
But NewsGuard gave all of these stories a green checkmark of legitimacy, like this one:
These news organizations may be “trying to get it right” in the verbiage of NewsGuard’s rating system, but their hyperbolic coverage failed miserably with this story, raising questions about the value of the green check of legitimacy.
Laudably, The New York Times followed up with a story that pointed out the soda study’s limitations, but consumers can’t rely on a helpful post-hoc critique of every health care claim that gets reported.
A void in the health news ecosystem
When HealthNewsReview.org lost its funding at the end of 2018, some people speculated that newly launched online projects would fill the void. But that hasn’t happened.
The site’s review criteria focused on the misleading elements of much health care reporting that may be missed in a purely fact-checking approach. Our reviewers helped readers understand the use of statistics that, while not factually incorrect, frames research results in the most favorable light possible – in ways that mislead and arguably misinform readers.
So, while it may be factually correct to report on the associations that researchers report in observational studies, I’ve long tried to teach readers that such stories are woefully incomplete message if they don’t include the limitations of observational research – and simply wrong if they make cause-and-effect statements about what was merely a statistical association. A news story or news release may be judged to be factually correct when describing the results of a study in which the outcomes were surrogate endpoints or markers. But it’s critical to educate news consumers about the limitations of such surrogates and what those findings may not mean.
An analysis based only on “Are the facts correct?” may make a message look more informative than it was.
Metafact is not the solution
The Poynter Institute, a journalism training center, pointed to Metafact as one possible “solution to health misinformation.” The Metafact solution is “to empower anyone to directly ask experts to verify a claim they have read and for a consensus score to quickly aggregate and spread, allowing people to make better judgments on questions important in their lives.” One recent example demonstrates the platform’s limitations.
“Does CBD (Cannabidiol oil) help with anxiety?” was the title of a recent Metafact entry. Five experts responded with their opinions. No systematic criteria were used. All five simply responded with whatever opinion came to mind. Two of the five acknowledged clear financial conflicts of interest; one worked for a commercial cannabinoid-based drug company and one worked for the International Cannabis and Cannabinoids Institute, which serves clients engaged in cannabis commerce. Two of the five open-ended responses were 25 words or less.
This approach may help inform some readers. But I’m not a fan of publishing opinions from conflicted sources and then touting, as Metafact does, “We don’t take sides – ever. The only thing we don’t compromise are the facts.” I don’t think that this approach assures that readers get the facts. So, in this form, I don’t see it as a solution to health misinformation.
You’re entitled to your own opinions, but not your own facts
Fact-checking projects may be colored by the people who publish them. Indeed, such efforts come in many forms with varying degrees of quality and usefulness. Careful consumers will need to hone their critical thinking and analytical skills in order to glean the most from any fact-checking project.
As the Facebook example above demonstrates, consumers can’t rely on technology behemoths to police health care claims. In this crazy environment, even fact-checkers need to be fact-checked.
In health care news, which follows an industry that is fraught with conflicts of interest and financial pressures that may influence the integrity of research and clinical care, patients and consumers need to be especially wary of what they read. And that includes what they read in health care fact-checking articles. If the fact-checkers don’t disclose who does the fact-checking, how they do it, and whether any potential conflicts of interest are involved, look elsewhere.
A version of this post was originally published by HealthNewsReview.org.