As the world grapples with the devastation of the coronavirus, one thing is clear: The United States simply wasn’t prepared. Despite repeated warnings from infectious disease experts over the years, we lacked essential beds, equipment, and medication; public health advice was confusing, and our leadership offered no clear direction while sidelining credible health professionals and institutions. Infectious disease experts agree that it’s only a matter of time before the next pandemic hits, and that could be even more deadly. So how do we fix what COVID has shown was broken? In this Mother Jones series, we’re asking experts from a wide range of disciplines one question: What are the most important steps we can take to make sure we’re better prepared next time around?
Back in March, as epidemiologists and front-line nurses scrambled to track and treat the spread of COVID-19, health expert Timothy Caulfield was already sounding the alarm about its sinister twin: the infodemic, or the spread of misinformation about the virus. “The tsunami of misleading noise flowing from this ‘infodemic’ has resulted in deaths, financial loss, property damage, and heightened stigma and discrimination,” he later wrote. “It has also facilitated an erosion of trust in key institutions and added to the already chaotic information environment.” Caulfield is a professor of law at University of Alberta’s School of Public Health, and the research director of its Health Law Institute, where he’s launched multiple studies on the intersection of misinformation, the Internet, and public health, and he’s helped craft guides on battling the spread of bad information. He’s also the host of the show A User’s Guide to Cheating Death, in which he debunks popular health trends that are based on pseudoscience. And he’s the author of several books, including Is Gwyneth Paltrow Wrong About Everything? I asked him about what we can do to address COVID’s parallel plague.
On how misinformation is spreading during the pandemic: I’ve been following the spread of misinformation as part of my career for decades. And I haven’t seen anything like this before. It really is incredible. There’s misinformation spreading about every aspect of the pandemic, from its source: People are saying it’s a bio-weapon (not true); that it’s caused by 5G technology (not true). To crazy cures: The idea that you can cure it with cow urine, with bleach (not true). The idea that you can “boost” your immune system, which has become a huge industry.
In the early days the problem was just misinformation. Now every topic, whether it is masks, hydroxychloroquine, or physical distancing, has become a polarized issue. That’s made it more difficult to battle the misinformation. And layer on top of that, all of the controversies that have been associated with the science being done. For example, the recently retracted hydroxychloroquine study that was in The Lancet. It was a study that got a lot of play in the media, and it had an almost immediate impact on clinical trials. And then it gets retracted. That’s problematic, both because it’s just bad for science, but it’s also really, really bad for public trust.
It’s incredible the degree to which social media has played a role here. That’s always been the case—we’ve done studies around for example, vaccination hesitancy and how vaccination misinformation is spread. But here, it really has dominated the spread of misinformation. And of course, that means that’s also where we need to battle it. We have a study that’s out for peer review right now, where we looked at over 200 websites, pretending we are someone searching, on Google or Bing or whatever, “immune boosting.” On 85 percent of those websites, immune-boosting is portrayed as if it’s an effective way to fight COVID-19. When the whole concept of immune boosting is scientifically questionable, it’s really grim when you see those kinds of numbers. And only 10 percent of the websites had any kind of critical critique of immune boosting at all. On Instagram, it’s pictures, it’s images, it’s a lot of influencers with lifestyle brands. And there’s very little room for scientifically nuanced portrayal. So things are presented in the way as if efficacy is taken for granted. I personally looked at hundreds of postings about immune boosting, and I didn’t see a single, scientifically accurate portrayal of the concept.
And so we’re looking at all those those generators of misinformation. And then we’re also working with other researchers doing empirical research on how people respond to that misinformation, and why people spread misinformation. And then, of course, what we want to do is develop some strategies. And we’ve already done some of that and provided recommendations.
On the individual power to stop bad information from spreading: It’s one of those one of those social problems that is going to require us to come at it from every angle, right? We’re going to need governments to take action. We’re going to need regulators like the FDA and the FTC to step up and do more. We’re going to need health professional organizations to make sure their members are not spreading misinformation. We’re going to need stronger truth in our advertising laws to make sure that people aren’t leveraging the fear of the pandemic to sell therapies. We’re also going to have to—and this is a big one—figure out ways to get the social media platforms to take action.
But perhaps the most important thing, and evidence backs this up: We need individuals to take action. There’s really interesting research that suggests the spread of misinformation is largely a bottom-up phenomenon. These are people sharing this information on Facebook and Twitter. On Instagram. We have to develop strategies and encourage people not to do that. If you can just nudge people to think about accuracy, to embrace accuracy, before they share, we can have an impact on the spread of misinformation. It sounds ridiculously simple. But there are a couple of studies to back that up. And then we need to counter misinformation when we see it, with good information. So if you see misinformation, respond with trusted sources of information that use nice, authentic language (I know it’s hard not to be snarky, I’m snarky online a lot). And then use a creative communication strategy; people respond to stories, art, humor. And lastly, make sure that the general public—not the hardcore denier—is the audience. You’re never going to change the minds of those hardcore deniers. Always aim for the general public.
On how science literacy is key to preventing pandemics, and in fostering trust: I think it’s something that should be taught as early as—and I know that sounds ridiculous—but as early as kindergarten, and it should be taught throughout middle school and through high school and all the way through university regardless discipline. It allows people to be more critical consumers of the news, more critical consumers of social media. And the other important aspect is, teaching science literacy also allows people to understand the scientific process better. It allows people to understand that science is not a list of facts. You know, science is not a person. Science is not an institution. Science is not an industry. Science is a process. It evolves, and public health officials do their best with the evolving science to make a decision. And I’m hopeful that if more people understood how science operates, they might be more forgiving of how science policy is made. And more critical of what they’re seeing on social media, in the news media, and in popular culture more broadly.
Having said all that, I think it’s important to recognize that the other thing we want to do with misinformation is listen and learn, you know, why are people being attracted to misinformation? What are their concerns? It’s not just trying to get people to be more scientifically literate, it’s also about trying to understand what’s attracting people to this information. Part of that is a breakdown in trust. I get a lot of hate mail. And regardless of the topic, whether it’s homeopathy, GMOs, alternative medicine, COVID, it’s almost always the same. They start with telling me what an idiot I am, probably a few swear words in there, but the very next paragraph is about trust. “I can’t trust the science. I can’t trust the system.” So I think that understanding what it takes to be a trustworthy institution; creating trustworthy science; and communicating in a trustworthy manner, is going to be incredibly important. I hope that’s one of the lessons we get from this current crisis.
The United States is in a really tough spot, because in many countries, Canada included, the citizens do have a higher level of trust in their government, in their health care system, and in their public health authority. That does make it easier in a public health crisis. There are institutions, including the pharmaceutical industry, including the biomedical research world, that have had bad actors and that have created legitimate situations that people can point to and legitimately say, “there are reasons I shouldn’t trust you.” And so I think we need to remember that when we’re trying to fight misinformation.
And I think that we need to make sure that the scientific house is in order. We’ve seen science has been rushed. Recently we’ve seen important science retracted in prominent journals. Without good science we’re never going to win the fight against misinformation.