Medical experts warn that the internet could be making more of us sick, pointing to research highlighting how online platforms’ role in boosting vaccination opponents and their baseless theories is fueling infectious disease outbreaks across the world.
Two researchers at the University of Stirling in Scotland released a study last year concluding, based on data from January 2011 to August 2017, that there were “statistically significant positive correlations” between the use of anti-vaccination search terms and falling rates of immunizations, suggesting that online anti-vaxxer activity is having real and potentially deadly consequences.
Amaryllis Mavragani, one of the study’s authors, told Mother Jones about observing anti-vaccination groups on Facebook, saying she grew frustrated at how easily incorrect information is passed around. She believes internet platforms like Facebook are contributing to the decrease in immunizations targeting diseases like the measles and enabling new outbreaks.
In 2005, Dr. Richard K. Zimmerman, a medical doctor and associate professor at the University of Pittsburgh’s Department of Family Medicine, wrote an eerily prescient paper for the Journal of Medical Internet Research warning that online disinformation could lead to more people getting sick. “With the burgeoning of the Internet as a health information source, an undiscerning or incompletely educated public may accept [claims critical of vaccines] and refuse vaccination of their children,” he wrote. “As this occurs, the incidence of vaccine-preventable diseases can be expected to rise.”
Looking back on the paper, written when social media and video streaming were still in their nascent stages, Zimmerman says today that such misinformation spread even faster than he’d imagined. In an interview with Mother Jones last week, Zimmerman pointed out that since his article was published, vaccination rates have indeed fallen, while cases of measles, a highly infectious disease whose spread is prevented by a robustly immunized population, have risen.
Before the internet, misinformation about vaccines generally required direct communication between individuals. “The internet allows people to create groups without being confined to a physical community, unlike before. I certainly think social media like Facebook can even accelerate that,” Zimmerman says. “There are other ways that these messages are communicated, but I think a lot of it is on the internet. I don’t think that’s the only way, but even when it’s not, someone on the playground might say, ‘Hey, I saw this thing about vaccines on Facebook,’ and things can spread that way too.”
While epidemiologists and other researchers have so far struggled to definitively establish a causal link between the proliferation of anti-vaxxer internet communities and real world outbreaks, many experts have little doubt that online misinformation is fueling outbreaks.
Dr. Wendy Sue Swanson, a spokeswoman for the American Academy of Pediatrics, agrees that the spread of misinformation about vaccines has been worsened by Facebook and the anti-vaccination groups it hosts. But she added that slowing down misinformation about immunization is difficult and stressed that she doesn’t fault the company. “I think that Facebook cares about this,” she said. “There is not an easy answer, though.”
Citing free-speech concerns, Swanson expressed trepidation at cracking down on these types of groups, instead suggesting that the platform could work to offer counterinformation from qualified doctors by elevating correct information within its algorithm.
Whether counterinformation is a successful antidote to misinformation has been debated in many contexts. (Facebook’s efforts to debunk “fake news” by boosting trusted publishers has already fallen short.) But further complicating such an approach on the topic of vaccines is the paltry amount of content that users produce making the case that they are safe and useful.
“Across all platforms, the dominant form of vaccine-related content is anti-vaxx,” said Renee Diresta, a social-media researcher at internet security firm New Knowledge who has explored the flow of such content across the internet. “There’s an asymmetry of passion around this topic. Most people aren’t producing pro-vaccine counternarratives…They just get their kids vaccinated and go on with their day.”
In addition to being a hub for false information about immunizations, Facebook has been used by anti-vaxxers to target groups vulnerable to misinformation. Facebook lets advertisers target 900,000 users interested in “vaccine controversies,” according to a recent Guardian report. Last week, the Daily Beast reported that the platform lets advertisers specifically target women “interested in pregnancy,” and published an analysis of Facebook advertisements purchased by several anti-vaccination pages showing they “overwhelmingly targeted” women over the age of 25—a demographic stocked with new and likely mothers.
On Thursday, Facebook said it was exploring potential solutions to limit the spread of anti-vaccination information on its platform following reports in the Guardian detailing pressure from the medical community and a related letter from Rep. Adam Schiff (D-Calif.).
“We’ve taken steps to reduce the distribution of health-related misinformation on Facebook, but we know we have more to do. We’re currently working on additional changes that we’ll be announcing soon,” a Facebook spokesperson told Mother Jones in an email. The company said it is considering removing anti-vaccination content and groups from its algorithmically driven recommendations.
YouTube recently announced it has changed its algorithm to recommend videos containing conspiracies and misinformation less often; a company spokesperson told Mother Jones the effort would include at least some vaccine videos. The spokesperson said the company is working on prioritizing content from “credible news sources” and on providing more context from outside references like Wikipedia and Encyclopedia Britannica around videos featuring subjects that are often targets for misinformation, including immunizations.
While most platforms have so far been hesitant to outright remove anti-vaccination content, others, like Pinterest and Medium, have decided to do just that.
Medium confirmed to Mother Jones that while it doesn’t have a specific policy addressing anti-vaccination information, it has taken down certain related posts. Pinterest takes a more aggressive approach.
A spokesperson explained a Pinterest policy forbidding “advice when it has immediate and detrimental effects on a Pinner’s health or on public safety,” specifically including the “promotion of false cures for terminal or chronic illnesses and anti-vaccination advice.”
“We recognize the important role vaccines play in personal and public health, which is why it is our policy to remove anti-vaccination advice and other health misinformation from our platform,” the spokesperson wrote. “We want Pinterest to be an inspiring place for people. We know that doesn’t happen on its own, which is why we continue to work on keeping harmful content off our platform.”