After years of allowing anti-vaccine groups and pages to rack up followers on its social network, Facebook announced last month that it wants to lead the world’s largest Covid-19 inoculation information campaign and encourage its users to get vaccinated. It’s also banned users from sharing general forms of vaccine misinformation, like the idea that vaccinations cause autism.
Facebook’s big push is meant to help bring an end to a pandemic that has killed more than 2.5 million people around the world. But for some of the people who have for years been sounding the alarm about the dangers of anti-vaccine groups and pages on Facebook and Instagram, the announcement — even if it’s a step forward — feels like too little, too late.
“No matter what the commitment is or the ideas made, at the end of the day … I can clearly see their priority was their job and the reputation of Facebook, versus the lives of Americans,” said vaccine advocate Ethan Lindenberger, who said that Facebook groups helped convince his mother not to vaccinate him as a child against illnesses like measles.
Over the past decade, Instagram and Facebook users have created communities on these platforms to organize against vaccines, mixing with and assuming online affinities like “vaccine safety,” parenting communities, or “health freedom,” among others. In Facebook groups, people have promoted the anti-vaccine movement by posting everything from personal anecdotes claiming vaccines have injured their children to far-out conspiracy theories, including the idea that inoculations are disingenuous money-making schemes.
Now, Facebook says it wants to change course. It’s directing users to local authorities to get information about where and how to sign up for shots. It’s also providing ad credits for public health agencies to boost vaccine content, and working with Johns Hopkins University to elevate vaccine information to groups whose access to vaccines can be lower, including Native American, Black, and Latinx communities. The company is also working with the Biden administration to tackle Covid-19 misinformation on its site.
But pro-vaccine advocates have serious doubts about whether the company fully understands the problem, and if it’s adequately prepared to address the anti-vaccine communities that flourish on its platforms.
Facebook’s vaccine campaign comes after a decade of limited action
Long before the Covid-19 pandemic, pro-vaccine advocates were trying to raise awareness about anti-vaccine content on Facebook and Instagram. Facebook groups, in particular, lured people into closed-off, online spaces that provided a sense of community, but spread medical misinformation. “I got this community of women who — like my midwife — were these supportive, intelligent, educated women who all loved their children,” recounts Maranda Dynda, a mom who joined Facebook anti-vaccine groups in 2012, before leaving these online communities about two years later. She now supports vaccines.
The problem was big enough that it even personally impacted Facebook CEO Mark Zuckerberg, who in 2016 saw his own profile swarmed by anti-vaccine comments after he posted a routine picture of his own young daughter getting vaccinated at a doctor’s visit.
While Facebook started a general fact-checking program in 2016, the company only took significant action on vaccine misinformation three years later in response to growing political pressure. Lawmakers were increasingly concerned about the surge in measles cases in the United States, and had begun paying more attention to how anti-vaccine misinformation and content discouraging vaccination had started to proliferate on social media sites like Facebook.
In March of that year, Facebook announced new changes: It would reduce the distribution of anti-vaccine groups and pages in recommendations and search. The company promised that advertisers would no longer be able to “target” audiences of people interested in vaccine-related “controversies,” and said it would reject ads with vaccine misinformation. In September 2019, the company also launched an alert that directed people to the World Health Organization (WHO) when searching for vaccine search terms, following consultation with the global public health authority.
Still, Facebook kept offering a safe place for anti-vaccine content. Groups were no longer supposed to be recommended to users in search, but they were still allowed on the platform and still adding new members. Despite its ban on ads with vaccine misinformation, it still allowed ads that pushed “opposition” to vaccines.
Unsurprisingly, anti-vaccine groups and pages continued to proliferate — and even crowdfund — on Facebook well into 2020.
Between May 2019 and May 2020, nonprofit organization Avaaz found that Facebook saw a whopping 3.8 billion views on health misinformation on the platform, including on false claims about vaccines that went unflagged by Facebook fact-checkers. The Center for Countering Digital Hate, which has researched the anti-vaccine industry on social media, told Recode it found that, between March 2019 and December 2020, members of anti-vaccine groups on social media grew from just over 650,000 to just under 800,000 members — a more than 20 percent increase. Between the end of 2019 and 2020, the group also found that the number of followers of anti-vaccine accounts grew by at least 1 million people on Facebook and 4 million on Instagram, based on accounts CCDH tracked.
The Covid-19 pandemic only raised the stakes for an already significant problem. As the world got closer to approving and rolling out Covid-19 vaccines, staunch anti-vaccine groups and pages on Facebook started targeting a new audience of potential supporters that they could recruit: the millions of people nervous about the new coronavirus vaccines.
Robert F. Kennedy Jr., a longstanding anti-vaccine figure, used Instagram to elevate vaccine misinformation during the pandemic, including the false Bill Gates vaccine microchip theory. And in May 2020, “Plandemic,” a conspiracy video that accused Anthony Fauci of hiding research about supposed deleterious effects of vaccinations, went viral and got millions of interactions on Facebook. Anti-vaccine groups and pages across the platform promoted unfounded theories as they emerged, including false claims that the Covid-19 vaccine might alter your DNA, or that the entire pandemic was a conspiracy. (While the vaccines from Pfizer/BioNTech and Moderna are vaccines designed with mRNA, they do not alter DNA; Covid-19 is not a hoax.)
Karen Ernst, who leads a group called Voices for Vaccines, told Recode that against the backdrop of the pandemic, anti-vaccine messaging appears to have expanded into broader communities, like those focused on organizing against mask-wearing and lockdown measures. “I hold Facebook hugely responsible,” she told Recode, in regard to the proliferation of anti-vaccine communities online. “They have been really derelict.”
Lindenberger said that to this day, Facebook remains the place where his mom gets “almost 99 percent” of the vaccine misinformation she brings up, and she remains an active member of anti-vaccine groups on the platform. He says he struggles to trust any of Facebook’s pledges.
Facebook’s new rules may not be enough to combat deep-seated vaccine hesitancy
As part of its global vaccine effort — and in response to growing concern that misinformation could make people hesitant to receive the Covid-19 vaccine — Facebook has banned common strains of general vaccine misinformation. It consulted with groups like the WHO to come up with a list of specific vaccine claims users are not allowed to make, like the idea that measles can’t kill people or that vaccines cause autism. (In October 2020, Facebook had made a more limited pledge to ban ads that discourage getting vaccinated.)
Facebook said it has consulted with more than 60 health authorities and experts to design its policies. “For years, we’ve been working to stop misinformation on Facebook by reaching as many people as possible with accurate information about vaccines, removing content that breaks our rules, and reducing the distribution of false information,” Dani Lever, a spokesperson for Facebook, told Recode.
Between March and October of last year, the company removed 12 million instances of content that violate its ban on Covid-19 misinformation that could lead to imminent physical harm. Since its new policies were announced in February, the company has removed another 2 million false claims about Covid-19 and vaccines.
But experts told Recode that Facebook’s stronger rules against misinformation won’t be enough to fully address its anti-vaccine problem.
Facebook isn’t solely responsible for anti-vaccine ideology, which is as old as vaccination itself. And Facebook doesn’t have the unilateral power to stop anti-vaccine content. But the experts Recode spoke to said Facebook should have taken — or still can take — several key steps to make the conversation surrounding vaccines on Facebook healthier.
“They’ve already got rules in place — they just don’t enforce them,” Imran Ahmed, the CEO of the Center for Countering Digital Hate, told Recode. A study by his organization found that of more than 500 anti-vaccine posts reported to Facebook by volunteers as misinformation, less than 6 percent were eventually removed or flagged by the platform. He says that a more effective action Facebook could take is deplatforming anti-vaccine entrepreneurs and influencers who remain on the site.
“People are really great at creating workarounds,” Kolina Koltai, a vaccine misinformation-focused postdoctoral scholar at the University of Washington, told Recode, noting that she’s still able to find misinformation and anti-vaccine Facebook groups. Koltai pointed out that although Facebook in February finally removed the Instagram page of Robert F. Kennedy Jr., his Facebook profile — where he has more than 200,000 followers — remains up. Facebook says that just because someone’s account is disabled on one of its platforms doesn’t automatically mean that their account on another service is disabled, since these accounts may post different content.
“Saying that, ‘well, we’re removing false claims and that’s going to solve the problem’ — it’s barely scratching the surface of what’s going on here,” said David Broniatowski, a professor at George Washington University who has studied anti-vaccination communities online. He notes that often, anti-vaccine groups form communities by organizing around “health freedom” and against certain government policies encouraging or requiring vaccination, and don’t necessarily focus on scientific claims about vaccines’ efficacy.
Two days after Facebook announced it would launch a global inoculation campaign, Ernst, of Voices for Vaccines, says she was able to find a six-hour-old post from a woman asking whether she should purposely expose her children to chickenpox — an illness with potentially severe complications — in a well-known anti-vaccine group. Ernst says Facebook needs a better way to report people who might be putting their children in danger.
But more broadly, experts told Recode that focusing on a particular set of claims that Facebook has deemed false and worthy of removal won’t truly combat vaccine hesitancy. Facebook does not operate in a vacuum, they argue, and spreaders of anti-vaccine content can use Facebook to find an audience, and direct them to even worse content off the platform.
At the same time, pro-vaccine advocates who do outreach to vaccine-hesitant people now fear that Facebook’s takedown algorithms won’t be sensitive enough to understand the nuances of how people talk about vaccines, and so they might sometimes remove good vaccine information shared on the site. It has happened before: In 2019, for instance, Facebook’s algorithms appeared to erroneously remove pro-vaccine content launched by the Minnesota Hospital Association as it tried to better monitor vaccine content on its site.
Facebook has acknowledged the murky lines the company’s vaccine policies can raise. “[I]f your neighbor shares their personal experience of feeling sick beyond the normal side effects after getting a vaccine, is their Facebook post supposed to be removed?” wrote the company’s head of health in an op-ed this month. “There’s no perfect solution for this.”
Still, Facebook’s recent efforts are a good thing, even if the challenge of effectively beating back anti-vaccine narratives on social media can be very difficult. Wendy Sue Swanson, a pediatrician who has been writing about vaccines online since 2009, and who has met with Facebook, says that Facebook’s campaign appears to be “a multi-pronged approach” that includes tackling vaccine misinformation, but also access and distribution.
“We should be applauding and helping in these efforts, not only criticizing these efforts,” she says. She emphasized that people who know the benefits of vaccines must amplify accurate pro-vaccine information in order to truly counteract vaccine hesitancy.
“[It’s] really easy for me to just share something that’s not true,” Swanson told Recode. “[It’s] really hard for me to dissect that, and then re-prove to you an opposite opinion using facts, particularly because facts aren’t emotional, and typically lives are.”
Open Sourced is made possible by Omidyar Network. All Open Sourced content is editorially independent and produced by our journalists.