Some social networks have already put policies in place specifically against Covid-19 vaccine misinformation; others are still deciding on the best approach or are leaning on existing policies for Covid-19 and vaccine-related content. But making a policy is the easy part — enforcing it consistently is where platforms often fall short.
Facebook, Twitter and other platforms have their work cut out for them: The coronavirus and pending vaccines have already been the subject of numerous conspiracy theories, which platforms have taken action on or created policies about. Some have made false claims about the effectiveness of masks or baseless assertions that microchips will be implanted in people who get the vaccine.
Earlier this month, Facebook booted a large private group dedicated to anti-vaccine content. But many groups dedicated to railing against vaccines remain. A cursory search by CNN Business found at least a dozen Facebook groups advocating against vaccines, with membership ranging from a few hundred to tens of thousands of users. At least one group was specifically centered around opposition to a Covid-19 vaccine.
Brooke McKeever, an associate professor of communications at the University of South Carolina who has studied vaccine misinformation and social media, expects a rise of anti-vaxxer content and said it’s a “big problem.”
“The speed at which [these vaccines] were developed is a concern for some people, and the fact that we don’t have a history with this vaccine, people are going to be scared and uncertain about it,” she said. “They might be more likely or prone to believing misinformation because of that.”
That has real world consequences. McKeever’s fear: that people won’t get the vaccine and Covid-19 will continue to spread.
The report said social media platforms have done the “absolute minimum.”
Here’s where the platforms stand on combating Covid-19 vaccine misinformation so far.
Facebook and Instagram
“We allow content that discusses Covid-19 related studies and vaccine trials, but we will remove claims that there is a safe and effective vaccine for Covid-19 until global health authorities approve such a vaccine,” a Facebook spokesperson said. “We’re also rejecting ads that discourage people from getting vaccinated.”
A Twitter spokesperson said the company is still working through its policy and product plans ahead of “a viable and medically-approved vaccine” becoming available.
Since 2018, the company has added a prompt that directs users to a public health resource when their search is related to vaccines. In the US, it points people to vaccines.gov.
YouTube
A YouTube spokesperson said it will continue to monitor the situation and update policies as needed.
TikTok
TikTok said it removes misinformation related to Covid-19 and vaccines, including anti-vaccine content. The company said it does so proactively and through its users reporting content.
TikTok also works with fact-checkers including Politifact, Lead Stories, SciVerify, and the AFP to help assess the accuracy of content.
On videos related to the pandemic — regardless of whether they are misleading or not — TikTok has a label that says “Learn the facts about Covid-19,” which leads to a hub with information from sources such as the World Health Organization.