Skip to main content

YouTube will remove videos with COVID-19 vaccine misinformation

YouTube will remove videos with COVID-19 vaccine misinformation

/

It’s an expansion of the platform’s COVID-19 Medical Misinformation Policy

Share this story

Illustration by Alex Castro / The Verge

Videos containing COVID-19 vaccine misinformation will be removed from YouTube, the platform announced today. Content about a vaccine that contradicts information from health experts or the World Health Organization won’t be permitted.

“A COVID-19 vaccine may be imminent, therefore we’re ensuring we have the right policies in place to be able to remove misinformation related to a COVID-19 vaccine,” Farshad Shadloo, a YouTube spokesman, said in an email. That could include false claims that vaccines implant microchips in people’s bodies, for example, or that they cause infertility. Both rumors are untrue.

The new guidelines are an expansion of YouTube’s existing COVID-19 Medical Misinformation Policy, which doesn’t allow videos that falsely suggest the coronavirus doesn’t exist, that discourage mainstream medical care for the disease, or that say the virus is not contagious. The highly contagious virus does exist, and alternative, unproven remedies can be dangerous.

YouTube demonetized videos that promoted anti-vaccination information in 2019.

On Tuesday, Facebook announced its own crackdown on anti-vaccination content: it’s not allowing ads that discourage vaccination. “We don’t want these ads on our platform,” the company said. Ads are as far as the policy goes, though, and organic posts from anti-vaccine groups will still be permitted.

The platform’s policies come as clinical trials of COVID-19 vaccines inch closer to completion. Public trust in those vaccines is low. President Donald Trump has made public statements pushing for a vaccine by Election Day, and many people in the US think that the development process is political, not scientific. Anti-vaccine groups are feeding off of that mistrust.