YouTube cracks down on anti-vaccine and coronavirus misinformation

  • The video-sharing giant said it would remove users and videos that spread false information and claim that Covid-19 vaccines are dangerous
  • The policy will extend to videos about other routine immunisations, such as measles and Hepatitis B
Agence France-Presse |

Latest Articles

OpenAI’s ChatGPT will ‘see, hear and speak’ in major update

Make these Japanese chicken meatball skewers at your next family dinner

Why farmers in Japan are returning to ‘fertiliser from a person’s bottom’

Hong Kong’s teacher shortage is forcing primary schools to hire untrained candidates

The Lens: Japan’s Johnny Kitagawa sexual assault scandal was hidden in plain sight

YouTube announced it would start removing misinformation about the coronavirus vaccine and other vaccines in general. Photo: Reuters

YouTube announced on Wednesday that it would remove videos and some high-profile users that falsely claim approved vaccines are dangerous, as social networks seek to crack down on health misinformation around Covid-19 and other diseases.

The video-sharing platform has already banned posts that spread false information about coronavirus treatments, including ones that share inaccurate claims about Covid-19 vaccines that have already been shown to be safe.

But the site said its concerns about the spread of medical conspiracy theories went beyond the pandemic.

The struggle to contain misinformation on social media

“We’ve steadily seen false claims about coronavirus vaccines spill over into misinformation about vaccines in general,” YouTube said in a statement.

“We’re now at a point where it’s more important than ever to expand the work we started with Covid-19 to other vaccines.”

YouTube said the expanded policy will apply to “currently administered vaccines that are approved and confirmed to be safe and effective by local health authorities and the WHO (World Health Organisation).”

Facebook pauses Instagram for kids plan

It will also remove false claims about routine immunisations for diseases like measles and Hepatitis B.

These would include cases where vloggers claim that approved vaccines do not work, or wrongly link them to chronic health effects.

Content that “falsely says that approved vaccines cause autism, cancer or infertility, or that substances in vaccines can track those who receive them” will also be taken down.

People in the US state of Ohio protest against Covid-19 vaccine mandates. In the US, there’s been a lot of resistance not only against vaccines, but masks and social distancing as well. Photo: AFP

“As with any significant update, it will take time for our systems to fully ramp up enforcement,” YouTube added.

It stressed there would be exceptions to the new guidelines, with personal testimonials of negative experiences with vaccines still allowed, so long as “the channel does not show a pattern of promoting vaccine hesitancy.”

YouTube said it had removed more than 130,000 videos since last year for violating its Covid-19 vaccine policies.

Instagram pledges to curb mental harm on teenagers

On Tuesday, the company told German media that it had blocked the German-language channels of Russia’s state broadcaster RT for violating its Covid misinformation guidelines.

YouTube said it had issued a warning to RT before shutting the two channels down, but the move has prompted a threat from Moscow to block the video site.

How to talk to a loved one who believes in conspiracy theories

It is not the only social media giant dealing with how to handle the spread of Covid-19 conspiracy theories and medical misinformation in general.

Earlier this month, Facebook launched a renewed effort to tackle extremist and conspiracy groups, beginning by taking down a German network spreading Covid misinformation.

Sign up for the YP Teachers Newsletter
Get updates for teachers sent directly to your inbox
By registering, you agree to our T&C and Privacy Policy