News & Events, Tech

YouTube removing 30,000 COVID-19 Vaccines misinformation videos

Only use reliable sources for vaccine information.

words by: Sahar Khraibani
Mar 25, 2021

The first wave of vaccination misinformation started surfacing on YouTube in October 2020. Now, in a move to combat misinformation and encourage people to get vaccinated, YouTube is removing close to 30,000 videos that propagate false claims about the COVID-19 vaccines.


In the last six months, most videos uploaded regarding the COVID-19 vaccine were violating YouTube’s misinformation policies.


According to a recent report, the platform started taking down more than 800,000 videos containing misleading and false information about the coronavirus beginning in February 2020. In October, the platform felt the need to add more policies to its COVID-19 medical misinformation policy. The platform’s AI systems as well as its human reviewers flag the videos at first, and then the videos go through another round of review before being taken down.


Misinformation in YouTube’s policy in regards to false claims made surrounding the pandemic is based on isolating any statements of information that contradicts with health experts and reputable authorities such as the World Health Organization. When these rules are violated, the company puts in place a “strike system,” that can result in the users’ accounts being permanently disabled and banned.


On the policy statement of their website, YouTube details:


“YouTube doesn’t allow content about COVID-19 that poses a serious risk of egregious harm. YouTube doesn’t allow content that spreads medical misinformation that contradicts local health authorities’ or the World Health Organization’s (WHO) medical information about COVID-19. This is limited to content that contradicts WHO or local health authorities’ guidance on:

  • Treatment
  • Prevention
  • Diagnostic
  • Transmission
  • Social distancing and self isolation guidelines
  • The existence of COVID-19

What this policy means for you if you’re posting content:

Don’t post content on YouTube if it includes any of the following:

Treatment misinformation:

  • Content that encourages the use of home remedies, prayer, or rituals in place of medical treatment such as  consulting a doctor or going to the hospital
  • Content that claims that there’s a guaranteed cure for COVID-19
  • Other content that discourages people from consulting a medical professional or seeking medical advice

Prevention misinformation: Content that promotes prevention methods that contradict local health authorities or WHO.

  • Claims that there is a guaranteed prevention method for COVID-19
    • Claims that any medication or vaccination is a guaranteed prevention method for COVID-19
  • Claims about COVID-19 vaccinations that contradict expert consensus from local health authorities or WHO
    • Claims that an approved COVID-19 vaccine will cause death, infertility, or contraction of other infectious diseases
    • Claims that an approved COVID-19 vaccine will contain substances that are not on the vaccine ingredient list, such as fetal tissue
    • Claims that an approved COVID-19 vaccine will contain substances or devices meant to track or identify those who’ve received it
    • Claims that an approved COVID-19 vaccine will alter a person’s genetic makeup
    • Claims that any vaccine causes contraction of COVID-19
    • Claims that a specific population will be required (by any entity except for a government) to take part in vaccine trials or receive the vaccine first”


To stay up-to-date on the latest COVID-19 news, visit