In a move sure to be applauded by some and held up by others as an example of tyranny, all YouTube anti-vax videos have been banned from the video site’s platform.
The announcement came yesterday in a blog post, in which YouTube informed users that new medical misinformation guidelines were being put in place immediately. The site was already actively targeting YouTube anti-vax videos spreading lies about the COVID-19 vaccine, but the new policies broaden the scope of the policies’ scrutiny. The blog post specifies any videos falsely claiming “approved vaccines” cause chronic health issues or that they do not work as intended will be removed, as will videos that make false claims about what is in the approved vaccines. YouTube clarifies that this includes videos making false statements that “approved vaccines cause autism, cancer or infertility, or that substances in vaccines can track those who receive them.”
The post also sought to redeem the company in the eyes of critics who have been insisting the changes be made long before now. The post claims this isn’t the beginning of the company’s fight against YouTube anti-vax videos, saying it’s already removed upwards of 130,000 of such videos since last year.
The post also warns that some content considered to be YouTube anti-vax videos will still be allowed, provided those videos don’t violate other guidelines. For example, a testimonial video about getting a bad side effect from a vaccine would be allowed, as would a video about the history of failed vaccines.
According to CNBC, many high profile YouTube anti-vax accounts have already been removed from the site. These include Joseph Mercola, Erin Elizabeth, and Sherri Tenpenny and the Children’s Health Defense Fund. However, YouTube’s blog post warns that while the new policies go into effect immediately, it will time for enforcement of them to “ramp up.”
Some argue we’ve already waited long enough for the YouTube anti-vax videos to disappear. In response to YouTube’s announcement, Slate‘s Aaron Mak wrote a piece outlining how the platform has been chronically “late to the party” when it comes to combatting misinformation. Mak points out that Facebook and Twitter’s own bans on vaccine misinformation came in February and March respectively, with YouTube lagging behind for most of the year. It isn’t just in the area of vaccines in which the platform falls behind — it was also significantly slow compared to dealing with voter fraud misinformation and QAnon conspiracy theories. This is particularly significant, argues Mak, because much of the misinformation begins on YouTube and then spreads to other platforms like Twitter and Facebook.
Of course, as Mak himself pointed out in March, YouTube anti-vax videos aren’t the only place to find that kind of misinformation. Just as those banned from Twitter and/or Facebook for spreading far-right misinformation are finding more welcome homes on Parlr and Gettr, the Canadian-based Rumble has become a destination for those no longer welcome on YouTube. It’s a sure bet the site will see some increased traffic in the weeks to come.