What YouTube is doing to stop the spread of misinformation

3 steps the platform is taking.

words by: Sahar Khraibani
Mar 25, 2022

YouTube has released a fresh overview of its developing efforts to counteract the spread of misinformation through YouTube clips, which gives some light on the platform’s numerous issues and how it’s exploring its options for dealing with these concerns. It’s a serious problem, as YouTube, like Facebook, is frequently identified as a source of false and potentially dangerous content — with YouTube’s recommendations dragging users down ever-deeper rabbit holes of misinformation.


What YouTube is doing

YouTube claims that they’re working on it, and is focusing on three critical areas in this effort.



The first is identifying misinformation before it spreads, which YouTube says can be especially difficult with emerging conspiracy theories and disinformation campaigns because it can’t update its automated detection algorithms without a considerable volume of video to train its systems on.


Automated detection systems are based on examples, which works well for older conspiracy theories because YouTube has enough data to train its classifiers on what they need to detect and limit. However, subsequent shifts have added to the complexity, posing a new problem.


YouTube says it’s looking into methods to improve its systems in this area, and minimize the spread of harmful content as it emerges, particularly around breaking news items. This, in theory, will increase its ability to recognize and limit emergent narratives, but this will remain a difficult task in many ways.


Cross-platform sharing and amplification

Cross-platform sharing and amplification of YouTube content outside of YouTube is the second area of attention. YouTube claims that it can make any modifications it wants to its app, but if people are re-sharing videos on other platforms or embedding YouTube material on other websites, it becomes more difficult for YouTube to control its spread, leading to further hurdles in controlling it.


In a blog post, the platform shares: “One possible way to address this is to disable the share button or break the link on videos that we’re already limiting in recommendations. That effectively means you couldn’t embed or link to a borderline video on another site. But we grapple with whether preventing shares may go too far in restricting a viewer’s freedoms. Our systems reduce borderline content in recommendations, but sharing a link is an active choice a person can make, distinct from a more passive action like watching a recommended video.”


This is a crucial point: If YouTube wants to ban content that spreads harmful misinformation, how far can it go without crossing the line if it doesn’t legally infringe the platform’s rules?


In the same blog post, they continued: “Another approach could be to surface an interstitial that appears before a viewer can watch a borderline embedded or linked video, letting them know the content may contain misinformation. Interstitials are like a speed bump – the extra step makes the viewer pause before they watch or share content. In fact, we already use interstitials for age-restricted content and violent or graphic videos, and consider them an important tool for giving viewers a choice in what they’re about to watch.”


Some may consider each of these proposals to be excessive, but they could help to limit the spread of harmful content. In any of these categories, there are no clear answers, but it’s fascinating to study the numerous factors at play.


Expand efforts abroad

Finally, due to different mindsets and approaches to information sources, YouTube claims it is expanding its anti-misinformation efforts abroad. To account for regional nuance, the only way to address this is to hire more staff in each location and build more specific content moderation centers and processes.


As always, there are no final answers, but it’s worth thinking about the various issues YouTube confronts as it attempts to improve its operations.


In other related news, read more on YouTube Shorts, the platform’s answer to TikTok.


Photo via Chris Ratcliffe/Bloomberg via Getty Images