YouTube has long been notoriously plagued for the most active, albeit toxic, offensive, and hateful comment section. Anyone who has spent any amount of time browsing through endless videos knows that the comments section is a universe of its own.
Much like other social media platforms, most recently Instagram’s anti-bullying tool, YouTube is following suit and has announced that it will be taking extra measures in the effort to combat offensive and hateful comments.
The video giant is set to debut this new feature that apparently prompts commenters to reconsider their words carefully before publishing them. This, of course, may come with a backlash, if we’re considering that bullies will not be stopped by a moral question. As explained in YouTube’s blog post, this feature aims to keep the comments section clean by warning users with the prompt “keeps comments respectful” when it senses an offensive comment being posted. It will also automatically direct users to the platform’s Community Guidelines for a better explanation of what’s considered appropriate. Users will be encouraged to edit their comments, however, they will be able to proceed if they wish to.
In order to further protect its digital environment, YouTube is adding a filter that allows creators to avoid hateful comments: creators can decide to hide, approve, or report comments without even having to read all of the content. These are all incredible steps that social platforms are implementing to make digital spaces, that are very much a part of our lives, as safe as possible.