YouTube to Warn Users About Offensive Comments

YouTube Introduces Warnings for Potentially Offensive Comments

YouTube has announced a new filter aimed at combating potentially offensive comments on its platform. According to YouTube Vice President Johanna Wright, the company will be testing a new filter in YouTube Studio that automatically sends potentially inappropriate or offensive comments for review. This way, creators won’t have to read such comments unless they choose to. YouTube is also improving its comment moderation tools to make the process even easier for creators.

New Feature to Encourage Thoughtful Commenting

Another new feature will warn users if the comment they are about to post might be offensive to others. The company wants to give users a chance to reconsider before publishing messages that could be seen as crossing the line. These prompts won’t appear for every comment, but will pop up when YouTube’s system detects content that seems similar to offensive language. When the warning appears, users can either go ahead and post the comment or take extra time to edit it.

Improved Detection and Support for Diverse Communities

“In addition, we have invested in technology that helps our systems better identify and remove hateful comments, taking into account the video’s topic and the content of the comment,” Wright added.

To support communities that face discrimination, starting in 2021, YouTube will voluntarily ask content creators to share their gender, race, ethnicity, and sexual orientation. This will help the platform’s administrators better analyze how popular videos from different groups are among users.

Leave a Reply