YouTube is introducing new features to its platform to support diverse communities and encourage respectful interactions. YouTube will warn users when a comment they are about to post may be offensive to others giving them the option to reflect before posting. YouTube will also test a new filter in YouTube Studio for potentially inappropriate and hurtful comments that have been automatically held for review, so that channel owners won't have to look at those comments if they don't want to.
YouTube said that the platform was working to close any existing gaps in how YouTube's products and policies work for everyone, specifically the Black community. The new Community Guidelines reminder for posting potentially harmful comments is rolling out on Android. If the commenter wants to go ahead with the potentially harmful comment even after the pop-up notification, they can consider doing so or even choose to edit/delete it. If users think they've been wrongly flagged, they can let YouTube know the same through the notification pop-up.
The new information informed through a blog post by google YouTube. YouTube said that it had invested in technology that would help the systems better detect and remove hateful comments, by taking into account the topic of the video and the context of the comment. YouTube will look into possible patterns of hate, harassment, and discrimination, that could affect some communities more than others.
In an effort to identify gaps in the system that could impact a creator's opportunities to reach their full potentials, YouTube will, ask creators on a voluntary basis to provide their gender, sexual orientation, race, and ethnicity starting next year. YouTube would examine how content from different communities is treated in its systems, such as search and discovery and monetisation.