New data released by YouTube reveals that most of the content that violates the platform's community guidelines is removed before receiving a substantial amount of views. The information comes as part of YouTube's Community Guidelines Enforcement Report, an in-depth look at all of the content that gets removed from YouTube for violating policies. Overall, the report seems to show that YouTube is handling violating content well and continuing to make adjustments to further combat the issue.

YouTube's Community Guidelines Enforcement Report was announced in 2018 as a way for the company to show accountability with regards to the content that's allowed on the incredibly popular streaming platform. It discloses things like the number of videos removed by YouTube, how these videos were identified by the platform, why specifically they were removed, as well as other statistics. The latest metric that YouTube is releasing points to increased efforts around pushing back against problematic content.

Related: Why YouTube Disabled The Number Of Video Dislikes For Some Viewers

In a blog post, YouTube's Trust And Safety Director, Jennifer O'Connor confirmed that 75-percent of the content YouTube flags is removed before reaching ten views. Furthermore, a new metric called the Violative View Rate illustrates that only between 16-18 of every 10,000 views on YouTube are for violative content. Impressively, those numbers are down 70-percent since YouTube began measuring the statistic back in 2017, according to O'Connor.

YouTube Making Progress, But There's Room For Improvement

YouTube law hammer

Clearly, YouTube is actively working to remedy a very real problem that the platform has with content. The numbers released in the Community Guidelines Enforcement Report suggest that efforts, like its automated moderation technology, are having a significant impact on keeping users from viewing violative or potentially harmful content. However,  YouTube's job is far from done. O'Connor acknowledges as much by stating how it's important that YouTube's Trust And Safety team stays vigilant about taking the necessary steps to crack down on violative content. After all, 25-percent of violating content is still seen by at least 10 people. Judging by the fact that 500 hours of content are uploaded to the platform every single minute, that's still a sizable amount that needs to be moderated.

While YouTube has been caught up in controversy regarding political videos or accounts that violate guidelines in recent months, it deserves recognition for the moves it is making to tackle harmful content overall. Transparency among tech giants isn't an incredibly common occurrence, and YouTube is helping to change that by releasing these metrics and statistics. At the same time, helping to make YouTube a safer space for all on the internet to enjoy.

Next: How To Mute Gambling & Alcohol Ads On YouTube

Source: YouTube