YouTube has released its latest Community Guidelines Enforcement report, which outlines all of the actions the platform took for rule violations between April and June this year.
YouTube’s efforts in the most recent quarter were impacted by COVID-19, with the platform noting that it had to put more reliance on automated systems to detect potential standards violations, as its human moderation capacity was reduced due to shutdowns in different regions.
“Earlier this year, we shared some of the steps we have taken to protect our employees and extended workforce during the COVID-19 pandemic. One major step was to rely more on technology to quickly identify and remove content that violates our Community Guidelines so that our teams that review content could safely remain at home. The second quarter of 2020 was the first full quarter we operated under this modified enforcement structure. Because of choices we made to prioritize the safety of the community, we removed the most videos we’ve ever removed in a single quarter from YouTube.”
That also likely means that some content was removed in error, which YouTube acknowledges, but it says that it chose to over-police, as opposed to letting more problematic content potentially slip through, especially in certain contexts.
“For certain sensitive policy areas, such as violent extremism and child safety, we accepted a lower level of accuracy to make sure that we were removing as many pieces of violative content as possible. This also means that, in these areas specifically, a higher amount of content that does not violate our policies was also removed.”
That’s no doubt been a headache for YouTube creators, but with moderation centers slowly coming back to full capacity, it shouldn’t be a long term concern.
By the numbers, YouTube removed 11,401,696 videos for rule violations in Q2, with the vast majority of them being automatically flagged by its systems.
Child safety concerns were the biggest cause of removal, with spam and nudity/sexualized content also among the key reasons.
YouTube has been working to update its guidelines on child safe content after the US Federal Trade Commission hit the platform with a record $170 million penalty last year as part of a settlement over an investigation into the privacy of children’s data on the Google-owned video site.
YouTube has since revised its data collection processes on such clips, while its updated approach has also caused some headaches for creators seeking to monetize their content. The numbers here reflect that ongoing push, with the platform still evolving its policies to better protect younger users.
Hateful or abusive content made up 80k removals (0.7% of the total), while almost a million videos were removed for promoting violent extremism or similar (8.1%).
Interestingly, YouTube doesn’t list videos removed due to misinformation. Like all platforms, YouTube has been working to address the spread of COVID-19 misinformation through its network, with the platform becoming a focus for several investigations into conspiracy theory rabbit holes via recommended videos on certain subjects.
This will be a key area for YouTube to address in future, so it’s likely to also become more of a focus in coming standards reports.
By region, the US saw the most videos removed, followed by India.
Which makes sense, given the US and India are two of YouTube’s biggest user markets. In addition to this, YouTube removed more than 2 billion comments in the period, with over 80% of those removals due to spam or harassment.
The numbers show that YouTube is indeed working to keep offensive content off its platform, while its automated systems are getting better at detecting violations, and erasing them before they’re ever seen.
YouTube still has areas of concern to address, as noted, but the figures here underline the sheer scale of issues the platform has to deal with. Which is a consequence of scale, of course, and therefore an inevitable element of growth. But it highlights the efforts each platform needs to go to, in order to manage such.
You can read YouTube’s full Community Guidelines Enforcement report for Q2 2020 here.