YouTube wants the world to know that it’s doing a better job than ever by enforcing its own moderation rules. The company claims that an increasing number of people watch problematic videos on their site, such as videos that contain graphic violence, scams or hate speech, before removing them.
During the last months of 2020, up to 18 out of 10,000 views on YouTube were videos that violated company policies and should have been removed before anyone could see them. This went from 72 out of 10,000 views in the fourth quarter of 2017, when YouTube began tracking the figure.
But the numbers come with an important caveat: while they measure YouTube’s performance in limiting the spread of worrying clips, they are ultimately based on videos that YouTube believes should be removed from its platform, and the company still allows some obviously troubling videos to stay awake.
Statistics is a new addition to YouTube Community Guidelines Application Report, a quarterly updated transparency report with details on the types of videos that are removed from the platform. This new figure is called Percentage of Violent Views or VVRs and tracks the number of views on YouTube that occur on videos that violate its guidelines and need to be removed.
This figure is essentially a way for YouTube to measure how well you are doing by moderating your own site, based on your own rules. The higher the VVR, the more videos you broadcast before YouTube can capture them; the lower the VVR, the better YouTube does when removing banned content.
YouTube made a graph showing how the number has decreased since it began measuring the number for internal use:
The sharp drop from 2017 to 2018 came after YouTube began relying on machine learning to detect problematic videos, rather than relying on users to report problems, said Jennifer O’Connor, YouTube product director for confidence and security, during an information session with journalists. The goal is to “get that number as close to zero as possible.”
Videos that violate YouTube’s advertising guidelines, but not the general community guidelines, are not included in the VVR figure, as they do not warrant deletion. Nor does it take into account the so-called “limit content” that confronts the rules but does not violate the rules.
O’Connor said the YouTube team uses the figure internally to understand how well users are performing to protect users from worrying content. If it increases, YouTube may try to figure out what types of videos are escaping and prioritize the development of their machine learning to catch them. “The pole star of our team is to keep users safe,” O’Connor said.