YouTube says it’s better at removing videos that violate its rules, but those rules are in flux

Open Sourced logo

YouTube is shedding new light on how it moderates its sprawling video platform, which has billions of views each day.

On Tuesday, the company released for the first time a statistic called the “violative view rate,” a new data point YouTube plans to include in its community guideline enforcement reports. Basically, for every 10,000 views on its social network — or at least during the last quarter of 2020 — about 16 to 18 of those views are of videos that violate YouTube’s rules, which currently forbid everything from hate speech to medical misinformation about Covid-19 to spam.

In a blog post published on Tuesday, YouTube argues those stats are a sign of progress, and shared that the “violative view rate” is down 70 percent since 2017 thanks to improvements the company has made in its content moderation-focused artificial intelligence. “We’ve made a ton of progress, and it’s a very, very low number,” YouTube’s product management director for trust and safety Jennifer Flannery O’Connor told reporters, “but of course we want it to be lower, and that’s what my team works day in and day out to try to do.”

YouTube shared this new information as politicians and users have grown increasingly concerned about how technology companies are moderating their platforms amid an “infodemic” of Covid-19 misinformation, and following the insurrection at the US Capitol and a presidential election cycle last year that was marked by conspiracy theories.

At the same time, YouTube’s stats on violative content bolster a narrative some YouTube executives have promoted in the past: that its systems generally do a good job of catching bad content, and that overall, the problem of nefarious videos on its site is comparatively small. YouTube also said on Tuesday that it’s able to take down 94 percent of content that breaks its rules with automated flagging systems, and that the large majority of those videos are caught before they get 10 views. Overall, YouTube claims it’s removed more than 83 million videos since it started releasing enforcement transparency reports three years ago.

“We have a large denominator, meaning we have lots of content,” CEO Susan Wojcicki told Recode back in 2019. “When we look at it, what all the news and the concerns and the stories have been about this fractional 1 percent.”

But the numbers YouTube released on Tuesday have limitations. Here’s how it calculated them: YouTube samples a number of views, meaning instances in which a user looks at a particular video (YouTube did not release the number of videos that factored into this statistic). Then, YouTube looks at the videos getting those views and sends them to its content reviewers. They study all the videos and figure out which ones violate the company’s rules, allowing YouTube to produce an estimated percentage rate of views that happened on “violative videos.”

Keep in mind that YouTube’s own reviewers — not independent auditors — decide what counts as a violation of YouTube’s guidelines. While Facebook last year committed to an independent audit of its community standards enforcement metrics, Flannery O’Connor said on Monday that the video platform had yet to make a similar commitment.

YouTube is often slow to decide what types of controversial content it will ban. The platform only changed its hate speech policy to ban neo-Nazis and Holocaust denial in 2019. While researchers had warned about the spread of the right-wing conspiracy theory QAnon for years, YouTube only moved to ban “content that targets an individual or group with conspiracy theories that have been used to justify real-world violence” in October of last year.

There’s also a lot of content that YouTube doesn’t take down. Some content doesn’t violate the company’s rules but it skirts the line, and some critics believe it shouldn’t be permitted on the platform. YouTube sometimes calls these kinds of controversial videos “borderline content.” It’s hard to study just how prevalent this borderline content is, given how huge YouTube is. But we know it’s there. For example, the company has kept up videos with election misinformation.

A major example of YouTube not outright removing offensive and harmful content came in 2019 when YouTube faced outcry after the company decided to leave up content from conservative YouTuber Steven Crowder that included racist and homophobic harassment of then-Vox journalist Carlos Maza (under intense pressure, YouTube eventually took away Crowder’s ability to run ads). Later that year, Wojcicki told creators that “[p]roblematic content represents a fraction of 1 percent of the content on YouTube,” but had a “hugely outsized impact.”

YouTube does remove ads for creators who post content that violates the platform’s monetization rules, and it does down-rank borderline content, but YouTube isn’t releasing similar stats for how prevalent this type of content is or how many views it typically gets.

As to why YouTube is releasing this particular statistic right now, Flannery O’Connor said the company had used the number internally for several years to study YouTube’s progress on safety and spikes in views of violative videos, and to set goals for its machine learning team. “We felt like [it’s] best to just be transparent and use the same metrics internally and externally,” she said.

YouTube’s announcement is part of a broader pattern of social media companies saying that their platforms are not, in fact, dominated by nefarious content — while critics, researchers, and journalists continue to point to the large number of views and clicks such content often attracts. Even when YouTube removes these videos, they sometimes have already succeeded in sharing harmful ideas that spread off the platform — for instance, the Plandemic video, which spread false Covid-19 conspiracies last year, captured millions of views on the platform before it was taken down.

Open Sourced is made possible by Omidyar Network. All Open Sourced content is editorially independent and produced by our journalists.