Less than 2% of users watch the majority of the fringe YouTube content

 

By Chris Stokel-Walker

The vast majority of views on the worst kinds of YouTube videos—including extremist and fringe conspiracy content—comes from those who already have a tendency to hold such beliefs, according to a new study.

An analysis by researchers from across the United States, published today in Science Advances, suggests the fear that YouTube channels sent users down rabbit holes of conspiracy theories and extremism may be overblown—at least, for the vast majority of people. (The authors could not rule out that such a theory was, in fact, false prior to 2019, when YouTube carried out a massive algorithm change.)

“There’s been a lot of concern about the potentially harmful effects of social media,” says Brendan Nyhan, a professor at Dartmouth College and one of the coauthors of the paper. “YouTube has been a central focus of those concerns, with people suggesting it may be radicalizing thousands, hundreds of thousands, or even millions of people. It seemed important to empirically evaluate what people were seeing on the platform—and how they were seeing it.”

To analyze whether YouTube radicalized users through its recommendations, the researchers tracked the YouTube habits of 1,181 American citizens, including oversampling a number of those who had previously expressed extreme views in survey responses. The average participant was tracked for 133 days using a browser extension that recorded which YouTube videos they watched. Around 15% of participants viewed at least one video from what the authors describe as an “alternative” channel (for example, Joe Rogan), while around 6% watched at least one video from an extremist channel. 

According to the study, 1.7% of participants accounted for 80% of the total watch time on alternative channels on YouTube, while just 0.6% of users spent 80% of the time watching extremist videos.

“The story we observe is consistent with one where the people consuming potentially harmful content on YouTube already have more extreme attitudes and are—at least, as far as we’re able to evaluate—seeming to seek that type of content out,” says Nyhan. “That doesn’t excuse the company’s role of providing that content to them. But it’s a different problem than the one conversation today has focused on.”

Jeremy Blackburn, a computer science professor at Binghamton University who wasn’t involved in the Science Advances study, echoes Nyhan’s point. “This paper provides very strong evidence that, at least with the major changes YouTube made, there is a very low chance that the recommendation algorithm is trapping people,” he says. 

 

YouTube spokesperson Elena Hernandez tells Fast Company: “Recommendations are a critical way we deal with misinformation on the platform by connecting people to high-quality content. In recent years, we’ve overhauled our recommendation systems to lower consumption of borderline content that comes from our recommendations significantly below 1%.”

Hernandez adds that videos can still get views through other sites linking to, or embedding videos from, YouTube. And that’s a real concern for Nyhan. “I’m more worried, based on our data, about people who already have extreme views consuming very large quantities of potentially harmful content that might inspire them to take actions in the real world,” he says, “or might otherwise contribute to their extremism taking some more consequential or malevolent form.”

Fast Company

(5)