Study Finds Extremist YouTube Content Mainly Viewed by Those Seeking it Out

YouTube rabbit holes are rare, but an SIO Scholar finds the platform can still help alternative and extremist channels build audiences.
image of a youtube button on a blue background

Stanford Internet Observatory Postdoctoral Scholar Ronald Robertson co-authored research that finds YouTube’s algorithms rarely send users down “rabbit holes” of alternative or extremist content, but does recommend that content to a relatively small audience that actively seeks it out.

The study, published today in the leading academic journal Science Advances, also highlights that despite the lack of evidence for a rabbit hole effect, YouTube helps extremist channels build audiences of people who are seeking out potentially harmful views about gender, race, and other topics.

The research probes a 2019 change made by YouTube to reduce recommendations for harmful or misleading content while still allowing those videos to remain available on the site. The study tracks  how participants engage with and are exposed to potentially harmful content on the platform, and assesses how racial or gender resentment among participants, as well as their YouTube channel subscriptions and browsing histories, affect exposure to that content.

The team of distinguished social and computer scientists found extremist videos were rarely recommended to users who did not already subscribe to channels producing those videos. 

The study examined a “rabbit hole” effect where users follow algorithmic recommendations to videos more extreme than the video they were watching, and from channels they were not already subscribed to. Findings from the 2020 observation period found that only 3% of participants experienced a rabbit hole. Although this percentage is small, YouTube operates at a massive scale as the most-used social media platform in the United States.

With nearly 1,200 participants, the study finds that only a small group sought out extreme or hateful content that reinforced existing beliefs. Most views were from a few individuals, and members of that group expressed high levels of hostile sexism and racial resentment, often subscribed to alternative or extremist channels, and reached videos from external sources.

The study concludes that while unintentional exposure to hateful content from the YouTube algorithm was rare in their 2020 dataset, the site still hosts alternative and extremist content and channels, and helps those channels build dedicated audiences.

man smiling

Ronald Robertson

Postdoctoral Fellow, Stanford Internet Observatory
FULL BIO