Watch all the Transform 2020 sessions on-demand here.
A team of researchers at universities in the U.S., Switzerland, and Brazil found that YouTube’s recommendation algorithm is likely to share racist right-wing channels. The work is detailed in “Auditing Radicalization Pathways on YouTube,” a paper published on arXiv.org last week, and it argues that consumption of “alt-right” content is a fitting proxy for radicalization because of its association with mass shootings and promotion of intergroup conflict.
Researchers arrived at their findings after breaking videos into three groups — the Intellectual Dark Web (IDW), Alt-lite, and Alt-right — in an effort to discern varying degrees of racism and radicalization.
The audit defines IDW content as occasionally offensive or associated with homophobia, Islamophobia, or racism, while Alt-lite content is often the work of right-wing activists who flirt with ideas of white supremacy. Alt-right content openly espouses a white ethnostate.
June 5th: The AI Audit in NYC
Join us next week in NYC to engage with top executive leaders, delving into strategies for auditing AI models to ensure fairness, optimal performance, and ethical compliance across diverse organizations. Secure your attendance for this exclusive invite-only event.
The researchers then searched YouTube with a set of keywords familiar to each of the three communities and recorded the first 200 results in English. Channels broadly related to the topic were also added, such as the names of well-known individuals associated with the alt-right.
“Our analyses show that, particularly through the channel recommender system, Alt-lite channels are easily discovered from IDW channels, and that Alt-right channels may be reached from the two other communities,” the audit report reads. “YouTube is extremely popular, especially among children and teenagers, and, if the streaming website is actually radicalizing individuals, this can push fringe ideologies like white supremacy further into the mainstream.”
Researchers arrived at their findings after labelers with more than 50 hours of experience watching alt-right content manually annotated a dataset. Altogether, the analyzed dataset is made up of more than 330,000 videos, 360 channels, 79 million comments, 2 million video recommendations, and 10,000 channel recommendations.
Of the 360 channels compiled, 90 were categorized as IDW, 114 as Alt-lite, and 88 as Alt-right.
To account for potential location bias in video and channel recommendations, the researchers used VPNs to collect data. Three of the VPNs were located in the United States and two were in Canada, with one each in Brazil and Switzerland.
A proxy for radicalization
To track radicalization, researchers categorized users as “lightly infected” if they made one or two video comments, “mildly infected” if they posted three to five comments, and “severely infected” if they left six or more comments. About 10% of commenters were lightly infected, and 4% — or more than 9,000 people — were moderately or severely infected.
Comments were used as a signal because the majority of comments were found to be in agreement with each other, and in 2018 alt-right videos received one comment per roughly every five video views.
“We argue that this finding comprises significant evidence that there has been, and there continues to be, user radicalization on YouTube,” the paper reads.
However, the researchers refrained from declaring that YouTube is a radicalization pipeline, because personalization could not be accounted for in results.
“In 2018, for all kinds of infections, roughly 40% of commenting users can be traced back from cohorts of users that commented only in Alt-lite or IDW videos in the past,” the report reads.
Observers note that many of the channels now dedicated to alt-right content got started with videos on very different topics, like video games or working out, but that politics became an “increasingly occurring” topic.
And the audit found significant user flow between each of the three categories, asserting that they increasingly share the same commenting base. Videos in each of the three categories have seen a rise in activity since 2015.
In an annual report last fall, the U.S. Department of Justice said hate crimes are up for the third year. Researchers found a corresponding rise in YouTube views, likes, and videos in each of the three categories since 2015.
For their audit, the researchers carried out 10,000 simulations to analyze algorithm performance and collected video and channel recommendations using custom-made crawlers. They collected multiple “rounds” of recommendations — 22 for channel recommendations and 19 for video recommendations.
“If we follow the recommender system 5 times, approximately 1 out of each 25 times we will have spotted an Alt-right channel,” the researchers wrote.
Using content from a handful of mainstream news sources as a control, the researchers found less migration from control to alt-right and higher levels of engagement with alt-right content, which received about one comment for every five views by 2018.
The researchers note that the trend toward radicalization may be helped along by skepticism toward mainstream media and the growing number of people getting news stories from their social media feeds.
The audit was performed by researchers from Berkman Klein Center for Internet and Society at Harvard University, UFMG in Brazil, and EPFL in Switzerland. Future research could be done to account for personalization and to perform a deeper analysis of comment content in order to track word usage patterns and the evolution of tone.
Updated 9:15 a.m. Aug. 29 to include recent comments by YouTube CEO Susan Wojcicki on extreme videos on its platform.