'People watch more than a billion hours of video on YouTube every day. Over the past few years, the video sharing platform has come under fire for its role in spreading and amplifying extreme views.
YouTube’s video recommendation system, in particular, has been criticised for radicalising young people and steering viewers down rabbit holes of disturbing content.
The company claims it is trying to avoid amplifying problematic content. But research from YouTube’s parent company, Google, indicates this is far from straightforward, given the commercial pressure to keep users engaged via ever more stimulating content.
But how do YouTube’s recommendation algorithms actually work? And how much are they really to blame for the problems of radicalisation?
The fetishisation of algorithms
Almost everything we see online is heavily curated. Algorithms decide what to show us in Google’s search results, Apple News, Twitter trends, Netflix recommendations, Facebook’s newsfeed, and even pre-sorted or spam-filtered emails. And that’s before you get to advertising.
More often than not, these systems decide what to show us based on their idea of what we are like. They also use information such as what our friends are doing and what content is newest, as well as built-in randomness. All this makes it hard to reverse-engineer algorithmic outcomes to see how they came about.
Algorithms take all the relevant data they have and process it to achieve a goal - often one that involves influencing users’ behaviour, such as selling us products or keeping us engaged with an app or website.
At YouTube, the “up next” feature is the one that receives most attention, but other algorithms are just as important, including search result rankings, homepage video recommendations, and trending video lists.
How YouTube recommends content
The main goal of the YouTube recommendation system is to keep us watching. And the system works: it is responsible for more than 70% of the time users spend watching videos.
When a user watches a video on YouTube, the “up next” sidebar shows videos that are related but usually longer and more popular. These videos are ranked according to the user’s history and context, and newer videos are generally preferenced.
This is where we run into trouble. If more watching time is the central objective, the recommendation algorithm will tend to favour videos that are new, engaging and provocative.
Yet algorithms are just pieces of the vast and complex sociotechnical system that is YouTube, and there is so far little empirical evidence on their role in processes of radicalisation.
In fact, recent research suggests that instead of thinking about algorithms alone, we should look at how they interact with community behaviour to determine what users see.'
Read more: What you need to know about YouTube's algorithm system