Menu Menu

YouTube’s ‘recommendations’ still pushing harmful videos

YouTube’s recommendation algorithm is reportedly guilty of sharing videos featuring misinformation, violence, hate speech, and other content in violation of its own policies.

With more than 2 billion monthly visitors, and a billion hours of content viewed every day, YouTube remains the undisputed mecca of long form video content in 2021.

Outside of binging our favourite channels on loop, many of us lurch from video to video via the platform’s recommended algorithm – which is still embroiled in long spanning controversy.

The coding behind this recommendation feature is designed to keep us glued to YouTube for as long as possible, but has come under fire in recent years for pushing viewers towards content rife with speculation and light on facts.

We’ve all been tempted by those gaudy thumbnails on conspiracy videos, am I right? What do you mean no?

Driving 70% of what users watch, this machine-learning system has landed YouTube in hot water several times recently.

Despite multiple statements that it’s working to promote ‘authoritative and popular videos,’ reports emerging this week suggest the platform is still promoting videos heavy in misinformation, violence, hate speech, and other content in violation of its own policies.

Mozilla Foundation, an American non-profit behind the Firefox web browser, headed up a crowdsourced investigation to find out exactly how much progress YouTube is making on that front.


The extent of the issue today

Impressively, Mozilla managed to gather 37,000 YouTube users to act as watchdogs on the look for harmful or extreme content from July of last year to May 2021.

Those who stumbled across videos in recommendations which should’ve been caught prior for misinformation, political conspiracy theories, violent/graphic content, or sexualised content disguised as child cartoons, were made note of through ‘regret reports.’

In a dedicated browser form which was eventually sent for analysis at the University of Exeter, 3,362 instances of ‘regrettable’ videos were recorded from 91 different countries. 71% of these reports came directly from random suggestions pushed by YouTube’s algorithm.

9% of the videos recorded in this study – a total of 189 videos – were later removed by YouTube, with many of them previously being recommended to those browsing the site. Worryingly, reported videos generated around 70% more views per day than those from active searches.

‘That is just bizarre,’ stated Mozilla’s senior manager of advocacy Brandi Geurkink. ‘The recommendation algorithm was actually working against their own abilities to police the platform.’

Countries that don’t typically speak English as a primary language registered reports at a rate 60% higher than those who do – particularly in Brazil, Germany, and France.

Unsurprisingly, given the global vaccine rush over the last year, misinformation made up the majority of regrettable reports overall.


What’s next for YouTube?

While this study’s validity is questionable (as it recruited volunteers with a willingness to participate and not a random sample of YouTube’s audience) it’s clear to see YouTube still has a sizable content problem to contend with.

Having made a reported 30 revisions to the recommendation system over the last year, YouTube states that suggested videos result in 200 million views a day and more than 80 billion pieces of digital information. That’s a lot to keep on top of.

With this in mind, Mozilla has provided YouTube with some advice that may help to limit the load, as well as tips for those at risk of consuming harmful content.

The report authors believe YouTube should set up independent audits of its algorithms to assist in finding a comprehensive fix, and stated that data transparency is key from YouTube’s end. ‘If YouTube won’t release us data, we can just ask people to send it to us instead,’ stated Geurkink.

Mozilla suggests that policymakers should now demand performance data from any automated system able to amplify content on a wide scale online. YouTube, to date, has been very coy on its quarterly engagement metrics.

‘We can’t continue to have this paradigm where researchers raise issues, companies say “OK, it’s solved,” and we go on with our lives,’ Geurkink concluded.

Lastly, and most crucially, Mozilla believes there should be an option to opt out of personalisation features like recommended videos for us users.

If you want to limit the possibility of any inciting clickbait or graphic viewing popping up during your next visit to YouTube, Mozilla advises reviewing your ‘watch’ and ‘search’ history and clearing anything you don’t want influencing future recommendations.

YouTube is no stranger to controversy, but with this report out in the ether it will have to respond in a big way. In the meantime, safe browsing all.

Accessibility