YouTube’s recommendation algorithm is reportedly guilty of sharing videos featuring misinformation, violence, hate speech, and other content in violation of its own policies.
With more than 2 billion monthly visitors, and a billion hours of content viewed every day, YouTube remains the undisputed mecca of long form video content in 2021.
Outside of binging our favourite channels on loop, many of us lurch from video to video via the platform’s recommended algorithm – which is still embroiled in long spanning controversy.
The coding behind this recommendation feature is designed to keep us glued to YouTube for as long as possible, but has come under fire in recent years for pushing viewers towards content rife with speculation and light on facts.
We’ve all been tempted by those gaudy thumbnails on conspiracy videos, am I right? What do you mean no?
Driving 70% of what users watch, this machine-learning system has landed YouTube in hot water several times recently.
Despite multiple statements that it’s working to promote ‘authoritative and popular videos,’ reports emerging this week suggest the platform is still promoting videos heavy in misinformation, violence, hate speech, and other content in violation of its own policies.
Mozilla Foundation, an American non-profit behind the Firefox web browser, headed up a crowdsourced investigation to find out exactly how much progress YouTube is making on that front.
The extent of the issue today
Impressively, Mozilla managed to gather 37,000 YouTube users to act as watchdogs on the look for harmful or extreme content from July of last year to May 2021.
Those who stumbled across videos in recommendations which should’ve been caught prior for misinformation, political conspiracy theories, violent/graphic content, or sexualised content disguised as child cartoons, were made note of through ‘regret reports.’
In a dedicated browser form which was eventually sent for analysis at the University of Exeter, 3,362 instances of ‘regrettable’ videos were recorded from 91 different countries. 71% of these reports came directly from random suggestions pushed by YouTube’s algorithm.
9% of the videos recorded in this study – a total of 189 videos – were later removed by YouTube, with many of them previously being recommended to those browsing the site. Worryingly, reported videos generated around 70% more views per day than those from active searches.
‘That is just bizarre,’ stated Mozilla’s senior manager of advocacy Brandi Geurkink. ‘The recommendation algorithm was actually working against their own abilities to police the platform.’
Countries that don’t typically speak English as a primary language registered reports at a rate 60% higher than those who do – particularly in Brazil, Germany, and France.
Unsurprisingly, given the global vaccine rush over the last year, misinformation made up the majority of regrettable reports overall.