Menu Menu
[gtranslate]

Fact checkers label YouTube a ‘major conduit’ for misinformation

Organisations involved in fact-checking have come together to demand YouTube finally address its misinformation problem.

YouTube has a little something for everyone, whether you’re looking for an amusing review of a takeaway chain or a comprehensive history lesson on the Mayans.

The only issue with allowing almost anything to end up on the platform, so long as it doesn’t breach terms brazenly, is that information is able to circulate rapidly without being certified as correct.

While most of us are partial to a novelty conspiracy video on Area 51, for example, in times of real political strife, election campaigns, or a full blown pandemic, the spread of misinformation can be used to exploit or inadvertently harm people.

What’s worse, is that YouTube’s decision to remove dislikes from the front end of the site means users are now unable to tell off the bat what is, and perhaps isn’t, a very reliable source of information.

In the case of anti-vax theories – which continue to populate YouTube on a daily basis – there has been many reported instances of people actually dying after being influenced not to get jabbed by hoax or unratified material.

So, following years of discussion on the matter, where exactly are we at?

https://www.youtube.com/watch?v=kxOuG8jMIgI


Fact checkers demand change

After the QAnon and ‘New World Order’ conspiracies, it seems the constant flow of vaccine misinformation over the last two years has really frayed the patience of fact checkers.

So much so, that 80 such outfits from Europe, Africa, Asia, the Middle East, and the Americas have now penned a joint letter to the Google-owned company demanding it crack down on misinformation in a more forthright manner.

Its key requests call for more transparency from YouTube’s end, stricter action against repeat offenders, context and de-buff hints over video deletions, and increased efforts to tackle misinformation in languages other than English.

On the last point, the cohort of fact checkers allege false content originating from developing countries often goes under the radar entirely.

Seditious or unreliable material shared in English is far likelier to be caught by YouTube’s algorithms, but the letter claims that its recommend feature hasn’t received enough of an overhaul either.

‘YouTube should fix its recommendation algorithms to ensure it does not actively promote disinformation to its users, or recommend content coming from unreliable channels,’ the letter states.

Looking to prompt a reaction – and no doubt stir up some media attention – it scathingly points to YouTube as a ‘major conduit of online disinformation.’


YouTube’s initial response

In comments to the Guardian, YouTube spokesperson Elena Hernandez attempted to reassure sceptics that the company was already investing in ways ‘to connect people to authoritative content’ whilst reducing ‘the spread of borderline misinformation and violative videos.’

‘We’re always looking for meaningful ways to improve and will continue to strengthen our work with the fact checking community,’ she said.

To its credit, YouTube has shown a willingness to respond when previously critiqued on the subject. Chiefly, when deleting videos posted by Brazilian President Jair Bolsonaro, who defended the use of uncertified drugs to treat Covid-19.

With the sheer magnitude of content and creators joining the platform every day, there’s no doubt that policing content is an incredibly difficult task. Perhaps now, after several inquiries into its mechanics, YouTube should concede that outside intervention is necessary.

As Full Fact chief Will Moy recently states, ‘bad information ruins lives.’ Failure to act swiftly now will only bring about more contempt from the experts.

Accessibility