Organisations involved in fact-checking have come together to demand YouTube finally address its misinformation problem.
YouTube has a little something for everyone, whether you’re looking for an amusing review of a takeaway chain or a comprehensive history lesson on the Mayans.
The only issue with allowing almost anything to end up on the platform, so long as it doesn’t breach terms brazenly, is that information is able to circulate rapidly without being certified as correct.
While most of us are partial to a novelty conspiracy video on Area 51, for example, in times of real political strife, election campaigns, or a full blown pandemic, the spread of misinformation can be used to exploit or inadvertently harm people.
What’s worse, is that YouTube’s decision to remove dislikes from the front end of the site means users are now unable to tell off the bat what is, and perhaps isn’t, a very reliable source of information.
In the case of anti-vax theories – which continue to populate YouTube on a daily basis – there has been many reported instances of people actually dying after being influenced not to get jabbed by hoax or unratified material.
So, following years of discussion on the matter, where exactly are we at?
https://www.youtube.com/watch?v=kxOuG8jMIgI
Fact checkers demand change
After the QAnon and ‘New World Order’ conspiracies, it seems the constant flow of vaccine misinformation over the last two years has really frayed the patience of fact checkers.
So much so, that 80 such outfits from Europe, Africa, Asia, the Middle East, and the Americas have now penned a joint letter to the Google-owned company demanding it crack down on misinformation in a more forthright manner.
Its key requests call for more transparency from YouTube’s end, stricter action against repeat offenders, context and de-buff hints over video deletions, and increased efforts to tackle misinformation in languages other than English.
On the last point, the cohort of fact checkers allege false content originating from developing countries often goes under the radar entirely.
Seditious or unreliable material shared in English is far likelier to be caught by YouTube’s algorithms, but the letter claims that its recommend feature hasn’t received enough of an overhaul either.
‘YouTube should fix its recommendation algorithms to ensure it does not actively promote disinformation to its users, or recommend content coming from unreliable channels,’ the letter states.
Looking to prompt a reaction – and no doubt stir up some media attention – it scathingly points to YouTube as a ‘major conduit of online disinformation.’