Menu Menu
[gtranslate]

Search engines are amplifying misinformation

A study published in Nature last week has uncovered that using online search engines to vet conspiracies can actually compound the probability that someone will believe it.

In November, UNESCO’s Director-General Audrey Azoulay sounded the alarm on the intensification of misinformation and hate speech online, which she said poses ‘major risks to social cohesion, peace, and stability.’

Her warning came on the back of a UNESCO-commissioned survey which found that more than 85 per cent of people are worried about the impacts of this on their country’s politics.

In the interest of tackling the increasingly worsening issue head-on, experts have begun conducting deeper investigations into misinformation, with the aim of understanding exactly why it is such a substantial amount of the population is so easily swayed by what they consume on the Internet.

Most recently, New York University’s Centre for Social Media and Politics (CSMaP), has turned its attention to the phenomenon, publishing a paper in Nature on the impact of search engines’ output on their users – which is relatively under-studied.

According to their findings, searching to evaluate the truthfulness of fake news or conspiracies (an approach encouraged by technology companies and government agencies) actually compounds the probability that someone will believe it.

The authors point to a known problem in this area called ‘data voids,’ whereby there’s occasionally not a lot of high-quality information to counter misleading headlines or surrounding ‘fringe theories’ which means that when someone sees an article about a certain topic and starts a casual search based on relevant keywords, they might find articles that reaffirm their bias.

In other words, there may be false information out there, but not the corresponding true information to correct it.

‘This points to the danger that ‘data voids’ – areas of the information ecosystem that are dominated by low quality, or even outright false, news and information – may be playing a consequential role in the online search process, leading to low return of credible information or, more alarming, the appearance of non-credible information at the top of search results,’ says lead-author, Kevin Aslett.

‘The question here was what happens when people encounter an article online, they’re not sure if it’s true or false, and so they go look for more information about it using a search engine,’ adds co-author Joshua Tucker. ‘You see exactly this kind of suggestion in a lot of digital literacy guides.’

As Tucker explains, the CSMaP team was particularly interested in seeing how people verify news that’s just happened and hasn’t yet had a chance to be verified by fact-checkers.

The results from that first inquiry revealed that people who’d been nudged to look for more information online were 19 per cent more likely to rate a false or misleading article as fact, compared to those who hadn’t.

‘What we find over and over again is that people are over-relying on these search engines or their social connections. They put this blind faith in them,’ Chirag Shah – a professor of information science at the University of Washington who wasn’t involved in the study – tells Vice.

‘They think they’ve done their due diligence and checked but it makes it worse than not checking.’

Unfortunately, this is something we’ll be grappling with for years to come as large language models and generative AI continue to grow and flood the Internet with even more misinformation. “The four most dangerous words are ‘do your own research”,’ says Shah, who deems it the responsibility of technology companies to offer tools to help people deconstruct fact from fiction.

‘We should be equipping them with the right tools. Those tools could come and should come from tech companies and search service providers,’ he finishes.

‘We also need to have that awareness of “just because you’re doing your research, that doesn’t mean that’s enough.” The more awareness people have, the more chance we have of having people think twice about the information they’re reading.’

Accessibility