Menu Menu
[gtranslate]

Facebook algorithm under fire for blocking wildfire posts

The platform has faced criticism for blocking multiple emergency posts relating to United States wildfires.

Facebook has recently found itself in the line of fire – and not for spreading misinformation or political propaganda, but for blocking emergency warnings.

As wildfires spread across Northern California this month, individuals took to social media to share updates and concerned messages with friends and family.

This kind of digital hand holding has become a common occurrence every time a natural disaster impacts communities globally. But recent attempts to share updates online have been blocked by Facebook itself, in an algorithm failure that has outraged users internationally.

Earlier this week, the Washington Post shared reports from those affected by California’s latest wildfires, including Lauri Hutchinson. A retired firefighter herself, Lauri was confused to find her Facebook post –which included real-time updates of a fire ripping through the small town of Clearlake – had been deleted shortly after publishing.

In its place was a private note from Facebook, saying it had been flagged as spam. ‘It looks like you tried to get likes, follows, shares or video views in a misleading way’ the message read.

Further research found that the social media platform had been flagging and removing dozens of posts containing links and screenshots from Watch Duty, a widely relied-upon wildfire alert app used in the US.

Those impacted included local people, as well as volunteer responders, fire and sheriff departments, news stations, and disaster non-profit workers across California state.

Comment
byu/marji80 from discussion
intechnology

While Facebook has misflagged posts as spam during emergencies before, disaster groups and wildfire trackers say that the issue has reached ‘critical mass’.

The company’s AI-driven moderation system is designed to flag content that seems produced to drive interaction in a ‘misleading’ way. But it’s a broad net – one that’s snaring emergency messages along with clickbait and spam.

But the fallout feels somewhat predictable. Wildfires aren’t new. Emergency services relying on social media for real-time updates isn’t new. Facebook messing it up with overly aggressive content moderation? If you’re to look at the recent controversies plaguing the platform, this isn’t new either.

This is the same Facebook that touts its ‘Crisis Response’ feature, which is supposed to help users stay informed during natural disasters. Yet, here we are, watching the platform block basic fire alerts because its algorithm can’t distinguish between contexts.

It’s not just a technical glitch – it’s a failure of priorities. Facebook has been hyper-focused on cleaning up its platform ever since the misinformation scandals of the past decade. But wildfires, it seems, have fallen victim to Zuckerberg’s obsession with engagement policing.

As a platform built on the premise of connection, it’s now getting in the way of people trying to connect with vital information. And it’s doing so in the most detached, bureaucratic way possible: with algorithmic decisions made by systems that don’t know what a wildfire is.

The larger issue here is Facebook’s over-reliance on automation. Sure, the platform needs to filter out junk content, but in its quest to avoid human error, Facebook has embraced the kind of machine-driven errors that have far graver consequences.

On Reddit, users have called out the platform for its algorithmic issues.

‘The fact that they haven’t got this handled as quickly as possible is insane to me. They have a vast amount of resources. They could at least take some responsibility to protect people whenever they’re in terrifying danger,’ said one comment.

Another defended the platform, suggesting it was a reflection of social media’s capacity more generally, rather than a failure to uphold standards within the company.

‘Sometimes it feels like social media just can’t handle these kinds of warnings properly. It would be great if they stepped up, but we all know they’re more focused on engagement than serving the public good.’

For now, local governments are advising residents to look elsewhere for critical updates. But that’s a sad indictment of one of the world’s largest social media platforms.

It’s an issue that speaks to a broader concern: that for all its technological advancements, social media is still worryingly out of touch with the real world.

The algorithm, after all, is doing exactly what it was designed to do: indiscriminately protect Facebook from anything that smells like manipulation, even when that manipulation doesn’t exist.

Accessibility