Menu Menu

Facebook ramps up moderation team amidst Covid-19 mental health crisis

Facebook is gearing up for an increase in anxious and depressed users, as the world is forced to hunker down into self-isolation.

This week has certainly been an unusual one, to say the very least.

As most of Europe and the US begin working from home on mass due to the Coronavirus pandemic, big social media platforms including Facebook are preparing themselves for an increase in traffic and anxious users. The site already has to monitor itself closely to stop the spread of conspiracy jargon and misinformation regarding COVID-19 and now, according to The Verge, Zuckerberg has outlined how his company will be prepping for an increase in global tension.

Included is a restructuring of Facebook’s content moderation team, an increase in full time staff, and new informative features that will appear at the top of your news feed. We’ve detailed below what you can expect over the next few months before the changes come into effect. We’re all going to be spending a lot more time indoors – so it’s probably best to know how your social media experience is set to adjust.

How is Facebook dealing with incorrect Coronavirus posts?

I don’t know about you, but I’ve already experienced false rumour messages and misinformation across Facebook the past few days.

Zuckerberg and co are aware of how easily these wrong stats and stories can spread and in this case it could be life threatening. Given the seriousness of the situation, it shouldn’t come as much of a surprise that Facebook has announced a new feature that pushes official, vetted information permanently to the top of your timeline.

A collection of useful links and approved news on COVID-19 directly from the World Health Organisation will be present within the next few days, and you may have already this come into effect. In the meantime, Facebook will continue to promote separate links to the official WHO website on both its own website and Instagram.

What will Zuckerberg be doing to tackle an increase in anxiety on Facebook?

Usually, Facebook’s moderation and spam reports are taken on by third party companies, but some of this will be changing in the near future. Zuckerberg has stated that these duties will be shifted to full-time employees, and that additional stuff are being brought in to take on the expected surge of self-harm and inappropriate content.

The company is also currently developing its machine-learning classifiers to automate more nuances of moderation on the site. Put simply, Facebook is attempting to improve its algorithm to more accurately pick up on content around anxiety, self-harm, depression, etc. It’s a deliberate push back against the knock-on effects of a worldwide pandemic.

All of this effort is important. Our lives are going to be defined by our phones, desktops, and internet experiences more than ever before this year. Zuckerberg’s worries over anxiety increases are warranted and I commend Facebook for giving the public real-time updates on its immediate steps forward. Social media sites have as much a responsibility to look after the population as governments and official institutions, especially as far as information and emotional wellbeing is concerned.

We’ve yet to hear anything as detailed from Google, Amazon, or Twitter – but hopefully it’s only a matter of time.

Accessibility