Menu Menu

Sexual harassment is already plaguing the metaverse

In only its early days of beta testing, a woman reported being groped in a metaverse called ‘Horizon Worlds’. Immersive VR on the platform makes such experiences all too real, generating calls for stronger safety features.

Recently, the term ‘metaverse’ has become unavoidable, whether you’ve got a keen eye for keeping up with the latest tech and gaming trends or not.

The metaverse is a rapidly growing, virtual reality world where users can go to school, work, play games, watch concerts, go shopping, and interact with others in the community without leaving their actual home.

Tech giant Facebook recently changed its name to Meta with the intention to bring the metaverse further into the mainstream. It has already successfully developed a new platform in the metaverse that is now accessible to the general public.

Enter, Meta’s virtual-reality social media platform called ‘Horizon Worlds’ – which has been compared to Minecraft due to its colourful interface and graphics.

The fun, light-heartedness of the digital world hasn’t lasted for long, though. In late November, one beta tester was groped by a stranger while navigating the platform in VR. She promptly reported her experience in the beta testing group on Facebook.

Taking appropriate safety measures

By now, software developers are well aware that negative social behaviours that exist in the actual world are prone to occur as often – if not more so – in digital spaces. In fact, many anticipate these types of problems when creating interfaces.

Upon reviewing the woman’s account of her experience on Facebook, developers at Meta said that she should’ve activated a tool called ‘Safe Zone’ – one of the key safety features build into the Horizon Worlds platform.

By doing this, a virtual ‘safety bubble’ is activated around the user, rendering others in the metaverse unable to touch, talk, or interact with them until they make the decision to deactivate the Safe Zone feature.

While this tool is helpful when a user feels triggered by anothers’ actions, it does not stop the issue of harassment in digital spaces from happening in the first place.

Nor does it protect from the victim’s psychological or physiological responses that come afterward – especially in platforms where immersive VR is used, making the experience feel extremely realistic.

Acknowledging the severity of the problem

Responses on the post which recounted being groped in Horizon World were varied, but many Facebook users attempted to diminish her experience, saying that what had happened was ‘no big deal’ simply because it didn’t happen in the actual world.

However, researchers at the Digital Games Research Association have pointed out that instances of toxic behaviour (such as sexual harassment and bullying) in virtual spaces can be as just as harmful as they are in person.

Especially when immersive VR is used, these negative experiences are heightened, meaning the social implications of virtual and verbal actions can be extremely triggering for those targeted.

‘At the end of the day, the nature of virtual-reality spaces is such that it is designed to trick the user into thinking they are physically in a certain space, that their every bodily action is occurring in a 3D environment,’ said Katherine Cross, a researcher of online harassment at the University of Washington.

When people have a screen to hide behind, it’s easier for individuals to act more recklessly than they would in person, due to lack of measurable consequences. This phenomenon has been coined by researchers as the ‘online disinhibition effect’.

And just as people of all age groups became accustomed to using Facebook during the early 2010s, it’s likely large numbers of people will begin entering the metaverse on a daily basis in the not-so-distant future.

Before this happens though, Meta (and other metaverse developers) should make sure they’ve covered as many safety bases as possible – such as disallowing features or character actions that have potential to be abused or used to make other users feel unsafe and uncomfortable.

Speaking on the subject, Horizon World’s vice president called the beta testing incident ‘absolutely unfortunate’ and described the beta tester’s feedback as valuable. He added that the company will continue to improve the functionality and accessibility of the platform’s ‘block user’ feature going forward.

Accessibility