Instagram has pledged to extend it’s ban of self-harm related content to cover cartoons, drawings, and memes in its bid to become a safer platform for young people.
This week Instagram confirmed its plan to flag and remove all images, drawings, cartoons, and memes depicting any form of self-harm or suicide as they ramp up efforts to make their platform as safe as possible for young users.
Insta is widely renowned as the ‘go to’ social hub for youngsters nowadays, with over 70% of 13-17-year olds worldwide owning personal profiles according to sproutsocial. And as a result, the Facebook-owned network are under mounting pressure from the media, charity organisations, and the public to ensure they’re ably protecting users from being exposed to harmful or coercive content.
This latest move is one of many precautions taken since the tragic death of British teen Molly Russell in 2017, who took her own life after viewing graphic content on the platform. Since February, Insta Chief Adam Mosseri has focused on restricting the circulation of content with suicidal themes, including stills and videos. But campaigners maintain that further work needs to be done.
Florida-based internet safety campaigner Dr Free Hess revealed that, despite Insta’s efforts, harmful content is still spreading on the platform, and showed multiple examples of unrestricted graphic photographs, videos, and memes advocating suicide to attendees at a lecture on online safety in New Jersey.
In response, Insta put out a statement claiming that they’ve doubled the amount of ‘material removed’ since the first quarter of 2019, supposedly vetoing 834,000 pieces of content – 77% of which had not been actively reported by users. Mosseri conceded, ‘there is very clearly more work to do’ and declared that the ‘work never ends’.