Photo via Getty Images


Instagram Promises To Offer "Better Support" To Those Struggling With Self-Harm

It's implementing new measures following the suicide of a teen

DISCLAIMER: This post speaks about self-harm and suicide. If you are struggling with your mental health, the National Suicide Hotline is a great resource. You can call it any time at 1-800-273-8255.

Instagram will be implementing measures to restrict users from seeing or being recommended content that shows self-harm.

Following the suicide of 14-year-old Molly Russell, who, according to her parents, engaged with self-harm content on Instagram and Pinterest prior to her death, Instagram head Adam Mosseri penned an op-ed in which he detailed the new measures. He says that he was "deeply moved" by Russell's story, and claims that "nothing is more important to us than the safety of the people in our community." As a result, Instagram is "revisiting our policies around what content is allowed on Instagram, investing in technology to better identify sensitive images at scale and working to make them less discoverable."

While Mosseri notes that "we still allow people to share that they are struggling," meaning that content that involves acts like cutting may still be allowed, the service would be "applying sensitivity screens to all content we review that contains cutting." These posts will no longer show up in searches, hashtags, or account recommendations. Mosseri says that, since "these images will not be immediately visible," it will be "more difficult for people to see them."

Other than this new change, Mosseri details that the platform is actively working to "make it harder for people to find self-harm images." "We have put in place measures to stop recommending related images, hashtags, accounts, and typeahead suggestions." Instagram will also be connecting users who post or search for this content with resources to "better support people who post images indicating they might be struggling with self-harm or suicide."