ILLUSTRATION - 28 April 2021, Berlin: On the screen of a smartphone you can see the logo of the app ...
picture alliance/picture alliance/Getty Images


Report Shows Facebook Knows How Toxic Instagram Is, Because Of Course They Do

At the same time, TikTok is unveiling more mental health resources on the platform.

It’s not exactly revelatory that Instagram is damaging to your self-esteem, but internal research — which is being made publicly available for the first time after being obtained by the Wall Street Journal — shows that Facebook is aware of how toxic Instagram is, and still has done little to change it.

According to the article, researchers at Instagram found that 32% of teen girls said that when they felt bad about their bodies, Instagram made them feel worse. (We’ll venture that’s severely underreported!) A 2019 slide, which was presented to Facebook’s internal message board, read, “Teens blame Instagram for increases in the rate of anxiety and depression. This reaction was unprompted and consistent across all groups.” Researchers also found that among teens who reported suicidal thoughts, 13% of British users and 6% of American users traced the thinking to Instagram.

Despite being aware of the depth of the problem, Facebook has still downplayed the app’s negative effects. Instagram executive Adam Mosseri told reporters that research he’d seen shows the effects on teen’s mental health is “quite small.” The WSJ report says Facebook has made minimal efforts to address any of the issues — but publicly they pat themselves on the back for minimal changes like getting rid of Instagram “likes.”

TikTok seized the bad Instagram press on Tuesday to announce it’s expanding mental health resources, especially around search interventions. Now, when someone searches for hashtags like #suicide, the app will direct users to the Crisis Text Line helpline and National Suicide Prevention Hotline. Videos will also appear in search results from creators with information on where to seek support and how to talk to loved ones about mental health issues.


And when a user searches for terms that have videos that might be distressing to some people, the search results page will be covered with a warning that says “the results for the words you’re searching may be distressing for some viewers,” and the user must click “show results” to see the content.

The problem obviously goes much deeper than tacking a suicide hotline onto a search result — but the WSJ report also noted that Instagram’s research found that social comparison is worse on Instagram than TikTok or Snapchat, particularly because Instagram’s currency is body and lifestyle. Meanwhile, Facebook is still working on Instagram Youth, a platformed designed for children under the age of 13, which Democrats are desperately fighting to stop from happening, but will probably happen anyway.