If you were on Tumblr during the 2000s, you’ll know that among the pseudo-artsy black and white photography, memes, and scene hairstyles, the platform had an issue with self-harming content. In fact, the issue became so prevalent that in 2012, Tumblr announced it officially prohibited blogs "that actively promote self-harm," with policy changes moderating blogs that "glorify or promote anorexia, bulimia, and other eating disorders." Despite this intention, pro-ana communities (short for pro-anorexia, a subculture that promotes eating disorders) have still found a place on the site and across all social channels. MySpace, Tumblr, Pinterest, Instagram, and Reddit all face pro-ana problems, with a recent report finding that Instagram still fails to protect people at risk for eating disorders from pro-anorexia messaging. However, more recently, it’s been the video-sharing app TikTok that is slowly becoming the home base for this harmful content.
TikTok is largely known as the social platform for young people and teenagers — about 60% of users are Generation Z — making the issue of pro-ana content on the app particularly pressing as it’s reaching a wide young and impressionable audience. With teenagers and children spending more time online than ever before, it appears to be more than coincidental that in less than a decade, the rate of eating disorders had risen by 119% in kids under 12. While TikTok officially does not allow content promoting or glorifying eating disorders, last year The Guardian found a variety of harmful pro-anorexia hashtags remained searchable on the app.
While pro-ana hashtags are an obvious way to spot harmful content, the nature of pro-ana content can often be insidiously disguised as “thinspo” tips or wellness trends to promote healthier living. This means teenagers may be exposed to pro-ana content without realizing they are entering a dangerous algorithmic rabbit hole. Two such trends that have emerged on the app are “body checking” videos (involving frequent, compulsive checking of appearance in the mirror or of the circumference of body parts) and “what I eat in a day” videos, which can encourage extreme restriction.
“The main difference is how much more covert TikTok is about it compared to Tumblr. It’s all disguised as wellness or being healthy.”
Devaney Sparrow, a 25-year-old based in New York, has been in recovery from disordered eating for the past few years, but says they’ve struggled with it for 13 years. An active TikTok user with more than 80,000 followers, Sparrow says they find many videos on the app to be triggering. “Disordered eating is always lurking in the shadow of your psyche, waiting for the perfect moment to get you,” they say. “Videos like these are exactly that perfect moment. When people body check, show how little they eat, or how long and intense their workouts are, it’s extremely harmful.”
Sparrow has tried to self-regulate exposure to this content by selecting the “not interested” button on videos or trends promoting diet culture but says it “seems hopeless,” as new videos always find their way back to their algorithm. “Ideally, we’d be able to regulate videos and trends like these,” they say. “My concern with the algorithm is that the only people’s videos that would end up being monitored and regulated are by Black and brown creators,” Sparrow notes, a reoccurring issue with the app’s regulatory processes. “Meanwhile, thin white people would be able to keep finding new and innovative ways to blow up and go viral by being ‘thinspo.’”
Sparrow says TikTok content is beginning to remind them of the days of being an active Tumblr user as a teenager and getting sucked into the thinspo trap. “The main difference is how much more covert TikTok is about it compared to Tumblr,” they say. “It’s all disguised as wellness or being healthy. It’s much more difficult for me to differentiate what’s taking care of my body and what’s my eating disorder trying to creep back into my world.” This can also take place on the app under the language of “healing your gut,” which still shows before and after weight loss videos.
Kirsten Oelklaus, co-founder and program director at Bellatore Recovery, says the core difference between a helpful and healthy video and a harmful one is often the framework used by the person posting the video. “I have seen some of these posts by individuals who are solidly in recovery, sharing joy and healthy connection in their relationship with food, and it can be very inspiring,” she says. On the other hand, “videos become harmful if they set up comparisons for the viewer, have subtle, unhealthy messages such as ‘good foods versus bad foods’ or moralizing messages such as eating ‘clean.’”
Because of the nature of an algorithm, after engaging with pro-ana content, you’re more likely to be shown more harmful content.
For the average viewer, making this distinction can feel like an impossible task. After all, “eating clean” rhetoric is so synonymous with today’s wellness culture, it can be hard for users to differentiate what videos are contributing to the problem. It’s also nearly impossible to determine the intentions of a person from a 15-second video. This is why all creators in the wellness space need to continuously examine their own content. “I think it is most helpful if the person posting the content has others that they can ask to review their posts for potential triggers,” says Oelklaus. “I believe the motivation behind posts can be very pure; however, what is not triggering to one person may be to someone else.”
Because of the nature of an algorithm, after engaging with pro-ana content, you’re even more likely to be shown more harmful content. This is why model Raha Europ, who is concerned about her siblings and other young women falling down the rabbit hole, says better education is necessary to teach teenagers how to navigate content online. “I think younger girls can misinterpret what they see and decide it’s what they should do,” she says. “Instead, I think that it’s important for everyone to learn about what works for them and not try to replicate what they see online. We should all learn to love our bodies and understand that we are all different, with different tolerances and metabolisms.”
The best way to ensure content won’t show up on your algorithm again is to ignore it, not engage with it, or to report it. This, of course, is easier said than done for teenagers struggling with body image issues. Under the pressure of social media perfection, the onus for regulating pro-ana content on TikTok can’t be on individual users but on mental health professionals and thoughtful regulations from tech companies. While some do exist, the fastest-growing social media platform on the Internet requires even faster solutions to a problem that’s spreading like wildfire.