“Pro-ana” communities—websites, blogs, forums, and social media spaces dedicated to promoting the worsening of eating disorders like anorexia—have been a fixture of the web more or less since its inception. So it’s no surprise that, as Buzzfeed reported last month, some TikTok users have found disturbing pro-ana content on their For You page, a personalized section of the platform that displays videos users are likely to enjoy.
Discovering, damage-controlling, and deleting pro-ana content has become a rite of passage for web companies. In 2001, Yahoo removed 113 pro-ana websites from its servers. MySpace, Tumblr, Instagram, Pinterest, Reddit, and many other social media platforms have faced pro-ana problems. This well-publicized history makes it frustrating that TikTok wasn’t better prepared, beyond claiming it doesn’t allow “content that promotes eating habits that are likely to cause health issues.” But now that TikTok’s policies are under a microscope, what guidance will the company take from a longer history of regulating online pro-ana communities, and exactly how worried should its users be?
Dr. Ysabel Gerrard is a lecturer in digital media and society at the University of Sheffield. Her research on social media content moderation has been featured in venues like The Guardian and The Washington Post. She also consults for social media companies, including Instagram.
The problem TikTok has right now is that its For You page is working exactly as it should: It gives users a personalized and therefore pleasurable experience by showing them what they likely want to see. I’ve previously written about the same problem playing out on Instagram, Pinterest, and Tumblr. Recommendation algorithms like this are the bread and butter of social media platforms. The happier you are on a platform, the likelier you are to stay, and if you stay, the company can retain your profitable data-generation.
But the problem—a problem most major social media companies have faced—is that recommendation algorithms aren’t really trained to make moral and health-related judgements about the kinds of content they recommend. Do you like cats? TikTok thinks you do, based on what you’re liking and searching for, so its algorithm will show you more cats. Yay cats! But the exact same formula applies to potentially harmful forms of content. Do you have anorexia? TikTok thinks you do, so here’s a bunch of triggering videos. Have at it!
In a recent BuzzFeed article, some TikTok users shared anecdotes of randomly receiving recommendations for pro-ana videos through their For You page. It is difficult to describe pro-ana behaviors without triggering readers, but they might involve sharing diet tips and purging methods, writing personal stories, and pairing up with a “buddy” to further encourage weight loss. We know from charities like Beat that eating disorder patients often report feeling “triggered” by certain images or words. If a TikTok user continuously sees triggering posts on their For You page, this could very well harm them. But one of the frustrations social media researchers have is that the inner workings of recommendation systems like the For You page are notoriously opaque, making it difficult to figure out why particular users see certain recommendations while others don’t. A recent New Media and Society article notes how social media users often create elaborate theories for figuring out how recommender systems work, what the author calls “algorithmic gossip.”
Without dismissing anyone’s claims about their For You recommendations, readers should know that users who are not engaging with videos related to eating disorders are highly unlikely to have them randomly recommended. A TikTok spokesperson explained that users can also adjust the content they see by, for example, “hearting” videos, clicking “not interested,” and following users. “In doing so, through time users will see more of the content they prefer.”
Whenever stories like BuzzFeed’s appear, I always worry that social media companies will respond by panicking and prohibiting all content relating to eating disorders, even if it’s about recovery or support.
Researchers have long known that social media and older online communities can offer support for people with stigmatized conditions like eating disorders. For example, Reddit’s decision to remove the r/proED sub in 2018 was met with outcry from community members who explained that, despite its name, the sub wasn’t actually used as a space to promote eating disorders and functioned more like a support group.
When moderated more appropriately, there’s no reason TikTok can’t offer an extra space for people to express their feelings and share their experiences in a highly creative way. TikTok could also become a helpful resource for people struggling with eating disorders. Secrecy is one of the hallmarks of an eating disorder, meaning social media sometimes exists as a sufferer’s only form of support. With this in mind, TikTok could develop genuinely useful eating disorder resources beyond sending users a list of contact details for local charities, “the 2020 equivalent of handing a teen a tri-fold brochure,” as psychiatrists Neha Chaudhary and Nina Vasan recently wrote in WIRED. Pinterest, for example, has pioneered a series of wellbeing exercises that it recommends to users searching for self-harm-related Pins.
I wholeheartedly agree that “social media companies are uniquely suited to be a psychiatrist’s biggest ally”—and this is precisely why researchers get so frustrated that social media companies increasingly restrict our access to potentially life-saving data.
I spend part of my time advising social media companies about their policies on eating disorders—particularly Instagram, as I sit on their Suicide and Self-Injury Advisory Board—and I have some advice for TikTok’s policy team.
If you or someone you know is affected by an eating disorder, please contact the National Eating Disorders Association helpline.
First, TikTok should work with independent experts who research the relationship between social media and eating disorders to develop more nuanced and helpful policies. They should partner with local and global eating disorder charities to seek advice on best practices for content moderation. When requested to comment on this article, a TikTok spokesperson confirmed that the company is forging such partnerships:
“TikTok was built to provide a positive place for creativity, and we prioritize the safety and wellbeing of our users. We care deeply about the complex and multi-faceted issue of eating disorders, and are focused on expanding our partnerships, building upon our product, policies and protective measures to provide additional in-app resources for this community. Content that supports or encourages eating disorders is strictly against our Community Guidelines and will be removed.”
This is reassuring, as social media companies with userbases the size of entire countries probably shouldn’t leave policymaking to teams of non-experts, especially policies that affect highly vulnerable users.
While TikTok clearly needs to improve its recommendation algorithms, where eating disorders are concerned, removing content and tweaking these algorithms are band-aids on bullet wounds. These small fixes often don’t do enough to help the people whose sole mental health resource is often social media. TikTok’s core teenage user base are likeliest to experience an eating disorder, meaning the platform could seize this opportunity to become a space for genuinely transformative help and support.
Finally, TikTok ought to be transparent about the criteria it uses to police posts about eating disorders, and about how it uses mental-health-related data. While transparency is beneficial for lots of reasons, I am not naive enough to think that full transparency will ever be achieved, and nor is it always wise. For example, releasing full lists of banned search terms would inevitably help users to find workarounds. But the rules social media companies have about eating disorders and the extent to which they publicly discuss them reflects the condition’s position in society. The more opaque rule-making is, the less chance there is for discussion, debate, and destigmatization.
Although social media platforms are no longer newcomers to the tech world, their approaches to dealing with eating disorders are still in their infancy. Platforms like Pinterest are only just starting to move away from offering now-standardized help messages and resource lists that show up when users search for worrying phrases. This industry-wide sluggishness gives TikTok a chance to be a force for change and develop meaningful mental health resources and communities.
WIRED Opinion publishes articles by outside contributors representing a wide range of viewpoints. Read more opinions here. Submit an op-ed at firstname.lastname@example.org.
More Great WIRED Stories
- Inside Devs, a dreamy Silicon Valley quantum thriller
- Algae caviar, anyone? What we’ll eat on the journey to Mars
- How to work from home without losing your mind
- Deliver us, Lord, from the startup life
- Share your online accounts—the safe way
- 👁 Want a real challenge? Teach AI to play D&D. Plus, the latest AI news
- 🏃🏽♀️ Want the best tools to get healthy? Check out our Gear team’s picks for the best fitness trackers, running gear (including shoes and socks), and best headphones