Earlier this year, TikTok said that they were changing their community guidelines to enhance their members’ “safety, security, and well-being.” A restriction on dead-naming, misgendering, and misogyny, as well as the removal of anything that promotes disordered eating, are among the adjustments.
Although TikTok explicitly prohibits anything that promotes or glorifies eating disorders and enforces this through a combination of human and AI moderation, the app has previously been chastised for enabling “pro-ED” content to flourish.
In response to this criticism, TikTok has attempted to control harmful content on the platform in recent years. In 2020, the company imposed restrictions on weight loss ads on the platform. In 2021, they launched a new feature that provides a phone number for users who search hashtags related to eating disorders. However, some hazardous content still makes its way onto the internet, and videos documenting individuals’ recovery are frequently removed incorrectly.
In order to make the app a safer place, the company has announced that videos that promote disordered eating symptoms such as short-term fasting and over exercising will be removed. According to a statement on the company’s website:
“We’re making this change, in consultation with eating disorders experts, researchers, and physicians, as we understand that people can struggle with unhealthy eating patterns and behaviour without having an eating disorder diagnosis. […] This is an incredibly nuanced area that’s difficult to consistently get right, and we’re working to train our teams to remain alert to a broader scope of content.”
TikTok is correct in emphasizing that this is an “extremely delicate area.” Content moderation involving disorders, such as eating disorders, is tough to do right. The boundary between content that “promotes” eating disorders, and content produced by persons chronicling their recovery, is frequently fuzzy. What if a person posts a “What I Eat In A Day” video that shows an amount of food that is an improvement for them, but appears frighteningly small to others? How can you tell the difference between “thinspo” content and content that features a thin body?
When the mental health of real human TikTok users (frequently young women) is at stake, it’s critical that problems like these are addressed and acknowledged.
The ambiguity surrounding what is and isn’t acceptable on TikTok makes it difficult for users to understand what they can and can’t upload. This ambiguity may endanger people who use the app to find community and support while recovering. Removing a user’s video or banning their account has the potential to significantly worsen their mental health.
We recognize and respect the hard effort that various physicians, academics, and specialists are putting in to limit the quantity of dangerous information on TikTok. But, when it comes to judging what constitutes information that “promotes” disordered eating, will the app’s moderators be sensitive to all of these nuances? Will they be more forthcoming about how they intend to moderate these videos? Only time will tell if this is true.
In related news, we think everyone should be trained in mental health first aid.
Photo via Bark