TikTok launches mental health guide after research shows Instagram is harmful to teens

TikTok is looking to support users who may be struggling with their mental health.

James Martin/CNET

TikTok shared a handful of new features on Tuesday designed to support users' mental well-being, including guides on how to engage with people who may be struggling and updated warning labels for sensitive content. The changes come as Facebook's research into its photo-sharing app Instagram, which last year launched TikTok competitor Reels, has reportedly raised concerns about Instagram's impact on the mental health of teens

"While we don't allow content that promotes, glorifies or normalizes suicide, self-harm or eating disorders," TikTok said in a blog post, "we do support people who choose to share their experiences to raise awareness, help others who might be struggling and find support among our community."

Get the CNET Now newsletter

Spice up your small talk with the latest tech news, products and reviews. Delivered on weekdays.

In order to more safely support these conversations and connections, TikTok is rolling out new well-being guides to support anyone sharing their personal experiences on the video app. The guides were developed along with the International Association for Suicide PreventionCrisis Text LineLive For Tomorrow, Samaritans of Singapore and Samaritans (UK), and are available on TikTok's Safety Center. They also provide tips on how to engage with people who might be struggling. 

The social video app is also sharing a new Safety Center guide on eating disorders for teens, caregivers and educators. The guide was developed along with experts like the National Eating Disorders AssociationNational Eating Disorder Information CentreButterfly Foundation and Bodywhys, and offers information, support and advice on eating disorders. Earlier this year, TikTok added a feature that directs users searching for terms related to eating disorders to appropriate resources. 

In addition, when someone searches for words or phrases like #suicide, they're pointed to local support resources like the Crisis Text Line helpline to find information on treatment options and support.  TikTok also said it's updating its warming label for sensitive content, so that when a user searches for terms that could surface distressing content, such as "scary makeup," the search results page will show an opt-in viewing screen. Users can tap "Show results" to view the content. 

The site is also showcasing content from creators sharing their personal experiences with mental well-being, information on where to get help and advice on how to talk to loved ones.  "These videos will appear in search results for certain terms related to suicide or self-harm, with our community able to opt-in to view should they wish to," TikTok said. On Tuesday, The Wall Street Journal reported that in studies conducted over the past three years, Facebook researchers have found Instagram is "harmful for a sizable percentage" of young users, particularly teenage girls.

For years, child advocates have expressed concern over the mental health impact of sites like Instagram, where it can be hard to separate reality from altered images. Advocacy groups and lawmakers have long criticized Instagram and parent Facebook for harboring harmful content and fostering anxiety and depression, particularly among younger audiences.  A 2017 report by the UK's Royal Society for Public Health found that Instagram is the worst social media platform for young people's mental health.

Reports earlier this year revealed Instagram is planning to launch a platform for kids under 13, stirring up more criticism from child health advocates who are concerned about threats to children's online privacy and their mental well-being.  In response to criticism, both Facebook and Instagram said in May that they'd give all users the option to hide the number of likes their posts get from the public and to choose whether they can see like counts on all posts in their feed. Of course, concerns about the impact of technology on young minds extends to TikTok as well, which last month added more features aimed at protecting the privacy and safety of teenagers who use the app. TikTok was also sued in April over allegations it illegally collects and uses children's data, with the company saying those claims "lack merit."

If you're struggling with negative thoughts or suicidal feelings, here are 13 suicide and crisis intervention hotlines you can use to get help.