TikTok’s ‘For You’ feed risks pushing children and young people towards harmful mental health content

Some might find information in this Press Release triggering.

– Technical research in partnership with the Algorithmic Transparency Institute and AI Forensics using automated accounts showed that after 5-6 hours on the platform, almost 1 in 2 videos were mental health-related and potentially harmful, roughly 10 times the volume served to accounts with no interest in mental health.

– There was an even faster “rabbit hole” effect when researchers manually rewatched mental health-related videos suggested to “sock puppet” accounts mimicking 13-year-old users in Kenya, the Philippines and the USA.

– Between 3 and 20 minutes into our manual research, more than half of the videos in the ‘For You’ feed were related to mental health struggles with multiple recommended videos in a single hour romanticizing, normalizing or encouraging suicide.

– TikTok’s very business model is inherently abusive and privileges engagement to keep users hooked on the platform, in order to collect evermore data about them. It unequally applies protections for users around the world.

TikTok’s content recommender system and its invasive data collection practicespose a danger to young users of the platform by amplifying depressive and suicidal content that risk worsening existing mental health challenges, two companion reports released today by Amnesty International show.

The two reports –Driven into the Darkness: How TikTok Encourages Self-harm and Suicidal Ideation and the I Feel Exposed: Caught in TikTok’s Surveillance Web -highlight the abuses experienced by children and young people using TikTok, and the ways in which these abuses are caused by TikTok’s recommender system and the underlying business model.

The findings of a joint technical investigation, with our partners – the Algorithmic Transparency Institute (ATI) at the National Conference on Citizenship and AI Forensics – show how children and young people who watch mental health-related content on TikTok’s ‘For You’ page are quickly being drawn into “rabbit holes” of potentially harmful content, including videos that romanticize and encourage depressive thinking, self-harm and suicide.

“The findings expose TikTok’s manipulative and addictive design practices, which are designed to keep users engaged for as long as possible. They also show that the platform’s algorithmic content recommender system, credited with enabling the rapid global rise of the platform, exposes children and young adults with pre-existing mental health challenges to serious risks of harm,” said Lisa Dittmer, Amnesty International Researcher.

The platform’s algorithmic content recommender system, credited with enabling the rapid global rise of the platform, exposes children and young adults with pre-existing mental health challenges to serious risks of harm

Lisa Dittmer, Amnesty International Researcher
The Issue: ‘For You’ Feed

Driven into the Darkness: How TikTok Encourages Self-harm and Suicidal Ideation details how TikTok’s relentless pursuit of young users’ attention risks exacerbating mental health concerns such as depression, anxiety and self-harm.

TikTok’s ‘For You’ feed is a highly personalized and infinitely scrollable page of algorithmically recommended content, picked out to reflect what the system has inferred to be a user’s interests.

Technical research was conducted using more than 30 automated accounts set up to represent 13-year-olds in Kenya and the USA to measure the effects of TikTok’s recommender system on young users. An additional manually run simulation involved an account each in Kenya, the Philippines and the USA.

The technical research revealed that after 5-6 hours on the TikTok platform, almost 1 in 2 videos shown were mental health-related and potentially harmful, roughly 10 times the volume served to accounts with no interest in mental health.

There was an even faster “rabbit hole” effect when researchers manually rewatched mental health-related videos suggested to research accounts mimicking 13-year-old users in Kenya, the Philippines and the USA.

Between 3 and 20 minutes into our manual research, more than half of the videos in the ‘For You’ feed were related to mental health struggles with multiple recommended videos in a single hour romanticizing, normalizing or encouraging suicide.

Addictive by Design

Focus group discussions, interviews and simulations of children’s TikTok accounts in Kenya, the Philippines and the USA, as well as existing evidence from the fields of social media harms research and public health, reveal how TikTok’s platform design encourages the unhealthy use of the app.  

When I watch a sad video that I could relate to, suddenly my whole ‘For You’ Page is sad and I’m in ‘Sadtok’.

Francis*, an 18-year-old student in Batangas Province, Philippines

*Luis, a 21-year-old undergraduate student in Manila who has been diagnosed with bipolar disorder, told Amnesty International of his experience with TikTok’s ‘For You’ feed.

“It’s a rabbit hole because it starts with just one video. If one video is able to catch your attention, even if you don’t like it, it gets bumped to you the next time you open TikTok and because it seems familiar to you, you watch it again and then the frequency of it in your feed rises exponentially,” said Luis.

*Francis, an 18-year-old student in Batangas Province, Philippines, observed: “When I watch a sad video that I could relate to, suddenly my whole ‘For You’ Page is sad and I’m in ‘Sadtok’. It affects how I’m feeling.”

Another focus group participant explained, “The content I see makes me overthink [even] more, like videos in which someone is sick or self-diagnosing. It affects my mentality and makes me feel like I have the same symptoms and worsens my anxiety. And I don’t even look them (videos) up, they just appear in my feed.”

 *Joyce, a 21-year-old woman in the Philippines said, “I deleted it [TikTok] for a while because I was very addicted to it… I would spend so many hours on TikTok just scrolling through videos because you can’t help but wonder what goes up next when you scroll down.”

Children and young people interviewed in Kenya said that they felt their TikTok use affected their schoolwork, social time with friends and led them to scroll through their feeds late at night instead of catching enough sleep.

These testimonies were corroborated by various adolescent psychologists consulted by Amnesty International as part of the research.

While young people’s individual responses and contextual factors affecting their social media use may vary, like other social media platforms, TikTok has made design choices intended to maximize users’ time spent on the platform.

“Our research shows that TikTok may expose children and young people to serious health risks by persisting with its current business model geared more at keeping eyes glued on the platform over respecting the right to health of children and young people”, said Lisa Dittmer, Amnesty International Researcher.

TikTok should be safe by design, not addictive by design.

TikTok’s addictive feature, the ‘For You’ feed, a highly personalized and endlessly scrollable page of algorithmically recommended content, taps into what psychologists describe as the “reward pattern of winning or losing on a slot machine”.

TikTok is designed to tap into users’ desires to be rewarded, which can lead users to develop habits that develop habits that encourage addictive use.

The Surveillance Web

“I feel Exposed”: Caught in TikTok’s Surveillance Web shows how TikTok’s rights-abusing data collection practices both underpin and are sustained by the harmful user engagement practices.

Amnesty International’s research shows that TikTok’s very business model is inherently abusive and privileges engagement to keep users hooked on the platform, in order to collect evermore data about them. TikTok then uses this data to create profiles of users and draw inferences about them, which allows it to cluster users in groups to target them with highly personalized content to keep them engaged. These groups and categories are also made available to advertisers so that they can target users with personalised ads.

To the extent that TikTok has put in place policies and practices to ensure greater respect of children’s rights, they differ from region to region, leaving children and young people in some parts of the world exposed to exploitative data collection in others.

“TikTok targets users, including children, with more invasive data harvesting practises in parts of the world where people have fewer protections for their data under local laws and regulations – meaning children living in countries with weak regulation, including many countries of the Global Majority are subject to the worst abuses of their right to privacy,” said Lauren Armistead, Amnesty Tech Deputy Programme Director.

“TikTok must respect the rights of all its younger users, not just those in Europe, by banning all targeted advertising aimed at those younger than 18 globally.”

TikTok must also stop hyper-personalizing the ‘For You’ feed by default, and instead let users actively choose which interests shape their content recommendations based on their informed consent and if they want a personalized feed.

While Amnesty International calls on TikTok to take these urgent steps towards a rights-respecting business model, binding regulation is also needed to protect and fulfil children and young people’s rights.

The best way to protect children from abuse of their personal data online is for governments to ban by law all targeted advertising based on the invasive collection of personal data.

Responding to our findings, TikTok pointed us to its Community Guidelines, which set out which types of content are banned and thus, if reported or otherwise identified, removed from the platform. These include a ban on content “showing, promoting, or providing instructions on suicide and self-harm, and related challenges, dares, games, and pacts”, “showing or promoting suicide and self-harm hoaxes” and “sharing plans for suicide and self-harm.”

TikTok stated that it is in the process of developing a “company-wide human rights due diligence process which will include conducting periodic human rights impact assessments.” The company did not provide details on which specific risks to children and young users’ human rights it has identified. That TikTok currently does not have a company-wide human rights due diligence process in place is a clear failure of the company’s responsibility to respect human rights.

Tiktok makes money by collecting data about you – such as who you are and what you like.
Children are no exception.

This corporate surveillance for profit undermines children’s right to have control over their personal information.
It’s an abuse of the right to privacy and freedom of thought.

Young Person’s Perspective

Written by Luis (pseudonym), 21-year-old student from Manila.

What is TikTok?

I use TikTok primarily for entertainment. TikTok is a platform that allows you to watch a variety of short videos. While other social media such as Facebook, Twitter, or Instagram would focus on texts or images, TikTok really capitalizes on videos.

On other video-centric social media such as YouTube, content is recommended too, but you have more control over what you watch. You still have to search and click.

On TikTok, the content comes to you. TikTok feeds you content, rather than offering you content. You end up scrolling through a long, perhaps endless, list. Since the videos are short, you wouldn’t notice the time pass, and suddenly you’re there for hours. It’s addictive because it’s fast-paced and spectacle-based. It makes the decisions for you.

Falling into the rabbit hole

In your ‘For You’ feed, you would notice that it is indeed based on your past viewing. Even if you stay just for a short while on a particular video, it will then show the same type of content as you scroll through. I view it as a rabbit hole. With just taking a peek, you risk falling down, and then that type of content starts to bombard you.

It rises not just in frequency, but also in intensity. It all starts with just one video – that curious rabbit leading you down into a ‘wonderland’, that red-haired clown down the sewers.

The next thing I see would be a barrage of videos on self-harm and even death, mixed with videos using psychological language – psychospeak – that would claim to ‘unpack’ my feelings.

As someone with Bipolar II disorder, I used TikTok for both my hyperactive and depressive periods. When I’m hyperactive, TikTok is appealing, because the fast-paced cacophony is able to stimulate my mind and give me that ‘rush’.

The videos would be all bright and energetic, inducing a mental and bodily response that, if sustained, would ultimately be dangerous. It’s essentially a prolonged ‘high’.

When I’m down, I would ‘escape’ my mind by mindlessly wandering through the feed. I would then encounter videos that affirm my emotions, and I would get trapped in that type of content for a long time.

When I’m depressed for instance, I would ‘hyperfixate’ or get stuck with just one video of sad literature and photos. The next thing I see would be a barrage of videos on self-harm and even death, mixed with videos using psychological language – psychospeak – that would claim to ‘unpack’ my feelings.

Background

Both reports add to evidence explored in Amnesty International’s previous research reports. Surveillance Giants exposed how the business model of Facebook and Google is inherently incompatible with the right to privacy and poses a threat to a range of other rights including freedom of opinion and expression, freedom of thought, and the right to equality and non-discrimination.

Amnesty International’s reports on  Myanmar: The social atrocity: Meta and the right to remedy for the Rohingya  and Ethiopia: Meta’s failures contributed to abuses against Tigrayan community during conflict in northern Ethiopia  both reveal how the Facebook platform’s engagement-based business model can result in devastating impacts through the amplification of extreme content that incites violence, hatred and discrimination ultimately contributing to serious human rights violations and abuses.

Together, these reports contribute to the growing evidence base of Amnesty International’s global campaign for corporate accountability and redress for human rights abuses associated with the surveillance-based business model of Meta, Google, TikTok and other “Big Tech” platforms.

TikTok can lead you to very dark places

Make TikTok Safer for Children and Young People

Call on TikTok to ensure that protections are in place for children and young people against harmful content and addictive behavior