Some young TikTok users are being shown potentially dangerous content which could encourage eating disorders, self-harm and suicide, an online safety group has claimed.
Research into the TikTok algorithm by the Center for Countering Digital Hate (CCDH) found certain accounts were repeatedly being served content around eating disorders and other harmful topics in the minutes after joining the platform.
The group created two accounts in each of the US, UK, Australia and Canada posing as 13-year-olds. One account in each country was given a female name and the other was given a similar name but with a reference to losing weight included in the username.
The content served to both accounts in their first 30 minutes on TikTok was then compared.
The CCDH said it used this username method as previous research has shown that some users with body dysmorphia issues will often express this through their social media handles.
In its report methodology, the CCDH also said accounts used in the study expressed a preference for videos about body image, mental health, and eating disorders by pausing on relevant videos and pressing the like button.
In addition, the report does not distinguish between content with a positive intent and that of a clearer negative intent – with the CCDH arguing it was not possible in many cases to definitively determine the intent of a video and that even those with a positive intention could still be distressing to some.
The online safety group’s report argues that the sheer speed with which TikTok recommends content to new users is harmful.
During its test, the CCDH said one of its accounts was served content referencing suicide within three minutes of joining TikTok and eating disorder content was served to one account within eight minutes.
It said on average, its accounts were served videos about mental health and body image every 39 seconds.
And the research indicated that the more vulnerable accounts – which included the references to body image in the username – were served three times more harmful content and 12 times more self-harm and suicide-related content.
The CCDH said the study had found an eating disorder community on TikTok which uses both coded and open hashtags to share material on the site, with more than 13 billion views of their videos.
The video-sharing platform includes a For You page, which uses an algorithm to recommend content to users as they interact with the app and it gathers more information about a user’s interests and preferences.
Imran Ahmed, chief executive of the CCDH, accused TikTok of “poisoning the minds” of younger users.
“It promotes to children hatred of their own bodies and extreme suggestions of self-harm and disordered, potentially deadly, attitudes to food,” he said.
“Parents will be shocked to learn the truth and will be furious that lawmakers are failing to protect young people from big tech billionaires, their unaccountable social media apps and increasingly aggressive algorithms.”
In the wake of the research, the CCDH has published a new Parents’ Guide alongside the Molly Rose Foundation, which was set up by Ian Russell after his daughter Molly ended her own life after viewing harmful content on social media.
The guide encourages parents to speak “openly” with their children about social media and online safety and to seek help from support groups if concerned about their child.
In response to the research, a TikTok spokesperson said: “This activity and resulting experience does not reflect genuine behaviour or viewing experiences of real people.
“We regularly consult with health experts, remove violations of our policies, and provide access to supportive resources for anyone in need.
“We’re mindful that triggering content is unique to each individual and remain focused on fostering a safe and comfortable space for everyone, including people who choose to share their recovery journeys or educate others on these important topics.”