TikTok has introduced more audience controls for creators, allowing them to restrict their content to adults only.

The company announced the change putting the responsibility on creators to keep inappropriate content away from children.

Content on TikTok Live could already be restricted by creators, meaning some livestreams wouldn’t show up for under 18s.

The same technology will now be used to allow creators to ban children from viewing their standard videos on the app.

Fakenham & Wells Times:

TikTok says its “strict policies prohibiting nudity, sexual activity, and sexually explicit content” will still apply to all creators, including those who limit content to adults only.

In a post announcing the new feature, TikTok said: “Our goal has always been to make sure our community, especially teens on our platform, have a safe, positive and joyful experience when they come to TikTok.

“We’ve already taken significant strides to help ensure their feeds are full of content that is appropriate for them, and these improvements mark an important next step to meet that goal.”

It comes a month after an online safety group claimed some young TikTok users are being shown potentially dangerous content which could encourage eating disorders, self-harm and suicide.

Research into the TikTok algorithm by the Center for Countering Digital Hate (CCDH) found certain accounts were repeatedly being served content around eating disorders and other harmful topics in the minutes after joining the platform.

The group created two accounts in each of the US, UK, Australia and Canada posing as 13-year-olds. One account in each country was given a female name and the other was given a similar name but with a reference to losing weight included in the username.

During its test, the CCDH said one of its accounts was served content referencing suicide within three minutes of joining TikTok and eating disorder content was served to one account within eight minutes.

It said on average, its accounts were served videos about mental health and body image every 39 seconds.

Imran Ahmed, chief executive of the CCDH, accused TikTok of “poisoning the minds” of younger users.

“It promotes to children hatred of their own bodies and extreme suggestions of self-harm and disordered, potentially deadly, attitudes to food,” he said.

“Parents will be shocked to learn the truth and will be furious that lawmakers are failing to protect young people from big tech billionaires, their unaccountable social media apps and increasingly aggressive algorithms.”

In response to the research, a TikTok spokesperson said: “This activity and resulting experience does not reflect genuine behaviour or viewing experiences of real people.

“We regularly consult with health experts, remove violations of our policies, and provide access to supportive resources for anyone in need.

“We’re mindful that triggering content is unique to each individual and remain focused on fostering a safe and comfortable space for everyone, including people who choose to share their recovery journeys or educate others on these important topics.”