Angus CrawfordBBC News Investigations
Getty Images
TikTok's algorithm recommends pornography and highly sexualised content to children's accounts, according to a new report by a human rights campaign group.
Researchers created fake child accounts and activated safety settings but still received sexually explicit search suggestions.
The suggested search terms led to sexualised material including explicit videos of penetrative sex.
The platform says it is committed to safe and age-appropriate experiences and took immediate action once it knew of the problem.
In late July and early August this year, researchers from campaign group Global Witness set up four accounts on TikTok pretending to be 13-year-olds.
They used false dates of birth and were not asked to provide any other information to confirm their identities.
Pornography
They also turned on the platform's "restricted mode", which TikTok says prevents users seeing "mature or complex themes, such as… sexually suggestive content".
Without doing any searches themselves, investigators found overtly sexualised search terms being recommended in the "you may like" section of the app.
Those search terms led to content of women simulating masturbation.
Other videos showed women flashing their underwear in public places or exposing their breasts.
At its most extreme, the content included explicit pornographic films of penetrative sex.
These videos were embedded in other innocent content in a successful attempt to avoid content moderation.
Ava Lee from Global Witness said the findings came as a "huge shock" to researchers.
"TikTok isn't just failing to prevent children from accessing inappropriate content - it's suggesting it to them as soon as they create an account".
Global Witness is a campaign group which usually investigates how big tech affects discussions about human rights, democracy and climate change.
Researchers stumbled on this problem while conducting other research in April this year.
Videos removed
They informed TikTok, which said it had taken immediate action to resolve the problem.
But in late July and August this year, the campaign group repeated the exercise and found once again that the app was recommending sexual content.
TikTok says that it has more than 50 features designed to keep teens safe: "We are fully committed to providing safe and age-appropriate experiences".
The app says it removes nine out of 10 videos that violate its guidelines before they are ever viewed.
When informed by Global Witness of its findings, TikTok says it took action to "remove content that violated our policies and launch improvements to our search suggestion feature".
Children's Codes
On 25 July this year, the Online Safety Act's Children's Codes came into force, imposing a legal duty to protect children online.
Platforms now have to use "highly effective age assurance" to stop children from seeing pornography. They must also adjust their algorithms to block content which encourages self-harm, suicide or eating disorders.
Global Witness carried out its second research project after the Children's Codes came into force.
Ava Lee from Global Witness said: "Everyone agrees that we should keep children safe online… Now it's time for regulators to step in."
During their work, researchers also observed the reaction of other users to the sexualised search terms they were being recommended.
One commenter wrote: "can someone explain to me what is up w my search recs pls?"
Another asked: "what's wrong with this app?"