A new investigation claims TikTok recommends pornography and sexualised clips to children. Researchers created fake child accounts, switched on safety settings, and still saw explicit search prompts. These led to videos of masturbation simulations and pornographic sex. TikTok says it acted quickly once informed and insists it is committed to safe use for young people.
Child profiles reveal sexual suggestions
In July and August, researchers from Global Witness set up four TikTok accounts. They posed as 13-year-olds by entering false birth dates. The app did not request further identification. Investigators enabled TikTok’s “restricted mode”. The company promotes this feature as protection against mature or sexual content. Despite this, the accounts received sexualised search prompts in the “you may like” section. These led to videos of women flashing underwear, exposing breasts and simulating masturbation. At the most extreme, explicit pornography appeared, hidden in ordinary-looking clips to avoid detection.
Global Witness reacts with alarm
Ava Lee from Global Witness described the results as a “huge shock”. She said TikTok not only fails to protect children but actively steers them toward harmful content. Global Witness usually investigates the role of big tech in climate issues, human rights and democracy. The group first encountered TikTok’s explicit material during unrelated research in April.
TikTok claims strong protections
Researchers alerted TikTok earlier this year. The company said it removed the flagged material and corrected the issue. But when Global Witness repeated its test in late July, sexual content reappeared. TikTok says it has more than 50 safety features for teenagers. It claims nine out of ten violating videos are deleted before anyone views them. After the report, the company stated it upgraded its search tools and removed more harmful content.
Online Safety Act raises pressure
On 25 July, the Children’s Codes from the Online Safety Act came into force. These rules require platforms to use strict age verification and stop minors from accessing pornography. Algorithms must also block material linked to suicide, self-harm and eating disorders. Global Witness repeated its research after the codes took effect. Ava Lee urged regulators to act, stressing children’s online safety must now be guaranteed.
Users question the app
During the research, investigators also monitored user reactions. Some expressed confusion at the sudden sexualised recommendations. One asked: “can someone explain to me what is up with my search recs pls?” Another commented: “what’s wrong with this app?”
