Skip to main contentSkip to navigationSkip to navigation
The TikTok logo is displayed on a smartphone
TikTok is one of three video-sharing platforms the regulator looks at in a report published on 14 December. Photograph: Bloomberg/Getty
TikTok is one of three video-sharing platforms the regulator looks at in a report published on 14 December. Photograph: Bloomberg/Getty

Ofcom investigates TikTok over parental control information

This article is more than 7 months old

Regulator to look at whether the platform broke the law by giving inaccurate information

The UK communications regulator has opened an investigation into whether TikTok gave “inaccurate” information about its parental controls to the watchdog.

Ofcom said it had asked the social video platform for information about its Family Pairing system and that is has “reason to believe that the information it provided was inaccurate”.

The watchdog said it had, therefore, opened an investigation into whether the Chinese-owned platform has breached the 2003 Communications Act.

TikTok said the problem was due to a technical issue that may have led to some of the information provided to the watchdog being inaccurate. It said it had identified the problem several weeks ago and raised it with Ofcom.

The regulator said it had also asked for information as part of a report into how video-sharing platforms [VSPs] were protecting users from harmful content.

“The available evidence suggests that the information provided by TikTok in response to the notice may not have been complete and accurate,” said Ofcom, adding that it would provide an update to the investigation in February.

Ofcom made the announcement as it published a report on Thursday into how three leading VSPs, TikTok, Snap and video game streaming service Twitch, were protecting children from encountering harmful videos.

The report cited research showing that more than a fifth of children aged eight to 17 have an adult online profile with an account stating their age is 18 or over. A third of children aged eightto 15 have an account with a user age of 16 or over, Ofcom added.

The regulator said the statistics called into question whether a policy of self-declaring a user’s age when they sign up was sufficient, and called on the platforms to step up attempts to find out the age of users.

“We therefore expect platforms to explore additional methods to gain a better understanding of the age of their users to be able to tailor their experiences in ways that are more appropriate to their age and that protect them from harm,” said Ofcom.

skip past newsletter promotion

The regulator said TikTok used undisclosed technologies to detect keywords that flagged a potentially underage account, while Twitch used several measures including language analysis tools and Snap relied on people reporting underage users.

TikTok told Ofcom that the number of underage accounts it had removed in the 12 months to March 2023 represented just over 1% of its monthly active user base. Twitch said over the same period it had removed 0.03% of its total UK user base and Snap had removed up to 1,000 accounts.

More on this story

More on this story

  • TikTok hackers target Paris Hilton, CNN and other high-profile users

  • TikTok opens datacentre in Dublin in bid to combat European privacy concerns

  • TikTok to auto-flag AI videos – even if created on other platforms

  • TikTok to be fined for breaching children’s privacy in EU

  • Universal signs TikTok deal allowing artists back on platform

  • TikTok stars clean up: the influencers saving Indonesia’s polluted rivers and beaches

  • Why is US threatening to ban TikTok and will other countries follow suit?

  • EU threatens TikTok Lite with ban over reward-to-watch feature

  • ‘Ice-cream so good’: how are TikTok creators making money from bizarre gestures and phrases on a loop?

  • TikTok says it will fight US ban or forced sale after bill passes

Most viewed

Most viewed