skip to main content

TikTok fined £12.7m for illegally processing children’s data

12th Apr 2023 | Commercial Law | Data Protection & Information Law | Data Protection Round-up | Digital & Technology
Illustration of man carrying a giant phone with the Tiktok logo on it

The Information Commissioner’s Office (ICO) has fined TikTok £12.7 million for a breach of data protection law, including failure to use children’s personal data lawfully.

Despite TikTok’s own rules not allowing children under 13 to create an account, the ICO estimates that 1.4 million children under 13 have been allowed to use the platform without parental consent.   

Serious failures

The ICO has concluded that TikTok has breached several UK GDPR Articles including:

  • Article 5 by failing to ensure that the user’s personal data was processed lawfully, fairly and in a transparent manner
  • Article 8 by failing to obtain the required parental consent
  • Articles 12 and 13 by failing to provide information in an accessible way to users about how their data is collected, used and shared.

TikTok sets 13 as the minimum age to create an account, but the ICO said the platform has done ‘very little if anything’ to check who was using the platform and remove underage users. All that is required from users is a self-verification that they are over 13 with no further verification or checks. As a result, TikTok has been collecting and using the children’s personal data, delivering potentially inappropriate content. The ICO says that the fine reflects the serious impact their failures may have had on underage users and the lack of precautions it has in place.  

Inadequate response

The ICO found that the concern surrounding the lack of verification was raised internally, but TikTok did not respond ‘adequately.’ A TikTok spokesman said they are invested in keeping under-13s off the platform, and a safety team of 40,000 works hard to keep the platform safe.

TikTok has introduced training for its safety team to look for signs an account may be used by a child under 13 so they can send accounts for review and promptly respond to requests from parents to remove the underage account.

The government has been accused of moving too slowly and failing at its duty to ensure young people are safe online. This has led to calls for the government to introduce stronger protections online and toughen existing regulatory requirements.  

A reduced fine

The ICO has reduced the fine from an anticipated £27 million after deciding not to pursue an initial finding that the platform had unlawfully used ‘special category data’ such as ethnic and racial origin, political opinions, religious beliefs, sexual orientation or health data. The ICO did not include this potential infringement in the final amount of the fine set at £12.7 million.

The fine comes alongside Australia's decision to ban TikTok from all government devices due to security concerns. Australia joins the US, Canada, New Zealand, Norway and the UK to enforce a partial TikTok ban, while Afghanistan and India have both imposed a complete ban on the app.

If you’d like to speak to our data protection team about any issues raised in this article or would like help to review your own data protection policies, please contact Alex Craig on 0191 2117911 or email [email protected]

Share this story...