An investigation by the ICO (Information Commissioner’s Office) estimates that TikTok allowed up to 1.4 million children under the age of 13 in the UK to use the platform in 2020, despite there being a minimum age restriction in place to create an account.
As a result, TikTok used the children’s data without the parental consent required under Article 8 of the UK General Data Protection Regulation (UK GDPR). The ICO reported that TikTok failed to obtain consent, even though it should have been aware that under 13s were using the platform, and further failed to carry out adequate checks to identify and remove underage children from its platform.
The ICO also fined TikTok for failing to provide appropriate information to people using the app about how their data is collected, used, and shared in a way that is easy to understand. Without such information, users of the platform, in particular children, were unlikely to be able to make informed choices about whether and how to engage with it. The investigation by the ICO found that children’s data may have been used to present to them potentially harmful or inappropriate content whilst scrolling.
The ICO further found that TikTok had failed to ensure that the personal data belonging to its UK users was processed lawfully, fairly and in a transparent manner, breaching Article 5 of UK GDPR.
TikTok received one of the largest fines issued to date by the ICO – a significant £12.7 million, which was reduced from the originally proposed £27 million.
Since the conclusion of the ICO’s investigation of TikTok, the regulator has published the Children’s Code to help protect children in the digital world.
If you provide services online which may be accessed by children you must make sure you are aware of the Code and what your obligations are. The ICO’s fine is a stark reminder of how costly it can be if you get your data processing obligations wrong.