The UK Information Commissioner’s Office (ICO) has fined TikTok £12.7 million for violating several articles of the UK General Data Protection Regulation (GDPR).
The monetary penalty notice (MPN) sets out the ICO’s views on important GDPR provisions relating to children’s data and transparency. This article will explain where TikTok went wrong in the eyes of the UK’s regulator.
Children’s Personal Data (Article 8)
The bulk of the ICO’s issues with TikTok were around the accessibility of the platform to users under 13.
The issues here mostly relate to TikTok’s legal basis for processing children’s data. However, the ICO’s conclusions suggest that TikTok would struggle to ever establish an appropriate legal basis for serving content to children under 13.
While TikTok’s terms of service prohibit children under 13 from opening an account, the evidence suggests that many young children used TikTok.
The MPN states:
“TikTok provided its services to UK users under the age of 13 and processed their personal data without consent given or authorised by the holder of parental responsibility over such child users, and without identifying any lawful basis for processing other than consent.”
There are several issues here, some of which are not straightforward.
Legal basis: Consent
Article 8 of the GDPR contains a rule for controllers that provide “information society services” (online services) directly to a child. When such a controller relies on consent as its legal basis for processing, the consent must be given by a parent or guardian.
TikTok relied on consent for the provision of targeted advertising. Despite having users under 13, the ICO states that TikTok failed to obtain parental consent.
While targeted advertising was not the focus of the ICO’s investigation, the MPN notes several times that TikTok makes money from targeted advertising.
Legal basis: Contract
TikTok processes personal data to deliver “core” (non-advertising) content. The company relies on “contract” for this activity.
Unlike with consent, the GDPR does not require controllers to obtain parental authorisation when relying on the legal basis of “contract”.
However, Commissioner argues that “contract” was not an appropriate legal basis for processing personal data about children under 13, as young children lack the capacity to agree to a contract.
TikTok’s legal basis conundrum
With contract and consent excluded, the only relevant legal basis remaining for TikTok to serve children under 13 is “legitimate interests”. However, it is unlikely that TikTok would be able to establish a legitimate interest that overrides the rights and safety of children.
As such, the only apparent way for TikTok to have avoided these violations would have been to verify the age of all users and find an effective way to exclude younger children from the platform.
This practice would raise its own data protection issues that are not addressed in the decision. The ICO’s Children’s Code provides some guidance on age verification techniques.
Regarding Article 12, the MPN states:
“TikTok failed to take appropriate measures to provide the information required under Article 13 UK GDPR to data subjects in a concise, transparent, intelligible and easily accessible form, using clear and plain language, in particular in relation to information addressed specifically to children.
And on Article 13:
TikTok failed to provide to data subjects with the information required under Articles 13(1) and (2) UK GDPR.
Articles 12 and 13 both relate to the information that controllers must provide to data subjects.
In the Commissioner’s words, “…while Article 13 sets out the specific information requirements, Article 12 provides for the way in which such information must be communicated”.
The ICO says TikTok violated Article 12 because the language in the company’s privacy notice “is not clear or plain, and is difficult to understand, in particular for child users (whether under the age of 13 or older).”
The MPN highlights some sections of the privacy notice that were deemed particularly unclear, including references to “internal operations, including troubleshooting, data analysis, testing, research, statistical and survey purposes” and the term “associated metadata”.
The MPN assesses TikTok’s privacy policies against each requirement under Article 13. Among other alleged violations, the ICO states that TikTok:
- Failed to provide contact details for its data protection officer (DPO).
- Did not link each category of personal data it collects with its purposes and legal basis for processing each category of personal data.
- Did not properly explain its legitimate interests.
- Did not name the specific recipients of each category of personal data shared with third parties.
- Failed to identify all of the third countries to which it transferred personal data, or provide sufficiently clear information about which safeguards were in place.
- Did not include enough information about how long different types of personal data would be stored.
- Failed to provide sufficiently clear information about how data subjects can exercise their rights.
The ICO’s interpretation of Article 13 is relatively strict.
The ICO finds violations in contentious areas that accord with recent decisions from the European Data Protection Board (EDPB) and the Court of Justice of the European Union (CJEU).
This, along with the Commissioner’s strict reading of the GDPR’s provisions on “contractual necessity”, suggests that the regulator is taking a hard line on GDPR violations – even if enforcement in the UK is rarer than in some EU member states.