TikTok received a €345 million fine from the Irish Data Protection Commission (DPC) in September 2023 following an investigation spanning more than three years.
The DPC made six findings involving violations across eight GDPR provisions, from security to data minimization and transparency. All violations related to how TikTok treated child users of its platform.
This article explores the Irish DPC’s six findings—and explains why one violation in particular (which the DPC was forced to find after an intervention by other European regulators) has significant implications for thousands of companies’ data collection practices.
GDPR violations found by the Irish DPC
Across its six main findings, the Irish DPC found violations of the following GDPR provisions:
- Article 5(1)(a): Lawfulness, fairness, and transparency
- Article 5(1)(c): Data minimization
- Article 5(1)(f): Integrity and confidentiality (security)
- Article 12(1): Clarity and accessibility of transparency information
- Article 13(1)(e): Information about recipients of personal data
- Article 24(1): Responsibility of the controller to ensure GDPR compliance
- Article 25(1): Data protection by design
- Article 25(2): Data protection by default
The DPC’s investigation relates to 31 July 2020 and 31 December 2020, referred to as the “relevant period” throughout the decision.
TikTok says it has fixed most of these problems—in some cases several years ago or before the investigation even began. TikTok has also announced that it will appeal the DPC’s decision.
Now, let’s explore how TikTok allegedly went wrong in each of these areas.
Finding 1: Default ‘public’ setting for children’s accounts
Throughout the relevant period, TikTok made accounts children’s accounts public by default.
This policy meant that anyone with access to TikTok (whether or not they had an account) could view a child’s TikTok content and comments unless the child took steps to make the content private.
The DPC notes that during account setup, “the Child User was prompted to select between ‘Go Private’ and remaining public”. However, “the Child User could opt to simply ‘skip’ this”.
The DPC found that this public-by-default setting violated GDPR provisions relating to data minimization, data protection by design, and data protection by default.
Finding 1 earned TikTok €100 million of its total fine.
Findings 2 and 4: ‘public’ setting: Failure to properly assess risks
The Irish DPC’s second and fourth findings relate to Article 24(1) of the GDPR, which requires the controller to “implement appropriate technical and organizational measures” to ensure it can comply with the GDPR.
The DPC identified “serious” risks associated with making children’s TikTok accounts public, including that “dangerous individuals” could use children’s TikTok content and comments for malicious purposes.
TikTok had implemented certain tools and safeguards to protect children’s privacy, but the DPC found that these safeguards were not proportionate to the risks involved.
Findings 2 and 4 did not directly contribute to TikTok’s fine.
Finding 3: ‘Family Pairing’ feature
The DPC’s third finding involved TikTok’s “Family Pairing” feature.
Family Pairing ostensibly enabled parents to link their accounts to their children’s accounts, allowing parents to send direct messages to their children and implement higher privacy controls on their children’s behalf.
However, the DPC found that the Family Pairing feature was poorly implemented as it enabled any adult user to connect their account with -and send direct messages to – any child user over 16.
As such, the DPC found that TikTok had failed to meet the requirements of the GDPR’s “integrity and confidentiality” (security) principle and the requirement for “data protection by design”.
Finding 3 earned TikTok €65 million of its total fine.
Finding 5: Transparency
The Irish DPC found that failed to make children fully aware of the risks associated with creating public TikTok accounts.
To the limited extent that TikTok provided this sort of information, the DPC found that it failed to provide it in a “concise, intelligible manner” and in a form that was “easily accessible” and used “clear and plain language” appropriate for a child.
As such, the DPC found a violation of the GDPR’s rules on providing information about data processing at Article 12(1).
Additionally, the DPC found that TikTok did not fulfil the Article 13(1)(e) GDPR requirement to disclose the “recipients or categories of recipients” of personal data.
Because children’s accounts were public by default, their personal data could be “received” by an “indefinite and unrestricted audience”—a fact not mentioned by TikTok in its privacy notices.
Finding 5 earned TikTok the remaining €180 million of its total fine.
Finding 6: ‘Fairness’ principle
Finally, the Irish DPC found a violation of the GDPR’s “fairness” principle, part of the “lawfulness, fairness, and transparency” principle at Article 5(1)(a) of the GDPR.
The DPC did not want to make this finding but was directed to do so by the European Data Protection Board (EDPB) as part of the “binding decision” process.
Several other EU data protection authorities argued that TikTok’s registration “nudged” users towards certain privacy-diminishing choices. The German data protection authorities claimed TikTok was, therefore employing “dark patterns” in violation of the “fairness” principle.
Finding 6 did not directly contribute to TikTok’s fine.
However, the EDPB’s position—that dark patterns violate the fairness principle—has major implications for future GDPR investigations.
This finding might be the most broadly applicable and could prompt many website and app providers to review their account setup and consent-request processes to ensure they do not encourage users to make potentially harmful choices.