Introduction

The Data Protection Commission ("DPC") recently imposed a reprimand and fine of €345m against TikTok, for GDPR violations in respect of its processing of children's data, and ordered TikTok to bring its processing operations into compliance by taking certain specified actions within three months from the date of the DPC's decision.

The fine was imposed following the conclusion of the DPC's own-volition inquiry into TikTok's processing of data of child users aged between 13 and 17 years old ("Child Users"). The DPC concluded that TikTok did not adequately protect the personal data of Child Users during the 5 month period between 31 July 2020 and 31 December 2020 (the "Relevant Period"), in line with its obligations under the GDPR. In particular, the DPC found that TikTok:

  • infringed the principles of data minimisation (Article 5(1)(c)) and data protection by design and by default (Article 25(1)), and its obligations as a data controller (Article 24(1)), by making Child User accounts public by default, meaning that anyone (on or off TikTok) could view the content posted by the Child User;
  • infringed the principles of security and integrity (Article 5(1)(f)) and data protection by design and by default (Article 25(1)), by setting up its “family pairing” option in a manner that failed to ensure that adult users purportedly overseeing the accounts of Child Users, through "Family Pairing," were verified as the parent or guardian of the Child User;
  • failed to provide sufficient information to Child Users, infringing the transparency rules in Articles 12(1) and 13(1)(e) of the GDPR; and
  • used 'dark patterns' to nudge Child Users into choosing more privacy-intrusive settings, during the registration process and when posting videos, infringing the principle of fairness (Article 5(1)(a) GDPR). This finding followed an objection raised by the Berlin Data Protection Authority and upheld by the EDPB, and was not included in the DPC's original findings.

The DPC acknowledged that TikTok had, retroactively, taken steps to comply with the GDPR, in its processing of Child User data. TikTok published a response to the DPC decision on 15 September 2023 noting many of the improvements they had already made.

Background

The DPC's engagement with TikTok dates back to January 2021, when TikTok's processing of Child Users' personal data came under review. Following requests from multiple supervisory authorities, the DPC commenced an own-volition inquiry into TikTok's compliance with GDPR obligations when processing Child Users' personal data on the TikTok platform.

In particular, the DPC's inquiry concerned two distinct sets of processing operations by TikTok, including:

  1. the processing of Child Users' personal data in the context of TikTok's platform settings, both mobile application and website-based, in particular public-by-default platform settings in relation to Child Users' accounts, videos, comments, downloading and 'family pairing', and
  2. the processing by TikTok of the personal data of children under 13 years old, in the context of the TikTok platform, both mobile application and website-based, in particular for the purposes of age verification.

Finally, with regard to the processing of personal data of persons under 18 years of age in the context of the TikTok platform (including any such processing in connection with websites or applications which provide access to the TikTok platform), the inquiry examined whether TikTok had complied with its transparency obligations to provide information to data subjects in the form and manner required by Articles 12(1), 13(1)(e), 13(2)(a), 13(2)(b), and 13(2)(f) GDPR.

As the processing under examination in the inquiry constituted "cross-border processing", the DPC, acting as Lead Supervisory Authority ("LSA"), was required to submit its draft decision to concerned supervisory authorities ("CSAs") in the EU/EEA, pursuant to the one-stop-shop procedure set out in Article 60 GDPR. 

The CSAs of Italy and Berlin raised objections to the DPC's draft decision. The Berlin CSA objected to the absence of a finding of a violation of the Article 5(1)(a) GDPR fairness principle as regards 'dark patterns'. The Italian CSA objected to the DPC's finding that TikTok had complied with Article 25 GDPR in its approach to age verification during the Relevant Period.

As the DPC and the CSAs were unable to reach an agreement on the subject-matter of these objections, the DPC referred the objections to the European Data Protection Board (“the EDPB”) for its determination, pursuant to the dispute resolution mechanism contained in Article 65 GDPR. The EDPB directed the DPC to amend its draft decision to include a violation of Article 5(1)(a) GDPR as proposed by the Berlin CSA. However, the EDPB did not direct the DPC to include a violation of Article 25(1) as regards TikTok's age verification approach, as requested by the Italian CSA. Albeit the EDPB did express "serious doubts" about the effectiveness of the age verification measures adopted by TikTok during the Relevant Period.

DPC Decision

The DPC found that, during the Relevant Period, all new TikTok accounts were set to public by default, including Child Users' accounts – a pop-up notification was displayed to Child Users allowing them to 'Go Private' or 'Skip'. When users of public accounts posted a video, the video was publicly published to 'Everyone' by default. TikTok had a 'Family Pairing' setting, which enabled a Child User to link their account to a non-Child User's account. From November 2020, this non-Child User could select whether the Child User account was public or private, see the Child User's liked videos, limit who was able to comment on videos, and select whether the Child User's account would be suggested to other Child Users. There was no verification of the relationship between the Child User and non-Child User.

The DPC made findings in relation to the following three issues:

  1. Platform Settings - TikTok's processing of Child User's personal data infringed Articles 5(1)(a), 5(1)(c), 5(1)(f), 24(1), 25(1) and 25(2) GDPR.
  2. Age Verification - TikTok's age verification measures infringed Article 24(1) GDPR.
  3. Transparency - TikTok's transparency practices infringed Articles 12(1) and 13(1)(e) GDPR.

Issue APlatform Settings - Assessment of whether TikTok complied with its obligations in respect of data minimisation (Article 5(1)(c)), data security (Article 5(1)(f)), obligations of a controller (Article 24) and data protection by design and by default (Article 25) concerning its platform settings for Child Users

Finding 1

The DPC concluded as follows:

  • During the Relevant Period, TikTok implemented public-by-default account settings for Child Users which permitted anyone (on or off TikTok) to view social media content posted by Child Users.
  • In this regard, TikTok failed to implement appropriate technical and organisational measures to ensure that, by default, only personal data necessary for TikTok's purpose of processing were processed.
  • In particular, this processing was performed 'to a global extent', in circumstances where TikTok did not implement measures to ensure that, by default, a Child User's social media content was not made accessible (without the user's intervention) to an indefinite number of persons.
  • The processing was, as a result, not in compliance with the data protection by design and default obligation under Articles 25(1) and 25(2) GDPR, and contrary to the data minimisation principle under Article 5(1)(c) GDPR.

Finding 2

The DPC concluded that:

  • The public-by-default account setting implemented by TikTok for Child User accounts during the Relevant Period (enabling anyone, on or off TikTok, to view the Child User's social media content) led to potential severe risks to the rights and freedoms of these Child Users.
  • As TikTok did not take account of the risks of this processing to these Child Users, it did not put into effect appropriate technical and organisational measures to guarantee that the processing was performed in compliance with TikTok's obligations as a controller under the GDPR, contrary to Article 24(1) GDPR.

Finding 3

Thirdly, the DPC examined the 'Family Pairing' setting, through which a user (intended to be the Child User) could scan a QR code generated by another user (intended to be the Child User's parent / guardian), which then enabled the intended parent / guardian to: manage screen time; add stricter restrictions on available content; disable search feature access; or enable / disable direct messages for users above the age of 16. Following November 2020, the intended parent / guardian could also make the account private; decide whether other users could view 'Liked Videos'; limit comments; and decide whether the account could be suggested to Child Users.

The DPC found that two primary concerns arose in relation to the 'Family Pairing' setting:

  • It allowed an unverified non-Child User to access and control a Child User's TikTok platform settings. While it was intended that this be the parent / guardian of the Child User, in effect, any other user could pair their account with the Child User's account and it was not restricted solely to the parent / guardian of that Child User.
  • It allowed the intended parent / guardian to make certain platform features less strict, for example by enabling direct messages for users over the age of 16.

The DPC concluded that:

  • During the Relevant Period, TikTok's 'Family Pairing' platform setting allowed Child Users and non-Child Users to pair their accounts, enabling direct messaging for Child Users above the age of 16. This processing presented potential severe risks to the rights and freedoms of Child Users.
  • If a supposed parent / guardian enabled direct messages on the Child User's account, this would result in TikTok processing the Child User's personal data such that third parties could directly message the Child User, amounting to unauthorised processing of their personal data.
  • On the basis that: (a) this processing did not ensure appropriate security of personal data (including protection against unauthorised or unlawful processing and against accidental loss, destruction or damage, by means of appropriate technical or organisational measures), and (b) TikTok did not implement appropriate safeguards into the processing in order to protect Child Users, the DPC found that this processing was not in compliance with TikTok's data security, and privacy by design and by default, obligations under Article 5(1)(f) and Article 25(1) GDPR, respectively.

Issue B: Age Verification - Assessment of whether TikTok complied with its obligations as a controller under Article 24, and its privacy by design and by default obligations under Article 25 GDPR, with regard to its approach to age verification

Finding 4

During the Relevant Period, TikTok did not assess the specific risks associated with children under 13 years of age gaining access to the TikTok platform. The DPC noted that, while TikTok had carried out a data protection impact assessment ("DPIA") regarding children's data and age appropriate design, the DPIA failed to identify the specific risk of children under the age of 13 accessing the platform, and the further risks that may ensue from that.

  1. The DPC acknowledged that there was not one particular method of ensuring that children under 13 years of age could not gain access to the platform. The measures implemented by TikTok to prevent or remove underage users included:
  2. An age-gate which did not inform users that their date of birth was being used for age-gating purposes. The age-gate operated by having users sign in via third-party accounts such as Google or Facebook;
  3. 12+ age rating in the Apple Store and Google Play Store;Detected underage accounts would be closed and deleted; and

Underage accounts could be reported by parents or other third parties, and then reviewed. The Privacy Policy invited individuals to contact TikTok, via a linked web form, if they suspected an underage user.

In circumstances where TikTok did not properly take into account the risks posed by the processing to children under 13 years of age, the DPC found that TikTok failed to implement appropriate technical and organisational measures to ensure and be able to demonstrate that the processing was performed in accordance with its obligations as a controller under the GDPR, contrary to Article 24(1) GDPR.

However, the DPC did not find any violation by TikTok of its data protection by design and by default obligation under Article 25(1) GDPR as regards the age verification approach, given the measures undertaken by TikTok towards age verification, and TikTok's efforts to ensure its platform was accessible only to users over 13 years of age.

Issue C: Transparency - Assessment of whether TikTok complied with its transparency obligations pursuant to Articles 5(1)(a), 12 and 13 GDPR

Finding 5

During the Relevant period, TikTok provided both a Privacy Policy, and a summary of that Privacy Policy for Child Users. The DPC considered whether, on review of the Privacy Policy and its summary, Child Users were able to determine the scope and consequences of registering as a TikTok user, in particular, the fact that their account would be made public by default.

The DPC found that, though both documents informed users that a public account would make users' content accessible to any other user, neither document noted that users' public accounts would be viewable on the TikTok website by an indefinite number of people, including those who were not registered TikTok users. TikTok's portals and centres did not inform users of this fact either. The DPC noted that the pop-up notification allowing users to choose to 'Go Private' or 'Skip' stated that videos of Child Users with public accounts could be viewed by anyone, but did not indicate whether this referred solely to other registered TikTok users or to anyone at all. The pop-up notification also did not allow users to navigate to the Privacy Policy or 'Summary for Users U18' to decipher who 'anyone' referred to (and even if they had, neither document provided such information).

The DPC noted that the Privacy Policy stated that content would be visible to third parties "such as search engines, content aggregators and news sites", with no reference to non-registered users. It commented that the reference to 'search engines' was insufficient as it did not necessarily follow that non-registered users could access content through such search engines without even registering.

The DPC ultimately found that, as TikTok did not provide Child Users with information on the categories of recipients of their personal data, it breached its GDPR obligations. Where TikTok did provide such information, it used the word "may" in terms of the recipients mentioned. The DPC commented that "may" is a conditional term which indicates that TikTok did not communicate in a clear, plain and transparent manner to Child Users which recipients would definitely receive their personal data. It further commented that an umbrella term such as "third parties" is unclear as it does not allow Child Users to access the specific information in relation to who receives their personal data. The language was not clear and plain and was also not provided in a concise, transparent and intelligible form that was easily accessible, as required by the GDPR. TikTok further breached such obligations by not providing Child Users with information on the scope and consequences of public-by-default processing in an easily accessible and transparent manner.

Accordingly, the DPC found that TikTok had breached its transparency obligations under Articles 12(1) and 13(1)(e) GDPR. However, the DPC did not find any infringement of the transparency processing principle under Article 5(1)(a) GDPR. While TikTok should have informed data subjects that non-registered persons could view their public accounts, having regard to the particular circumstances and the information that was provided, this particular informational deficit this did not amount to an overarching infringement of the transparency principle.

Finding 6

Following an objection raised by the Berlin Data Protection Authority, and in line with the EDPB's Article 65 determination, the DPC found that TikTok infringed the principle of fairness pursuant to Article 5(1)(a) GDPR in the context of the Registration Pop-Up and the Video Posting Pop-Up practices. TikTok was found to have utilised Registration Pop-Ups and Video Posting Pop-Ups in order to "nudge" users to make certain decisions and lead them "subconsciously" to decisions which violated their privacy interests. In other words, users were encouraged towards choosing the public-by-default setting. The EDPB found that such a practice makes it more difficult for data subjects to make a choice in favour of the protection of their personal data, and constitutes unfair processing. This was particularly the case where the data subjects were children who merit specific protection with regard to their personal data.

Corrective Powers

The DPC exercised the following corrective powers (pursuant to Section 115 of the Data Protection Act 2018 and Article 58(2) GDPR) with regard to the infringements:

  • ordered TikTok to bring its processing into compliance with the GDPR within a period of three months from the date on which TikTok was notified of the DPC's decision;
  • issued a reprimand regarding the identified infringements of the GDPR; and
  • imposed three administrative fines totalling €345 million, including:
    • €100 million for finding 1, regarding infringements of data minimisation principle in Article 5(1)(c), and the data protection by design and by default obligations in Articles 25(1) and 25(2);
    • €65 million for finding 3, regarding infringements of the security principle in Article 5(1)(f), and the data protection by design and by default obligation in Article 25(1) GDPR; and
    • €180 million for finding 5, regarding infringements of the transparency obligations in Articles 12(1) and 13(1)(e) GDPR.

In imposing the administrative fines, the DPC noted that TikTok had implemented changes following the Relevant Period. However, it commented that it is "not always possible to retrospectively correct a past lack of control, as personal data has already been published and data subjects may already have suffered consequential damage as a result".

That said, the DPC acknowledged that TikTok's actions may have decreased the likelihood of further additional risk of damage to data subjects following GDPR infringements. As a result, the DPC formed the view that such actions provided limited mitigation of the damage to data subjects. The DPC therefore considered the actions to be of "mitigating value".

Impact of DPC Decision

Although this decision was directed at TikTok, it has implications for any EU/EEA organisation that processes children's personal data.

The key takeaway from the decision is that organisations must review their technical and organisational measures to ensure that they adopt a privacy by design and by default approach to processing personal data. Where organisations process children's personal data, they must take account of the specific risks posed to children. This is particularly so where such risks include the possibility that bad actors could directly communicate with children as a result of the organisation's technical designs and default settings. These risks must be clearly identified in a data protection impact assessment, and where an organisation could process personal data of children under the age of 13, this must be addressed as a specific risk.

In addition, privacy notices must clearly identify the categories of recipients of data subjects' personal data. Vague and broad language such as "may" and "third parties" are unclear and fail to communicate the actual recipients of personal data, in a plain and transparent manner. The DPC expects privacy notices to include clear and definitive terms to identify recipients, or categories of recipients.

Finally, organisations must ensure that they do not use dark patterns to subconsciously encourage or "nudge" users to choose more privacy-intrusive settings to the detriment of their privacy interests. Data subjects should be encouraged to make decisions that promote the protection of their personal data.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.