Data Protection Commission fines TikTok €345 million over GDPR infringements in processing children’s personal data

Following its inquiry into TikTok Technology Limited’s (“TikTok”) approach to certain aspects of processing child users’ data during a defined period in 2020, the Data Protection Commission (“DPC”) has issued a total fine of €345 million against TikTok. 

The inquiry focused on (i) certain TikTok platform settings, including public-by-default settings and the ‘Family Pairing’ feature; (ii) age verification as part of the registration process; and (iii) TikTok’s transparency obligations, particularly in connection with the extent of information provided to child users in relation to default settings.  As a result of the decision, TikTok has been issued: (i) a reprimand; (ii) an order to bring their data processing into compliance within three months of the decision; and (iii) a fine of €345 million.

The DPC’s inquiry was limited to TikTok’s processing of personal data between 31 July 2020 and 31 December 2020, and focused on whether TikTok had complied with certain of its obligations under the GDPR as a controller in respect of children’s data.  As the GDPR defines “child” as anyone under the age of 18, and TikTok provides the TikTok platform to those aged 13 and up, the “Child Users” of the platform are those between 13 and 17 years old (“Child Users”).  The investigation into TikTok’s processing of Child Users’ data was under the headings of: (1) Platform Settings; (2) Age Verification; and (3) Transparency.

The DPC adopted its final decision on 1 September 2023, following a binding decision from the European Data Protection Board (the “EDPB”), directing the DPC to make certain changes to the draft decision which the DPC had submitted to other data protection authorities on 13 September 2022 (the “Article 65 Decision”).

Platform Settings

Public-by-default - During the registration process for the TikTok platform, users were generally offered the choice to “Go Private” or to ‘skip’ that decision (which would in effect be choosing the default setting to have a public account).  Public accounts were viewable by any registered user and any user accessing TikTok from a web browser (i.e. any internet user can access public content on the TikTok platform without having a registered TikTok account).  In addition to the prompt during the registration process, TikTok noted that even on the default public setting, on the first instance of attempting to post a video on their account, users were prompted again to choose between “Post Now” and “Cancel”.   The public-by-default setting also meant that any registered TikTok platform user (adult or child) could comment or otherwise engage with the video.   

The DPC noted that “…by setting accounts to public by default, [TikTok] ensured that the scope of processing social media content of Child Users was potentially very extensive, being made accessible without restriction to an indeterminate global audience”.  The DPC further noted that the public-by-default processing “… could lead in the first instance to Child Users losing autonomy and control over their data and, in turn, they could become targets for bad actors, given the public nature of their use of the TikTok platform. This could also lead to a wide range of potentially deleterious activities, including online exploitation or grooming, or further physical, material or non-material damage where a Child User inherently or advertently reveals identifying personal data. There is the identified risk of social anxiety, self-esteem issues, bullying or peer pressure in relation to Child Users”.  

Family pairing - In addition to the public-by-default settings, the DPC also reviewed certain settings with regard to the ‘Family Pairing’ feature.   This setting allowed a non-Child User (who could not be verified as a parent or guardian) to pair their account to a Child User’s account, which (among other features) allowed the non-Child User to enable “direct messages” for Child Users above the age of 16 where those settings had not been enabled by the relevant Child User.  In contrast to the other features of the ‘Family Pairing’ setting, this allowed the relevant non-Child User to apply less strict privacy settings than had been set by the relevant Child User.    

The DPC made the following findings in connection with the public-by-default and ‘Family Pairing’ platform settings:

  • TikTok implemented default account settings which allowed anyone to view the TikTok platform content of Child Users.  The DPC found that TikTok failed to implement appropriate protections and to limit data processing to that which was necessary for the purpose of the processing.   The DPC noted that, in particular “…this processing was performed to a global extent and in circumstances where [TikTok] did not implement measures to ensure that, by default, the social media content of Child Users was not made accessible (without the user's intervention) to an indefinite number of natural persons”.  The DPC found that this was contrary to the principles of data protection by design and default under Articles 25(1) and 25(2) of the GDPR, and contrary to the data minimisation principle of Article 5(1)(c).
  • The DPC found that the default account settings utilised by TikTok posed severe possible risks to the rights and freedoms of Child Users.  The DPC found that TikTok did not implement proper technical and organisational measures to ensure that the processing was performed in accordance with the GDPR, contrary to Article 24(1).
  • In relation to the ‘Family Pairing’ feature which allowed a Child User to pair their account with a non-Child User, the DPC noted that this feature would also allow the non-Child User to enable direct messages to the Child User’s account above the age of 16 (which had not been enabled by the Child User).  The DPC found that TikTok did not ensure appropriate security of the personal data and failed to implement technical and organisational measures designed to implement the integrity and confidentiality principles in an effective manner and put in place necessary safeguards to meet the requirements of the GDPR and protect the rights of data subjects, contrary to Article 5(1)(f) and Article 25(1) GDPR.

Age Verification

The DPC noted that a number of measures were implemented by TikTok to prevent children under the age of 13 from accessing the TikTok platform and acknowledged that there is no “perfect age verification method”.  However, the DPC emphasised the high level of risk arising as a result of the processing (high both in terms of likelihood and severity), and noted that these risks “would apply equally, if not more severely to children under 13”

The DPC was critical of the fact that, while TikTok conducted a data protection impact assessment (“DPIA”) in relation to children’s age and age appropriate design, the DPIA did not identify the risk of children under the age of 13 accessing the TikTok platform.  The DPC found that TikTok had not put in place proper technical and organisational measures to ensure and demonstrate that the processing was in compliance with the GDPR (contrary to Article 24(1) of the GDPR), as it failed to assess the specific risks to children under the age of 13 gaining access to the TikTok platform.

Transparency

The DPC considered (i) whether Child Users were made sufficiently aware in a concise, transparent, intelligible and easily accessible form, of the various public and private account settings in accordance with the GDPR, and (ii) whether Child Users were appropriately made aware as a user of the TikTok platform of default public account settings in accordance with the GDPR.

TikTok outlined a number of transparency measures that it had in place (e.g., a privacy policy, a summary for users under the age of 18, ‘just in time’ notifications and a ‘Youth Portal’).  It also submitted an expert’s report regarding the intelligibility of the wording used in its privacy notices.   However, the DPC’s decision placed particular focus on the lack of transparency around the fact that, for publicly uploaded content, any internet user could access it from a web browser (i.e. without any requirement to be a registered TikTok platform user). The DPC found that TikTok did not provide Child Users with information on the categories of recipients of personal data, contrary to Article 13(1)(e) of the GDPR.  The DPC was not persuaded by the expert’s report regarding the intelligibility of the wording used and also found that TikTok did not provide Child Users with information on the scope and consequences of public-by-default processing in a concise, transparent and intelligible manner, and as such, did not comply with Article 12(1) of the GDPR.

Fairness

Following an instruction by the EDPB to include a finding of an infringement of the fairness principle, the DPC also found that TikTok had infringed Article 5(1)(a) of the GDPR, for the reasons set out by the EDPB in the Article 65 Decision.  In assessing the principle of fairness in this case, the EDPB stressed the importance of (i) the autonomy of data subjects, (ii) the avoidance of deception, (iii) the power balance between the parties, and (iv) the requirement of truthful processing. The EDPB noted that the fairness principle means that data subjects should not be presented with options in such a way that they are ‘nudged’ in the direction of allowing the controller to collect more data (than if there had been no ‘nudging’ applied).

Two practices of TikTok in particular were assessed for fairness: the “Registration Pop-Up” and the “Video Posting Pop-Up”.  In these cases, although users are presented with a ‘Skip’ option, this ‘Skip’ option meant that users were set to public-by-default.  The EDPB also noted that the placement of the ‘Skip’ button (on the right hand side of the screen) would lead to a majority of people selecting it, as they have learned through muscle memory that on mobile social media apps, that location on the right side of the screen is usually the option to advance /go to the next step of the process. It was also noted that this ‘nudging’ effect was further amplified by the default option to post the video being presented in a darker, bolder text in the context of the “Video Posting Pop-Up”.  The EDPB also noted that, although a reference is made to the possibility of changing preferences, a direct link is not provided to those settings.

Calculation of fine

The DPC’s decision contains a detailed description of the various considerations, including mitigating and aggravating factors, taken into account in determining the amount of the fine to be imposed for the infringements that were decided to have occurred. Notably, despite submissions by TikTok to the contrary, the DPC found that some of the infringements were negligent and some were intentional, and these operated as aggravating factors.

Next steps for TikTok

In addition to issuing a reprimand and imposing the fine of €345 million, the DPC ordered TikTok to take the following actions in connection with Child Users within three months of the date of the DPC’s decision (noting that TikTok has already made a number of amendments to relevant platform settings since the relevant period which was the subject of the DPC’s inquiry, including that Child Users’ accounts are now set to private by default):

  1. Implement appropriate technical and organisational measures in respect of any ongoing public-by-default processing to ensure that by default only personal data which is necessary for each specific purpose of the processing is being processed;
  1. Provide Child Users with information in a clear and transparent form on the purposes of the public-by-default processing; and
  1. Bring processing (in the context of the “Registration Pop-Up” and “Video Posting Pop-Up”) into compliance with the principle of fairness, and to eliminate deceptive design patterns.

TikTok has since initiated a statutory appeal against the DPC’s decision in the Circuit Court, as well as a High Court challenge by way of judicial review seeking to set aside the DPC’s Decision.

Also contributed to by Lisa Leonard

This document has been prepared by McCann FitzGerald LLP for general guidance only and should not be regarded as a substitute for professional advice. Such advice should always be taken before acting on any of the matters discussed.