Why Was TikTok Fined?
The Data Protection Commission (DPC), Ireland's data protection regulator, has fined TikTok Technology Limited (TikTok) €345 million for breaching the EU GDPR's principle of fairness and requirements for data protection by design and default when processing data relating to users aged between 13 and 17, and children under 13.
Both the EU GDPR and the UK GDPR (collectively the GDPR) contain several prescriptive requirements organisations are obliged to comply with to protect children's personal data, including ensuring that children are addressed in plain, clear language that they can understand.
The GDPR requires this specific protection for children as they are likely to be less aware of the risks, consequences and their rights in relation to personal data.
As part of ensuring personal data is sufficiently protected, the GDPR requires appropriate technical and organisational measures to be adopted by organisations in every aspect of their processing activities.
The purpose of these organisational measures should be to implement the GDPR's principles effectively and safeguard individual data protection rights.
This requirement is known as "data protection by design and default".
The DPC found a number of data protection breaches by TikTok, which included a failure to implement data protection by design and default by placing a child user's account in a default public setting, meaning that anyone could see the content of the child's account, and comment on it.
The DPC's action follows the Information Commissioner's Office fine of TikTok of £12.7 million in April 2023 for several breaches of data protection law, including failing to legally process children's personal data.
The use of online services accessed by children has recently been under wider public scrutiny.
Meta announced its plans to implement end-to-end encryption on its Instagram and Facebook Messenger services, raising concerns that doing so will remove the ability to detect child abuse on the platforms.
The Home Secretary has urged Meta not to implement end-to-end encryption in the absence of more robust safety measures to ensure that children are safeguarded from sexual assault and exploitation facilitated via messaging services.
Whether implementing such encryption tools will also invite scrutiny by data protection regulators for potential data protection law breaches is yet to be seen.
What is Data Protection by Design and Default?
Data protection by design and default is a key element of the GDPR's focus on accountability, i.e., that organisations utilising personal data can demonstrate that they are complying with their obligations under the GDPR.
Data protection by design means that an organisation considers privacy and data protection issues at the design phase of any system, software, service, product, or process and then thereafter continually throughout the lifecycle.
Data protection, by default, requires an organisation to ensure it only processes the personal data necessary to achieve the specific purpose.
This is central to the core principles of data minimisation and purpose limitation that the GDPR requires.
What Should Organisations Do Next?
Organisations should adopt a proactive approach and implement measures that aid compliance with the GDPR to achieve data protection by design and default.
Measures may include:
- Designing any system, software, service, product or other business practice to protect personal data automatically.
- Minimise the processing of personal data.
- Pseudonymise personal data wherever possible.
- Appoint individuals internally to monitor the organisation's data processing activities and enforce internal practices to aid compliance with the GDPR.
Organisations that offer online products and services which are likely to be accessed by children in the UK must also comply with the Age Appropriate Design Code (the Children's Code).
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.