ARTICLE
10 January 2025

The Effects Of Deepfake Technology On Personal Data And Pdpl's Protection Mechanisms

KC
Kilinc Law & Consulting

Contributor

Kilinç Law & Consulting established by Levent Lezgin Kilinç currently operates in Istanbul, Izmir and London. Our firm, provides services to clients in a wide range of complex matters including Project Finance, Corporate Law, M&A, Energy Law, Dispute Resolution, Maritime Law, IP Law, International Transactions as well as Litigation of the disputes.
Recently, with the rapid development of artificial intelligence technologies, many innovations have been introduced into our lives. Deepfake technology, one of these innovations...
Turkey Privacy

A. Introduction

Recently, with the rapid development of artificial intelligence technologies, many innovations have been introduced into our lives. Deepfake technology, one of these innovations, can create unrealistic images and videos by manipulating the faces or voices of individuals. Deepfake technology produces realistic but fake images and videos using artificial intelligence-supported algorithms.

Deepfake technology harbours numerous risks such as manipulation of personal data, dissemination of false information and privacy violations. In addition to the creative uses of this technology, it is a fact that it also causes various legal and ethical problems. Especially in the context of personal data protection law, the use of deepfake technology causes various debates. In this article, the legal dimensions of deepfake technology in terms of personal data will be analysed.

B. Usage Methods of Deepfake Technology

Deepfake is a recently used term that is a combination of the words 'deep learning' and 'fake'. This technology enables the digital manipulation of images and voices of individuals using artificial neural networks and machine learning techniques. Deepfake can be used in creative fields such as cinema, education and media, as well as in acts of identity theft, fraud and forgery by manipulating a person's visual or audio data to make it appear that they are doing things they are not actually doing.

Similarly, deepfake technology can be used to manipulate the masses or spread false information, and the use of visual or audio data without a person's permission may cause privacy violations. Malicious use of deepfakes can damage the reputation of target individuals and cause serious psychological damage to these individuals and their environment, and in this context, it cannot be denied that deepfake technology has psychological and social effects.

C. Evaluation of Deepfake Technology within the Framework of Personal Data Protection Law

Law No. 6698 on the Personal Data Protection Law (''PDPL'') is a legal regulation that ensures the processing of personal data in accordance with the law and aims to protect the rights of individuals on data. Pursuant to the PDPL, personal data refers to any information relating to a specific or identifiable natural person. Elements such as an individual's face, voice, biometric data are also included in this scope. Therefore, deepfake products created using an individual's face or voice are considered personal data.

Pursuant to Article 5 of the PDPL, personal data cannot be processed without the explicit consent of the data subject. If the use of deepfake technology involves the use of biometric data such as face and voice without the consent of individuals, this situation constitutes unlawful data processing. In this case, the data subject has the freedom to exercise his/her rights listed in Article 11 of the PDPL and has the right to (i) learn whether his/her data is being processed, (ii) check whether the data is used for its intended purpose, (iii) request correction or deletion of incorrect or manipulated data, (iv) claim compensation for damages in case of unlawful data processing.

In the event that one of these rights is exercised by the data subject, the data controller is obliged to fulfil the requirements immediately, and data controllers are also obliged to ensure the security of personal data and prevent unlawful access in accordance with Article 12 of the PDPL Law. In this context, in the use of deepfake technology, effective measures must be taken by data controllers to prevent the dissemination of manipulated data.

Pursuant to the PDPL, data controllers have the obligation to ensure that personal data are processed in accordance with the law. Individuals and institutions that develop or use deepfake technology have the title of data controller and must fulfil their obligations arising from the legal legislation on the protection of personal data without exception. Otherwise, there is a risk of legal and criminal sanctions.

Although the PDPL provides a fundamental framework for the protection of personal data, the adequacy of the existing regulations to deal with the new threats posed by deepfake technology is questionable. In fact, the PDPL does not contain special regulations on manipulated data and threats arising from artificial intelligence. For new technologies such as deepfake, answers to questions such as 'How can manipulated visual and audio data be deleted from its source?' and 'Can deepfake producers be held legally liable?' are sought.

The PDPL obliges data controllers to take technical and administrative measures; however, it may be insufficient to provide an effective monitoring and prevention mechanism against sophisticated manipulations such as deepfake. More detailed regulations, such as the GDPR, provide additional protection mechanisms against deepfake technology. At this point, it is important that the PDPL is updated in this area and approaches international standards. In terms of concrete cases, in addition to the sanctions stipulated under the PDPL, the crimes of 'Recording personal data' regulated under Article 135 and 'Unlawful transfer or acquisition of data' regulated under Article 136 of the Turkish Criminal Code come to the fore and provide protection and punishment to individuals.

D. Conclusion

Deepfake technology has emerged as an innovation that has profound effects on both individuals and social structures. This technology harbours multifaceted threats ranging from personal data protection to privacy rights, reputation and information security. In Turkey, the PDPL provides a framework for the protection of personal data; however, due to the complexity and rapidly evolving nature of deepfake technology, the existing legal regulations are not fully sufficient in this area.

The protection mechanisms introduced under the PDPL constitute an important basis, especially in terms of the obligations of data controllers and the rights of data subjects. However, legal gaps, technical infrastructure deficiencies and the need for international harmonisation create obstacles to the effective prevention of deepfake-related risks. In order to minimise the negative effects of deepfake technology on personal data and privacy, various measures need to be taken at both legal and technical levels.

(ⅰ) Revision of Legal Regulations:

  • Special regulations should be introduced regarding deepfake technology. Specific laws can be created to prevent the unauthorised use of manipulated images and sounds.
  • In addition to the PDPL, more detailed legislation on artificial intelligence technologies should be implemented.

(ⅰⅰ) Development of Technological Tools:

  • Algorithms that detect and detect deepfakes should be developed and these technologies should be integrated with legal regulations.
  • Data controllers should establish systems that can quickly detect and remove manipulated content.

(ⅰⅰⅰ) Raising Awareness:

  • Individuals and organisations should be informed about the potential risks of deepfake technology.
  • Data subjects should be made aware of how to exercise their rights and obtain legal assistance.

(ⅰⅴ) International Cooperation:

  • Turkey's adoption of regulations in line with international standards such as GDPR will increase the effectiveness in combating deepfake technology.
  • An international mechanism should be established to monitor and remove cross-border deepfake content.

(ⅴ) Legal and Technical Cooperation:

  • Lawyers, software developers and ethics experts should collaborate on the legal and technical aspects of deepfake.

Deepfake technology necessitates a comprehensive struggle in both legal and technical dimensions. Although the existing protection mechanisms of the PDPL provide an important basis for dealing with these new threats, they should be supported by current technological and legal developments. For an effective protection, it is necessary to act with a common awareness in legal, technical and social dimensions.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More