ARTICLE
1 May 2025

Recent Developments In Türkiye Regarding The Protection Of Children In Digital Environments

P
Paksoy

Contributor

Paksoy is an Istanbul-based independent Turkish law firm with over 120 employees, offering legal advice and counselling to foreign investors and the Turkish business community. We provide a wide range of services to meet the needs of local and international businesses in almost every field, including corporate law, capital markets, mergers and acquisitions, competition law, banking and finance, tax, real estate and project development, project finance, energy and infrastructure, litigation and arbitration.
Since the beginning of 2026, notable legislative and policy developments have taken place in Türkiye on the protection of children in digital environments, reflecting an accelerated effort to establish...
Turkey Media, Telecoms, IT, Entertainment
Stéphanie Beghe Sonmez’s articles from Paksoy are most popular:
  • with Senior Company Executives, HR and Inhouse Counsel
  • in United Kingdom
  • with readers working within the Business & Consumer Services and Construction & Engineering industries

Since the beginning of 2026, notable legislative and policy developments have taken place in Türkiye on the protection of children in digital environments, reflecting an accelerated effort to establish a comprehensive framework to protect children and young people from online risks, strengthen data protection standards and ensure the safe and age-appropriate use of digital platforms.

On 1 May 2026, the Law on Amendments to the Social Services Law and Certain Other Laws (“Law”) has been published in the Official Gazette, introducing substantial changes to the Internet Law No. 5651 (“Internet Law”) aimed at enhancing the protection of minors online. This legislative development complements the government’s Action Plan 2026–2030 for the protection of children in digital environments and signals a broader regulatory shift towards a more structured and child-focused approach.

Overview of the amendments to the Internet Law on the protection of minors in digital environments

The Law introduces wide-ranging changes to the Internet Law aimed at strengthening the protection of minors in digital environments. It focuses on age restrictions and age assurance for social network providers, introduces new concepts for the gaming ecosystem, expands compliance duties for platforms, and tightens oversight and sanctions.

The Law entered into force upon publication in the Official Gazette, save for the provisions relating to social network providers and gaming actors which will take effect six months from the date of publication.

Age restrictions and verification. Under the Law, social network providers are no longer permitted to offer services to children under 15 and are required to implement technical measures, including age verification, to ensure compliance. For users aged 15 to 18, platforms are required to offer a differentiated experience designed for minors, separate from adult services, and publicly explain the measures they have put in place.

Obligations for gaming actors. The Law introduces, for the first time, definitions of “game”, “gaming distributor”, “game developer” and “gaming platform”, clarifying the roles of entities that develop games and those that distribute or make them available to users.

Gaming platforms may not offer games which are not duly classified by age criteria, and shall remove content that does not comply, without prejudice to their responsibilities as content or hosting providers.

Foreign gaming platforms exceeding a daily access threshold of 100,000 users from Türkiye are required to appoint a local representative, notify the Turkish Information and Communication Technologies Authority (“ICTA”) and publish the representative’s contact details.

Parental control measures. Both social network providers and gaming platforms are now required to implement effective parental control tools under the Law. These tools are intended to enable parents to manage account settings, control or approve purchases and subscriptions, and monitor or limit usage time.

Supervision and information requests. The Law strengthens supervision and introduces enhanced information request mechanisms. In-scope gaming platforms are required to provide information and documentation relating to their corporate structure, IT systems and data processing mechanisms to ICTA upon its request, without delay and within a period not exceeding fifteen days.

Sanctions. Enforcement is strengthened through administrative fines and measures, with different regimes applicable to social network providers and gaming platforms that do not meet their obligations.

Under the Law, social network providers and gaming platforms may face administrative fines for non-compliance (including failure to follow ICTA requests or orders, implement parental controls and age verification, provide information to ICTA, prevent misleading advertising, or comply with ICTA-imposed measures such as modifying or restricting services).

Gaming platforms are subject to a graduated administrative fine regime, whereby failure to comply following an initial notification may result in administrative fines ranging from TRY 1 million to TRY 10 million (approx. EUR 18,800 to EUR 188,000), and, upon continued non-compliance, from TRY 10 million to TRY 30 million (approx. EUR 188,000 to EUR 564,000). In addition, if non-compliance persists, the ICTA may apply to the criminal court of peace for bandwidth restriction measures which may be imposed at a rate of 30% and subsequently up to 50%.

For social network providers, non-compliance may trigger a separate enforcement mechanism, including a ban on receiving advertisements from Türkiye, followed by bandwidth restriction measures of up to 50% and, ultimately, up to 90% upon continued non-compliance.

Additional platform obligations and urgent content removals. Separately from child-specific provisions, the Law introduces additional general duties for social network providers, including measures to prevent deceptive advertising practices. Platforms with more than 10,000,000 daily connections from Türkiye are also required to implement urgent content removal decisions immediately, and in any event within one hour, and to take the necessary measures to prevent the publication of the relevant content on their websites.

Action Plan and DPA investigations

Prior to the enactment of the Law, the Turkish Ministry of Family and Social Services had published the Action Plan for Empowering Children in the Digital World (2026–2030). The Action Plan adopts a strategic approach focused on awareness raising, prevention, intervention tools, and strengthening legal and institutional frameworks. It places particular emphasis on harmful content, protection of children’s personal data and risks such as cyberbullying and online exploitation. It also signals potential future amendments to the existing legal framework and highlights the role of the Turkish Data Protection Authority (“DPA”) in a broader review of rules affecting children’s personal data.

In parallel, the DPA has initiated ex officio investigations into major social media platforms, such as TikTok, Instagram, Facebook, YouTube, X and Discord, to assess how children’s personal data is processed and what safeguards are implemented in practice.

Taken together, these steps indicate a notable and accelerating regulatory shift in Türkiye. Social network providers, digital content platforms and gaming companies operating in or targeting Türkiye should monitor developments closely and reassess their compliance position, particularly around age assurance, parental controls, content classification and the processing of children’s personal data.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

[View Source]
See More Popular Content From

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More