ARTICLE
30 July 2025

Hong Kong Privacy Commissioner For Personal Data Completes Compliance Checks On The Use Of AI And Data Privacy

MB
Mayer Brown

Contributor

Mayer Brown is a distinctively global law firm, uniquely positioned to advise the world’s leading companies and financial institutions on their most complex deals and disputes. We have deep experience in high-stakes litigation and complex transactions across industry sectors, including our signature strength, the global financial services industry.
Artificial intelligence ("AI") has rapidly transitioned from experimental use to widespread adoption across Hong Kong. Organisations are now leveraging AI models to enhance customer service, improve risk management, and expedite research and development activities.
Hong Kong Privacy

Introduction

Artificial intelligence ("AI") has rapidly transitioned from experimental use to widespread adoption across Hong Kong. Organisations are now leveraging AI models to enhance customer service, improve risk management, and expedite research and development activities. Against this backdrop, the Office of the Privacy Commissioner for Personal Data ("PCPD") carried out a round of compliance checks in February 2025, which covered 60 local organisations from various sectors. The review offers first-hand insight into the state of AI governance, data protection, and AI risk management in Hong Kong. In this article we discuss the key findings, regulatory expectations, and practical implications for organisations deploying AI in Hong Kong.

Background and Scope of the Compliance Checks

After the last round of compliance checks conducted between August 2023 and February 2024 targeting 28 local organisations, the PCPD undertook a new round of compliance checks in 2025 covering 60 organisations from various sectors such as telecommunications, banking and finance, insurance, beauty services, retail, transportation, education, medical services, public utilities, social services and government departments. The purpose of the exercise was two-fold: first, to assess compliance with Personal Data (Privacy) Ordinance ("PDPO") by companies when collecting, using and/or processing personal data aided by AI tools; and second, to examine organisations' implementation of the PCPD "Artificial Intelligence: Model Personal Data Protection Framework" ("Model Framework") (see our previous Legal Update on the Model Framework).

Key Findings

Use of AI and related data processing practices

Of the 60 organisations reviewed, 48 (80%) used AI in their day-to-day operations—a 5% increase compared to the results obtained last year. Notably, 42 of these 48 organisations had been using AI for over a year, and more than half (26 of them) used three or more AI systems. The most common use cases include customer service, marketing, administrative support, compliance and risk management, and research and development.

50% of the organisations which used AI in their day-to-day operations collected and/or used personal data through AI systems. These entities formulated Privacy Policy Statements and Personal Information Collection Statements, specifying the purposes of use of the personal data and potential data transferees. Approximately 29% of these organisations provided Privacy Policy Statements that also covered the application of AI.

Of these, the majority retained the personal data collected through AI systems and specified the retention periods for personal data.

Data Security and Privacy Measures

All organisations handling personal data via AI systems implemented appropriate data security measures. These measures included access controls, penetration testing, data encryption, and data anonymisation. A subset (29%) went further by activating AI-specific security alerts and conducting red teaming drills.

As far as data minimisation is concerned, 67% of the organisations which collected and/or used personal data through AI systems used anonymised or pseudonymised data when using AI systems, and 29% implemented advanced privacy-enhancing technologies such as synthetic data and federated learning.

AI Governance, Risk Assessment and Incident Response

Among the 24 organisations collecting and/or using personal data through AI systems, 79%had established AI governance structures such as AI governance committees and/or designated responsible personnel which oversee the use of AI in the organisations. Furthermore, approximately 46% conducted internal audits and/or independent assessments regularly to ensure compliance with the organisation's AI strategies and/or policies.

96% of the organisations which collected and/or used personal data through AI systems conducted pre-implementation testing to ensure reliability, robustness, and fairness of the AI systems. Around 83% of them performed privacy impact assessments before implementation. All organisations conducted risk assessments in the procurement, use and management of AI systems. The risk assessments considered factors such as data security, legal requirements, data volume, quality and sensitivity, potential impact of the AI systems, and mitigating measures.

92% of the organisations had formulated data breach response plans, with around one third of them specifically addressing AI-related incidents.

Regulatory Expectations and Recommendations

The PCPD confirmed in her report that no contravention of the PDPO had been found during the 2025 compliance checks. The PCPD provided guidance and set out her expectations and recommended the best practices for organisations to follow when adopting AI tools:

  • Continuous Monitoring: Regularly monitor and review AI systems and adopt measures to ensure compliance with the PDPO requirements.
  • AI Strategy and Governance: Establish clear AI strategies, AI governance structures, and provide appropriate employee training.
  • Comprehensive Risk Assessment: Identify, analyse and evaluate risks, and tailor risk management measures for each AI system's risk profile.
  • Incident Response: Prepare AI-specific response plans to address and mitigate potential risks arising from AI system failures or other data breaches.
  • Regular Internal Audits and Independent Assessments: Conduct regular internal audits and independent assessments to ensure system security, data security, and compliance with the organisation's data and AI policies.
  • Transparency and Engagement: Communicate and engage with stakeholders and respond to stakeholders' feedback.

Apart from the Model Framework, organisations should also refer to the PCPD "Checklist on Guidelines for the Use of Generative AI by Employees" which provides guidance on how to develop internal employee policies or guidelines for the use of generative AI at work.

Conclusion

The findings from the PCPD compliance checks highlight the growing integration of AI across diverse sectors in Hong Kong and the critical importance of robust data protection and governance practices. As AI technologies and regulatory expectations evolve, regular assessments, and a commitment to best practices will be essential for maintaining public trust and supporting the responsible development and deployment of AI technologies in the evolving regulatory landscape.

Visit us at mayerbrown.com

Mayer Brown is a global services provider comprising associated legal practices that are separate entities, including Mayer Brown LLP (Illinois, USA), Mayer Brown International LLP (England & Wales), Mayer Brown (a Hong Kong partnership) and Tauil & Chequer Advogados (a Brazilian law partnership) and non-legal service providers, which provide consultancy services (collectively, the "Mayer Brown Practices"). The Mayer Brown Practices are established in various jurisdictions and may be a legal person or a partnership. PK Wong & Nair LLC ("PKWN") is the constituent Singapore law practice of our licensed joint law venture in Singapore, Mayer Brown PK Wong & Nair Pte. Ltd. Details of the individual Mayer Brown Practices and PKWN can be found in the Legal Notices section of our website. "Mayer Brown" and the Mayer Brown logo are the trademarks of Mayer Brown.

© Copyright 2025. The Mayer Brown Practices. All rights reserved.

This Mayer Brown article provides information and comments on legal issues and developments of interest. The foregoing is not a comprehensive treatment of the subject matter covered and is not intended to provide legal advice. Readers should seek specific legal advice before taking any action with respect to the matters discussed herein.

See More Popular Content From

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More