In June of last year, the federal government introduced Bill C-27, the Digital Charter Implementation Act. Bill C-27 proposes to reform federal private-sector privacy law and introduce new legislation to regulate the design, development, and use of artificial intelligence (AI) systems. Bill C-27 passed second reading in the House of Commons and is currently being considered by the Standing Committee on Industry and Technology (Committee).

Earlier this week, the Minister of Innovation, Science, and Industry submitted a summary of the government's planned amendments to Bill C-27 to the Committee as it advances its study of the Bill. The Minister highlighted three changes to the Consumer Privacy Protection Act (CPPA) and proposed more extensive changes to the Artificial Intelligence and Data Act (AIDA) based on feedback received from stakeholders and parliamentarians.

Stakeholder Feedback

Various stakeholders, including non-profit policy institutes, academic researchers, and the Privacy Commissioner of Canada (Commissioner), have shared their submissions on Bill C-27 with the Committee. Prior to this, feedback was provided by Members of Parliament during Bill C-27's second reading. The Minister notes that the government's planned amendments are informed by this feedback. While the amendments do not encompass all suggested revisions, they do address some key themes reiterated by the Commissioner and other stakeholders with respect to the CPPA: privacy is a fundamental right, the rights of minors must be protected, and enhanced enforcement mechanisms are needed. Similarly, the planned amendments to AIDA reflect stakeholder comments relating to alignment with other proposed regulatory regimes and the need to appropriately distribute the regulatory burden across the AI value chain.

Amendments to the CPPA

Privacy as a Fundamental Right

Many commentators have noted that the CPPA does not explicitly recognize privacy as a fundamental right. The government plans to add such an acknowledgement to the preamble of the CPPA.

Rights of Minors

The CPPA currently deems the personal information of minors to be sensitive personal information and provides additional protections for such information. The government's planned amendments will go further by requiring that organizations consider the special interests of children when determining whether personal information is being collected, used, or disclosed for an appropriate purpose.

Enforcement Mechanisms

The proposed CPPA has stronger enforcement mechanisms than the existing Personal Information and Protection of Electronic Documents Act. In its current form, the CPPA would give the Commissioner order-making power and the authority to recommend that a monetary penalty be imposed. It would also allow the Commissioner to enter into compliance agreements with organizations it believes have contravened the CPPA. However, only the Personal Information and Data Protection Tribunal (another new development under Bill C-27) would have the authority to impose a monetary penalty.

The government's planned amendments would also permit the Commissioner to enter into compliance agreements that include financial considerations.

Amendments to AIDA

The government's planned changes to AIDA are more extensive than the planned changes to the CPPA, perhaps in part because AIDA, as proposed, leaves many key details to future regulations. The planned changes attempt to bring greater clarity to AIDA.

Defining High-Impact Systems

If passed, AIDA will regulate the design, development, and use of AI systems in the private sector, with a focus on mitigating the risks associated with "high-impact" AI systems. Much of the substantive content in AIDA is set to be established by future regulations, including the definition of a "high-impact" AI system. However, the government's planned amendments to AIDA will set out an initial list of specific classes of high-impact AI systems, which are those that are used for:

  • Employment-related determinations, such as hiring and remuneration.
  • Determining whether to extend services to an individual, determining the costs and types of such services, and prioritizing of the provision of such services.
  • Processing biometric data for identification purposes or for determining an individual's behaviour or state of mind.
  • Online content moderation on "online communications platforms", including search engines and social media, and the "prioritization of the presentation" (i.e., recommendation of such content).
  • Healthcare and emergency services.
  • Decision-making by courts and administrative bodies.
  • The exercise and performance of law enforcement powers.

This list can be subsequently expanded by the government.

Alignment with EU and OECD Standards

The government plans to make amendments to align with evolving international standards in the EU and by the Organisation for Economic Co-operation and Development (OECD). For example, the government plans to propose an amendment to the definition of AI systems to align with the OECD definition. Other planned amendments would impose more stringent requirements on high-risk systems at various stages of the system's lifecycle, including pre-commercialization, which would bring AIDA closer to the European Union's proposed Artificial Intelligence Act.

Obligations across the "AI Value Chain"

Consistent with the Minister's earlier statements in the AIDA Companion Document, the government's planned amendments would differentiate between obligations that apply to developers of machine-learning models that are intended for high-impact use, developers of high-impact systems, persons who make high-impact systems available for use, and persons who manage the operations of high-impact systems.

Addressing General-Purpose AI Systems

The planned amendments would specifically regulate general-purpose AI systems, such as chatbots like ChatGPT, which may not be categorized as "high-impact" but nonetheless are widely used in a variety of contexts. Developers of general-purpose AI systems would be required to perform certain risk assessments and mitigation testing during pre-market development. Once the system is on the market, a developer would need to make available a plain language description of the capabilities and limitations of the system, and continuously monitor for harms and risks.

Notably, anyone who manages a general-purpose AI system would need to ensure that individuals can identify AI-generated content. For example, art that is created using AI would need to be identified as such. This layer of transparency may increase consumers' trust in AI systems and allow them to appropriately contextualize and evaluate content.

Looking Forward

The Minister's submission notes that the government remains open to collaborating on other issues relating to Bill C-27, hinting that this may not be the last of the changes made to the CPPA and AIDA.

Our Privacy and Cybersecurity group will continue monitoring and providing updates on amendments to the CPPA and AIDA as they make their way through the Committee's study of Bill C-27.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.