As artificial intelligence ("AI") continues to transform industries across the world, businesses in Canada are increasingly grappling with AI privacy issues and need to mitigate potential dispute risks. The use of AI tools can raise a range of privacy-related issues, especially regarding the collection, use, and disclosure of personal information. This blog post aims to provide a roadmap of key privacy laws in Canada that may apply to the use of AI tools, and to highlight the need to mitigate potential privacy dispute risks that may arise as a result.
Given that we focus on current privacy dispute risks, this blog post does not address recent legislative proposals that would impact how AI is regulated in Canada (e.g., Canada's Bill C-27, the Artificial Intelligence and Data Act). It also does not address foreign laws or proposals that may impact Canadian businesses operating abroad (e.g., the European Union's Artificial Intelligence Act). See our other blog posts here and here for more information on these topics.
The Canadian Privacy Law Landscape
The federal Personal Information Protection and Electronic Documents Act ("PIPEDA") governs the collection, use, and disclosure of personal information in the course of commercial activities across Canada. Among other things, PIPEDA generally requires organizations to obtain meaningful consent before collecting, using, or disclosing personal information, and to maintain security safeguards to protect personal information in their possession or custody. Although PIPEDA does not expressly address AI tools, PIPEDA's requirements may apply to the collection, use, or disclosure of personal information in connection with AI tools.
PIPEDA applies across Canada, except in three provinces that have enacted their own statutes deemed "substantially similar" to PIPEDA: British Columbia, Alberta, and Québec. Unlike PIPEDA, these provincial statutes are not limited to commercial activities and may apply to any collection, use, or disclosure of personal information in the province. Like PIPEDA, they may apply to the collection, use, or disclosure of personal information in connection with AI tools.
Privacy law in Québec recently saw a major shift with the enactment of Law 25, which now requires organizations in Québec to take proactive steps on data and privacy management. For example, organizations in Québec must conduct privacy impact assessments before engaging in specific activities, which can include using AI tools, and new transparency requirements apply to the use of automated decision-making that relies on personal information. Our blog series on Law 25 covers these topics in more detail.
AI Privacy Dispute Risks
AI tools often train their algorithms on a variety of data, which may include personal information. These AI tools may engage privacy laws when this personal information is collected, used, and/or disclosed. Two collection methods are noteworthy from a privacy law perspective. First, AI developers often collect training data from the internet. Second, generative AI tools often use end-user inputs as further training data. Both of these collection methods can potentially involve personal information and therefore engage privacy laws.
AI tools can create dispute risks at many points in the data use cycle, from consent and collection to storage and disclosure. Businesses relying on AI tools for commercial use—whether internal or external—should be aware of these risks and take proactive steps to mitigate them and ensure compliance with all applicable legal requirements. While general requirements may include obtaining consent and maintaining security safeguards, the applicable requirements will always depend on the circumstances and the jurisdiction.
To illustrate how privacy dispute risks can arise in relation to AI tools, a number of lawsuits, including proposed class actions, have been filed in Canada and elsewhere challenging the use of personal information to train AI tools. These lawsuits have raised a range of alleged causes of action, including alleged breach of privacy laws, invasion of privacy, intrusion upon seclusion, and unjust enrichment. It remains to be seen how Canadian courts will approach these lawsuits.
Privacy dispute risks can also arise when using AI tools licensed from third parties. PIPEDA generally provides that personal information transferred to a third party for processing must receive a comparable measure of protection through contractual or other means. Businesses should therefore take steps to understand the privacy policies and practices of potential third-party data processors and mitigate any potential AI privacy dispute risks.
AI tools can also raise dispute risks related to AI-assisted decision making based on personal information. For example, while Canadian privacy laws do not generally prohibit AI-assisted decision making tools, care must be taken to ensure that these tools do not produce discriminatory outcomes based on personal information about affected individuals. Otherwise, disputes can arise in relation to AI-assisted decisions affecting individuals.
AI Privacy Disputes in the Regulatory Context
AI privacy disputes can also arise in the regulatory context. PIPEDA and substantially similar provincial statues are enforced by privacy commissioners appointed under each statute. These commissioners have the power to investigate individual and commissioner-initiated complaints and, in some jurisdictions, to impose penalties and/or bring court proceedings.
AI tools are top of mind for Canadian privacy commissioners. The federal, British Columbia, Alberta, and Québec commissioners recently launched a joint investigation into the compliance of OpenAI, the company behind ChatGPT. Privacy commissioners across Canada have also published joint guidelines on best principles for privacy-protective use of generative AI. Compliance with these and other standards is important for businesses using AI tools.
To view the original article click here
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.