ARTICLE
17 March 2025

Legal Considerations In Canada Related To Deepfake Videos And Images

ML
McMillan LLP

Contributor

McMillan is a leading business law firm serving public, private and not-for-profit clients across key industries in Canada, the United States and internationally. With recognized expertise and acknowledged leadership in major business sectors, we provide solutions-oriented legal advice through our offices in Vancouver, Calgary, Toronto, Ottawa, Montréal and Hong Kong. Our firm values – respect, teamwork, commitment, client service and professional excellence – are at the heart of McMillan’s commitment to serve our clients, our local communities and the legal profession.
The proliferation of AI generated videos and images, or "deepfakes," and consideration of their legal implications have been drawing increased attention in the public sphere.
Canada Technology

The proliferation of AI generated videos and images, or "deepfakes," and consideration of their legal implications have been drawing increased attention in the public sphere.

In mid-February 2025, Scarlett Johansson spoke out publicly about her experience of being a victim of deepfake technology. A deepfake video, which appears to show Scarlett Johansson and a number of other celebrities, including Adam Levine and Mila Kunis, condemning antisemitism, went viral on the internet.1 Ms. Johannson has since urged U.S. elected officials to pass legislation protecting individuals against such uses of artificial intelligence.2 Another celebrity victim, Taylor Swift, was reported to be considering legal action after explicit deepfake images of her were circulated on social media last year.3

In our recent bulletin, Legal Considerations in Canada related to "Voice Cloning",4 we discussed potential causes of actions that one may have in Canada should they become a victim of voice cloning. Legal recourses that are available to voice cloning victims are also applicable to the deepfake context.5

Causes of Action Common to both Voice Cloning and Deepfakes

1. Violation of Privacy: Use of a "likeness"

In our previous bulletin, we discussed how various provincial statutory privacy claims could arise if someone's voice were used without their authorization.6 Similarly, deepfake would be just as likely to be subject to such statutory claims, given that such provincial statutes7 were enacted to protect individuals against the unauthorized use of their "likeness" without their authorization. For example, and referring to The Privacy Act in Manitoba:

2(1) A person who substantially, unreasonably, and without claim of right, violates the privacy of another person, commits a tort against that other person.

3 Without limiting the generality of section 2, privacy of a person may be violated

[...]

(c) by the unauthorized use of the name or likeness or voice of that person for the purposes of advertising or promoting the sale of, or any other trading in, any property or services, or for any other purposes of gain to the user if, in the course of the use, that person is identified or identifiable and the user intended to exploit the name or likeness or voice of that person;8

[emphasis added]

Arguably, such statutory language would be sufficiently broad to capture wrongful acts related to deepfakes.

2. Appropriation of Personality

With regard to the tort of appropriation of personality, such tort is committed when one's personality is exploited for a "commercial purpose".9 Such tort is not new either. For example, in Athans v. Canadian Adventure Camps Ltd. et al.,10 George Athans Jr., a well-known professional water-skier, was successful in claiming the tort against Canadian Adventure Camps Ltd. and its public relations firm (collectively, "CAC"). CAC had used a drawing derived from a photograph of Mr. Athans in a brochure advertising a camp with water-skiing as one of its activities. CAC was found liable for commercially benefiting from Mr. Athans' personality. As it is foreseeable that bad actors may commercially benefit from one's personality through the use of deepfakes, such misuse of deepfake technology therefore may be subject to claims of appropriation of personality.

In some provinces, privacy statutes specifically refer to situations of commercial exploitation of one's image. For example, in the province of British Columbia, there is a statutory tort for the unauthorized use of name or portrait of another for the purpose of advertising:

3(2) It is a tort, actionable without proof of damage, for a person to use the name or portrait of another for the purpose of advertising or promoting the sale of, or other trading in, property or services, unless that other, or a person entitled to consent on the other's behalf, consents to the use for that purpose.11

[emphasis added]

In British Columbia however, there is an open question as to whether appropriation of personality should be recognized under common law, due to an ongoing debate on whether B.C. courts should recognize breach of privacy claims (of which appropriation of personality is a subcategory) under common law. In the recent decision of Bao v. Welltrend United Consulting Inc.,12 the British Columbia Court of Appeal declined to uphold the lower court's finding of appropriation of personality,13 but left open the possibility that the tort could be recognized in the future:

Although recently this Court in Tucci v. Peoples Trust Company, 2020 BCCA 246 at paras. 55, 64–68, suggested that it may be time to reassess whether B.C. needs a common law tort of privacy to address personal data breaches, that is a problem distinct from appropriation of someone's personality for commercial gain—a cause of action arguably covered by section 3 of the Privacy Act. In any event, the question of whether the statutory cause of action for breach of privacy in B.C. precludes recognition of a common law tort is a challenging one: Situmorang v. Google, LLC, 2024 BCCA 9 at para. 88. As Justice Horsman observed in that case, its resolution "would at least require an analysis of whether the Privacy Act evidences a legislative intent to create a comprehensive and exclusive code": at para. 88). Not surprisingly, given the absence of any submissions, the judge did not undertake this analysis. In my view, it would not be appropriate to address the issue on appeal in these circumstances.

[emphasis added]

Nevertheless, whether under statute or common law, deepfake victims in Canada will likely have a legal recourse when their personality is exploited for commercial purposes.

3. False Light

In Canada, the tort of false light requires that:

  1. the false light in which the person was placed would be highly offensive to a reasonable person; and
  2. the individual had knowledge of or acted in reckless disregard as to the falsity of the publicized matter and the false light in which the other would be placed.14

The applicability of the tort of false light to deepfakes is similar to its applicability to voice cloning.15 However, the tort of false light remains a new tort, and as of the date of this bulletin remains only fully recognized in the province of Ontario. The latest treatment of the tort in all nine common law provinces in Canada is summarized as follows:

Province Recognition of the Tort of False Light
Newfoundland and Labrador Yet to be considered
Prince Edward Island Yet to be considered
Nova Scotia Rejected16
New Brunswick Yet to be considered
Ontario Recognized17
Manitoba Not recognized18
Saskatchewan Yet to be considered
Alberta Not recognized19
British Columbia Unsettled20

Of note is the recent decision of Sampson v. TD Insurance Meloche Monnex.21 There, the Nova Scotia Supreme Court explicitly stated that the tort of false light should not be recognized because it duplicates other causes of action, namely defamation.

While the tort of false light may not be a cause of action that is available across Canada, it could be a potential recourse for deepfake victims as more jurisdictions consider its applicability.

Conclusion and Further Reading

Similar causes of action, namely violation of privacy, appropriation of personality and false light, are likely applicable in both the voice cloning and deepfake contexts. We anticipate that Canadian case law will continue to develop in this area, given the proliferation of deepfake and voice cloning in recent years. We will continue to monitor further developments in these newer and developing causes of action.

McMillan LLP has extensively explored the various legal implications of deepfake technology. For further reading, please see our prior publications on deepfake technology:

Footnotes

1 Elizabeth Wagmeister, "Scarlett Johansson calls for AI laws after fake video of celebrities condemning Kanye West's antisemitism goes viral", CNN, (13 February 2025).

2 Ibid.

3 Imran Rahman-Jones, "Taylor Swift deepfakes spark calls in Congress for new legislation", BBC, (27 January 2024).

4 Pablo Tseng, Carina Chiu & Aki Kamoshida, "Legal Considerations in Canada related to 'Voice Cloning'", McMillan LLP, (6 December 2024), available here.

5 As of the date of this bulletin, we have not identified any civil decisions in Canada where a victim of deepfake technology has brought a legal action against a creator of a deepfake content. However, a Quebec criminal court decision, R c. Larouche, 2023 QCCQ 1853 is an example where the court discusses the wrong arising from the creation of deepfake videos in the child pornography context. We have discussed this decision in a prior bulletin: Pablo Tseng and Paola Ramirez, "What Has the Law Done About "Deepfake" ?", McMillan LLP, (10 May 2023), available here.

6 Supra note 4.

7 Privacy Act, RSBC 1996, c 373; The Privacy Act, RSS 1978, c P-24; The Privacy Act, CCSM c P125; Art 35 CCQ, CQLR c CCQ-1991; Privacy Act, RSNL 1990, c P-22.

8 The Privacy Act, CCSM c P125, ss 3-4.

9 Pablo Tseng, Carina Chiu & Aki Kamoshida, "Legal Considerations in Canada related to 'Voice Cloning'", McMillan LLP, (6 December 2024), available here; Gould Estate v Stoddart Publishing Co., 1996 CanLII 8209 (ON SC).

10 Athans v. Canadian Adventure Camps Ltd. et al., 1977 CanLII 1255 (ON SC).

11 Privacy Act, RSBC 1996, c 373.

12 Bao v. Welltrend United Consulting Inc., 2025 BCCA 3, this is an appeal of a decision, 2023 BCSC 1566, that we have referred to in our previous bulletin, "Legal Considerations in Canada related to 'Voice Cloning'", available here.

13 Note, the BCCA's decision to not uphold the lower court's decision was based on the fact that such tort was not argued by the plaintiff and that the lower court had improperly embarked on its own analysis related to this tort.

14 Yenovkian v Gulian, 2019 ONSC 7279, at para 170.

15 Pablo Tseng and Paola Ramirez, "What Has the Law Done About "Deepfake"?", McMillan LLP, (10 May 2023), available here.

16 Sampson v. TD Insurance Meloche Monnex, 2025 NSSC 19.

17 Yenovkian v. Gulian, 2019 ONSC 7279, most recently discussed in Fowlie v. Spinney, 2024 ONSC 5080.

18 Galton Corporation v. Riley, 2023 MBKB 73 states that "Manitoba has yet to accept the existence of the tort."

19 Benison v McKinnon, 2021 ABQB 843; although the tort of "false light" specifically is yet to be fully considered in Alberta, this decision states that there is no common law breach of privacy in Alberta.

20 Although in Durkin v Marlan, 2022 BCSC 193, the court engaged in a hypothetical exercise applying the tort of false light, it may also be subject to the question of whether there is a common law tort of breach of privacy in British Columbia: Veeken v British Columbia,

21 Sampson v. TD Insurance Meloche Monnex, 2025 NSSC 19.

The foregoing provides only an overview and does not constitute legal advice. Readers are cautioned against making any decisions based on this material alone. Rather, specific legal advice should be obtained.

© McMillan LLP 2025

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More