ARTICLE
3 September 2025

Facebook: Federal Court Of Appeal Clarifies Requirements For Consent Under PIPEDA

LL
Lerners LLP

Contributor

Lerners LLP is one of Southwestern Ontario’s largest law firms with offices in London, Toronto, Waterloo Region, and Strathroy. Ours is a history of over 90 years of successful client service and representation. Today we are more than 140 exceptionally skilled lawyers with abundant experience in litigation and dispute resolution(including class actions, appeals, and arbitration/mediation,) corporate/commercial law, health law, insurance law, real estate, employment law, personal injury and family law.
In 2018, during Donald Trump's first presidency, news media reported that the U.K.-based company Cambridge Analytica had used personal information obtained from Facebook users without authorization.
Canada Privacy

In 2018, during Donald Trump's first presidency, news media reported that the U.K.-based company Cambridge Analytica had used personal information obtained from Facebook users without authorization. Cambridge Analytica used this information to build a system that profiled individual U.S. voters and targeted them with personalized political advertisements.

The scandal led to a congressional hearing before the U.S. Senate, where Facebook's chief executive, Mark Zuckerberg, testified. Following the media reports, the Privacy Commissioner of Canada (the Commissioner) received a complaint regarding Facebook's compliance with the Canadian Personal Information Protection and Electronic Documents Act (PIPEDA).

After the investigation, the Commissioner concluded that Facebook had failed to obtain valid and meaningful consent for disclosing users' information to third-party applications and had not adequately safeguarded users' data. As a result, in February 2020, the Commissioner commenced an application in the Federal Court pursuant to paragraph15(a) of PIPEDA, asking the court to determine whether Facebook had breached PIPEDA.

The underlying facts, lower court decision and decision of the Court of Appeal are summarized below, but here are the key takeaways:

  • Meaningful consent under PIPEDA requires that a reasonable person would understand the nature, purpose and consequences of data collection, use and disclosure — not just that users clicked "agree" to lengthy policies.
  • Organizations must take active steps to ensure users are properly informed and must not rely on complex, lengthy or obscure policies to claim consent.
  • Safeguarding obligations require more than contractual assurances; platforms must actively monitor and enforce privacy practices of third parties with access to user data.
  • The context of digital platforms and consumer contracts of adhesion means courts will scrutinize claims of consent and safeguarding more closely, especially where default settings favour disclosure.
  • The decision clarifies that privacy obligations under PIPEDA are to be interpreted from the perspective of a reasonable person, not based on subjective or expert evidence about individual users.

Background

In 2007, Facebook launched the platform technology, which allowed third parties to build and operate applications on Facebook. By installing these applications, users could have personalized social and entertainment experiences on Facebook, such as sharing photos and listening to music (Canada (Privacy Commissioner) v. Facebook, Inc., 2024 FCA 140, at para. 5). To enable these third-party applications to receive information from users, Facebook provided a communication protocol, called "Graph API," which went through two phases of revisions.

Under the first version (the "v1"), third-party applications could obtain information about both the installing users and their friends; under the second version (the "v2"), applications were prohibited from accessing information about friends of the installing users, with a few limited exceptions.

However, before switching to v2, Facebook gave existing applications a one-year grace period to continue operating under v1 and obtain information about the installing users' Facebook friends.

The third-party application at issue, "thisisyourdigitallife" (TYDL), was developed by Dr. Aleksandr Kogan, a former professor at the University of Cambridge. TYDL, presented as a personality quiz, accessed information including Facebook profiles of the installing users' as well as information from their Facebook friends. Through approximately 272 Canadian users, TYDL obtained data from over 600,000 Canadian Facebook users. The user data collected by TYDL was later sold to Cambridge Analytica to develop the psychographic models used to target political messages toward Facebook users during the 2016 U.S. presidential election.

The alleged breaches of PIPEDA occurred during the v1 phase, between November 2013 and December 2015. During this time, Facebook had two platform-wide policies in place, the Terms of Service and the Data Policy. A user must consent to these policies in order to sign up for Facebook. The Terms of Service, which was approximately 4,500 words in length, provided that third-party applications may ask the user for permission to access their information. The Data Policy, which was approximately 9,100 words in length, explained how information was shared on Facebook.

The Federal Court decision

The Federal Court dismissed the Commissioner's application because the Commissioner had failed to meet the standard of proof. In addressing the first allegation that Facebook had not obtained meaningful consent from users and their Facebook friends, Justice Michael Manson held that the evidentiary vacuum was detrimental to the Commissioner's case (Canada (Privacy Commissioner) v. Facebook, Inc., 2023 FC 533, at para. 71).

The Commissioner did not compel evidence from Facebook pursuant to s. 12.1 of PIPEDA, nor did the Commissioner provide any expert evidence indicating what Facebook could have done differently. The Commissioner also did not submit any subjective evidence from users about their expectations of privacy. As a result, Justice Manson rejected the Commissioner's submissions, which would have required the court to speculate and draw unsupported inferences.

Regarding the second allegation, whether Facebook had adequately safeguarded user information, Justice Manson agreed with Facebook that PIPEDA's "safeguarding obligations end once information is disclosed to third-party applications." In the absence of evidence to the contrary, Justice Manson declined to conclude that Facebook's contractual agreements and policies failed to provide adequate protection for users' information.

The Federal Court of Appeal decision

The Federal Court of Appeal allowed the appeal and granted the Commissioner's application in part.

The meaningful consent analysis

The Federal Court of Appeal rejected the Federal Court's requirement for subjective and expert evidence. The court held that the standard for meaningful consent is the reasonable person standard as prescribed by the legislation, since both s. 6.1 and clause 4.3.2 of PIPEDA expressly adopt the word "reasonable." Therefore, subjective evidence is not necessary when determining the reasonable person standard.

Unlike assessing a reasonable professional, which might require expert evidence due to limited experience with that particular profession, a judge in this case can rely on everyday life experience to inform their decision. Consequently, expert evidence would not be necessary. Further, evidence of surrounding circumstances, including disclosure from Cambridge Analytica and Facebook's policies and practices, was sufficient for the court to assess whether the reasonable person standard had been met.

The Federal Court of Appeal clarified the double reasonableness standard in clause 4.3.2 of PIPEDA. The legislation states as follows:

The principle requires "knowledge and consent." Organizations shall make a reasonable effort to ensure that the individual is advised of the purposes for which the information will be used. To make the consent meaningful, the purposes must be stated in such a manner that the individual can reasonably understand how the information will be used or disclosed [Emphasis added].

The first prong of the test focuses on organizations requesting information, requiring them to make reasonable efforts to inform an individual about the collection and use of data. The second prong centres on individuals, which requires the form of consent to be informative enough that an individual can reasonably understand the use and disclosure of information.

Whether a form of consent is sufficiently meaningful depends on the specific circumstances. In this case, the court identified two distinct circumstances based on two groups of Facebook users: those who installed third-party applications and their Facebook friends.

While users who installed third-party applications had the opportunity to directly review the application's privacy policies and consent to the collection and use of data, friends of these users did not. The only policy available to them was Facebook's high-level Data Policy, which the court found to be too broad and ineffective. It did not sufficiently inform users about the disclosure of information related to their friends' use of third-party applications, nor did it contemplate large-scale data scraping such as what TYDL did. Therefore, the court concluded that the Facebook friends of those users who installed third-party applications did not meaningfully consent to the disclosure of their information.

The Court of Appeal also held that the users who installed the TYDL application did not provide meaningful consent to the disclosure of their data either. The underlying question is whether a reasonable person would have understood that in downloading a third-party application, like a personality quiz in this case, they consent to the scraping of their data and to the use of it in a manner contrary to Facebook's rules, such as developing models to target political advertisements. The court considered the following contextual factors in reaching its conclusion:

  1. Facebook obtained users' consent to the Data Policy in a manner contrary to PIPEDA's requirements, as users were deemed to accept the Data Policy when they accepted the Terms of Service;
  2. Mark Zuckerberg himself testified before the U.S. Senate that he imagined that most people do not read the policies;
  3. Facebook did not warn users of the possibility of bad actors on its platform, such as TYDL; Facebook had no robust preventative measures in place, which a reasonable user would have expected;
  4. When TYDL requested access to unnecessary information while transitioning to v2 in 2014, Facebook failed to act promptly despite labelling it as a "red flag"; and
  5. Facebook's Terms of Service and Data Policy were adhesion contracts not subject to negotiation, which gave rise to heightened scrutiny.

After considering these contextual factors, the Court of Appeal concluded that users who installed TYDL did not meaningfully consent to the disclosure of their information.

The safeguarding obligations analysis

The Federal Court of Appeal overruled the Federal Court's finding and held that Facebook had not met its safeguarding obligations to adequately protect user data. Although the Federal Court of Appeal agreed with the Federal Court that the safeguarding principles apply only to data in an organization's possession, the appellate court took issue with Facebook's inaction prior to the disclosure of data to third parties. The court noted Facebook's pattern of inaction. For instance, Facebook failed to review the content of privacy policies of the third-party applications or act on TYDL's suspicious request for access to information in 2014. Although Facebook discovered TYDL's breach of its policies in December 2015, it did not notify affected users or ban Kogan and Cambridge Analytica from Facebook until March 2018, contrary to its own policies. The court concluded that having invited users to its platform, Facebook must ensure compliance with PIPEDA.

Balancing interests under PIPEDA

The Federal Court of Appeal emphasized the importance of context when conducting a PIPEDA analysis, highlighting that an organization has no inherent right to information. PIPEDA balances individuals' rights of privacy with organizations' need to collect, use and disclose personal information. The organization's need for information must be considered with regards to the nature of its business.

For example, Facebook's business model aims at obtaining information and selling advertising, and therefore, the direct link between the collected information and Facebook's profits informs the degree required for a meaningful consent. The interpretation of meaningful consent must take into account all relevant factors, including "the demographics of the users, the nature of the information, the manner in which the user and the holder of the information interact, whether the contract at issue is a one of adhesion, the clarity and length of the contract and its terms and the nature of the default privacy settings."

The estoppel and officially induced error arguments

Between 2008 and 2009, the Commissioner investigated Facebook's privacy policies and made several recommendations. In 2010, the Commissioner informed Facebook that it had satisfied the commitments. Relying on this representation, Facebook argued that the Commissioner was estopped from pursuing the application. The court rejected the argument on three grounds. First, as technology is always evolving, Facebook should be expected to adapt its privacy measures. Second, the application was a de novo hearing, and consequently, the court owed no deference to the Commissioner's earlier report. Third, estoppel in public law has narrow application, and the Commissioner cannot be estopped from exercising statutory duties based on a representation made over a decade ago.

The implications

This case highlights the importance for organizations to obtain meaningful consent and safeguard user data when conducting their business. Courts are encouraged to consider all relevant contextual factors in determining whether PIPEDA has been complied with. The reasonable person standard under PIPEDA developed by the Federal Court of Appeal, which does not necessarily require expert or subjective evidence, will make it easier for the Commissioner to establish breaches of PIPEDA. Organizations should be cautious when adopting non-negotiable boilerplate privacy policies, as such consumer contracts of adhesion may be subject to heightened scrutiny.

Originally published by Law360 Canada.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More