Artificial intelligence ("AI") programs have been permeating public discourse, with the popularization of language learning models and AI image generators raising novel legal questions. Recently, the Federal Court in Haghshenas v. Canada (Citizenship and Immigration), 2023 FC 464, considered one such question: what are the legal implications of using AI in the administrative decision-making process?

Haghshenas involved a work permit decision of an immigration officer. On judicial review, the applicant argued that the decision was reached with the help of AI. The Federal Court held that, in the circumstances before it, the alleged assistance of AI was not relevant, writing: "Whether a decision is reasonable or unreasonable will determine if it is upheld or set aside, whether or not artificial intelligence was used. To hold otherwise would elevate process over substance."

This is the first reported decision in Canada to comment on the appropriateness of AI assistance in administrative decisions, and may prompt the expansion of AI assistance in administrative decision-making. As AI becomes more prevalent, we can expect further caselaw with respect to its role in decision-making processes.

Majid Haghshenas, the applicant, was a 33-year-old citizen of Iran who applied for a Labour Market Impact Assessment-exempt work permit under the C11 category, which is targeted towards entrepreneurs and self-employed candidates seeking to operate a business. In refusing the application, the immigration officer emphasized that he was not satisfied that Mr. Haghshenas would leave Canada at the end of his stay due to the purpose of his visit. Mr. Haghshenas subsequently sought judicial review of the decision, challenging it on both procedural and substantive grounds.

With respect to the use of AI, Mr. Haghshenas argued that the officer's decision was procedurally unfair because it was reached through the assistance of artificial intelligence, specifically the "Chinook" software application. He further argued that the decision was unreasonable due to concerns about Chinook's reliability and efficacy, and suggested that a decision made using Chinook could not be termed reasonable until it has been elaborated to all stakeholders how machine learning had replaced human input and the resulting effect on application outcomes.

Chinook is a Microsoft Excel-based tool developed by Immigration, Refugees and Citizenship Canada ("IRCC") for temporary resident application processing. It was first piloted by IRCC in 2018 and officially launched in 2019. According to IRCC, Chinook displays information stored in IRCC's processing system and system of record in a more user-friendly way to simplify the visual representation of an applicant's information for the benefit of the decision-maker. IRCC denies that Chinook uses AI or advanced analytics for decision-making and that Chinook features no built-in decision-making algorithms. IRCC states that it is always an IRCC officer, not Chinook, that makes the final decision on an application. However, IRCC notes that Chinook's various modules can assist IRCC with pre- and post-decision management.1

In dismissing the application, the court accepted that AI had been used in the decision-making process. However, it found that the officer's decision was not made by Chinook, but instead by the officer, while acknowledging that the officer's decision had input assembled by AI. The court emphasized that the use of AI was irrelevant to the application for judicial review because: (1) the officer ultimately made the administrative decision; and (2) on an application for judicial review, the court must determine whether the administrative decision was procedurally fair and reasonable, not whether artificial intelligence assisted in the decision-making process.

The court's analysis in Haghshenas may prompt greater adoption of AI in government decision-making. However, and as alluded to by Mr. Haghshenas in his submissions, courts may need to address concerns about decisions made with heavy reliance on information provided by AI, and how much human input is needed in the decision-making process in order for that decision to qualify as a "human-made" decision.

Footnote

1. CIMM - Chinook Development and Implementation in Decision-Making - February 15 & 17, 2022

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.