ARTICLE
28 July 2025

Artificial Intelligence In Nigeria: Key Regulatory Considerations

BH
Balogun Harold

Contributor

Balogun Harold is a specialist law firm for investment and financing transactions focused on Africa. We routinely undertake debt finance, private equity, project finance, venture capital, market entry and technology transactions on behalf of clients. We deliver proven, guaranteed and exceptional outcomes by always aiming for the best level of legal and transactional support necessary to achieve our clients' strategic goals.

A recent report by Google titled "Our Life With AI: From Innovation to Application" suggests that generative AI adoption in emerging markets is significantly higher than global averages.
Nigeria Technology

A recent report by Google titled "Our Life With AI: From Innovation to Application" suggests that generative AI adoption in emerging markets is significantly higher than global averages. Within that context, we examine some regulatory touch points within the emerging legal framework for artificial intelligence services in Nigeria.

  1. No Dedicated Artificial Intelligence Law Yet

At present, Nigeria has no dedicated law or statute that regulates artificial intelligence ("AI") or large language models specifically. There is no AI Act or AI licensing regime for model training, deployment, or explainability. However, in 2025, the National Information Technology Development Agency ("NITDA") released the National Artificial Intelligence Policy ("NAIP"), setting out a high-level framework for the responsible development and use of AI in Nigeria. In addition, there is currently a bill before the National Assembly proposing the establishment of a sector-specific regulatory agency for AI oversight. Thus, foreign artificial intelligence corporations with Nigerian subscribers are currently regulated under the same general framework that applies to foreign SaaS corporations. Kindly refer to our article titled: Are U.S. Foreign SaaS Corporations Regulated by Nigeria? for an initial assessment.

2. AI-Specific Considerations

It is useful to note that there are some emerging AI-specific regulatory concerns that distinguish Large Language Models ("LLMs") from traditional SaaS platforms. The most immediate regulatory touchpoint appear to be the Nigeria Data Protection Act, 2023 ( the "NDPA"), which applies extraterritorially . The NDPA is particularly relevant to LLMs because of their ability to ingest, retain, and infer personal data at scale, even where the data is not structured in a traditional database. LLM platforms that collect user input (prompts), generate responses based on personalized context, or allow account creation will likely suggest the processing personal data under Nigerian law.

We also think that LLMs also raise distinct legal concerns that are likely to become central to regulatory activity in Nigeria, some of which include:

(a) Algorithmic Explainability: The NDPA's provisions on automated decision-making suggest a growing expectation that individuals must be able to understand how algorithmic outputs are generated, particularly when used in high-stakes contexts.

(b) Bias and Fairness Audits: Though not yet mandated in Nigeria, the growing international norm, which is also reflected in NAIP's principles, is that AI providers should evaluate their models for discriminatory outputs. Thus, LLM companies serving Nigerian users may increasingly be asked to document bias mitigation efforts.

(c) Model Training and Data Provenance: LLM developers may need to account for the sources of training data, especially if such data includes scraped content from Nigerian websites or publicly available datasets involving Nigerian individuals. Issues of copyright, data protection, and misappropriation may arise.

(d) Content Liability for AI-Generated Output: While Nigerian law has not yet established liability rules specific to generative content, it is prudent for foreign LLM companies to anticipate that offensive, defamatory, or harmful content generated for Nigerian users could trigger scrutiny under local consumer protection and defamation norms.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

See More Popular Content From

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More