By Kayode Lawrence-Omole
Introduction: Whose Rules Govern Your Data?
When you ask an AI tool a question, where does your data go? Lagos? London? A data centre in Dublin or Dubai? For most users, and even for many companies, the answer is far from clear. In the age of artificial intelligence, data no longer stays in one place. It flows across borders, fuels machine learning systems, and sometimes gets embedded into vast models in ways that are difficult to trace.
This reality poses a new question for Nigeria. What does “data sovereignty” mean under the Nigeria Data Protection Act (NDPA) 2023 when AI is part of the equation?
The NDPA was designed to safeguard personal data, build digital trust, and position Nigeria within the global data economy. But AI is testing its boundaries. To remain relevant, lawyers, businesses, and regulators must rethink how the NDPA’s provisions on sovereignty apply in this new landscape.
Data Sovereignty under the NDPA
At its core, data sovereignty means that data about Nigerians is subject to Nigerian laws ; no matter where that data travels. It is a claim of jurisdiction, a way of saying: our rules follow our citizens’ data.
The NDPA gives life to this idea through:
- Lawful bases for processing personal data (consent, contract, legal obligation, vital interest, public interest, legitimate interest).
- Cross-border transfer restrictions, requiring either adequate protection in the receiving jurisdiction or the use of specific safeguards like standard contractual clauses.
- Supervisory oversight, vested in the Nigeria Data Protection Commission (NDPC), which is empowered to enforce compliance and issue further regulations.
This framework is relatively clear when dealing with traditional cloud services. If a Nigerian company wants to store customer data on servers in the US, it must ensure protections are in place. But with AI, the story becomes more complicated.
How AI Complicates the Picture
AI systems, especially large language models (LLMs) and machine learning platforms, introduce unique challenges:
- Cross-border processing by default
AI models are often trained and deployed using infrastructure spread across multiple jurisdictions. A chatbot accessed in Lagos might route data through servers in Frankfurt, California, or Singapore, sometimes all at once. Unlike conventional data transfers, AI processing rarely respects geographic boundaries.
- Data residency vs. data flow
The NDPA assumes you can know where your data is being processed. With AI, however, “location” becomes blurred. Models may process fragments of personal data, not store them, yet still “use” them in ways that affect sovereignty.
- Opacity of AI systems
AI models are sometimes “black boxes.” Even developers may struggle to explain exactly how data moves through them. This lack of transparency clashes with NDPA principles like purpose limitation and accountability.
- Risk of re-use in training
Data used in prompts or uploaded to AI platforms might be incorporated into training sets, making it difficult to enforce the NDPA’s provisions on consent, deletion, and accuracy.
In short: AI doesn’t break the rules of the NDPA, but it stretches them in ways that demand reinterpretation.
New Interpretations Emerging
As Nigerian regulators and practitioners begin to grapple with these realities, several new interpretations of the NDPA are emerging:
- Consent and transparency redefined
In AI contexts, true consent requires that individuals understand not only that their data is being processed, but also that it may travel across borders and feed into automated systems. This sets a higher bar for transparency than traditional cloud services.
- Cross-border transfers under stress
The NDPA’s adequacy and safeguard tests were designed for bilateral transfers (Nigeria → UK, for example). But if an AI model processes data simultaneously across 10 jurisdictions, which test applies? There is growing consensus that controllers in Nigeria must take a “systems view”, assessing the AI ecosystem as a whole rather than treating each transfer in isolation.
- Accountability beyond the vendor
Some organisations argue that if they license an AI platform, liability should rest with the vendor. But under the NDPA, the data controller (the Nigerian business using the tool) cannot fully delegate responsibility. This is shifting expectations. Firms must actively audit vendors, not just trust contracts.
- Explainability as compliance
The NDPA does not explicitly use the term “explainability,” but its emphasis on fairness and purpose limitation suggests that AI-driven processing must be understandable. When AI systems make decisions that affect Nigerians, organisations should be prepared to show how personal data influenced those outcomes
Global Parallels & Lessons
Nigeria is not navigating these waters alone. Other jurisdictions are facing similar challenges, and their approaches provide useful context:
European Union
The EU’s General Data Protection Regulation (GDPR) has long shaped global data rules, but the new EU AI Act adds a layer of obligations. It introduces a risk-based approach: “high-risk” AI systems must meet strict transparency, quality, and oversight requirements. This directly links sovereignty with AI accountability. Nigerian firms dealing with EU clients may need to meet these standards regardless of NDPA enforcement.
United Kingdom
The UK has opted for a principles-based approach. Regulators emphasise that existing duties, fairness, accountability, security, already apply to AI, but firms must show how they have assessed and mitigated risks. This mirrors the NDPA’s structure and could guide Nigerian practice.
United States
There is no federal data protection law, but regulators like the Federal Trade Commission (FTC) have signalled that misrepresenting AI capabilities or failing to secure data can amount to unfair practices. The U.S. is more sectoral and flexible, but its enforcement-first stance is instructive.
South Africa
Under the Protection of Personal Information Act (POPIA), cross-border transfers face conditions similar to the NDPA. Discussions are underway on how AI use aligns with these requirements, with emphasis on professional standards rather than heavy legislation.
Nigeria can draw from these models but must tailor its interpretation to its own context: a fast-growing digital economy, an ambitious tech sector, and a legal system seeking to assert sovereignty without stifling innovation.
Implications for Nigerian Businesses and Lawyers
These evolving interpretations are not just abstract legal debates; they have real consequences:
For companies
- Contracts with AI vendors must now address cross-border data flows, liability for misuse, and compliance with NDPA safeguards.
- Firms using AI for customer interactions (chatbots, automated decision-making, HR screening) must ensure they can explain data use and obtain informed consent.
- Reliance on “global” tools without due diligence could expose companies to NDPC enforcement.
For lawyers
- Advising clients will increasingly require knowledge of both data protection law and AI systems.
- Legal opinions must anticipate questions of jurisdiction, liability, and explainability.
- There is a growing opportunity for Nigerian lawyers to leverage the NDPA’s novelty to lead regional conversations on AI and data governance.
For regulators
- The NDPC faces the challenge of issuing sector-specific guidance to clarify how NDPA rules apply to AI in finance, healthcare, HR, and government services.
- There is also an opportunity to position Nigeria as a standard-setter in Africa, balancing sovereignty with openness to global AI ecosystems.
The Road Ahead: Localisation or Interoperability?
The key policy question is whether Nigeria will lean towards data localization, requiring AI tools to process data within its borders, or towards trust-based interoperability with international systems. Localization advocates argue this is the only way to guarantee sovereignty, protect sensitive data, and build local infrastructure. Interoperability supporters counter that strict localisation risks isolating Nigeria’s digital economy, raising costs for businesses, and limiting access to cutting-edge AI innovations.
The NDPA, as it stands, supports interoperability, provided adequate safeguards are in place. But political and regulatory pressure may push toward localisation, especially in sensitive sectors like defense, health, and finance.
Conclusion: Rethinking Sovereignty in the Age of AI
Data sovereignty under the NDPA was always about control, ensuring that Nigerians’ data is governed by Nigerian law. AI challenges this by making data flows more complex, less transparent, and harder to pin down geographically.
The Act itself remains robust enough to cover AI use, but only if interpreted with creativity and foresight. Consent must mean informed consent about AI use. Accountability must mean explainability. Cross-border transfer rules must adapt to AI ecosystems, not just bilateral flows.
Ultimately, AI is forcing Nigeria to ask whether sovereignty is about physical control of data, or about enforceable accountability wherever data goes.
Key Contact: Kayode Lawrence-Omole, Compliance and Risk Expert, Email: olukayode.lawrence-omole@dentons.com, Tel: +2348077771670




Contact & Orders 📞 0704 444 4777 | 0704 444 4999 | 0818 199 9888 🌐 www.alexandernigeria.com
______________________________________________________________________

