The 2026 National E-Governance Bill: What Nigeria Is and Isn't Getting Right
march 2026
nigeria is drafting the rules that will govern AI in public life. this is a first look at what the bill actually proposes.
for the past five years, the integration of artificial intelligence into public administration in nigeria has operated largely in a regulatory vacuum. biometric citizen registries, algorithmic tax assessments, and automated passport issuance systems have relied on proprietary algorithms with minimal transparency. the introduction of the draft 2026 National E-Governance Bill represents the federal government's first comprehensive attempt to establish statutory boundaries around algorithmic decision-making in public life. it is a deeply significant piece of legislation, but a close examination reveals profound compromises.
the context: AI in the nigerian public sector
nigeria's public sector has aggressively pursued digitization. federal agencies like the National Identity Management Commission (NIMC), the Federal Inland Revenue Service (FIRS), and the Nigeria Customs Service process vast datasets concerning the financial, biometric, and physical movements of over 200 million citizens. as these agencies increasingly deploy machine learning models to detect fraud, allocate resources, or predict compliance, the nature of administrative law fundamentally changes. when a taxpayer is flagged for an audit based on an algorithmic anomaly, the traditional mechanisms of administrative appeal become strained. you cannot cross-examine a neural network.
the draft bill seeks to modernize the administrative framework. it aims to ensure that digital public infrastructure is built on principles of accountability, interoperability, and human oversight. however, as with all legislation attempting to catch a moving technological target, the devil is entirely in the definitions.
what the bill proposes: accountability and transparency
at its core, the bill mandates a standard of "algorithmic transparency" for all federal ministries, departments, and agencies (MDAs). section 14 of the draft requires that any MDA deploying an automated decision-making system (ADMS) that materially affects the rights, privileges, or financial standing of a citizen must maintain a public register of that system.
more impressively, section 18 introduces a statutory right to explanation. if a citizen is denied a government service, licence, or welfare benefit through an automated process, the agency must, upon request, provide a plain-language explanation of the primary variables that influenced the decision. this is a direct, albeit localized, reflection of Article 22 of the GDPR, establishing that citizens cannot be subjected to purely automated decisions without a human fallback mechanism. the draft bill explicitly mandates that all critical public-sector ADMS must have a "human-in-the-loop" mechanism to review contested outcomes.
the glaring omissions: what it does not cover
while the public sector provisions are robust, the bill's greatest flaw is its scope. the 2026 E-Governance Bill strictly limits its jurisdiction to public authorities and private entities acting under direct government contract. private sector AI deployments operate entirely outside its purview.
this creates a dangerous legal asymmetry. if a federal agency uses an algorithm to deny you a passport, you have a statutory right to an explanation. but if a private commercial bank uses an opaque proprietary algorithm to deny you a mortgage, or if a tech platform relies on biased biometric software to lock you out of your digital wallet, the E-Governance Bill offers zero protection. the heavy lifting for private sector algorithm governance is left entirely to the relatively broad principles of the NDPA and consumer protection laws, neither of which are specifically tailored to the unique complexities of generative or predictive AI.
furthermore, the bill is frustratingly silent on liability. when an MDA deploys a flawed machine learning model that unlawfully revokes the business licences of five hundred SMEs due to a data processing error, the bill establishes the mechanism to challenge the decision, but it fails to establish a clear framework for statutory damages or state liability. it treats algorithmic harm as an administrative error to be corrected, rather than a tortious injury to be compensated.
the european comparison: echoing the AI act
it is impossible to read the draft bill without recognizing the shadows of the European Union's AI Act. nigeria's approach mimics the EU's risk-based classification system, designating certain systems (like biometric surveillance and judicial scoring) as "high-risk" public systems requiring mandatory prior impact assessments. however, unlike the EU Act, which imposes strict ex-ante testing and certification requirements orchestrated by a central AI office, the nigerian bill places the burden of assessment largely on the deploying agencies themselves. this self-policing model in public administration often leads to superficial compliance.
the role of the NDPC
the draft bill cleverly avoids creating a new regulatory body, instead expanding the mandate of the Nigeria Data Protection Commission (NDPC). the NDPC becomes the de facto auditor of public sector algorithms. while this prevents bureaucratic bloat, it places immense pressure on a commission that is already stretched thin enforcing the 2023 NDPA across the massive private sector. whether the NDPC has the specialized technical capacity—the data scientists and algorithmic auditors required to effectively evaluate deep learning models deployed by the defense or financial ministries—remains a critical point of concern raised by civil society.
practical implications for organisations
if enacted in its current form, the implications for the private sector are indirect but significant. any technology vendor, consultancy, or cloud provider bidding to build or host digital infrastructure for the nigerian government will have to comply with the bill's transparency and explainability mandates. black-box proprietary algorithms will become incredibly difficult to sell to the state.
for now, tech practitioners and legal observers must focus on the public consultation phase. the bill gets the fundamental principles right—human oversight, transparency, and the right to appeal—but without extending these safeguards to the private sector and establishing firm liability regimes, it risks becoming a brilliant framework for a very limited slice of the nigerian digital economy.
if this is relevant to your situation, tell me what you are trying to solve.
send a brief