By Folarinwa M. Aluko
Distinguished Senior Advocates, Elders and Members of the NBA Calabar Branch, Artificial intelligence may not be the first technology to unsettle the law, but it is the first to mimic its reasoning. Every major technological wave over the course of human history has redrawn the boundaries of legal practice, ethics, and the marketplace. The invention of the printing press extended the Lawyer’s reach beyond the spoken word. The telegraph and telephone collapsed distance while the personal computer and the internet increased access to information. Each technological leap has augmented and reshaped the practice of Law in ways that were measurable and obvious.
AI is different. It presses on something deeper, the lawyer’s intellectual sovereignty which is based on the ability to decide what the law means and why. By reading, predicting, and drafting in languages that sound like ours, AI trespasses on the lawyer’s sacred ground of legal interpretation, blurring the line between judgment and suggestion.
The amount of processing power stored accessible to the average User is simply astounding. A few months ago, while consulting for a UK-based private equity firm preparing to acquire a portfolio of Nigerian Fintech companies, I was asked if I could “run a predictive analytics model on the target companies’ IP filings and contract dispute histories to forecast litigation risk?”
That was a fair question from a team accustomed to data-driven due diligence. But the honest answer was less flattering. Our “database” for such information isn’t a single, queryable platform. The only way to conduct that due diligence was through painstaking manual searches, because the relevant data is scattered across a maze of court registries, fragmented filings at Ministries, Departments and Agencies; and archives that are both unstructured and largely offline.
In truth, Nigerian data isn’t ready for anyone, much less an algorithm. The point here for the legal profession is that while foreign Law Firms use AI to interpret data, we are still struggling to create it. Yet we keep speaking of innovation as if the data framework plugin already exists. We have mastered the vocabulary of technology but not its infrastructure. That is the illusion of progress our profession must confront. Every young Lawyer who has spent hours searching a dusty registry knows this truth. Our challenge is not ignorance of technology but the stubborn endurance of paper.
Two Kinds of Policy, and Why the Difference Matters
There are two distinct layers of governance through which law must engage AI.
The first is a National AI Agenda. This refers to a public framework, defining the limits of acceptable behavior across sectors, setting standards for privacy, transparency, liability, and accountability. It governs what the State and its Citizens, real or incorporeal, may build and how those systems should behave.
Nigeria already has fragments of this architecture in the National Information Technology Development Agency Act and the Nigeria Data Protection Act 2023. Yet these frameworks, often conceived secretly and in haste, blur constitutional lines. Privacy is a residual matter reserved for the states. A national law may set guiding principles, but state-level legislation is better suited to reflect local realities and safeguard rights. We do not need another sweeping federal statute filled with good intentions but bad outcomes, we need a coordinated system that empowers States to regulate responsibly while ensuring that both data and the machines that process it remain accountable. In Countries like the United States, France, and China, such laws anchor a National Strategy while the States build predictability for investors and innovators.
The second layer is an AI Policy for the Legal Profession. This refers to a narrower, but more demanding policy level that translates broad technology principles into obligations of competence, confidentiality, and professional integrity. This Policy should dictate how Lawyers in the Legal Profession and the Justice Sector, can use AI responsibly: how client/user data is stored, how AI tools are used, how AI-generated research is verified, and how algorithmic evidence is authenticated in Court.
Nigeria needs both layers. A National Policy cannot substitute for professional ethics, and Bar Guidelines cannot shoulder the responsibility of state regulation. The National Law sets the floor while the Professional Policy builds the ceiling. These frameworks should only be designed after a period of rigorous public advocacy and debate that ensures inclusion, participation and public ownership.
The Psychology of Leapfrogging
Nigerian optimism often suffers from the Leapfrog Fallacy, the belief that we can vault over necessary stages of development simply because we can see the summit. It’s an illusion of progress that masks the absence of preparation.
This pattern is reinforced by two cognitive biases. Present Bias drives our hunger for immediate recognition, the prestige of announcing a “national AI strategy” over the slow, unglamorous work of building infrastructure. Overconfidence Bias convinces Policymakers that (good) intention can replace systems. That we can legislate a Digital Future into being by Committee, Decree or Faith.
The result is a policy mirage. We celebrate the announcement, neglect the architecture and organize Roundtables and Workshops to discuss why our Systems keep failing: this is the Nigerian Strategy for public policy.
We announce the launch of ‘Digital Filing’ supplying computers to Registries that rely entirely on paper filing and wonder why no one uses the new system?
There are no shortcuts to achieving justice or building technological systems. To transition into a digital environment, we need to have built the internal capacity to sustain the shift. Our underlying realities demonstrate the mismatch between rhetoric and reality. For context, Nigeria’s internet penetration rate supposedly stands above 65%, yet stable high-speed broadband which is the oxygen of modern AI, is available to less than a quarter of Users.
The national grid, with an installed capacity of 13,000 MW, produces less than 5,000 MW on an average day. Anecdotally, more than 95% of Court Judgments, especially from lower courts, are not available as scanned documents while less than 5% of appellate case law is structured in a machine-readable format. AI cannot run on ambition alone. Every skipped step must eventually be climbed, or else we run the risk of failure before we even begin.
To be fair, there are glimmers of progress. Platforms such as LawPavilion and Legalpedia provide judgments from Appellate Nigerian Courts in searchable formats, and some state judiciaries, including Lagos, Oyo and FCT are piloting e-judgment repositories.
The Architecture of Readiness
Like the foundation of a building, each layer of technological maturity must set before the next is laid. The pathway from manual systems to digital intelligence has at least five levels: digitization, connectivity, automation, analytics, and intelligence, with each layer supporting the next.
Digitization turns paper into machine-readable text. Connectivity allows systems to talk to one another. Automation codifies routine processes for efficiency. Analytics extracts insight from multiple data sets. Only then can intelligence (learning systems that predict and adapt) emerge meaningfully.
Nigerians are known for thinking and talking in grand visions while ignoring the quiet discipline of details, forgetting that the devil we fear in development usually hides in the very minutiae we overlook. In 2025, Nigeria hovers somewhere between the first and second rung, yet our discourse, impatient and aspirational, begins at the fifth. When policy vaults ahead of practice, the result is hollow modernity: the appearance of advancement without the substance of readiness.
Legal Logic vs. Machine Logic
Law and AI reason in different languages. Law is normative in that it asks what is right, what is fair, what fits within principle. AI on the other hand, is probabilistic, it asks what is likely, based on prior patterns. Legal reasoning in Nigeria is intensely contextual, steeped in oral argumentation and cultural nuance. AI, however, flattens context into data points. A model trained on foreign legal corpus will misinterpret our statutes, cultural nuances and native intelligence.
For AI to serve our justice system, it must learn law and be trained on local judgments, statutes, and idioms, and fine-tuned to reflect the norms of Nigerian practice. Otherwise, we risk automating misunderstanding at scale.
Ethics as Risk Management
Ethics in the legal profession is not a static code, it is a dynamic system of risk management on which the integrity of the entire system relies. Therefore, each ethical risk introduced by AI requires a corresponding human control. We will examine a few
- Accuracy and Automation Bias: Machines generate plausible text with extraordinary confidence, the problem is that confidence is not truth. This danger is compounded by our learned Automation bias. Automation bias is the psychological tendency to trust machines over ourselves, compounds the danger.
A lawyer who accepts an AI-generated case summary without checking the underlying judgment abdicates their duty of competence. Verification must now be part of ethics: a disciplined skepticism that treats every AI citation as unverified until proven otherwise.
- Confidentiality and Data Integrity: Uploading client materials into public AI tools is not convenience, it is a breach of privilege. Rule 19 of the Rules of Professional Conduct for Legal Practitioners (2023) makes this duty explicit, extending confidentiality to “any information acquired in the course of professional employment.” When such data is uploaded to public AI models hosted on foreign servers, the lawyer risks breaching not only ethics but data residency obligations.
The duty of confidentiality extends beyond sealed lips and locked filing cabinets to server configurations, encryption standards, and data residency. With AI, Ethical compliance demands technological literacy, in essence, Lawyers must know not only what AI can do but where their Client’s data goes when it does it.
- Bias and Fairness: An algorithm trained on historical data inherits historical inequities. Bias amplification in AI is not hypothetical; it is structural. When the system learns from skewed sources, it reproduces those distortions at scale.
Ethical lawyering now requires vigilance against confirmation bias, our own tendency to seek data that supports our case. AI can make this bias invisible by embedding it in the data itself. Ethical practice must therefore include bias audits, impact assessments, and transparent review processes.
- Cognitive Offloading and the Atrophy of Judgment: The most subtle danger posed by AI is cognitive. Overreliance on AI fosters cognitive offloading: the gradual outsourcing of memory, synthesis, and analytical reasoning. We saw this before, when calculators dulled our instinct for numbers.
The slippery slope begins with small conveniences such as letting the AI find citations or summarize arguments, over time, the brain adapts to disuse. This atrophy of judgment erodes the lawyer’s defining skill which is the ability to synthesize law, fact, and human consequence into reasoned judgment. In a profession built on discernment, to surrender judgment is to surrender identity.
- Accountability and Oversight: Even when AI is used responsibly, its decisions must remain attributable to a human actor. The principle of non-delegable responsibility is timeless: if your brief cites a fabricated case, you stand liable for disciplinary action, not the algorithm. Maintaining “the lawyer’s mind in the loop” means designing workflows that ensure oversight, documentation, and traceability for every machine-assisted task.
The courts have yet to rule definitively on whether AI-generated materials meet the authentication standards of Section 84 of the Evidence Act. Until judicial or procedural clarity emerges, prudence demands that every AI-assisted document be verified and authenticated by a human lawyer before filing. The machine may assist, but only the Lawyer should certify. The challenge lies in proving the provenance and integrity of AI-generated materials under Section 84(4) Evidence Act, which demands proof of the device and operator, which, in the case of cloud-based AI, are often beyond the user’s control.
Earlier this year, a content creator client used an AI tool to draft an agreement with an international distribution company. The AI, trained primarily on U.S. templates, inserted a clause that was unenforceable under Nigerian copyright law and detrimental to that Client’s interest. Fortunately, the client sought legal review before execution, and the clause was removed. The lesson here was not technological failure, but professional complacency, a reminder that while AI can assist, it can never absolve the lawyer of the duty to think.
This approach echoes the emerging global consensus reflected in the American Bar Association’s 2024 Guidelines on AI and Professional Responsibility and the Council of Bars and Law Societies of Europe’s 2023 AI Charter, both of which emphasize human accountability, transparency, and competence as non-negotiable principles.
These risks are not theoretical. They unfold daily in law offices, court registries, and boardrooms where decisions depend on both machines and men. The question is not whether AI will enter the profession, but how we can meet it on our own ethical terms.
Building Justice by Design
Colleagues of the NBA Calabar Branch, artificial intelligence in Nigeria is not merely a question of technology; it is a question of justice. Use-case scenarios demonstrate that AI’s most transformative potential for Nigeria lies in Equitable Access and not in Elite Efficiency.
Imagine, for instance, an AI-powered assistant that helps a trader in Aba understand the need to register a limited liability company over a business name, or guides a farmer in Odukpani through the basics of a land lease in Efik, Ibibio, or Pidgin English.
Contrary to popular belief, Nigeria’s legal profession is under-lawyered, not over-lawyered. Far too many Nigerians go through life without meaningful access to legal services. Properly designed AI can bridge that gap by explaining rights, simplifying procedures, and directing users to Legal Practitioners for real counsel.
This vision aligns with Section 36 of the Constitution of the Federal Republic of Nigeria (1999), which guarantees every citizen access to justice. If AI can expand that access responsibly, then it becomes an instrument of constitutional realization, not disruption. This is what justice by design means: deploying technology to make rights not only known but usable. If the law is to remain a tool of empowerment, not exclusion, then it falls to us as lawyers to shape the use of AI in ways that extend justice, not automate inequality.
Recommendations
What NBA Calabar (and every Branch of the Bar) can do
- Digitize Locally: Each Branch of the Bar ought to initiate a digitization program within its jurisdiction, consistent with the Bar’s constitutional duty to promote access to justice under Section 3 of the NBA Constitution. The project should include the compilation and scanning of judgments from the High Court, Magistrate Courts, and Tribunals. A local database, however modest, becomes a building block for a National repository.
- Connect Institutions: Each Branch ought to foster collaboration between the Courts, Ministries, and Agencies like the Corporate Affairs Commission and the Land Registry. Advocate for data-sharing protocols and interoperable systems. A connected ecosystem is what allows legal data to become legal intelligence.
- Learn, Re-learn and Unlearn: Branches of the NBA have a responsibility to integrate AI Literacy, Data Ethics, and Automation Psychology into Branch Continuing Legal Education (CLE). Organize workshops to help members understand not only how to use AI, but when not to. Competence in this era includes knowing the limits of technology and the irreducible value of human judgment.
- Govern Responsibly: Work with the NBA National Executive Council to establish Branch-driven ethical guidelines on AI use. These guidelines should cover verification standards, data protection, and disclosure obligations. The guidelines should be inclusive and led by public debate and interrogation for the Bar to take ownership.
- Localize Innovation: Encourage collaboration with Nigerian developers and Universities to train AI models on Nigerian legal data and local languages. The law’s voice must sound like the people it serves. Let Calabar be known not only for its festivals but also for pioneering locally trained legal models that understand our idioms and realities.
Legal Intelligence comes before Artificial Intelligence
The real risk before us is not that AI will replace the Nigerian lawyer, but that lawyers, dazzled by the illusion of effortless competence, may forget why judgment matters. The opportunity, however, is far greater: to design a profession where intelligence, human, artificial, and yes, even spiritual (because as Africans we know some cases defy logic alone), work together in partnership.
If every Branch begins with its own house, digitizing, educating, and governing responsibly, we will not merely adapt to the age of AI; we will define it.
In conclusion, every revolution in technology has posed a question of relevance for the legal profession. The printing press threatened the Scribes; the typewriter threatened Law Clerks; the Computer threatened the Legal Researcher. Each time, the profession adapted and deepened. The danger with AI is subtler. It will not make lawyers obsolete; it will make unthinking lawyers redundant.
We cannot automate what we have not yet articulated. Before we build artificial intelligence, we must build legal intelligence; ensuring that our systems, our ethics, and our minds are fit for purpose. Only then will technology serve justice, not the other way around.
Thank you Mr. Chairman, Branch Exco and the distinguished members of the NBA Calabar Branch for this invitation.
Presented at the October 2025 Meeting of the Nigerian Bar Association, Calabar Branch
Folarinwa Aluko is a legal practitioner and partner in the Law Firm of Trumann Rockwood Solicitors. He can be reached at fmaluko@trumann-rockwood.com or by phone at 08038601052.
