In recent years, many universities, exam boards and online‐learning platforms have adopted AI proctoring tools—that is, software that monitors students during exams using cameras, microphones, sometimes facial recognition, voice detection, etc. These tools are often deployed in remote exam settings, or where human invigilation is costly or impractical.
But for students in Africa, two major obstacles often go unaddressed: first, accent and dialect bias—where the software misinterprets or penalises speech that is different from “standard” accents or dialects; second, power outages and unreliable electricity or internet supply, which can disrupt exams and bias outcomes. Unless these issues are audited and corrected, AI proctoring may actually deepen inequality rather than help reduce cheating or improve fairness.
Several studies show that speech technologies, including automatic speech recognition systems (ASRs), struggle with non‐native English accents or dialects. For example, systems often misinterpret or perform much worse for speakers of African American English than for those using Standard American English. Also, in African countries, unreliable infrastructure means that frequent power outages or internet disruptions during online or computer‐based exams are commonplace. These disruptions (sometimes called “load-shedding” in some countries) not only interrupt the exam but can lead to unfair penalties (time lost, inability to submit work etc.).
So, the intersection of accent bias and infrastructural instability means that many African students are at risk of being disadvantaged by systems that claim to be “objective”.

Table of Contents
What Recent Audits and Studies Reveal
Audits of AI systems often show technical bias—accuracy varies depending on speaker accent, background noise, skin tone, gender, etc. One relevant project is AfriSpeech-200, a Pan-African dataset designed to give speech recognition systems more representative training data across many African accents. Its existence points to recognition that current ASR systems are not doing well with such diversity. arXiv
Similarly, the systematic review “A Systematic Review on AI-based Proctoring Systems: Past, Present and Future” highlights that many AI-proctoring tools have limited peer-reviewed evidence on how they perform under conditions common in Africa (e.g. lower quality internet, interruptions, diverse accents).
On the infrastructure side, reports from Nigeria’s exam body and media show that in the 2025 Unified Tertiary Matriculation Examination (UTME), many candidates encountered blank questions, login failures, and power cuts, which prevented them from completing exams properly. In some cases, a “technical glitch” caused by power failures or system issues led to markedly poor pass rates.
In South Africa, too, load-shedding has disrupted online exams. Students have had their exams cut short or forced to forfeit because power went off in the middle. These kinds of disruptions aren’t rare—they are systemic, especially in regions where the electric grid is under strain or backup infrastructure is weak.
Together, these findings suggest that unless AI proctoring tools are audited specifically for accent bias and resilience to power/internet disruption, they may reinforce existing educational inequities.
How These Problems Manifest in Everyday Exam Situations
To understand fully how accent bias and power outage issues combine to disadvantage students, consider some of the more concrete scenarios:
- Misflagging speech or behaviour because of accent: An AI tool might misinterpret pauses, pronunciation, or tonal inflexions typical of West African English (or Nigerian English) as suspicious behaviour. For example, it might “flag” background noise or accents as a disturbance, or misrecognise speech, penalising students wrongly.
- Sudden power cut during an exam: A student might lose electricity in the middle of an exam, causing software to freeze or disconnect from the internet. Depending on the exam rules, this could lead to the exam being “forfeited” or the student losing time which cannot be recovered. Even if an extension is offered, emotional stress and technical frustration might reduce performance.
- Unreliable internet during proctoring: Even where there is power, internet disruptions can break video or audio transmission. AI proctoring tools might mistake the loss of video feed for cheating or attempt to reconnect in ways that waste time, or recordings may be corrupted.
- Psychological stress: The knowledge that accent or dialect might be misread, or that power might fail, adds a layer of anxiety. Some students might overprepare or avoid using online exams or AI-monitored formats entirely, giving up potential opportunities.
These issues are not hypothetical. There are reports—such as from UTME candidates in Nigeria and students in South Africa—confirming that power cuts and system glitches have led to confusion, cancellations or complaints.

What Can Be Done: Strategies for Fair Audits and Better Practice
If AI proctoring is going to be fair for African students, institutions, developers and regulators need to work together. Here are some key recommendations:
- Include African accents and dialects in training and testing
AI systems must be trained with speech datasets that reflect the range of accents common in Africa. Projects like AfriSpeech-200 are good models. Testing should include real-world speech samples (with background noise, local pronunciation, etc.) to reveal bias. - Conduct regular bias audits
Beyond just initial testing, companies and educational institutions should conduct independent and periodic audits of AI proctoring systems—especially focusing on how they handle speech from non-standard accents, and what happens when technical disruptions (like power or internet outages) occur. - Build robust infrastructure or fallback mechanisms
Backup power (generators, solar), offline exam modes, or human invigilation support in case power fails or the network goes down. Also, exam rules should allow for rescheduling or making up lost time due to disruptions that are outside the candidate’s control. - Transparent rules about flags and penalties
Students must understand in advance what kinds of behaviours or events are flagged, how accent/dialect differences are handled, and what will happen if there is a disconnection or power loss. There should be clear appeal processes for when AI errs. - Hybrid oversight: AI + human review
Automated flagging should not lead immediately to negative outcomes. Human review should mediate, especially in ambiguous cases of speech detection or when disconnection happens. - Policy and regulation
Governments and regulatory bodies should establish guidelines or standards for AI in exams that require fairness, inclusivity, and accommodation for infrastructural constraints. Accreditation or examination authorities can enforce those. - Awareness and training
Educators, exam authorities, and students all should be aware of the limitations and risks of AI proctoring. Training can reduce unintended penalties: for example, teaching students how to ensure good audio quality, stable internet, and what to do in case of power disruptions; also teaching developers to build with these limitations in mind.

Conclusion
AI proctoring tools carry great promise—scaled monitoring, reduced cheating, and convenience for remote or online exams. Yet, for many African students, accent bias and infrastructural instability (especially power cuts and poor internet) threaten the fairness of these tools. If institutions simply deploy proctoring software without auditing for such risks, they risk perpetuating inequality.
To ensure fair access to education, we need focused audits of AI proctoring tools specifically for African accents and for resilience to power outages and connectivity failures. Only then can we claim that “online exams” or “remote proctored tests” are accessible and just for everyone, not only those in regions with perfect infrastructure or “standard” speech patterns.
Join Our Social Media Channels:
WhatsApp: NaijaEyes
Facebook: NaijaEyes
Twitter: NaijaEyes
Instagram: NaijaEyes
TikTok: NaijaEyes