Skip to Content

AI Policy & Governance, Equity in Civic Technology, Privacy & Data

How Automated Test Proctoring Software Discriminates Against Disabled Students

Because of the COVID-19 pandemic, most students can no longer attend classes or take exams on campus. From K-12 classrooms to higher education, virtual learning has become the new norm – and so has virtual proctoring for millions of tests. Virtual proctoring software algorithmically profiles students for suspicious behavior, creating anxiety and fears about surveillance in the exam room. For disabled students, it can be much worse  – as merely being disabled may be enough for the software to flag a student as suspicious, which can also exacerbate underlying anxiety and trauma.

Virtual proctoring can involve using webcams to record audio and video footage of students, both to monitor their body, head, or eye movements for potentially suspicious behavior and to use facial recognition to identify test-takers. It can involve use of wearable biometric monitoring devices to track movement and body temperature, as well as real-time video surveillance by strangers or even keystroke logging and remote access.

Schools, colleges, and universities turn to virtual proctoring because they want to prevent cheating and increase the efficiency of exam proctoring. But disabled people have long confronted accusations of cheating, lying, or faking – especially about our own disabilities. We’ve also often faced significant barriers to accessing digital technologies and remain disproportionately affected by the digital divide. It’s no wonder that we are deeply concerned about increasing surveillance, invasions of privacy, and discriminatory discipline arising from expanded use of automated proctoring.

Because being disabled can affect how we move, what we look like, how we communicate, how we process information, and how we cope with anxiety, virtual proctoring software places us at higher risk for being flagged as suspicious – and for being outed as disabled. 

Real-time remote proctoring or video recording software with AI analysis pose specific risks to students’ privacy and mental health.

The presence of an invisible virtual proctor or a known remote proctor can trigger anxiety in nondisabled students, and exacerbate it for students with chronic anxiety or post-traumatic stress disorders (PTSD). Video recordings can reveal that a student has a personal care attendant or support person, whether a paid professional or family member, that a student is low-income and shares a small living space with other people, or that a student has a service animal or emotional support animal present. AI software can flag an additional person’s presence as evidence of cheating – and the student’s disability status could be disclosed to anyone with access to the video recording.

Additionally, one automated proctoring company even had to suspend services after a recent security breach, which raises further concerns about whether such companies can adequately protect students’ sensitive data, including actual video or audio recordings that show where a student lives, who lives with them, and whether they have a readily apparent disability.

Video recording software with AI analysis risks flagging disabled students as suspicious based on disability-specific movement, speech, and cognitive processing.

AI video analysis might flag students with attention deficit disorder (ADD) who get up and pace around the room. It could flag students with Tourette’s who have motor tics, students with cerebral palsy who have involuntary spasms, or autistic students who flap or rock. It could flag students with dyslexia who read questions out loud, or blind students using screen-reader software that speaks aloud. It could flag students with Crohn’s disease or irritable bowel syndrome who need to leave to use the bathroom frequently. It could flag blind or autistic students who have atypical eye movements. Because all of these movements and responses are naturally occurring characteristics of many types of disabilities, there is no way for algorithmic virtual proctoring software to accommodate disabled students. The point is to identify and flag atypical movement, behavior, or communication; disabled people are by definition going to move, behave, and communicate in atypical ways.

Keystroke logging and remote desktop access software with AI analysis also risks flagging disabled students as suspicious based on use of necessary assistive technology.

Keystroke logging and remote desktop access software may not understand how to interpret movements or input from students who use speech-to-text dictation software, touch typing mode, sip and puff input, or other assistive technologies. The software may then flag such usage as suspicious because it does not meet the normed standards for mouse and keyboard movement. 

Facial recognition and similar technologies put privacy at risk when they do work, but they may fail to recognize some disabled people at all.

Facial recognition, facial detection, and similar automated technologies are unreliable for detecting and identifying Black people, East Asian people, South Asian people, Central American people, and trans and nonbinary people. How might they fare in properly identifying students with ectodermal dysplasia, Down syndrome, Crouzon syndrome, or other disabilities that create disfigurement or facial differences? If they fail to properly detect and identify students with particular disabilities, then those students could be prevented from even accessing tests that require facial recognition for identity confirmation. Further, even if facial recognition technologies are eventually able to accurately identify disabled students, its use in the school context can further fear about whether and to what extent schools could increase surveillance of their students in other ways.

One of my teaching colleagues shared that many of her students in other classes – especially science, technology, engineering, and math classes – have been subjected to use of Proctorio, one of the largest vendors of virtual proctoring software. One student was kicked out of her exam because her eyes moved atypically and had to beg to be allowed back in. Another student read questions out loud to herself and was terrified she’d be flagged for cheating. Another student, whose exam was meant to be open-note, had no idea what Proctorio would consider cheating since the software flags as suspicious the presence of any other material on a student’s desk. Another student had to ask their roommate to avoid entering the room for three hours just to be able to take the test, and yet another student had no private room to take the test at all.

Some students are organizing to ban facial recognition and automated proctoring on college campuses, both because of the high likelihood of discrimination and because of fears about increased surveillance and mandatory data collection. But other students, especially much younger ones, may not be able to organize as effectively – and are learning that this kind of proctoring should be the norm.

As use of virtual proctoring technologies expands during the pandemic, disabled students who belong to multiple marginalized communities will remain at heightened risk of harm from technology-driven, punitive surveillance. Automated proctoring presumes that there is one normal kind of body and one normal kind of learning. It assumes that all students have the ability to demonstrate their knowledge through conventional assessments. It also assumes that all students have equal access to safe, private, clean, and quiet testing environments. Virtual proctoring could only be fair if it accounted for the many ways that students move, learn, process information, and demonstrate knowledge. It would be extraordinarily difficult to design and implement a virtual proctoring program that actually supports multiple ways of learning and knowing, and so, schools should not use them. Disabled students deserve access to learning and teaching environments that use multiple modes of instruction and assessment, and that focus not on surveilling and punishing, but on learning and teaching.