GettyImages-1226755086 copy
Image: Getty Images / Composition: Jason Koebler

Students Are Rebelling Against Eye-Tracking Exam Surveillance Tools

Invasive test-taking software has become mandatory in many places, and some companies are retaliating against those who speak out.
Janus Rose
New York, US

As a privacy-minded computer science student preparing to start his first year at Miami University, Erik Johnson was concerned this fall when he learned that two of his professors would require him to use the digital proctoring software Proctorio for their classes. The software turns students’ computers into powerful invigilators—webcams monitor eye and head movements, microphones record noise in the room, and algorithms log how often a test taker moves their mouse, scrolls up and down on a page, and pushes keys. The software flags any behavior its algorithm deems suspicious for later viewing by the class instructor.


In the end, Johnson never had to use Proctorio. Not long after he began airing his concerns on Twitter and posted a simple analysis of the software’s code on Pastebin, he discovered that his IP address was banned from accessing the company’s services. He also received a direct message from Proctorio’s CEO, Mike Olsen, who demanded that he take the Pastebin posts down, according to a copy of the message Johnson shared with Motherboard. Johnson refused to do so, and is now waiting to see if Proctorio will follow up with more concrete legal action, as it has done to other critics in recent weeks.

“If my professors weren’t flexible, I’d be completely unable to take exams,” Johnson said. “It’s insane to think that a company [or] CEO can affect my academic career just for raising concerns.”

His case is just one example of how college campuses are revolting against the use of digital proctoring software, and the aggressive tactics employed by proctoring companies in response to those efforts. In recent weeks, students have started online petitions calling for universities across the world to abandon the tools, and faculty on some campuses, like the University of California Santa Barbara, have led similar campaigns, arguing that universities should explore new forms of assessment rather than subjecting students to surveillance.

“We need to really think long and hard about how we are adapting,” Jennifer Holt, a film and media studies professor at UCSB, told Motherboard. "We’re supposed to be protecting our students.”


Surveillance at Home

Algorithmic proctoring software has been around for several years, but its use exploded as the COVID-19 pandemic forced schools to quickly transition to remote learning. Proctoring companies cite studies estimating that between 50 and 70 percent of college students will attempt some form of cheating, and warn that cheating will be rampant if students are left unmonitored in their own homes.

Like many other tech companies, they also balk at the suggestion that they are responsible for how their software is used. While their algorithms flag behavior that the designers have deemed suspicious, these companies argue that the ultimate determination of whether cheating occured rests in the hands of the class instructor. The companies consider the algorithms proprietary and Proctorio, in particular, has reacted swiftly to prevent anyone with access to its training material or underlying code from disclosing their analyses.

“Any plan that calls for schools to just ‘stop using’ proctoring will make cheating more common than it already is, escalating a severe threat to all higher education,” Scott MacFarland, the CEO of ProctorU, another proctoring vendor, wrote in an email to Motherboard. Comparing his product’s deterrent effect to that of more ubiquitous surveillance technology, he added “we may not love the idea of being on camera every time we visit a bank or go to a convenience store, but no one is suggesting taking them down.” 


There is little peer-reviewed evidence about how digital proctoring affects students’ honesty and test-taking ability, and the little research that has delved into the issue doesn’t offer a clear consensus. A 2018 study tracking 2,686 students across 29 courses found that those whose exams weren’t monitored using Proctorio received grades 2.2 percent lower than those whose were. The authors concluded that the results were likely a result of cheating by students not using Proctorio. But a 2019 study involving 631 students found that test takers who felt higher levels of anxiety during exams performed worse, and that the cohort of students monitored by proctoring software felt more anxiety than those who weren’t.

Suspicion levels.JPG

A slide from Proctorio's training materials, detailing how the system measures "suspicion levels" while students take exams.

Students’ and educators’ objections to exam proctoring software go beyond the privacy concerns around being watched and listened to in their bedrooms while they take a test. As more evidence emerges about how the programs work, and fail to work, critics say the tools are bound to hurt low-income students, students with disabilities, students with children or other dependents, and other groups who already face barriers in higher education.

Every day for the last week, Ahmed Alamri has opened ExamSoft and attempted to register for the practice version of the California state bar exam. Every time, the software’s facial recognition system has told him the lighting is too poor to recognize his face. Alamri, who is Arab-American, has attempted to pass the identity check in different rooms, in front of different backgrounds, and with various lighting arrays. He estimates he’s attempted to verify his identity as many as 75 times, with no success. “It just seems to me that this mock exam is reading the poor lighting as my skin color,” he told Motherboard.


Alamri isn't alone. Law students around the country are organizing to fight against the use of any kind of digital proctoring software like ExamSoft on bar exams. In California, two students have filed an emergency petition with the state supreme court requesting that it cancel the exam entirely and institute a new form of assessment. A similar effort is underway in Illinois, while Louisiana, Oregon, and Wisconsin have already scrapped their upcoming bar exams as a result of student pressure. Other states, including New York, are fumbling for solutions as deadlines for the exams quickly approach; at one point, New York's test proctor announced it was going to ban the use of "desktop computers" to take the test.

In their petition, the students say the use of ExamSoft discriminates against people of color because facial recognition technology has been shown on numerous occasions to be worse at recognizing people with darker skin tones, and particularly women of color. The California bar exam would require test takers to verify their identity with facial recognition checks eight separate times, according to the petition, and a single failure would end the test.

The petitioners also conducted a survey of 1,413 law students who were preparing to take the bar exam. “78.8 percent of African-American/Black respondents, 91.7 percent of Alaskan Indian or Native American respondents, 71.5 percent of Asian/Pacific Islander respondents, 81.4 percent of Southwest Asian North African respondents, and 75.9 percent of Latinx, Latino/a, Hispanic respondents” reported that they either would not have reliable internet during the exam, or were unsure whether their connection would be reliable.


“They aren’t taking into consideration people from underprivileged communities,” Alamri said. “This sort of online exam is really measuring a person’s generational wealth and not their knowledge of the law.”


Another major point of contention between proctoring companies and university communities has been the algorithmic techniques the software uses to detect potential cheating.

In training documents Proctorio provides to universities, the company explains that its software determines whether a test-taker’s “suspicion level” at any given moment is low, moderate, or high by detecting “abnormality” in their behavior. If a student looks away from the screen more than their peers taking the same exam, they are flagged for an abnormality. If they look away less often, they are flagged for an abnormality. The same goes for how many keystrokes a student makes while answering a question, how many times they click, and a variety of other metrics. Variation outside the standard deviation results in a flag.

That methodology is likely to lead to unequal scrutiny of people with physical and cognitive disabilities or conditions like anxiety or ADHD, Shea Swauger, a research librarian at the University of Colorado Denver’s Auraria Library who studies educational technology, told Motherboard. “These coders are defining, mathematically, the ideal student body: how often it does, or doesn’t do, these certain attributes, and anything outside of that ideal is treated with suspicion,” he said.

Keystroke abnormalities

A slide from Proctorio's training materials detailing how the software detects "abnormalities" by analyzing keystroke patterns.

Proctorio and other proctoring companies strongly disagree with that assessment. “The biggest thing is that we’re not making any sort of academic decisions, we’re just providing a quicker way [for teachers] to review places in the exam based on the things they’re looking for,” Olsen, Proctorio’s CEO, told Motherboard in an interview. Teachers can choose which types of behaviors to monitor, and it’s up to them to decide whether an abnormality constitutes cheating, he added.

Students from multiple schools across the US told Motherboard that while teachers ultimately choose whether and how to use exam-monitoring software like Proctorio, they often do so with no guidance or restrictions from the school's administration.

"Each academic department has almost complete agency to design their curriculum as far as I know, and each professor has the freedom to design their own exams and use whatever monitoring they see fit," Rohan Singh, a computer engineering student at Michigan State University, told Motherboard. 

Singh says that students at the school objected after professors began using an exam-monitoring software called Respondus without proper notice at the end of the spring semester, when many universities began converting to online learning. He added that while it's ultimately up to the instructor how the software is used, it generally helps teachers who are predisposed toward doling out Academic Dishonesty Reports, or ADRs. "As a rule of thumb, the professors who choose to use Respondus are the professors more inclined to use their discretion to hand out ADRs," he said.


Nearly a dozen other students told Motherboard that they or their peers had objected to professors' use of exam-monitoring software at other state universities across the US.

In April, Swauger, who is organizing an effort to convince the University of Colorado system to drop Proctorio, published a peer-reviewed article critical of algorithmic proctoring in the journal Hybrid Pedagogy. In response, Proctorio sent a letter to the journal demanding a retraction. The journal’s editors declined. 

The company’s response to Ian Linkletter, a learning technology specialist at the University of British Columbia, in Vancouver, was even sharper. After Linkletter began sharing Proctorio training videos and documents that explained the company’s abnormality methodology on Twitter, the videos were removed from YouTube, and Proctorio filed for a court injunction to prevent Linkletter from sharing its training material. Linkletter declined to comment for this article due to the pending legal case.

Olsen said Proctorio welcomes public critiques of its service, but takes action when critics share records the company hasn’t made public.

Other proctoring companies have also been litigious when faced with criticism. 

In March, after students approached faculty members at the University of California Santa Barbara, the faculty association sent a letter to the school’s administration raising concerns about whether ProctorU would share student data with third parties. The faculty asked UCSB to terminate its contract with the company and discourage professors from using similar services.

In response, a ProctorU attorney threatened to sue the faculty association for defamation and violating copyright law (because the association had used the company’s name and linked to its website). He also accused the faculty association of “directly impacting efforts to mitigate civil disruption across the United States” by interfering with education during a national emergency, and said he was sending his complaint to the state’s Attorney General.

Although ProctorU never filed a lawsuit against the UCSB faculty association, the threat had a chilling effect on professors’ willingness to discuss the software.

Holt, one of the faculty members who first raised questions about proctoring software, declined to talk to Motherboard specifically about the ordeal or ProctorU. But in general, she remains worried about the spread of proctoring tools on campuses.

“We must do better than subjecting our students to surveillance and violations of their privacy,” she said. “We must do better than allowing algorithmic policing through biometric surveillance as the new normal for education.”