Tech

Scientists Asked Students to Try to Fool Anti-Cheating Software. They Did.

During the pandemic, Proctorio sold invasive software that claims to stop cheating. A new study shows otherwise.
GettyImages-1339030411 (1)
Getty Images

Proctorio, a popular anti-cheating software used by schools around the world, failed to detect all of the cheaters in a controlled testing environment, a study found, confirming previous Motherboard reporting that cheating is fairly easy despite the company’s claims of preventing it. 

The researchers, from University of Twente in the Netherlands, concluded the software is “best compared to taking a placebo: it has some positive influence, not because it works but because people believe that it works, or that it might work.”

Advertisement

Proctorio requires students to install a browser extension or an app on their computers, which tracks student eye movements and body language during exams to automatically flag “suspicious” behavior, which can be—but is not always—checked by a human. The study, first presented at the 13th international conference on computer supported education in April 2021, sought to find out if the software worked. So they hired 30 student volunteers from the university’s computer science program and told six of them to cheat on a first-year exam supervised by Proctorio. Five others were told not to cheat but act suspicious, and the rest to take the test honestly. The researchers left it up to the students’ creativity to decide how to cheat so as best to fool the system.

Sign up for Motherboard’s daily newsletter for a regular dose of our original reporting, plus behind-the-scenes content about our biggest stories.

The results confirmed that Proctorio is not good at catching cheaters. The system did not flag any of the cheaters as cheating. Some used virtual machines, a known vulnerability to Proctorio’s system. The study says the software did flag those students as an “irregularity” but also flagged other honest students with the same irregularity. Similarly, some cheaters used audio calls, but Proctorio did not flag their audio as abnormal, but did flag the audio of students taking tests in noisy environments as abnormal. 

An independent human review of the data and footage caught only one of the six cheaters, largely because they couldn’t see what the students were doing from the chest down due to the angle of the webcam and the optional “room scan” feature, which makes students take footage of the room they’re testing in, is often blurry.

David Lux, a spokesperson for Proctorio, referred Motherboard to three other studies on “the efficacy of online proctoring” that are “multi-disciplinary and with a more robust sample size,” shortcomings of the University of Twente study the authors acknowledged. However, none of the three studies were controlled experiments. Instead, they either used surveys or regression models to estimate cheating rates and student attitudes towards online proctoring. Not only were they different study designs, but they were also interested in a fundamentally different question, since none of them sought to find out if online proctoring software is effective at catching cheaters.

That being said, all of the studies more or less agree on the central point of the University of Twente study: The software itself likely acts as a deterrent against the kind of effortless, easy cheating that occurs without any semblance of a proctor at all, such as students taking the test in the same place and consulting each other in real time. Whether this is an outcome schools need to pay thousands of dollars per test for while jeopardizing their students’ privacy, or, say, having one grad student sit in on the exam while everyone has their cameras on, is perhaps a subject for another study.