At one New York school district, a facial recognition program that scans students' faces has become both a local controversy and a national test case at the forefront of a wave of similar systems rolling out in US schools.
Image: Lia Kantrowitz/VICE

FYI.

This story is over 5 years old.

Tech

Facing Tomorrow's High-Tech School Surveillance

Installed in the wake of recent high-profile mass shootings, controversial facial recognition systems that scan students’ faces could be the not-too-distant future for schools across America and beyond.

This story appears in VICE Magazine's Power and Privilege Issue. Click HERE to subscribe.

This fall, as students file into Lockport City schools in upstate New York, they will be watched not just by teachers. Instead, for the first time in the district’s history, students will be monitored by a sophisticated new surveillance system that scans their faces, looking for matches to the school’s security database.

Advertisement

It might sound like dystopian science fiction, but this could be the not-too-distant future for schools across America and beyond. Researchers at the University of California, San Diego, for instance, have already begun publishing models for how to use facial recognition and machine learning to predict student engagement. A Seattle company recently offered up an open-source facial recognition system for use in schools, while startups are already selling “engagement detectors” to online learning courses in France and China. Advocates for these systems believe the technology will make for smarter students, better teachers, and safer schools. But not everyone is convinced this kind of surveillance apparatus belongs in the classroom, that these applications even work, or that they won’t unfairly target minority faces.

Lockport’s facial recognition program has become both a local controversy and a national test case at the forefront of a wave of similar systems rolling out in American schools. To install its system, the Lockport school district was awarded $4 million through the Smart Schools Bond Act, a New York State fund. While most other schools in the state applied for funding to update computer labs or digitize books, Lockport requested specific funds for “new cameras and wiring…to provide viewing and automated facial and object recognition of live and recorded surveillance video,” plus “additional surveillance servers…to provide enhanced storage of recorded video and processing,” according to the grant application.

Advertisement

Earlier this year, the school district announced it would be using tech developed by SN Technologies Corp., the Canadian company behind Aegis, a surveillance platform that comes with both facial recognition software and a tool designed to flag guns that might appear on the camera footage (provided the firearm is in someone’s hand, not in a bag). In the wake of high-profile mass school shootings across the US, Lockport, a small, conservative town of around 20,000 people, has invested in Aegis out of a belief the facial recognition system will help safeguard students, even though there’s no evidence that such a system would be an effective security measure in an active shooter scenario. As this issue went to press, KC Flynn, the head of SN Technologies, told me that 20 other US school districts were considering moving forward with Aegis.

“They want to see the product up and running in Lockport” first, Flynn said.

**

Here’s what SN Technologies’ vision for Aegis looks like: A school using the platform installs a set of high-quality cameras, good enough to detect individual student faces, before determining exactly which biometrics it thinks must set off the system. Crucially, it’s up to each school to input these facial types, which it might source from local police and mug-shot databases, or school images of former students it doesn’t want on its premises. With those faces loaded, the Aegis system goes to work, scanning each face it sees and comparing it with the school’s database. If no match is found, the system throws that face away. If one is, Aegis sends an alert to the control room.

Advertisement

The idea is that the school could get an extra few seconds of warning when an unwanted person arrives on campus, whether that’s an expelled student or an escaped felon. But critics of the system point out that the vast majority of school shooters are enrolled students—individuals who probably wouldn't be in the facial database.

Illustration by Lia Kantrowitz

Illustration by Lia Kantrowitz

To implement its system, Lockport is installing or updating a total of 417 cameras, according to an excerpt of a contract obtained by Jim Shultz, a parent in Lockport, and shared with VICE, that outlines the district’s planned implementation of the surveillance tech (with SN Technologies being a subcontractor, in this case). The network will cover six elementary schools, one middle school, one high school, and an administrative building.

“It is cutting edge,” Dr. Robert LiPuma, the director of technology for the district, told the Lockport Union-Sun & Journal in March. (The district did not respond to repeated requests for comment for this story.) When it comes to school security, LiPuma said Lockport hopes to be “a model.”

The New York branch of the American Civil Liberties Union (NYCLU) is worried about just that. It’s well known that facial recognition systems are often biased in ways that disproportionately affect people of color. It’s also unclear what biometric database SN Technologies uses to train the Aegis system to detect faces. Previous cases have shown how big an impact training data has on the accuracy of these systems—and using certain databases creates a system that can, for example, incorrectly peg 28 members of Congress (most of them people of color) as criminals on the basis of images from a mug-shot database, as demonstrated in a recent ACLU test that used Amazon’s facial recognition tool. Flynn declined to comment on how Aegis was developed, citing the proprietary nature of the software.

Advertisement

In theory, the safeguard against a student of color being misidentified as a felon, for example, is that whoever is in the control room must confirm that a match is indeed correct and not a false positive. That may not be so simple, especially if the security worker is white, and what happens once the system triggers an alert is up to each school to decide.

Hundreds of documents related to Lockport’s new surveillance program, obtained by the NYCLU in late August through a Freedom of Information Law request, suggest that Lockport did not engage with the community before deciding to move ahead with installing the surveillance network, and that a security consultant who taught Lockport’s board about the tech and was later hired by the district holds licensing for Aegis through a separate company, CSI. The NYCLU found nothing in the documents outlining policies for accessing data collected by the cameras, or what faces would be fed to the system in the first place. And based on emails acquired through the same FOIL request, the NYCLU noted, Lockport administrators appeared to have a poor grasp on how to manage access to internal servers, student files, and passwords for programs and email accounts.

“The serious lack of familiarity with cybersecurity displayed in the email correspondence we received and complete absence of common sense redactions of sensitive private information speaks volumes about the district’s lack of preparation to safely store and collect biometric data on the students, parents and teachers who pass through its schools every day,” an editor’s note to the NYCLU’s statement on the Lockport documents reads.

Advertisement

The Aegis website offers little information about how the system actually works, either. It describes the facial recognition tool as something that “will be used to alert school officials if anyone from the local Sex Offenders Registry enters a school or if any suspended students, fired employees, known gang members or an affiliate enters a school.” As to where such a database of “known gang members or an affiliate” would come from, Flynn said Aegis doesn’t come with preloaded faces, so it’s on the individual school to provide the system whatever biometrics it thinks should be registered. Individual schools also get to select the duration of data storage, though in most cases, Flynn said, the system won’t be saving individual faces as it scans students moving about the school. Rather, it will attempt to square any of them to those registered in the system, and discard if no match is found.

Of course, if a school wanted to put every student’s face in the system to track throughout the school year, theoretically, it could. “That hasn’t been my experience,” Flynn noted, when I raised that possibility. “That’s not how we package the system.”

Meanwhile, Jim Shultz, whose daughter currently attends Lockport High School, has been trying to organize parents to rally against the system. He sees it as not only an invasion of privacy, but a waste of money for a district that comprises around 4,500 students. Of the original $4 million Smart Schools grant, Lockport has spent over $3 million to date, putting its per-pupil spending on the tech at over $550. When Shultz tried to voice his concerns to school administration and a security consultant working with the district, he told me that the board seemed not to take him seriously.

Advertisement

“Students should think of schools as a welcoming place to learn,” Coyle added. “They shouldn’t have to worry that their every single move is being monitored, and their pictures are going to wind up in some kind of law enforcement or immigrant database just because they decided to come to school today.”

In Lockport, school security officers will be responsible for watching the cameras in a surveillance room, according to Flynn. At any other school, it’s still anyone’s guess who will have access to the surveillance system. This, in turn, leads the NYCLU to wonder whether undocumented students and their parents risk being flagged and turned over to US Immigration and Customs Enforcement for deportation. To complicate matters further, schools can each establish their own protocols and decide themselves who can access the information. Without knowing how long this data is stored for, and by whom, it’s hard to evaluate the potential security risks. It’s also currently unclear if students, for their part, will be allowed to opt out of facial scanning. In the US, biometric data from students of any age falls under the Family Educational Rights and Privacy Act (FERPA), a law meant to protect the privacy of student education records. But if the surveillance system is controlled by law enforcement, and not the school, then FERPA doesn’t apply.

Beyond the possibility that Aegis and similar surveillance platforms might harm some people more than others, critics say that deploying technologies in schools to begin with sets a clear tone. “They’re sending this message to kids that they’re unpredictable potential criminals,” Stephanie Coyle, the NYCLU education counsel, told me. Coyle co-wrote a letter to the New York State Education Department objecting to the use of this technology in schools. “Students should think of schools as a welcoming place to learn,” Coyle added. “They shouldn’t have to worry that their every single move is being monitored, and their pictures are going to wind up in some kind of law enforcement or immigrant database just because they decided to come to school today.”

Advertisement

Then again, what if teachers could use facial recognition systems to better understand students and likewise improve their lectures? What if students could leverage machine learning to help them pay attention better? Developers are teasing out such ideas in labs and startups across the globe, hoping to build, essentially, emotional surveillance systems.

Think of them as engagement detectors. One such system, Nestor, which is now mainly applied to at-home video lectures, operates in a particularly tricky fashion. At the end of the lecture, the system asks students questions only from segments where the software registered them as not being attentive. “It is impossible to pass the exam if you’re not 200 percent focused,” Marcel Saucet, the CEO of LCA Group, the company behind Nestor, told me.

Proponents like Saucet see the potential for these engagement detection systems to revolutionize teaching. “I really think we can change the world of education,” he said.

Students at a French business school who participated in the main public case studies involving Nestor weren’t so convinced. They couldn’t opt out of the trial and to view lectures had to agree to turn their webcams on, something Saucet reported the students didn’t always like: “They don’t want to be spied on,” he said. “They can be scared,” he added, “but it’s not going to change anything.” The way he sees it, there’s no stopping facial recognition tech creeping into our lives. “Everybody is doing this,” he continued. “It’s really early and shocking, but we cannot go against natural laws of evolution.”

Advertisement

**

In fact, there is nothing natural or predetermined about the active decisions made by technologists, each with their own innate prejudices, in developing systems that monitor students. There’s evidence to suggest that facial recognition tech struggles to recognize non-white faces, and research also shows teachers are more likely to perceive the faces of black students as angry than the faces of white students.

“The mechanisms that perpetuate such systemic inequalities do not magically disappear when a new technological system like facial recognition is introduced into classrooms and therapeutic environments,” said Meryl Alper, an assistant professor of communication studies at Northeastern University. “History shows that they actually tend to amplify them.”

When I asked Suresh Venkatasubramanian, a University of Utah computer science professor who studies algorithmic bias, about impartial emotional surveillance, he laughed. “I could answer your question for you, but I’m not even sure it’s the most important question. I think many other questions would need to be answered first,” he replied, rattling off a few: issues of control and privacy, of who owns the data and how it’s stored, whether the training software is actually meant for this application, and if engagement is even a meaningful thing to study. “As you’re describing this to me,” Venkatasubramanian said, “a whole bunch of red flags are popping up and not one of them is about the machine learning. The machine bias is probably the least fraught question of all these.”

Advertisement
Illustration by Lia Kantrowitz

Illustration by Lia Kantrowitz

The idea that researchers can, and should, quantify something as slippery as “engagement” is a red flag for many of the experts I talked to. As Alper put it, “anyone who has spent time in any kind of classroom will know that attention isn’t something well-measured by the face. The body as a whole provides many more cues.”

That’s especially true for those who might not perform engagement the way an algorithm has been trained to recognize: students who don’t make eye contact with a presenter, who rock or self-stimulate, or who are working parents trying to listen to a lesson at home while feeding kids.

Saucet said that these individual differences wouldn’t be a problem over time, as Nestor would learn how to recognize “engagement” on student’s faces. But during the training period for any such system, students who don’t perform engagement the same way as others could potentially be penalized. And in order to learn like this, Nestor stores student data for a long time. For the moment, at least, the company’s own data collection dates back about two and a half years, according to LCA Group’s chief technology officer, Nicolas Delhaume. “If the goal is to track and help the student,” Delhaume said, “mostly we need to keep history during the full course of the student.” (Nestor stores data on teachers for at least two years.)

Other attempts at building these systems illustrate how difficult a thing “engagement” is to quantify. “Engagement is an abstract, somewhat hard to define state,” said Jacob Whitehill, a computer science professor at Worcester Polytechnic Institute in Massachusetts. In 2014, Whitehill copublished a paper documenting the automated detection of student engagement, in which participating instructors specifically weren’t asked to rate how engaged someone actually was. Instead, they were asked to rate “how engaged does the subject appear to be.” (“It gets around trying to read the person’s mind,” Whitehill said.) Whitehill and his coauthors ultimately found no correlation between perceived engagement and actual learning.

Venkatasubramanian, for one, thinks we’re “in a hurry to throw tech at everything,” though he’s quick to say machine learning certainly could help in classrooms. What if students controlled the system, say, or owned all their data and decided what to do with it? “We could imagine so much more,” he said.

For now, in places like Lockport, gun violence has prompted a different reaction than in Parkland, Florida, where student-activists have led rallies demanding tighter gun control laws and regulations. Instead of pushing for similar measures, Lockport is turning to technology. “Some people have taken that fear and used it as a device to generate public empathy and support for doing something about guns,” Shultz said, “but those same fears can be used to take schools in the direction of other kinds of policies like this one.

“You want to look like you’re doing something,” Shultz continued. Even if that comes with its own set of problems.