in READ

Facing Tomorrow’s High-Tech School Surveillance

Facing Tomorrow's High-Tech School Surveillance by Rose Eveleth (Motherboard)
Installed in the wake of recent high-profile mass shootings, controversial facial recognition systems that scan students’ faces could be the not-too-distant future for schools across America and beyond.
Rose Eveleth in Motherboard. All annotations in context.

It might sound like dystopian science fiction, but this could be the not-too-distant future for schools across America and beyond. Researchers at the University of California, San Diego, for instance, have already begun publishing models for how to use facial recognition and machine learning to predict student engagement. A Seattle company recently offered up an open-source facial recognition system for use in schools, while startups are already selling “engagement detectors” to online learning courses in France and China. Advocates for these systems believe the technology will make for smarter students, better teachers, and safer schools. But not everyone is convinced this kind of surveillance apparatus belongs in the classroom, that these applications even work, or that they won’t unfairly target minority faces.

The system is comprised of a multitude of cameras constantly scanning faces and comparing to databases.

A school using the platform installs a set of high-quality cameras, good enough to detect individual student faces, before determining exactly which biometrics it thinks must set off the system. Crucially, it’s up to each school to input these facial types, which it might source from local police and mug-shot databases, or school images of former students it doesn’t want on its premises. With those faces loaded, the Aegis system goes to work, scanning each face it sees and comparing it with the school’s database. If no match is found, the system throws that face away. If one is, Aegis sends an alert to the control room.

The scanning, storage, and security on this dataset is showing a lack of preparation in dealing with this dataset.

The NYCLU found nothing in the documents outlining policies for accessing data collected by the cameras, or what faces would be fed to the system in the first place. And based on emails acquired through the same FOIL request, the NYCLU noted, Lockport administrators appeared to have a poor grasp on how to manage access to internal servers, student files, and passwords for programs and email accounts.

“The serious lack of familiarity with cybersecurity displayed in the email correspondence we received and complete absence of common sense redactions of sensitive private information speaks volumes about the district’s lack of preparation to safely store and collect biometric data on the students, parents and teachers who pass through its schools every day,” an editor’s note to the NYCLU’s statement on the Lockport documents reads.

The post goes on to discuss challenges with the algorithms identifying “engagement” in these scans of student faces.

The idea that researchers can, and should, quantify something as slippery as “engagement” is a red flag for many of the experts I talked to. As Alper put it, “anyone who has spent time in any kind of classroom will know that attention isn’t something well-measured by the face. The body as a whole provides many more cues.”

One would think that we could possibly train, trust, and pay individuals that would sit in the classroom and monitor students as well.

“Students should think of schools as a welcoming place to learn,” Coyle added. “They shouldn’t have to worry that their every single move is being monitored, and their pictures are going to wind up in some kind of law enforcement or immigrant database just because they decided to come to school today.”

Reposts

  • Earl Draculera

Write a Comment

Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Webmentions

  • Earl Draculera reposted this bookmark on twitter.com.