In the classic sci-fi movie “Minority Report,” Tom Cruise plays a cop whose “Precrime” unit uses surveillance and behavior patterns to arrest murderers before they kill. Set in the future, the movie raised tough questions about privacy, due process, and how predicting criminal behavior can destroy innocent lives.
But what once seemed like an action fantasy is now creeping into American classrooms.
Today, across the country, public schools are adopting artificial intelligence tools — including facial recognition cameras, vape detectors, and predictive analytics software — designed to flag students considered “high risk” — all in the name of safety. But civil rights advocates warn that these technologies are being disproportionately deployed in Black and low-income schools, without public oversight or legal accountability.
A recent report from the Center for Law and Social Policy (CLASP) argues that AI programs and mass surveillance aren’t making schools any safer, but rather quietly expanding the school-to-prison pipeline. And according to author Clarence Okoh, the tools don’t just monitor students — they criminalize them.
“The most insidious aspect of youth surveillance in schools is how it deepens and expands the presence of law enforcement in ways that were previously impossible,” says Okoh, a senior associate at the Georgetown Law Center on Privacy and Technology. “Black students are being watched before they even act.”
Surveillance in the Name of School Safety?
The rise of school surveillance didn’t begin with AI, but the advancing technology has taken it to a new scale. According to the National Center of Education Statistics, 91% of public schools use security cameras, while more than 80% monitor students’ online activity. Yet there is little evidence that these tools improve safety — and even less to show they’ve been tested for bias.
In fact, a 2023 Journal of Criminal Justice study found that students in “high-surveillance” schools had lower math scores, fewer college admissions, and higher suspension rates — with Black students bearing the greatest impact. These systems include facial recognition, social media monitoring, location tracking, as well as vape and gun detection sensors.
“The line between school and jail is being erased — not metaphorically, but digitally,” Okoh says.
In Pasco County, Florida, for example, an AI program secretly used school records to flag children for future criminal behavior based on grades, attendance, and discipline — leading to home visits, interrogations, and harassment.
“It wasn’t hypothetical,” Okoh said. “Kids were being watched, tracked, and punished — and families were being pushed out.”
Okoh also added that the incident in Pasco wasn’t isolated: “These tools are being marketed across the country, and the schools most likely to say yes are the ones serving Black and low-income students.”
Funded by Fear, Backed by Public Dollars
One of the report’s most alarming revelations was that schools paid for much of this AI surveillance with federal money meant to support students during the COVID-19 pandemic. Okoh’s report found that districts spent CARES Act and American Rescue Plan funds to purchase unvetted AI tools. Some vendors even advertised their products as eligible expenses under COVID-era guidelines, including predictive policing tools and vape sensors.
And yet, according to Okoh, most districts fail to assess whether these tools meet federal civil rights obligations before implementation. Okoh warns that this is more than an oversight — it’s a violation of federal funding laws.
“Title VI, disability rights, and data privacy laws are supposed to apply to any district that receives federal funds,” he says. “But there isn’t a single school I know of that has actually tested these technologies for civil rights compliance before buying them.”
The Cost of Being Watched
As school surveillance grows, students are increasingly aware they’re being watched, and Okoh says it’s affecting their well-being.
“Surveillance is easier than care,” he recalls one youth saying in a recent focus group. “And that’s the problem. These tools replace trusted relationships with punishment — and they do so under the guise of safety.”
Experts say the threat of surveillance isolates and criminalizes students. The more schools invest in these technologies, the fewer dollars go toward counselors, therapists, and restorative justice programs — the very things known to improve student outcomes.
“There’s this assumption that tech is neutral,” Okoh says. “But it’s not. These systems are built on data that already reflect bias, and then they turn that bias into decisions about who to monitor, who to discipline, and who to exclude.”
Even when the technology fails, Okoh says, the damage is already done: “Young people are saying, ‘If you truly cared about us, you wouldn’t be watching us — you’d be investing in us,’” he says.
A Youth-Led Vision of Safety
Okoh and the coalition he co-founded, NOTICE, or NoTech Criminalization in Education, are working to create a safe environment without relying on surveillance. Their vision centers on mental health supports, youth-led crisis response teams, peer mentorship, and restorative justice.
“Black students deserve schools that trust them, not track them,” he says. “They’re not asking for a world without accountability. They’re asking for one where they’re not criminalized just for being who they are.”
He’s also calling on policymakers to ban surveillance tech in schools outright — particularly tools that monitor biometric data, social media, or behavior without transparency or due process. And he urges the Black community to stay engaged and vigilant on the issue.
“These contracts pass quietly through school board meetings,” Okoh adds. “We need parents, educators, and students to all show up for our kids and say, ‘Our schools are not testing labs for private tech companies.’”
This story was originally published online with Word In Black, a collaboration of the nation’s leading Black news publishers (of which The Informer is a member).