Schools are using AI surveillance to protect students. It also leads to false alarms—and arrests


Lesley Mathis knows what her daughter said was wrong. But she never expected the 13-year-old girl would get arrested for it.
The teenage girl made an offensive joke while chatting online with her classmates, triggering the school’s surveillance software.
Before the morning was even over, the Tennessee eighth grader was under arrest. She was interrogated, strip-searched and spent the night in a jail cell, her mother says.
Earlier in the day, her friends had teased the teen about her tanned complexion and called her “Mexican,” even though she’s not. When a friend asked what she was planning for Thursday, she wrote: “on Thursday we kill all the Mexico’s.”
Mathis said the comments were “wrong” and “stupid,” but context showed they were not a threat.
Surveillance systems in American schools increasingly monitor everything students write on school accounts and devices. Thousands of school districts across the country use software like Gaggle and Lightspeed Alert to track kids’ online activities, looking for signs they might hurt themselves or others. With the help of artificial intelligence (AI), technology can dip into online conversations and immediately notify both school officials and law enforcement.
Educators say the technology has saved lives. But critics warn it can criminalize children for careless words.
‘Teachable moment’
In a country weary of school shootings, several states have taken a harder line on threats to schools. Among them is Tennessee, which passed a 2023 zero-tolerance law requiring any threat of mass violence against a school to be reported immediately to law enforcement.
Gaggle’s CEO, Jeff Patterson, said in an interview that the school system did not use Gaggle the way it was intended. The purpose is to find early warning signs and intervene before problems escalate to law enforcement, he said.
“I wish that was treated as a teachable moment, not a law enforcement moment,” Patterson said.
Alexa Manganiotis, 16, said she was startled by how quickly monitoring software works. West Palm Beach’s Dreyfoos School of the Arts, which she attends, last year piloted Lightspeed Alert, a surveillance program.
Interviewing a teacher for her school newspaper, Alexa discovered two students once typed something threatening about that teacher on a school computer, then deleted it. Lightspeed picked it up, and “they were taken away like five minutes later,” Alexa said.
Amy Bennett, chief of staff for Lightspeed Systems, said that the software helps understaffed schools to “be proactive rather than punitive” by identifying early warning signs of bullying, self-harm, violence or abuse.
The technology can also involve law enforcement in responses to mental health crises. In Florida’s Polk County Schools, a district of more than 100,000 students, the school safety program received nearly 500 Gaggle alerts over four years, officers said in public Board of Education meetings. This led to 72 involuntary hospitalization cases under the Baker Act, a state law that allows authorities to require mental health evaluations for people against their will if they pose a risk to themselves or others.
False alarms
“A really high number of children who experience involuntary examination remember it as a really traumatic and damaging experience—not something that helps them with their mental health care,” said Sam Boyd, an attorney with the Southern Poverty Law Center.
Gaggle alerted more than 1,200 incidents to the Lawrence, Kansas, school district in a recent 10-month period. But almost two-thirds of those alerts were deemed by school officials to be nonissues—including over 200 false alarms from student homework, according to an Associated Press (AP) analysis of data received via a public records request.
Students in one photography class were called to the principal’s office over concerns Gaggle had detected nudity. The photos had been automatically deleted from the students’ Google Drives, but students who had backups of the flagged images on their own devices showed it was a false alarm. District officials said they later adjusted the software’s settings to reduce false alerts.
Natasha Torkzaban, who graduated in 2024, said she was flagged for editing a friend’s college essay because it had the words “mental health.”
“I think ideally we wouldn’t stick a new and shiny solution of AI on a deep-rooted issue of teenage mental health and the suicide rates in America, but that’s where we’re at right now,” Torkzaban said. She was among a group of student journalists and artists at Lawrence High School who filed a lawsuit against the school system last week, alleging Gaggle subjected them to unconstitutional surveillance.
School officials have said they take concerns about Gaggle seriously, but also say the technology has detected dozens of imminent threats of suicide or violence.
“Sometimes you have to look at the trade for the greater good,” said Board of Education member Anne Costello in a July 2024 board meeting.
Two years after their ordeal, Mathis said her daughter is doing better, although she’s still “terrified” of running into one of the school officers who arrested her. One bright spot, she said, was the compassion of the teachers at her daughter’s alternative school. They took time every day to let the kids share their feelings and frustrations, without judgment.
“It’s like we just want kids to be these little soldiers, and they’re not,” said Mathis. “They’re just humans.”