💭 Are Students Prisoners of War? How AI is Peeking Through School Windows

💭 Are Students Prisoners of War? How AI is Peeking Through School Windows
Photo by Jose Alonso / Unsplash

Content warning: this article discusses sensitive topics regarding gun violence within schools.

On November 30, 2021, during a haircut at a local barbershop, news started pouring in of a shooting at Oxford High School with live updates about the shooter's location, police counter-strategy, number of killings, and one heroic student’s sacrificial effort to disarm the active shooter. “This happened in my county,” I thought in disbelief. A year after the shooting at Oxford High School, news about a shooting threat caused nearby schools in Wayne, Oakland, and Macomb to close. This was only a “prank”, and the student who initiated the “prank” was quickly found through their device activity and use of social media. While this incident might have been a "prank", the students’ fear was real. 

Within three months, another shooting occurred at Michigan State University on the night of February 13, 2023, with a lone gunman hurting nine students and killing three. I frequented MSU to visit friends and family. I hadn’t considered that this campus could become a battleground. School shootings are a recurring nightmare in this nation, but they became much more visceral to me when they began to strike close to home. As Dean of Postsecondary Success at a high school in Detroit, fear for my students’ safety never felt more tangible than after these repeated threats. Our administrative team turned to AI security technologies, hoping to detect weapons without compromising our school's culture of community, safety, and productive exchange.

One of the AI solutions, ZeroEyes, offered to use AI detection software to detect weapons. Oxford High School adopted this tool after the shooting incident. The data collected is processed by ZeroEyes' “military-trained operations center analysts” for verification so that they can alert local 911 dispatch in “3 to 5 seconds”. We maintained a thorough camera system throughout the school, so adopting this product wasn’t too big a stretch. However, it posed a significant threat to Black students as any false detection of a gun would lead to an immediate call to nearby police.

The second option was an AI surveillance system that detects concealed weapons using X-ray scanners. The system is innovative with presumably higher performance than conventional X-ray machines. But anyone who’s ever been in an airport knows how invasive, inconvenient, and inconsiderate TSA inspections feel like.

The technical specifications revealed a profound moral and cognitive dissonance. The system endangers Black students falsely flagged to the police. These “solutions”’ false promises exacerbate harms in a broken system targeting Black and Brown communities through surveillance. Law enforcement officers abuse their power under the guise of “doing their jobs” as documented across social media. Artificial Intelligence would stoke flames of outrage within an education system debating whether teachers should be allowed to carry guns in schools.

Emerging technology perpetuates biases, criminalizing Black and Brown youth, thereby inhibiting career advancement, a sense of community, and school safety, preventing them from living a fulfilling life. False flags could lead to punitive measures, such as an expulsion or criminal offense that tarnishes students' records, blocking postsecondary opportunities. Society prejudges these students as adults deserving maximum punishment despite their innocence. Even well-intentioned security measures could strip dignity from our daily lives. Although these optimized surveillance systems may offer students a sense of safety, the impact could disempower students.

Ultimately, we decided against implementing either technology. Despite potential benefits, we concluded that AI technology would disrupt our community's culture, and compromise student security. Our dedicated Dean of Culture and team bond with students and implement restorative justice to resolve issues in our school community. Are we the heroes in this story by rejecting invasive alternatives, or the villains denying the optimal protection for our students?

True safety demands radical imagination and collective action. By empowering students to co-create security protocols, establish anonymous reporting systems, and develop emergency response plans, we build a foundation of trust and preparedness. This approach doesn't just protect - it strengthens our entire school community.

Acknowledgements:

Special thanks to my fellow tech ethicists, Gabrielle Hibbert, and Carolyn Yang, for their thorough and thoughtful input and feedback.

Edited by Estelle Ciesla and Vance Ricks

Subscribe to The Ethical Tech Digest

Don’t miss out on the latest issues. Sign up now to get access to the library of members-only issues.
jamie@example.com
Subscribe