Baltimore High School Student Detained by Cops After Artificial Intelligence System Mistook His Bag of Doritos for a Gun

Write Comment

AI Gone Wrong: Doritos Bag Mistaken for a Gun

An AI system missed the mark big time when it mistook a high school student’s bag of Doritos for a firearm and alerted local police, thinking the kid was armed.
Taki Allen, a student at Kenwood High School in Baltimore, was hanging out with friends on Monday night, just vibing and enjoying some chips, when cops rolled up with their weapons drawn.

“At first, I had no clue what was happening until I saw them approaching with their weapons drawn, telling me to ‘Get on the ground.’ I was like, ‘What’s going on?’” Allen told WBAL-TV 11 News.
According to reports, the police ordered Allen to get on his knees, handcuffed him, and searched him for weapons—only to find nothing. Then they showed him the image that triggered the whole situation.

“I was just holding a Doritos bag—two hands and one finger out, and they said it looked like a gun,” Allen said.

How the AI Gun Detection System Works

Last year, Baltimore County high schools started using a gun detection system that uses AI to monitor surveillance cameras for potential weapons. The system is meant to flag possible threats before they happen—but this time, it completely misfired.

When the system spots something it thinks looks like a gun, it immediately sends an alert to school officials and police. In this case, that “something” was just a bag of chips.

This kind of technology was meant to protect students from violence, but instead, it scared an innocent kid and left a whole community wondering if these tools can actually be trusted.

Experts have been warning that AI often struggles with accuracy when it comes to real-world visuals, especially in diverse environments. Misidentifications like this can lead to dangerous encounters—especially when law enforcement is involved.

School District Responds After Outrage

Baltimore County Public Schools released a statement addressing the situation. In a letter to families—leaked to WBAL-TV 11 News—the school acknowledged,
“We understand how upsetting this was for the individual involved and others who witnessed the incident. Our counselors will provide direct support to the students affected and are also available for any student in need of assistance.”

The district says it’s reviewing the AI system’s accuracy and the response protocol that led to this unnecessary confrontation.
Baltimore County police confirmed the details in their own statement:

“Officers from Precinct 11-Essex responded to Kenwood High School following reports of a suspicious individual with a weapon. Upon arrival, the person was searched, and it turned out he wasn’t carrying any weapons at all.”

Community Questions the Role of AI in Policing Schools

The community’s response has been mixed—some parents are calling for the removal of AI surveillance systems, while others say this is proof that human judgment should always come before technology.
Many Black families are especially concerned, pointing out that false assumptions and overreactions can have serious, even deadly, consequences when young Black men are involved.

Advocates argue that technology built without cultural and racial awareness can reinforce existing biases instead of preventing harm. As one local parent said, “AI didn’t mistake a bag—it mistook a Black boy with a snack as a threat.”

What This Means for AI Safety and Accountability

This incident shines a light on how quickly AI mistakes can escalate when humans treat its alerts like gospel. The promise of “safety through technology” starts to crumble when it leads to racial profiling and public humiliation.
While schools and police departments across the country are racing to integrate AI for safety, experts are calling for stricter oversight, transparency, and community input.

AI is supposed to help humans make better decisions, not replace them. Until systems like these are tested for accuracy, cultural bias, and accountability, we’ll keep seeing stories like this—where a kid with a bag of chips ends up face-down on the pavement.

Leave a Comment