Eight police cars rushed to a Maryland neighborhood after a high school’s gun-detection system sent out an emergency alert. Moments later, officers arrived with guns drawn. All because an AI camera mistook a shiny Doritos bag for a firearm.

The 16-year-old student, Allen, was walking home with friends after school when he suddenly found himself surrounded by officers shouting commands.

“They made me get on my knees, put my hands behind my back, and cuffed me,” Allen told WBAL-TV.
“I didn’t know where they were going until they started walking toward me with guns, talking about, ‘Get on the ground,’ and I was like, ‘What?’”

After searching him and finding no weapon, the police revealed the source of the chaos. A blurry surveillance image flagged by Omnilert, the school’s AI-powered gun detection system. The program misinterpreted the reflection and shape of Allen’s crumpled Doritos bag as a gun.

“It was two hands and one finger out, and they said it looked like a gun,” Allen explained. “It was mainly like, am I gonna die? Are they going to kill me?”

The encounter left the teen shaken and raised new concerns about the growing use of artificial intelligence. Especially in school security systems. Civil rights advocates warn that while these systems are sold as safety tools. However, false positives can quickly turn deadly.

Groups like the Algorithmic Justice League and Electronic Frontier Foundation (EFF) have called for stricter oversight of AI surveillance tools. They cite racial bias, technical flaws, and lack of accountability.

“We can’t trade one kind of fear for another,” one activist told local media. “Our children shouldn’t have to risk their lives because a computer can’t tell the difference between chips and a gun.”

For Allen, it’s an experience that will be hard to forget. What began as a simple walk home after school. Turned into a terrifying reminder of how quickly technology — and human reaction — can go wrong.