Taki Allen was hanging out with friends and waiting for a ride home after football practice at Kenwood High School in Baltimore County, Maryland last week when police rolled up and allegedly confronted him at gunpoint about a pistol they thought he was carrying. Allen was placed in handcuffs and searched, but no gun was found.
Instead, it was apparently a bag of Doritos that Allen was holding that triggered a weapons alert issued by an AI-powered security system.
I was just holding a Doritos bag – it was two hands and one finger out, and they said it looked like a gun,” Allen told WBAL.
… CNN reached out to the Baltimore County Police Department for comment. The department told WBAL officers responded to “a report of a suspicious person with a weapon” but determined the person was unarmed after a search.
Kenwood Principal Kate Smith said the school district’s security department reviewed and canceled the gun detection alert after confirming there was no weapon, according to a statement sent to parents that was shared with CNN. Smith said she reported the matter to Kenwood’s school resource officer, who called local police for support.
The principal didn’t immediately realize the alert had been canceled, a spokesperson for Baltimore County Public Schools told WBAL.
“We understand how upsetting this was for the individual that was searched as well as the other students who witnessed the incident,” Smith said in the statement. “Please know that ensuring the safety of our students and school community is one of our highest priorities.”
Omnilert, which is the company behind the AI-based weapons detection system, said that the alert should have triggered a human review, but argues that otherwise the system worked exactly as it’s supposed to; “to prioritize safety and awareness through rapid human verification.”
Yeah, if you want to call the deployment of multiple police officers, handcuffs, and a pat down “rapid human verification,” I suppose that’s true. And that does appear to be okay with school officials.
Despite the controversial fallout, Baltimore County Public Schools Superintendent Myriam Rogers said the system worked how it was meant to.
“The program is based on human verification and in this case the program did what it was supposed to do which was to signal an alert and for humans to take a look to find out if there was cause for concern in that moment,” Rogers said.
“Humans” is a pretty broad term. What Rogers really means is that once an alert goes out it will be law enforcement who will respond to “take a look”; a look that will involve detaining someone at gunpoint, placing them in handcuffs, and grilling them on a weapon that they may or may not have.
Video shows officers handcuffing Allen and searching him.
Once they realize he’s not armed, police removed the handcuffs and explain what happened, while also thanking Allen for cooperating.
Officers then allow Allen to see a picture of the alert, depicting a crumpled up bag of chips that resembled a gun.
While huddled together assessing the picture, multiple officers conclude the picture looks more like a bag rather than a weapon.
I mean, shouldn’t that assessment have been made before arriving on campus?
It’s hard to entirely fault the officers here. They got a report about a student with a gun, after all. It’s the fact that the system used by the Baltimore County schools can look at a bag of chips and determine its a gun that’s the real issue.
These weapon detection systems can also fail to detect weapons, which is another issue entirely, but the false positives that they can generate have the potential to do real harm. In this case, thankfully, the worst thing that happened to Allen was having a gun pointed at him, but its not hard to imagine a situation where shots are fired because law enforcement mistakenly believe an individual is armed. As one of the officers told Allen afterwards, “AI’s not the best.”
It’s also one thing to use these systems in K-12 schools, where firearms are generally prohibited. What happens, though, when these systems are rolled out in and around places where we can exercise our Second Amendment rights? Even if AI successfully determines someone has a pistol, it can’t know whether or not someone is lawfully carrying that firearm. Suspicions of carrying a firearm shouldn’t be reason enough to stop and search someone absent, but that doesn’t mean they don’t happen… especially in places where concealed carry licensees are few and far between,
AI’s getting pretty good at making Will Smith eat spaghetti, but if it can still mistake a bag of Doritos for a pistol it shouldn’t be used to identify weapons. At the very least it’s going to lead to some uncomfortable moments, but it could easily lead to an officer-involved shooting when there’s no legitimate threat at all.
Read the full article here


