Shooting
Add news
News

Gun Detection Tech Fails To Detect Gun, Prevent School Shooting

0 2

There are lots of things we could be doing to limit school shootings. But none of those have been tried because most people, lobbyists, and politicians continue to believe issuing “thoughts and prayers” statements while standing on children’s graves is the absolute utmost they should be expected to do.

Instead of common sense measures that have managed to keep every other First World country almost completely free of school shootings, the US continues to take a hands-off approach… I mean, not counting the pallbearers asked to deliver innocent children to their final place of rest.

One of the so-called solutions is making tech companies richer while not actually making kids any safer. Lots of firms are offering “gun detection tech” to schools which seem to be more prone to false positives than life-saving gun detections.

While a lot of recent attention has been directed at Evolv — due to its failures pretty much everywhere (hospitals, schools, subways) it’s been deployed — this recent tragedy adds another tech company to the list of entities that are well-meaning, but ultimately useless, when lives are on the line. Here’s the latest bad news/worse news, as reported by Nashville (TN) Fox affiliate, WZTV.

The technology system meant to prevent school shootings failed to detect the Antioch High School shooter’s gun, an official confirms.

A Metro Nashville Public Schools’ spokesperson says based on the camera location and the shooter in relation to the camera, it did not detect the weapon.

MNPS adds the camera did activate an alarm trigger when law enforcement and school resource officers arrived with their weapons.

The technology, Omnialert, is an Artificial Intelligence (AI) gun detection used in all Metro Schools.

Gun detection tech isn’t much use when it only detects weapons carried by law enforcement officers deployed to neutralize an active shooter. Obviously, everyone in the building and the underperforming AI expected an armed response to a school shooting. “Detecting” blatantly obvious things isn’t anyone’s definition of “detection,” a term that’s normally associated with acts of intuition where things not immediately apparent are sussed out by instinct, skill, or… I don’t know… reliable tech.

I’m sure Omnilert appreciates the inadvertent typo, which will help muddy the search results and brush a bit of its earned shame off its shoulders. The company is Omnilert and it claims it’s the ultimate blend of military know-how and AI magic. (Omnialert is a brand linked to other non-gun detection products.)

Our expertise in AI has roots in the U.S. Department of Defense and DARPA related to real-time target recognition and threat classification. That military focus on high reliability and precision carried through to the development of our AI threat detection that goes beyond identifying guns to finding active shooter threats.

We employ a data-centric AI methodology that prioritizes high-quality training data. While traditional methods focus on data volume, sourcing millions of gun images, we take a quality-over-quantity approach. Our training data is hand-curated with rich annotations that improve accuracy and increase reliability.

Cool cool cool. Thanks for letting us know your failure was bespoke (“hand-curated”), rather than just off-the-shelf “hey man is that a gun” detection algos that aren’t backstopped by human assistance. If nothing else, it lets us know the company has a bit of blood on its “curating” hands before we even have to enter the discovery phase of post-school shooting litigation.

The sales pitch includes up-to-date reporting on school shootings that opens with this…

503 mass shootings in the U.S. and 330 incidents in schools highlight the ongoing need to provide layers of protection including technologies such as AI visual gun detection 

… and ends with this:

Protect your people, facilities, and operations with Omnilert’s AI-powered visual gun detection. Act now to transform your security cameras into proactive, life-saving tools.

Maybe the tech is better than this very limited sample size shows. Maybe it isn’t. Either way, it failed when it mattered most, resulting in the killing of one student and the wounding of another. And only the most extreme cynic would claim that’s an acceptable loss in comparison to other mass shootings.

Taxpayers were asked (although not explicitly) to pay for a product that didn’t do the only thing it’s supposed to do when it mattered most. And most likely they’ll be expected to keep paying for it because it might do the job the next time around. There are many useful ways to limit gun violence, but this nation will never go for them. Instead, we’ll just keep sacrificing kids to the AI gods because somehow that’s more acceptable than asking citizens to subject themselves to a bit more scrutiny before being allowed to purchase and carry deadly weapons.

Comments

Комментарии для сайта Cackle
Загрузка...

More news:

Read on Sportsweek.org:

Other sports

Sponsored