Schools’ AI Weapon Detection Systems Accused Of Flaws

The Business of Bots robot artificial intelligence ai

Artificial intelligence-powered weapons detection systems used in US schools are less effective than claimed, industry watchers say

An artificial intelligence (AI)-powered system for detecting weapons that is widely used in US schools has been accused of significant gaps in its effectiveness.

The controversy comes as UK and other jurisdictions are developing plans for regulating the AI industry amidst rapid advances in the field.

The Intercept reported several incidents in which guns and knives had passed through detection systems made by US firm Evolv Technology, and the BBC said tests found the systems were frequently unable to detect large knives.

“Through investigation it was determined the Evolv Weapon Detection System… was not designed to detect knives,” said Brian Nolan, superintendent of schools in Utica, New York, where a deadly knife attack took place in a school last year.

ai artificial intelligence data network pexels
Image credit: Pexels

School safety

Tests found 42 percent of large knives went undetected in 24 walkthroughs, the BBC reported.

Evolv’s system is used in major stadiums in the US and the Manchester Arena in the UK, and the firm has been aggressively expanding into US schools amidst increasingly frequent attacks involving guns and knives.

Last October a Utica student was stabbed to death by another student who had passed through an Evolv system carrying the weapon.

Evolv says its systems use “sensor technology” combined with “proven artificial intelligence” that recognises the shapes of a wide range of weapons, including guns, knives and bombs.

It has encouraged schools to replace metal detectors with its systems, which can cost millions of dollars.

‘Worst fears’

Stefanie Coyle, deputy director of the Education Policy Center at the New York Civil Liberties Union (NYCLU), told The Intercept that private companies were preying on school districts’ “worst fears” to sell them “technology that’s not going to work”.

The Intercept found more than 65 school districts had bought or tested AI gun detection from a range of companies since 2018, spending $45 million (£36m).

Conor Healy of IPVM, which tests security equipment, told the BBC Evolv was “one of the worst offenders” for exaggerating the effectiveness of their technology.

But he said there was an “epidemic” of schools buying “new technology based on audacious marketing claims, then finding out it has hidden flaws”.

‘Bad actors’

Evolv, which is publicly traded, declined to comment but pointed to a blog post in which its chief executive, Peter George, said it was necessary for the firm not to disclose details of how its technology works.

“Marketing weapons detection security requires a delicate balance between educating stakeholders on new technology and not providing bad actors with the information they could use to do harm,” George said in the post.

Because of this public-facing marketing materials were “intentionally not specific”, but relevant aspects of the technology, “including limitations”, were communicated directly to customers and prospective customers, he wrote.