Internet Explorer 11 is not supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Does Facial Recognition and Other Tech Make Schools Safer?

There have been 33 school shootings in 2023 that resulted in injuries or deaths. Many school officials are using COVID-19 funds to purchase security equipment. But the tools may not always be as beneficial as they seem.

The New York Department of Education has permanently banned the use of facial recognition technology in schools—the first state to do so.

While it's too early to know if other states will follow New York's lead, the state's new policy should serve as a reminder to district leaders that they should proceed with caution when it comes to implementing facial recognition technology and any other tech that gathers biometric data on students, say experts in school security and student data privacy. They advise schools to scrutinize the claims made by vendors of these technologies and be fully aware of their drawbacks—in particular when it comes to student data privacy and school climate.

New York's ban comes as companies that sell such technologies have amped up their marketing to school districts in recent years, said Kenneth Trump, the president of National School Safety and Security Services, a consulting firm.

"Schools have been using the COVID recovery funds to buy security equipment and hardware because they have that pot of money that doesn't come from the district operating budget," he said. "But they have been used to solve political and community relation problems, not so much school safety problems. When there is gun use or confiscation on campus, we see school boards and superintendents make knee-jerk decisions and play to the emotional security needs of parents and staff."

But taking a more deliberate approach to school safety can often be easier said than done. School leaders are under immense pressure to make their campuses safe from gun violence. There have been 33 school shootings in 2023 that resulted in injuries or deaths, according to a database maintained by Education Week, and 177 total since 2018. And high-tech solutions such as facial and weapons recognition technology—which is powered by artificial intelligence—can be an alluring solution for school boards and superintendents looking to reassure parents that their school campuses are safe, said Trump.

Trump, who has worked in school safety and security for decades, said he has seen this with other kinds of school security technology, whether cameras or panic buttons: Tech is purchased with a one-time grant, principals and school staff are not adequately trained or given all the tools they need to properly use it, and no ongoing funding from the district's general budget is dedicated to maintenance—soon rendering the technology largely useless from a school safety and security perspective.

Trump fears this is a trap that schools could easily fall into with new AI-powered tech. Districts, he said, should consider how they plan to pay for upkeep and training staff on any new security tech they invest in.

Many companies that provide AI-powered school security technologies lease the hardware to schools and sell subscriptions to the software, Trump said, so districts need to also have a plan for how they will continue to pay for these services out of their operating budgets.

Facial recognition and artificial intelligence-powered weapons recognition technology takes time to manage, he said, and administrators don't always take into account the tradeoffs they might have to make. Most plots to commit violence or self-harm at school are discovered by engaging with students, Trump said. It's not worth sacrificing the time staff spend being physically present and building relationships with students to manage high-tech security systems.

Student Data Privacy is a Big Concern


There are also important data privacy concerns schools need to consider before deploying facial recognition technology, experts point out.

First, the technology has a spotty track record when it comes to accurately identifying many groups of people: including women, people of color, non-binary and transgender people, and children.

Schools also need to think about whether they want to be responsible for collecting and handling biometric data, which is sensitive and valuable data, said Irene Knapp, the director of technology at Internet Safety Labs, a nonprofit research and product testing organization.

"You can't change your face," they said. "In privacy, one thing we think about is different kinds of identifiers. If you can reset it easily, you have more safety. It's a pain to change your email and phone number, but you can do it in principal. You can never change your face and same for any biometrics such as fingerprints."

It's also extremely difficult to know whether data collected through a particular software program is being shared with third-party developers, Knapp said.

"Modern software development practices, it's not really one program where the company that publishes the program wrote all the code—they are incorporating code from a variety of companies, which from an engineering perspective is helpful, it saves a lot of time and effort of having to solve basic problems over again," they said. But "there is an awful lot of reliance on contractual agreements where everyone kind of pinky swears not to use the data for any unauthorized purposes but, when you dig into it there is no enforcement mechanism, there is no way for any of these corporations to know what the other corporations are doing."

How the Tech Can Hurt School Climate


Surveillance technology can also undermine a positive school climate and relationships with parents. There's research showing that when students feel constantly watched by surveillance systems, it erodes students' and parents' trust in schools, said Amelia Vance, the president of the Public Interest Privacy Center.

There's also the real risk of mission creep: It's tempting for schools to use surveillance technology like facial recognition in ways it wasn't originally intended for, such as tracking and fining parents who are late picking their children up from school.

"There is a temptation to use it as a potential mechanism to keep tabs or enforce certain policies, and that makes people uncomfortable," said Vance, who is also the chief counsel for the student and child privacy center at AASA, the School Superintendents Association.

While Vance said in general she thinks New York made the right call in banning the technology, she said there might be a future use case that hasn't arisen yet where the benefits of facial recognition technology outweigh the risks. And there is a chance that states that ban the technology now may find themselves hamstrung if they want to reverse course later.

For example, some schools have started scanning students' fingerprints—linked to an account—to pay for school meals, which has been beneficial, said Vance, in ways that people may not have predicted even a few years ago.

"This sped up the line so much that it actually helped kids have more time to eat more nutritional food," she said. "Just getting kids through that line faster and taking away the stigma of having them have to say that they are receiving free lunch is pretty great."

New York's ban doesn't extend to the use of digital fingerprinting.

While facial recognition technology is banned in New York schools moving forward, its use has been on pause since 2020. That year, the New York Civil Liberties Union sued the state education department demanding that a local school district, the Lockport City School District, located outside of Buffalo, stop using a facial recognition security system.

Lockport schools was not going to use the security system to identify students, who would not be entered into the database of potential threatening people to monitor. The district intended to use the technology to identify adults on school grounds—say, someone on a sex offender list.

State lawmakers passed a law putting a moratorium on the use of the technology in all schools while conducting a study into the risks and benefits of the tech.

It was the results of that study released in late September—which concluded that the risks of using facial recognition technology for school security purposes outweighed the benefits—that led the state education department to implement its ban.

The report noted that research has found that the vast majority of school shooters over the past three decades were current students, not adults, and that school employees would have had to know someone was a threat and entered them into a database for the system to work.

"While [facial recognition technology] vendors claim FRT offers increased school security, FRT may only offer the appearance of safer schools," the report said.


(c)2023 Education Week (Bethesda, Md.) Distributed by Tribune Content Agency, LLC.

TNS
TNS delivers daily news service and syndicated premium content to more than 2,000 media and digital information publishers.
From Our Partners