Internet Explorer 11 is not supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Police Tech Companies Also Grapple With Calls for Reform

As protests shine light on racial inequities in the U.S., tech firms that serve law enforcement are evaluating ways to avoid creating, rather than solving, problems — and at least one has pledged not to support policing.

When a video of the fatal arrest of George Floyd in Minneapolis sparked nationwide protests, it also inspired a period of reckoning about the living legacy of racism in America. The past two months have seen a deluge of editorials, open letters and other public statements from individuals and institutions calling for change in their own industries — for diverse hiring, implicit bias training and other measures to address racial inequities.

Now, in the face of so much public scrutiny on the issue, some data and police tech companies are grappling with what it would mean to be more responsible for what they put out into the world.

The Politics of Data and Design

For some companies, responsibility starts at the design phase, and understanding the potential relationship between what sort of biases or oversights went into a product and what it will do in practice, at scale. This was the impetus for the formation of Axon’s ethics board in 2018, an independent group which has had some success in dissuading the company, for example, from further developing facial recognition on the basis of privacy concerns and racial disparities in accuracy. In short, developing a face-recognizing algorithm with photos of mostly white faces will make it more accurate for white people, and more commonly misidentify people of color. What’s more, product engineers may not know as much about a surveillance tool’s potential for abuse in the wrong hands as experts in policy or civil liberties.

Earlier this year, Axon CEO Rick Smith told Government Technology that the idea was to bring in more civil rights voices — it gets input from police all the time because they form the company's client base, but it didn't have that level of natural contact with civil rights advocates.

The board also influenced Axon's development of license plate reading technology, including default privacy controls such as limits on how long data is stored for.

Smith also told Government Technology about the ethics board's work on license plate readers as well. While developing that technology, the company's engineers have engaged in an "iterative loop" with the board where they gather feedback and then tweak the product accordingly.

“It’s more important that we get it right than that we move fast and break things,” Smith said.

For some companies, such as the data consulting firm DataMade in Chicago, recent attention on policing has made them reconsider what sort of projects they undertake in the first place. In a blog post on July 13, the company went so far as to say it would “never again build tools or technology that supports policing or incarceration,” arguing that the company’s work had contributed to irresponsible policing by uncritically presenting biased historical data.

DataMade founder Derek Eder said the company shut down two websites his team had built, CrimeAround.us and CAPSure, so as not to promote the narrative that the west and south sides of Chicago are in need of more police.

“We are very likely to use that crime report data again, but we’re more likely now to use it for a project, say, to hold police to account, but not just to say ‘there were 15 armed robberies in this area at this time.’ That’s not a productive way to show off that data, we think,” he said. “One of the prevailing tropes in technology is that technology is neutral, that it’s not political, and that’s false. Everything you build … has real political implications, and there’s no way to be neutral. What you choose to highlight, what you choose to help and empower with technology, those are political decisions.”

For all the companies that recently issued statements of solidarity with the cause of racial equity, Eder didn’t know of any that took quite the same stance as DataMade. But short of that, he said civic tech companies in particular, being focused on projects for public good, should be in an ideal position to at least think critically about their own work.

“From the inception of the project: who’s at the table, who’s asking for it, who’s empowered by this, who’s potentially harmed by this? These are questions that often times aren’t asked in different tech spaces,” he said. “There’s a lot of technology that I think should not be pursued at all, and I encourage anybody in the tech space (to see this as) a critical test that’s often missed and needs to be put up front, before you spend time building something.”

Eder said as population demographics are changing and protests bring these questions to the fore, he was hopeful that a new generation of tech entrepreneurs might be more engaged with them than the last.

“Now that these systems like Facebook and Twitter are so big that you can’t possibly imagine a society without them, now that it’s almost too late, they’re trying to reckon with the power of the system they’ve created and design decisions they made 10 years ago, and who made those decisions,” he said. “Because by then, how do you pull the plug on a billion-dollar product?”

Audits and Accountability

Some organizations that have been hammering away at these problems for years agree the attention is overdue, because the technology in question is only becoming more widely used over time. Since 2016, the Algorithmic Justice League, founded by computer scientist Joy Buolamwini and funded in part by the Ford Foundation and MacArthur Foundation, has been raising awareness about problems with artificial intelligence in predictive policing.

There’s also the Policing Project, a center at New York University’s School of Law, founded in 2015 and run by a dozen full-time employees and several consultants, students and volunteers. Executive Director Farhang Heydari said the goal of the Policing Project, as a research, consulting and advocacy organization, is to work with tech companies to create front-end accountability for police — limitations and rules in advance, as opposed to back-end accountability that kicks in only after something has gone wrong. He said that will require more regulation of police technology in general, whether for facial recognition, predictive algorithms or body cameras.

“The way the courts have interpreted the Fourth Amendment, people get very little protection when they’re in public, and that’s where a lot of policing technology operates — from drones to facial recognition to license plate readers to predictive algorithms,” he said. “They suck in data based on where we are in the public, so we get very little constitutional protection. It means police and companies can roll out these products without any real legal constraint.”

But innovation doesn’t have to be a threat to civil liberties. Heydari granted that a lot of police tech, such as body cameras, can be a net good for all concerned if tech companies engage civil rights experts and citizens before going to market. That was the thinking behind the Policing Project’s “responsible tech audits,” which offer to people who make such technology concerns and recommendations regarding civil liberties, racial justice or privacy, based on an evaluative framework detailed on the organization’s website.

“I don’t think any policing tech company is going to do this right if all the development is coming from their in-house folks. I’ve never met a tech company that has meaningful voices from over-policed communities from across the country, and has civil rights lawyers and privacy lawyers,” Heydari said. “They don’t have to do tech audits from the Policing Project, but they'd better be getting meaningful feedback from somewhere.”

In 2019, ShotSpotter asked the Policing Project to review their ShotSpotter Flex product, a gunshot detection system. The resulting report, which is available online, found the system’s microphones didn’t pose a significant risk of voice surveillance, but it made recommendations regarding data sharing and storage. ShotSpotter Senior VP of Marketing Sam Klepper said the audit helped ease concerns from potential clients.

“The Oakland privacy commission in late 2019 was reviewing ShotSpotter, because the surveillance ordinance in the city is considered one of the strictest in the country,” he said. “The audit was included as part of this review that they had, and the group unanimously voted in favor of approving ShotSpotter.”

Speaking for a company that has installed gunshot detection equipment in more than 100 cities globally, Klepper said it’s important for police tech innovators to be open and transparent about what they make and how it works.

“It’s also important, at this time particularly, to have products that help with precision policing, to use objective data and get cops to places where something is actually happening, with the right situational awareness, so there’s not a community response of, ‘why are they here, they’re harassing us,’” he said. “We believe that our products provide that kind of objective use of data and pinpoint accuracy as events are unfolding.”

The question of what to do with biased data — historical data amassed across decades of policing focused more in neighborhoods with larger populations of racial and ethnic minorities — is a big one, and Heydari didn’t have a concrete answer.

“The alternative cannot be to throw out all the data and not have a data-driven approach, because that doesn’t get us to the right place either,” he said. “I think it starts with better data by police.”

Heydari proposed ways to improve data. He mentioned victimization surveys to find who in a community has experienced crime, as opposed to the number who reported crimes, as opposed to the number who were arrested for crimes, because those aren’t the same things. He said there’s a role for relatively objective technologies, such as ShotSpotter, to measure incidents as opposed to police activity. And he said he hopes that audits like those the Policing Project does, or some kind of certification process, can become an industry standard.

“We hear a lot from executive-level people in government that they wish they had these (audits), because the procurement folks don’t know how to meaningfully audit technologies,” he said. “Even police departments don’t know how to meaningfully audit technologies, and the last thing they want to do is buy a technology that happens to be spying on users or has a racist algorithm. So we’ve heard from a lot of procurement folks that they wish more companies had these kind of stamps of approval or rejection, so they’d know who they should buy.”

Government Technology is a sister site to Governing. Both are divisions of e.Republic.

Government Technology is Governing's sister e.Republic publication, offering in-depth coverage of IT case studies, emerging technologies and the implications of digital technology on the policies and management of public sector organizations.
From Our Partners