Internet Explorer 11 is not supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

What Can Local Government Do to Avoid Inequitable Tech?

As use of new technology by government continues to increase, experts and advocates in the space say that public servants should be keenly aware of the potential to exacerbate long-standing biases.

diversity concept
Shutterstock/Angelina Bambina
As government agencies increasingly deploy new tech — ranging from artificial intelligence to data-driven decision-making — advocates say public-sector leadership should be aware of the potential to reinforce pre-existing biases and inequities.

The conversation about public-sector technology bias is certainly not a new one. For many years, however, it has primarily been framed in the context of law enforcement. Concerns have arisen around government surveillance basically dating back to when surveillance technologies like recorders and cameras were first rolled out. In recent years, that ongoing conversation has evolved to include newer technologies, specifically artificial intelligence and facial recognition, culminating in a United Nations panel issuing a human rights warning around the subject in late 2020.

The chief warning from the panel was essentially that machines can also be wrong, and algorithms can inadvertently reinforce biases that have long existed. Now, as agencies at all levels of government start to use technology to solve challenges outside of law enforcement, that same potential carries with them, experts say.

This includes housing departments that use mortgage data influenced by redlining, procurement staff that purchase products not beholden to the same regulations as government, and communications departments that create visualizations without vetting them against rapidly evolving equitable language standards.

As this potential evolves, however, there are also a set of evolving solutions for public-sector technologists committed to equitable work. But first it is important to also be aware of how long-existing inequities can be reinforced with technology


The Risks


Samara Trilling works as a staff software engineer for New York City’s housing justice technology group, JustFix.nyc, and has also been a fellow at the Aspen Tech Policy Hub, where she did work related to the regulation of algorithms for online mortgage decisions.

Mortgages and housing serve as a prime example of how tech can reinforce existing biases.

“If you feed machine learning algorithms historical data and try to make a decision based on that past data,” Trilling said, “you’re going to reinforce the traditional biases we see in the world.”

What that looks like specifically is fewer families from underserved communities being approved for mortgages, often along racial lines because of historical redlining practices. While many places now have laws to guard against such discrimination, it’s possible data for decision-making algorithms will predate those laws. There is also risk in algorithms that heavily rely on credit scores, which also have a history of discrimination.

Given the powerful nature of artificial intelligence and machine learning, Trilling noted, people can have a general tendency to not apply critical thinking to where those technologies draw their data from. What this means is that while an individual human’s judgment might be questioned or analyzed, it’s possible that a machine-driven decision will not be, even if it makes that decision based on data collected with a human bias.

This is all something developers would do well to keep in mind.

“Data bias can creep in at each step of the process when you’re building a new tech tool,” Trilling said.

Modern data collections can also be biased, because some populations avoid sharing information with power structures due to long histories of mistrust. This means that even data collection operations conducted in good faith can be skewed by omissions. Another potential problem era is procurement, especially if jurisdictions do not make third-party vendors responsible for discrimination in the same ways as government.

Finally, public-sector technologists must also be acutely aware of the potentials for bias when they communicate work to the public, said Jonathan Schwabish, a senior fellow in the Income and Benefits Policy Center at the Urban Institute. Schwabish was one of the authors of the Do No Harm Guide, which as its name implies seeks to dole out info about how data storytelling can do no harm.

What this means for government, Schwabish said, is largely spending more time thinking about and collecting data on how people want to be described and visualized within data storytelling. It means meeting them in their native languages as well, and just generally investing effort in ensuring that this work is welcoming for everyone in the terms they understand and prefer.

When it comes to communications, government may not risk losing customers the way a private-sector company might, Schwabish said, but they do risk alienating the very people they must serve, creating innumerable challenges moving forward.

“We should just be thinking more carefully about these issues as data communicators,” he said. “If you’re someone who's communicating data and putting information into the world, how can you be more inclusive and equitable in how you do that?”

Solutions and Best Practices 


As the risks and concerns continue to evolve and proliferate, so too do solutions and best practices for avoiding biases and inequities in public-sector tech work.

“I encourage government to think about conducting a racial impact analysis,” said Tina Walha, who recently joined the U.S. Digital Response (USDR) as director of public digital after having spent six years in the Seattle mayor’s office working at the intersection of digital products and design.

Such a study might ask questions like what are the implications of this technology? Who benefits from the use of this tech? What are some unintended consequences of this technology? And the like.

In addition to a formalized analysis, Walha said it’s important at the highest levels of government to put forth clearly articulated guiding principles. Before even really delving into the use of tech, government should have already considered how its goals line up with equity, accessibility, community and associated values.

It’s also helpful, she noted, to bring in policymakers and have them be well versed on how data collection and application really works.

But even more important than incorporating policymakers into this work is incorporating members of the community who will be directly affected by it. This was a point echoed by Jessie Posilkin, a senior adviser to the USDR who leads the Economic Stability Program.

Human-centric design practices — which are becoming increasingly common within public-sector technology work — are a great way to accomplish this. Human-centric design intensely focuses on claimant need, working to figure out what data is necessary to actually help, while also determining what data might feel unnecessary or insensitive to real people, regardless of what government has identified as its own needs in a given scenario.

“With all technology problems, it’s so critical to look at what’s the real problem for a claimant or the residents wherever you are,” Posilkin said. “It doesn’t just have to be big picture stuff, but as you build the thing, bring it to the people who are going to use it.”

Getting back to the subject of procurement, this should also be an approach that vendors in the public-sector space take as well. There are also nonprofit groups doing work in the space to combat technology and data biases, groups like Data for Black Lives and the Ida B. Wells Just Data Lab at Stanford University.

Again though, perhaps the most important preventative measure government can take is remaining aware that technology tends to carry forward the implicit biases of those who create it. So if the majority of technologists working on a project are a certain type of person, the work they produce may not reflect the entirety of the community, which is what government is tasked with serving.



Government Technology is a sister site to Governing. Both are divisions of e.Republic.
Associate editor for Government Technology magazine.
Special Projects