Internet Explorer 11 is not supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Performance Management Does Away With the Whip

The way governments are measuring results is becoming kinder -- and more effective.

whip
(Shutterstock)
The practice of performance measurement has long been an unfriendly process. Goals were set; data points were hauled out to see how they measured up against the targets. Agencies were called on the carpet and pressed to do better. 

But we’ve noticed a trend in recent years toward a kinder, gentler brand of performance management. 

“There was a ‘gotcha,’ hammer-like approach that was creating a culture problem,” says Beth Blauer, executive director of the Center for Government Excellence at Johns Hopkins University. Blauer, who is familiar with the performance culture in scores of cities, adds that “governments were creating rigid performance routines and were focused on making sure templates were populated.” Less attention was paid to how the data could actually be used to greater effect across the organization. “Thinking it’s all the data and not the discourse that follows,” Blauer says, “is total bull.”

It was an attitude that Larisa Benson, the host of the Government Performance Consortium in Washington state, knew all about when she was director of government management and performance under Gov. Christine Gregoire. As Benson recounts it, Washington had decided to kick off its Government Management Accountability and Performance (GMAP) sessions by having department heads and managers meet with the governor and other top officials to discuss the data they had gathered. 

But Benson had reservations about that approach. She and other leaders had visited Baltimore and New York City to see their performance management programs in action, and they discovered that some hard-working public officials felt diminished by the process of being drilled with questions. “It had to do with the tone of the questions,” she says, “and it didn’t make people feel like they were side-by-side looking at data.”

Benson saw that fear factors have negative effects. “As soon as the brain goes offline because you’re scared, your creative energy is gone,” she says. “When everyone is worried about winning or losing at the budget game, they’re not putting their energy to critical thinking.”

She and her team brought that insight back to Washington and tweaked the setup of the GMAP meetings. Although the Washington meetings were challenging, leaders were very deliberate about facing problems together instead of facing off against each other.

Here’s another variation on the same theme. The New Orleans Fire Department was having difficulties meeting its performance targets for response time, one of the primary measures being used to evaluate the department. Oliver Wise, who was director of the Office of Performance and Accountability there until late 2017, recalls that the fire department seemed to regard the work done by his group as “nothing but grief. We were a thorn in its side.” 

During Wise’s tenure, there was a major house fire that killed three people, including two children. The house, it turned out, didn’t have any smoke detectors. So rather than simply measure how long it took to get to a fire, the performance team created a model that identified neighborhoods most in need of smoke alarms. “That’s the population that is most likely to die if there is a fire in the house,” Wise says.  

Houses targeted by this data received free smoke detectors. Shortly thereafter, a fire broke out in one of them. Eleven people escaped. No one died. “Having data that gave them actionable intelligence empowered the firefighters,” Wise says, “rather than leaving them feeling berated.”

Many cities, notably those that have been accepted into the Bloomberg-funded What Works Cities project, now aim to use performance data as a tool, not as a whip. 

We’ve learned that lesson, too. Some time ago, when we were involved with Governing’s Government Performance Project, we gave cities and states good grades if they could demonstrate that they were gathering outcome measures and making them widely available. As we’ve seen the world change, we’d never do it that way again. Instead, we’d dig far deeper into the actual changes that were made in an entity as a result of the data they gather. 

From Our Partners