In the late 1990s, Hickory, North Carolina, was flunking Pothole Patching 101. "Pothole patching is a big deal to citizens," says Karen Hurley, a city budget...
In the late 1990s, Hickory, North Carolina, was flunking Pothole Patching 101. "Pothole patching is a big deal to citizens," says Karen Hurley, a city budget analyst, "and our numbers did not look good."
As part of a 16-city comparative-performance consortium, Hickory has immediate access to sometimes sobering statistics on its relative performance in a host of areas ranging from road maintenance to fire prevention, and even internal functions such as personnel management.
So it was easy to determine that Hickory's pothole-patching performance did not measure up when compared with its neighbors. Although 85 percent of the city's potholes were being repaired within 24 hours of being reported, the consortium average was 96 percent. (Rather than do head-to-head comparisons among members, most consortia compare to an average, which participants say allows them to see where they stand without the anxiety of being stacked up directly against a star member of the group.)
By analyzing the comparative data and learning lessons from the top performers, Hickory's highway department realized that it was using outdated techniques and equipment and was able to make a successful pitch to elected officials for a new hot-patch truck.
Hickory officials' decision to embrace bad news and use it to improve performance is probably the key characteristic of the dozens of cities that are now getting involved in regional consortia nationwide. It's still a relatively tiny slice of local governments, though. For the vast majority of cities, counties and school districts, the notion that their performance might be compared to another's is still intimidating enough -- or enough of an administrative, fiscal or political challenge -- that they seem to prefer continuing to operate in the comfort of their own small worlds.
Acting on the Numbers
For those who've decided to break out, though, they contend that the rewards for sharing information on procedures, performance and costs can pay real dividends. Randy Harrington, budget director for Concord, North Carolina, notes his city was able to save hundreds of thousands of dollars a year on a trash hauling contract by looking at what Concord spent on trash collection and disposal versus what other members of the consortium were spending. "We had a big solid waste contract up for renewal," he says, "and we used the benchmarking information to negotiate that, and it was tremendously helpful on both the cost and the performance side."
Specifically, the city's private-sector waste hauler wanted to jack up per-collection-point costs from $7.07 to $7.76. By showing that its costs were already higher than the consortium average, Concord was able to hold the line at $7.07 per collection point. Harrington figures that keeping the cost of collection stable is now saving the city almost $400,000 per year.
The ability to negotiate from a position of knowledge -- and therefore strength -- has certainly made the $10,000 annual cost of being part of the consortium worth it, in Harrington's view. "Really, the $10,000 has never been an issue with our city council," he says. (Concord joined the consortium in 1999). "But you take the solid waste example. You've spent $10,000 a year to save hundreds of thousands a year."
Obviously, not all the benefits to being part of a such an effort are so overwhelmingly obvious, acknowledges William Rivenbark, director of the consortium, which is formally known as the North Carolina Benchmarking Project, housed in the University of North Carolina's School of Government. But since the project got rolling in 1995, Rivenbark says he has noticed a clear trend in the increased willingness of consortium members to act on the numbers the project has generated. "Of course, that's what the project is all about," says Rivenbark, "using data to improve." Last December, Rivenbark's institute released a report, "Benchmarking for Results," which includes a host of examples of how member cities used comparative data to improve performance and/or hold down costs.
The use of data to do things differently seems to signal that a consortium has matured, and in that regard Mike Lawson, director of the mother of all comparative measurement consortia -- the International City/County Management Association's Comparative Performance Measurement Program -- says his operation is clearly maturing. "It's the difference between using the numbers to wave a big foam 'We're Number One' finger in the air," says Lawson, "and using the data as a learning tool."
It's not an evolution that comes easily. Comparative measurement consortia are neither easy to put together nor to hold together, say those in the business. Changing personnel and changing priorities are just two of the reasons why consortia members may lose interest and drop out. The director of one regional measurement effort compares keeping all his member cities in the fold to "herding cats."
But with the advent of regional and statewide consortia, one problem that has long dogged efforts to create comparative consortia seems to have been solved, more or less: the old apples-versus-oranges complaint. That is, how to compare the performance of, say, St. Paul and Phoenix, when so much about each city is different, from climate to demographics.
That was always one of the more significant challenges for the ICMA consortium, says Lawson. "You have questions about how weather affects one jurisdiction's performance versus another's, or state laws, or whether one is operating in a unionized environment."
In fact, ICMA's project seemed to be stagnating, note outside observers, until the association decided to get into the regional consortium game, too. Now that ICMA is sponsoring regional consortia -- it's up to seven of them -- Lawson says there's been a significant resurgence in interest in the performance measurement program. In fact, the number of participating localities -- which each pay in the neighborhood of $5,250 per year to ICMA -- has grown from around 120 to more than 160. The various consortia ICMA oversees range from municipalities in the Puget Sound area to Virginia counties.
While regional consortia may help alleviate a lot of the comparison concerns, Hickory's Hurley believes that being part of a consortium means accepting the fact that no two jurisdictions are ever going to be exactly alike, even if they're right next door to each other. "People say they would like to compare apples to apples," says Hurley, "but how many apples are exactly the same?" Which is why even though her city is one of the smallest of the group, she still thinks that Hickory has plenty to learn from larger cities, and vice versa. "I don't care if you have a population of 100,000 or 30,000, the idea is to come away with lessons on best practices."
Also contributing to the growth in consortia is the fact that as the debates over performance measures and cost accounting have been worked out, there's just more standardization. At the same time, consortia members also argue that the whole notion of measurement has matured to the point where willing participants don't get quite so hung up where there are ongoing and hard-to-reconcile differences.
"One thing that our project has confirmed for us is that none of us do the same thing in the exact same way," says Kirk Bednar, assistant city manager for Brentwood, which is part of a 12-city consortium in Tennessee. "There's just some accepted fudge. Take, for example, the issue of fire department response times. We've gone back and forth on the definition, but it's been a good exercise and as a result, we're doing some things differently." The key, argues Bednar, is to be willing to sit down with other jurisdictions and discuss it all.
Still there are often significant issues to be ironed out whenever different jurisdictions start trying to compare and contrast, even if they both get the same amount of snow every year. Trying to build consistency into what consortium members measure and report, and how each one calculates cost can prove initially daunting.
Take, for example, the jobs of paving and sweeping streets. It would seem that coming up with a reliable, multi-jurisdictional formula for calculating both accomplishments and cost would be pretty simple. But even that can be messy, says Mark Abrahams, an expert in activity-based costing. Abrahams recently helped a consortium of localities in the greater Kansas City area to do a limited comparison project on street paving and sweeping.
"We selected those two specific activities based on the theory that they were very straightforward jobs that everyone does," says Abrahams, and therefore, the theory went, it would be easy to measure and compare among them. "But it turned out far more complicated than we thought. Do you take out a ruler and a map and figure out how far the street sweeper went? Do you measure how long they were out there for? Do you look at the odometer? How do you account for differences in equipment?" Never mind wrestling with the issues of performance quality and how each jurisdiction accounted for cost.
In the end, such issues were worked out, more or less, and participating jurisdictions seemed to get something out of the project. For example, Overland Park, Kansas, beefed up its pothole patrols and Kansas City, Missouri, adjusted the frequency of street sweeps. Nevertheless, says Abrahams, the job was tough enough that it appears to be a one-shot effort for the moment.
All consortia participants concede that such efforts involve an investment of some time and money, and that a lack of resources -- both money and staff -- might prevent some jurisdictions from joining up. At that, though, North Carolina's Rivenbark calculates that for a whole city, it takes only one-quarter of one full-time-equivalent staffer to field the data for all participating departments.
Sharing Best Practices
One obvious key to a successful consortium is having a third-party coordinating body to administer the effort. "Having a third party that doesn't have an ax to grind collecting, scrubbing and reporting the data is important," says Rivenbark.
But as consortia mature, it's becoming clear that having well-scrubbed and relatively comparable data on performance and cost is only a starting point for the potential benefit of being part of one. The opportunity to sit down with peers from other jurisdictions to compare strategies and tactics is emerging as one of the most powerful benefits.
In South Carolina, Anna Berger says the consortium project she oversees at the Center for Governmental Services, which is part of the University of South Carolina, has actually moved away from the annual gathering and publishing of data that characterizes most comparative performance projects. "A year or so ago, we met with our steering committee and they wanted us to add new service areas," says Berger. "It's very time-consuming to create a new set of data and analyze and clean it, so we're moving in a new direction, away from collecting data and more toward holding local government service delivery forums in specific program areas to share knowledge. It's more about looking at best practices and less about crunching the data."
Although her consortium may be downplaying number crunching and emphasizing best practices, she says one piece of information of constant interest to her members is what citizens think about the job that government is doing, which is why another service her center offers is doing citizen satisfaction polls for her consortium members. "We found that city managers are really interested in that information."
And it's no wonder, because no matter where the jurisdiction or how different one may be from the other, deep potholes can lead to angry citizens, which makes for unhappy elected officials, which can result in very short tenures for city or county managers.
These articles are part of a continuing series on public performance measurement focusing on citizen involvement. Support has been provided by the Alfred P. Sloan Foundation. Although the foundation may assist some of the programs described in these articles, it had no control or influence over the editorial content, and no one at Sloan read the material prior to publication. All reporting and editing was done independently by Governing staff.