Internet Explorer 11 is not supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Deeds, Data and Dollars

The idea of crafting a budget based on performance measures is catching on--slowly.

When it comes to performance-based budgeting, Carolyn Lane is convinced that Louisiana is making progress. Asked how that's come about, she responds dryly, "One retirement at a time."

As the acting state planning management director for Louisiana, Lane knows that breaking away from old budget ways--of line items and object codes--and focusing on programs and results is as much a matter of longtime legislators and budget staff moving on to other pastures as it is giving them reasons for why trying to tie dollars to deeds makes sense.

Either way--relying on new blood or actually making the case--it's been slow going for putting more of a performance base under how states and localities budget. Nationally, states and localities have been engaged in something of a one-step-forward, two-steps-back approach to the practice. Still, a growing number of governments seem to be successfully--if gradually--pushing budgeting away from an exercise that is overly focused on dollars, and toward a system whereby program descriptions and data on results are at least a part of the conversation come budget time.

Austin, Texas, for one, has been trying to infuse performance into its budget process for years. Its first tangible breakthrough arrived with the new millennium. "Starting with this year's budget, departments actually allocated money based on business plans, versus the old personnel costs, contracting expenses, administration and those kinds of things," says Tony Mastracci, who heads up the performance-measurement effort in the city's budget office. "We still do object codes, but we use those as efficiency measures; they allow us to analyze the cost of activities, which gives us an idea of what it costs to achieve certain outcomes."

It is still an imprecise science, one that the city will be a long time in refining, Mastracci notes, but he believes that the influence of the new approach on departments--and Austin's seven city council members--is real.

Doing departmental budgets based on business plans has, for example, inspired the city's personnel department to work much harder to fulfill its role as human resources consultant to its customer agencies, as opposed to focusing narrowly on some measure such as the number of grievances filed per year, as it did in years past. And while the fire department and the city council are still intensely interested in that old emergency-services measurement standby-- response times--a new measure is now being used to assess departmental effectiveness: the extent to which fire damage is contained upon the fire company's arrival on a fire scene.

LEGISLATIVE BUY-IN

Key to making measures meaningful come budget time, of course, is getting elected legislatures--state or local--to actually pay some attention to them; that's always been difficult. Expecting busy legislators to pore over dozens of outcome measures for dozens of departments simply isn't realistic. If anything, burying them with statistics has frequently had the opposite effect. One budget official in a state well known for performance-based budgeting laments that the executive budget document currently being produced is so packed with data on results, legislators now seem to regard it as little more than a fat doorstop.

Which was a much-discussed problem in Oregon--the performance- measurement pioneer--when it embarked on its "Oregon Benchmarks" program. Its much-poked and prodded performance-measurement approach has had more than its fair share of ups and downs, including a major cutback of measures in 1994. "We had one large agency that actually had more performance measures than employees," says Jeffrey Tryens, executive director of the Oregon Progress Board, which oversees the benchmarking effort.

When the state went through that paring-back process six years ago, the impact was twofold: Not only did that make the whole effort more manageable, it also allowed agencies to begin to more closely link what they were reporting by way of outcomes with the broader statewide goals outlined in Oregon Benchmarks. And what budgeters in Oregon have found is that those measures of general statewide health and well being tend to resonate more powerfully with elected officials than do the detailed data coming out of agencies.

Indeed, a number of cities and states have found that it's worth developing a more global set of measures that have more immediate political appeal for elected officials, measures that can be used city- or statewide to gauge progress and well being on broad fronts, such as public safety, economic prosperity and education. Agencies can then tie their more specific performance measures directly into such global goals as a way to make the connection for lawmakers between budget allocations and ultimate results.

One state that's demonstrated particular discipline in that regard is Missouri. The state's "Missouri Show Me Results" effort sets goals and tracks progress in five highly visible areas--"prosperous," "safe," "healthy" and "educated" Missourians, and "responsible government." Under those five general areas are just 25 targeted indicators, including increasing high school graduation rates, lowering rates of infant mortality, increasing the number of jobs statewide that pay more than $10 an hour, and continually improving the state's record on employing minorities and women. "What we've found is that you can draw legislators into the discussion of agency performance measures by tying them directly into the `Show Me Results' goals," says Mark Reading, an assistant director in the budget and planning division of the Missouri Office of Administration.

In Austin, Tony Mastracci says, the city's broad scorecard includes data related to public safety; sustainable community; youth and family; and neighborhood vitality and affordability, all of which get a good deal of attention from city councilors. But just as visible, he adds, is an annual survey that asks residents what they see as priority areas for city government, and how well they think the city is doing in addressing those priority areas. The survey results carry considerable clout, Mastracci points out, because they give a citizen's voice to the scorecard data the city collects. "So when departments make their budget presentations, they're tied into both the community scorecard as well as the citizen survey; what are they trying to accomplish and what successes they've had." At the same time, the data becomes a way for department heads to make their case, if they choose to. "Some still see it all as busy work," Mastracci says, "but others are using it to tell their story."

In Maine, state budgeters have taken a slightly different and perhaps more hard-nosed tack to the well-recognized problem of outcome measures run amok. Rather than suggest that agencies try to hitch their measures to more global statewide goals, budgeters there have simply put agencies on an outcome indicator diet. In the state's next biennial budget--which will be the first to include outcome measures with every program the legislature is being asked to fund--executive budgeters have capped the number of measures that any program can submit. "Depending on the program's size, we're capping them at somewhere between three and six," says Jody Harris, strategic planning coordinator in the Maine State Planning Office. "We want them to focus on those key measures that the legislature will really be interested in." Figuring out how to distill those golden few out of piles of possible choices is hardly magic, Harris says. "You look at where most of your money goes."

TOILING IN THE TRENCHES

Digging into what's going on across the country by way of more outcome-oriented budgeting, it's become clear that selling legislators on the idea isn't what has kept the movement alive; there is ample evidence that even in states where budget documents contain thoughtfully selected and well-presented performance data, engaging legislators is still the toughest part of the game.

What's probably been most critical to the trend is that an increasing number of executive and legislative budget staff seem to simply have tired of the old way of doing business, of operating in the dark when it comes to what a state or locality is getting for its money. And so there are now key staffers working in the trenches and behind the scenes to keep building performance measures into the budget process and budget documents.

It even appears that the momentum is sweeping up budget analysts, who have traditionally been the most preoccupied with line items and object codes and the dollars that are attached to them. In an increasing number of jurisdictions, those red-pen artists are foregoing the time-honored game of "what-are-they-trying-to-slip-past- me-this-time; they-won't-get-away-with-it" with each agency budget submission. Instead, some are actually engaging agencies in meaningful discussions of resources in terms of what an agency wants to get done.

"We've been working hard to integrate managing for results, strategic planning and performance measurement into the budget process," says David A. Treasure, deputy director of budget analysis in Maryland's budget office, "and we view our budget analysts as the point people in helping agencies make that happen."

Given such significant changes in attitude at that level, it seems more than likely that performance measures will continue to infuse the budget process in increasing numbers of jurisdictions. The extent to which legislators will ever pay attention to outcome measures when it comes time to argue about budgets will remain an open question. Arguably though, as the practice embeds itself in standard budget operating procedures, lawmakers will become more measurement literate.

"There are always things that will get funded for political reasons, regardless of outcome data," says Austin's Tony Mastracci, "or just because the council views them as important." And it will never be possible for a department simply to declare the outcomes it wants to achieve and then plug those into some magic formula that tells it exactly what it will all cost. But Mastracci is confident that his city is on the path to a more sensible way of deciding how to allocate resources: Present the city council with information on citizen and community needs in concert with what the various departments are doing to meet those needs. Then tie all that as closely as is reasonably possible to budget allocations. That at least allows the opportunity for a more informed budget debate and discussion. And for those who've embarked on the path, all of them agree it also beats budgeting in the dark.

From Our Partners