In Rich Siegel's view, there isn't a more over-rated document in state and local government than the comprehensive annual financial report -- "CAFR" in public...
In Rich Siegel's view, there isn't a more over-rated document in state and local government than the comprehensive annual financial report -- "CAFR" in public finance lingo. So much time and effort spent on tracking money in and money out, says the head of Bellevue, Washington's budget office, and so little time and effort spent trying to figure out if the money bought anything worthwhile. CAFRs, Siegel scoffs, are little more than "the government accountant full-employment plan."
Much beloved by ratings agencies, which care first and foremost about whether a government can make good on prospective financial promises, and required for any government that wants an official clean bill of financial health, comprehensive annual financial reports have for more than 50 years been the annual official seal of approval for how a government is operating. Over the past decade, though, a small but growing group of state and local officials have been arguing that CAFRs don't tell anything like the whole story when it comes to whether government is really doing the job.
"CAFRs are very useful on a certain level," says Michael Matthes, assistant city manager for Des Moines. "But they're not a good mechanism for communicating to the public what they're getting for their money. Just because the dollars add up, that doesn't mean the money was spent well."
Which is why both Bellevue and Des Moines have joined the ranks of governmental entities that have begun to produce a different kind of annual report, known generically as "annual performance reports."
The thrust of annual performance reports is to communicate to citizens and policy makers the real story behind the mind-numbing columns of dollars included in the classic CAFR, by compiling a whole new set of data on what government is doing, whether it's doing any of it well, and -- perhaps most important to the governments that have committed to performance reports -- what citizens think of it all.
Bellevue's report, for example, looks at the basics: crime statistics, water quality, response times for 911 calls, and so forth. But it also includes survey data from citizens, asking them to rate the city in broad areas such as "neighborhood livability" and traffic congestion.
It is the citizen survey portion of such annual retrospectives that probably comes least naturally to governments, however. Doing statistically valid citizen surveys -- some jurisdictions also do focus groups and town meetings -- can be expensive and time-consuming. But if it sometimes seems like a lot of work getting citizen input, Siegel argues it's worth the trouble because too often public officials are off the mark when it comes to what local residents really care about.
For example, Siegel recalls being in on a discussion about developing some basic measures to report back to the Federal Highway Administration on state and local transportation programs. "These guys are talking about all these technical specifications around road and highway construction, and I'm arguing with them because citizens don't care about any of that, they want to know how long it's going to take them to get from point A to point B!"
If that seems obvious, Matthes agrees with Siegel that government officials -- even elected ones, who in theory are the most tuned in to citizen concerns of all public officials -- can badly misread voter interests and concerns. Indeed, you'd think that a small city council -- just seven members -- in a medium-sized city such as Des Moines would have a pretty good handle on what's most important to constituents. "Naturally, everyone figured it was all about police and fire," says Matthes. "But when we did our citizen-satisfaction survey, it turned out that streets were the number one thing that people cared about, and that streets were also the number one thing that citizens were unhappy about."
When the city council got its hands on the annual performance data and discovered the discrepancy, the response, says Matthes, was immediate. The council quickly voted to redirect $6 million to street resurfacing.
Carol McFarland, who collects and coordinates performance data for the Oklahoma Health Care Authority, says its annual performance report has proved to be a key document not only as a way of communicating to the public and politicians what the authority is doing but also as a guide in making adjustments to ongoing programs and policies.
For example, in tracking emergency room visits by Oklahomans eligible for Medicaid, which the authority administers, the question of chronic users came up. Because visits to the emergency room are so expensive, authority officials embarked on a program of identifying and contacting chronic emergency room visitors to see if a more preemptive and less expensive approach could be taken toward their health care. The result has been a measurable decrease in chronic use of last-resort medicine, which is being tracked and publicly reported through the authority's annual report. "It has been a fabulous resource," says McFarland. "We had a Medicaid task force touring the state and emergency room visits was one of the things that came up, and we were able to show that we're on top of the problem."
On the other hand, admits McFarland, the report also highlights some issues that are still in need of attention. "Child-immunization rates are something we struggle with. We're at 72 percent of the federal goal. That's been flagged as an area needing improvement."
Warts and All
Despite the anecdotal evidence that annual performance reports can have real-world value -- and ground-level impact -- they continue to be a very hard sell. Besides the work involved in trying to engage citizens, elected officials have, in general, evinced little interest in any in-depth analysis of the actual day-to-day and year-to-year performance of the organizations for which they're ultimately responsible; most seem happy to rely on that time-honored plebiscite known as Election Day to gauge broad citizen satisfaction with the job that they're doing.
As a result, governments have been slow to adopt performance reporting, either as a matter of homegrown policy, or as proposed and outlined more formally by such entities as the Governmental Accounting Standards Board (GASB), which sets all the rules and regulations around state and local financial accounting. "I don't think there's probably more than 50 cities that are really doing it," Matthes surmises.
One obvious reason for the lack of interest is that performance reports tend not to get much attention outside of government once they hit the streets. Matthes says boxes of Des Moines' first annual report wound up in the recycling bin. What's more, the Des Moines Register , a Gannett affiliate, responded to the report with a giant journalistic yawn. To date, Matthes adds, the paper has run only two stories related to the city's report and they were both lifted directly from the document without attribution or any information about how to obtain a copy of the report. A decade ago, Portland, Oregon, famously paid the Oregonian for a full-page supplement outlining the findings of its first performance evaluation because the paper wouldn't give the report any ink of its own.
Clearly contributing to the lack of press coverage is the fact that the media are naturally suspicious of data on performance that is self-published by government. To overcome such suspicion, argues Matthes, performance reports can't gloss over bad news: "You have to report everything, warts and all." Also important to credibility, notes Gary Blackmer, Portland, Oregon's auditor, is the passage of time. He says that after years of producing annual performance reports seemingly in a vacuum, at least a small handful of serious reporters now keep the report handy as a reference for pertinent stories that come up over the course of the year.
Another part of the problem when it comes to sparking interest in performance reports clearly is packaging. Most performance reports have all the design charm of a car battery. Only the hardest of hard-core public administration jocks are drawn in by the drab pages of charts, graphs and explanatory notes.
In order to break out of the flat pattern, Des Moines' Matthes says his city tried something very different for its 2005 report, released this year. The city held a competition among local artists that drew 200 entries for opening-page art for each of the report's eight sections. The buzz generated by the competition alone was worth the effort, says Matthes. The flashy, four-color, glossy pages and high-grade design have actually made the report a local best-seller. "We're down to just a couple of boxes."
There is, however, a far more substantive issue than just presentation that may be getting in the way of the broader adoption of performance reporting: The government accounting community has been split over how to go about supporting the practice ever since the concept was first conceived.
On one side is GASB, which for more than five years has been trying to ease the government accountancy world toward what's know as "service efforts and accomplishments," or "SEA" reporting -- essentially an annual look at government action and the extent to which its has amounted to anything like progress. GASB has developed and continues to refine a set of 16 criteria for what it believes adds up to a thorough and useful annual performance report (the criteria are available by going to www.seagov.org/sea_gasb_project/sea_guide.pdf). But it has been a long, tough slog. Loath to mandate SEA reporting, GASB has been trying to coax, cajole and coerce governments into adopting the practice and perhaps making such reports a standard addendum to CAFRs.
On the other side of the issue is the Government Finance Officers Association, which flat out opposes the GASB push on SEA reporting. "It's the wrong approach," says Jeff Esser, executive director of GFOA. His association strongly supports the concept of performance reporting, he says, and wholly endorses the value of performance reporting as the best way to communicate to citizens what government is doing. But the adoption of such a practice ought to be driven by program and policy people and not the accounting world, he argues. "Having it as part of or as a supplement to the CAFR is the wrong place for it," he says. Nor does Esser believe there should be any sort of national-level criteria outlining how to do performance reporting. He believes that it ought to be a more organic process based on the requirements and goals of individual jurisdictions.
But whether believers in annual performance reports think that GASB's standardized SEA reporting is the way to go, or that reports should be driven by home-grown needs, the notion that information -- financial and otherwise -- should be used both to connect with citizens and to influence programs and policies is now a given for many.
These articles are part of a continuing series on public performance measurement focusing on citizen involvement. Support has been provided by the Alfred P. Sloan Foundation. Although the foundation may assist some of the programs described in these articles, it had no control or influence over the editorial content, and no one at Sloan read the material prior to publication. All reporting and editing was done independently by Governing staff.
Join the Discussion
After you comment, click Post. You can enter an anonymous Display Name or connect to a social profile.
LATEST FINANCE HEADLINES
The Week in Public Finance: Several Shades of Bad News2 days ago
Los Angeles Failed to Collect $1.8 Million in Overtime Reimbursements3 days ago
American Wages Might Explain Puerto Rico's Economic Troubles3 days ago
How Puerto Rico Got So Deep in the Red4 days ago
10 Years and Still Waiting for Space Travel's Takeoff4 days ago
Budget Stalemate Cripples Illinois Government4 days ago