Apples to Umbrellas
Criteria and formulas for basic performance measures are so varied, comparing data from one state to another is often a fruitless exercise.
In July, governors from all but a handful of states agreed that it would be a very good thing to standardize the formulas used to calculate high school graduation and dropout rates. Right now, states do it any way they want. Some, for instance, only count as dropouts students who leave high school in their senior year. By this dubious definition, children who cease getting an education when they're juniors are never counted--as either dropouts or graduates. "Because of the inconsistent quality of state data on graduation and dropout rates," says Virginia Governor Mark Warner, "many states cannot account for the status of their students as they progress through high school and beyond."
Graduation rates aren't the only poorly defined area. Some years ago, when we started making efforts to compare states, cities and counties on a variety of issues, we were told that this was an impossible task- -like the proverbial comparing of apples and oranges. We discovered the problem wasn't so much that governmental entities themselves are varieties of fruit but that the data you'd use to compare them are like apples and umbrellas.
Harry Hatry, director of the public management program at the Urban Institute, has helped lead an effort to develop common definitions for a number of performance measures used by local governments. A few years ago, his group began looking at response times for police and fire departments. You'd think this would be a snap. But the means of measurement varied from city to city, and police and fire chiefs were reticent to alter the way they did things. "Everyone says, 'My city is different. I have different weather conditions, or different amounts of minorities in my population that affect it,'" Hatry says.
Over the years, we have compared public pension plans. The most obvious approach is to look at their unfunded liabilities. But, since cities and states make a variety of assumptions about the prospective returns for their plans, this seemingly easy approach doesn't work.
Some states are making worthwile efforts to remedy the definition- comparison issue. In Michigan, leaders became weary of receiving a mish-mash of numbers from the localities that are responsible for managing 110,000 miles of roads. The state found itself forced to use unreliable statistics provided by the locals to make funding decisions. "It had gotten to the point where the legislature got tired of everyone coming in with their own set of data--key data that proved their case," says Kirk Steudle, deputy director of the Michigan Department of Transportation. So, in 2001, a new Asset Management Council was created with state, county, city and regional representation. The idea was to get an accurate reading of the condition of all the roads, so that money could be distributed on the basis of need. The state needs three separate measurements over time, and its database will soon have comparable information on about 43,000 miles of roads. Meanwhile, Michigan is training state and local officials on how best to use this newly available information to manage infrastructure.
Uninformed reseachers in areas such as health care, infrastructure and the environment might wish to turn to the federal government. Sadly, the feds have frequently given the states great flexibility in how they come up with the numbers they report. Kansas, for example, looks like one of the worst states in the country with regard to clean water, while neighboring Nebraska looks like one of the best. This may be true. Or it may be false.
As Environmental Defense (formerly the Environmental Defense Fund) reports on "Scorecard," a Web site it runs, the holes in the state-to- state clean water comparisons data are big enough to steer the Queen Mary 2 through. The criteria for determining whether a problem exists are not uniform. States include different categories of water bodies. What's more, not all bodies or watersheds are included. In fact, EPA and the states have assessed only one-third of the nation's waterways. For example, if a report on Clean Water Act assessment indicates that 18 percent of surface waters reported problems, that statistic covers only the surface waters that each state has chosen to include in its report to EPA.
Don't misunderstand. We think comparisons can be powerful tools for government managers. We're not falling into ranks with George Herbert, the 16th-century poet who famously observed that "comparisons are odious." The trick is, they should be based on good numbers to begin with.
Join the Discussion
After you comment, click Post. You can enter an anonymous Display Name or connect to a social profile.
Undocumented Immigrants Can't Be Denied Bail, Rules Missouri Supreme Court18 hours ago
Interstate Health Care, Part of GOP's Replacement Plan, Has Failed to Attract Insurers' Interest19 hours ago
Child Care Subsidies Are Dwindling, for and by the States20 hours ago
What Oakland Can Learn From Rhode Island's Response to Deadly 2003 Fire20 hours ago
30 Road Projects Halted in Montana Due to Budget Shortfall20 hours ago
South Carolina Makes History With 4 Women in State Senate21 hours ago