Internet Explorer 11 is not supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

The Management Challenge of Bad Data

Obstacles to coming up with solid, accurate numbers to measure program performance are all over the place.

State and local government officials need good numbers to make good decisions. That ought to be an obvious enough point. As much as some managers may resist using anything that's dubbed "performance measurement," they don't challenge the importance of information. When it comes to the scarcity of accurate data, a lot of them sound as thirsty as Coleridge's Ancient Mariner: "Water, water everywhere, nor any drop to drink."

And yet it surprises us how often government officials have a bad-data albatross hanging around their necks. Not long ago, for example, Florida put private companies in charge of determining the appropriate Medicaid services for clients in several of its counties. If these providers could save money and demonstrate that there was no decline in services--the problem that patient advocates most feared--the program would be expanded into more counties.

Sounds great, but when it was time to start making decisions about expansion, the state was foggy about what had really gone on. The necessary information was missing in action. As a result, regardless of the potential of this program, Florida's legislative Office of Program Policy Analysis and Government Accountability (OPPAGA) wisely suggested deferring any expansion until the evidence was in. "To date, little data is available to demonstrate that Medicaid Reform has improved access to and quality of care," OPPAGA reported. "Little data is yet available on whether Medicaid Reform has produced cost savings or is more cost-effective than traditional Medicaid."

This obstacle to assessment is a familiar one. "If you have something that is going to be evaluated," says Gary VanLandingham, OPPAGA's director, "then it would be good if the folks who were designing the programs would bring in some data folks at the beginning." As VanLandingham points out, it's difficult to retrofit management systems to go back and collect data that a program should have been collecting to start with.

In many instances, the data does exist: It's just wrong, misguided, misleading or some combination of the three. The Texas State Auditor's office frequently reviews state performance measures and has found serious problems in some agencies. While some entities, such as the School for the Deaf and the Board of Nursing, do a solid job of keeping score, others don't. The Board of Dental Examiners, for instance, showed unreliable results for eight of its 12 key performance measures.

John Keel, the Texas auditor, believes his reviews help to improve the accuracy of the data. He explains that, for years, many Texas agencies haven't paid much attention to the quality of information that went out. It simply wasn't a high priority.

It seems to us like this is remarkably shortsighted thinking. Spending a lot of time to create too much data may be a problem. But spending any time at all to produce bad data is a total waste.

Maryland provides another example. When its Office of Legislative Audits tested 35 cases involving offenders said to have completed substance-abuse treatment in 2007, they found that there was no documentation for 24 of them. In fact, 16 of the 24 "had not been enrolled in a substance-abuse treatment program" in the first place. That makes it kind of tough to come up with documentation.

Bruce Myers, the state's legislative auditor, says it isn't usually very difficult to get agencies to provide internal financial information. But the job becomes quite a bit tougher when it requires dealing with outside contractors, or with other levels of government or school systems. In those cases, says Myers, there's an unfortunate tendency for data on the amount and quality of work done to be "just flat out wrong or inaccurate--or there's a lack of documentation or some other factor that won't allow us to certify that it's accurate."

Obstacles to coming up with solid, accurate numbers are all over the place, as Tim Maniccia discovered when he became director of operations for Albany County, New York, in 2007. He discovered that much of the information the county needed wasn't controlled by its own managers, but by the state of New York. As a result, there was lots of good data in the universe, but it was placed in inaccessible systems. That problem has been alleviated: Albany County now has developed tools to get to the state-owned data. But it's still an issue elsewhere.

If good, reliable data hasn't been sufficiently available in the past, we fear that things are only going to get worse now. As Florida's Gary VanLandingham puts it, "when times are tough, data systems and data folks and evaluation units are some of the first that get cut. They're not seen as direct service-provision folks." For example, when Florida's budget called for across-the-board reductions this year, the state's Department of Children and Families cut several evaluation units. "They don't really have one now," says VanLandingham. "It's easy to cut a program like that in one year, but then it takes years to build it back up."

Special Projects