It’s not a surprise that most of the people cited in this issue’s cover story by Katherine Barrett and Richard Greene on the problems states have with missing or inaccurate data are government auditors. Their daily work brings them face to face with bad data and its consequences, and what they see alarms them. As Ohio Auditor David Yost is quoted as saying, “The poor quality of government data is probably the most important emerging trend for government executives, across the board, at all levels.”
Of course, government isn’t the only place where problems with nonexistent or distorted data are found. In a recent New York Times article on science fraud, Adam Marcus and Ivan Oransky wrote that “every day, on average, a scientific paper is retracted because of misconduct,” and that “not surprisingly, the problem appears to get worse as the stakes get higher.”
Nor is the problem only a recent one, born of our current emphasis on “big data.” Back in 1996, in her book Tainted Truth: The Manipulation of Fact in America, Cynthia Crossen detailed numerous cases of fabrication, distortion, and falsification of facts and figures by both governments and private researchers -- bad information that too often made its way into public policy.
Problems with data are to a certain degree inevitable, because data are inherently tied to positive and negative consequences for the people who create and report the information. We’ve largely come to understand that fact in the case of financial data. One of the public officials cited in our cover story notes that most of the data problems in government are in the management of programs, not in financial accounting.
Accounting data are better because, beginning after the Great Depression, legislation was passed requiring financial audits and related systems of internal control. Nonfinancial data are in need of the same level of rigor. When data matter a great deal, as in the Atlanta schools’ test-cheating scandal and in many of the situations cited in Barrett and Greene’s article, then there ought to be legal requirements in place.
In a recent conversation about using data to improve outcomes from social services programs, Linda Triplett, a senior staffer with the Mississippi Legislature’s Joint Committee on Performance Evaluation and Expenditure Review, told me that her key concern was verifying the accuracy of data. What is needed, she said, is accuracy, analysis and action. She’s right. Accuracy is critical, because without it the analysis will be flawed and the action ineffective. Good government requires good data.