Zachary Tumin is the Associate Director for Programs in Technology, Networks and Governance at the Ash Institute of Harvard Universitty's John F. Kennedy School of Government.E-mail: email@example.com
Just when you thought it was safe to go in the open gov waters, suddenly there are a couple of sharks swimming in the implementation pond.
The first alarm comes from Clay Johnson (this Clay Johnson, not this one) over at Sunlight Labs. Johnson is taking the Obama Administration to task for a lack of progress on data.gov, essentially asking, "What gives, guys? The promises you made..."
Federal CIO Vivek Kundra conjured high hopes last spring, promising many thousands of data feeds added to data.gov. That's data.gov's mission - "increase public access to high value, machine readable datasets generated by the Executive Branch of the Federal Goernment."
I don't have the answer for it, but it would be interesting to learn what's getting in the way here, if anything. After all, it may turn out that 600 datasets is about where Data.gov should be about now. Is it?
If it's not, what are some of the obstacles? Is it attorneys pouring over sensitive data making sure everything accords with privacy rules and regs? Is the move from legacy data onto machine readable platforms harder than expected? Is it a resource and cost question, with agencies willing to get around to it, but not prioritize it? There are some thoughtful suggestions in the comments section to Johnson's post, here.
This is not just idle curiosity. All eyes are on the data.gov initiative. Many jurisdictions are taking their own first steps or waiting to see how this big Federal push goes. (Here is the UK's own version of data.gov, now in private beta.) Plus - it's the linking of state and local sets to Federal sets that has the true transformative potential , not to mention global linkages. Everybody sees their own fractal piece of climate, health, finance, safety - take your pick. Nobody sees the full picture. Getting them mashed up together is where the power and the glory is here.
And we haven't even talked about how to make sense of all this data when it comes together - and who should.
But it's a concern now if this next step scaling up to "open gov" at data.gov turns out to be harder than expected. With the first "spiral" behind us - proving the technical capabilities, wetting whistles with some captivating mashups, rolling into town on the Obama bandwagon, gaining internal executive sponsorship, assigning formal accountability, resourcing the push - this next scaling up may hold important lessons for rollouts and implementations across the nation.
Electronic Health Records
Another shark bit this week - in the same murky waters of implementation, but a different pond. All is not well in health data land. Or, rather, things may be going swimmingly for electronic health records - with $19 billion on the way -- but the results of past investment seem tepid so far. Even if the records go electronic, translating that into improved performance seems not yet within our reach.
The New York Times reported last week on a new study from Dr. Ashish K. Jha, at Harvard School of Public Health, and Catherine M. DesRoches of Massachusetts General Hospital. Looking at the results from 3000 hospitals, Jha and DesRoches assessed the impact on the quality of health care and outcomes from the move to electronic health records.
The results? Let's throw caution to the wind and call it "zero" -so far.
Here's an example of Jha's and DesRoches's results, stratified across three different hospital types (those with advanced "EHR", those using come computerization, and those using paper based records):
"In the heart failure category, for example, the hospitals with advanced electronic records met best-practice standards 87.8 percent of the time; those with basic computer records, 86.7 percent; and those without, 85.9 percent. The differences in other categories were similarly slender."
"Reducing the length of hospital stays... For hospitals with full-featured digital records, the average length of stay was 5.5 days; for those with basic computer records, 5.7 days; and those without, 5.7 days."
It's a little fast to blame EHR. But it's not too early to sound the warning klaxon: this ain't easy. Even if the data is open, translating openness to results has vast new requirements we haven't yet plumbed. The networks are going to be new; the performance metrics are going to be new; the transformation possibilities left dormant until we understand them and take decisive action to realize them.
Dr. Jha may have an answer to his own unspoken question here: what's the role of leaders in assuring health outcomes? In yet another study -this of the attitudes of 1,000 hospital Board Chairmen/women regarding clinical quality - Dr Jha found that fewer than half of the Chairs rated quality of care as one of their two top priorities. (It would be interesting to assess the performance impact of a Board chair who does prioritize quality care vs one who doesn't).
Moving from Promise to Performance
Soon, someone is going to have to manage the transition from really good ideas to really good results. Justice Brandeis told us "sunlight is the greatest disinfectant." But "sunlight" alone is not enough --- this concept of "let the machine work its magic," whether it's simply writing laws, or simply opening data -- relies on precisely that - magic.
Producing results - whether assuring the move to open datasets, or the assuring the translation of datasets to improved performance - requires tough management, measureable performance goals, prioritized and incented managers -- and hard work.
Are we ready for the change?