Commented April 7, 2010
Try this link for more information about EHS Results: http://www.mass.gov/?pageID=eohhs2terminal&...
Lately I've been hearing more and more people in the health and human services world use the phrase "evidence-based practice." This represents real progress in a field that's been widely resistant to the concept.
Why the resistance? The argument among those in the health and human services game was always that any results they might achieve were so heavily dependent on human behavior - that is, changing human behavior - or socioeconomic factors beyond anyone's control that it simply wasn't fair to hold those administering and delivering health and human services to the same standards as the people filling potholes, processing small business tax returns or arresting bad guys.
I never had much patience with that argument. After all, if you're accepting public money to run a program that is supposed to reduce drug and alcohol addiction among your clients, or to help mentally-disabled adults find work, then it seems perfectly reasonable that you ought to be held accountable to those that you have agreed to help.
But that kind of resistance to data-informed policy and practice seems to be fading and fading quickly. From juvenile justice, to hospital care, to children and family services, more and more people seem to be figuring out that ignoring data is a losing strategy when it comes to helping people become healthy, productive, engaged citizens.
In fact, one of the most coherent and articulate mini-disquisitions on the necessity of managing to data I have heard lately was delivered to me just last fall by a front-line foster care supervisor in Jefferson Parish, Louisiana. This woman meant business. "In the past we invested in services and had no idea if they were making a difference," said Juanita Beasley. "We had no quantitative way of knowing if we were getting any bang for our buck. Now we're using data to manage our investments; to see if what we're investing in is really helping kids."
How various governments are getting into the evidence-based policy and program game varies. Some agencies are focusing on a handful of what are sometimes called "leverage" measures - single measures that actually indicate improvement in a wide range of practices. Other governments are trying to track lots of indicators as a way to assess progress. Virginia is a good example of the former approach, Massachusetts of the latter.
In Virginia, outgoing Governor Tim Kaine is handing off that state's ambitious and promising children and family services transformation effort to the incoming administration of Governor Bob McDonnell. A hallmark of the Virginia transformation initiative is that it is centered on a mere six key outcome measures. For example, one of them was the percentage of kids in congregate or residential care. It was a smart number to focus on for two reasons. First, congregate care has been clinically proven in most cases to do more harm than good, particularly when stays extend beyond three months (it also happens to be the most expensive type of foster care any jurisdiction can provide). Second, if local social service agencies are reducing numbers of kids in congregate care, it means those agencies have changed their practice models more broadly - that they are now focusing on either keeping kids out of the system in the first place, or that they're finding kids more appropriate family-like settings in which to be placed.
In other words, by focusing on the right data point, Virginia has not only reduced the percentage of kids in congregate care from more than 25 percent to around 18 percent - and saved states and localities a lot of money in the process - it has also succeeded in changing the behavior of local government service providers across the system, inspiring them to emphasize more clinically therapeutic approaches to foster care. That's leverage.
Massachusetts, by contrast, is in the midst of a massive and ambitious effort to start tracking performance measures across its entire Executive Office of Health and Human Services through what it is calling "EHSResults!"
The EOHHS push to track performance includes 17 agencies that provide services ranging from helping the elderly, to managing Temporary Assistance to Needy Families, to helping the institutionalized disabled move to community-based care. Underpinning the push to improving services, EOHHS has laid out 25 major goals, 75 sub-goals and scores of specific outcome measures related to all those loftier aspirations.
Massachusetts is in the early stages of EHSResults! so it's too early to tell whether it's going to actually refocus the system on evidence-based best practices or just overwhelm those in the trenches who ultimately have to deliver on all those mini-promises laid out in detail in the reform effort. But the leadership at EOHHS clearly understands the double edge sword of data. "We've re-thought some measures either because they just weren't a good measure or in some areas we had too many measures," says Secretary of Health and Human Services JudyAnn Bigby, who is responsible for launching the effort. But Bigby says her agency heads have clearly gotten the message that she will be looking at the numbers, and adds that the state is already seeing some key indicators moving in the right direction.
Is one approach better than another? More measures versus fewer? My guess is that as EHSResults! evolves, Massachusetts will figure out that when it comes to performance measures, one is better than a dozen, and a dozen are better than 100.
But the Virginia and Massachusetts efforts bear watching closely because both are clear and convincing evidence that evidence-based practice is coming to a jurisdiction near you, and that excuses for why it shouldn't apply to health and human services just aren't washing anymore.
You may use or reference this story with attribution and a link to