Internet Explorer 11 is not supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

A Tipping Point on Evidence-Based Policymaking

More and more, governments are turning to data to answer a crucial question: What works?

In 2012, I wrote in this space about an initiative being piloted in a handful of states to help policymakers, through the use of rigorous evidence and benefit-cost analysis, prioritize funding to programs that are most likely to produce positive results. At the time, many state and local governments were struggling to balance their budgets and policymakers were eager for an alternative to across-the-board cuts.

Fast forward to 2015, and 19 states and four counties have collectively directed $152.1 million to evidence-based programs with an estimated $521.3 million in return on investment. These governments have adopted an innovative and rigorous approach to policymaking: Create an inventory of currently funded programs; review which ones work based on research; use a customized benefit-cost model to compare programs based on their return on investment; and use the results to inform budget and policy decisions.

Through a partnership with the Pew-MacArthur Results First Initiative, states such as Mississippi and New Mexico have joined the state of Washington -- long considered an innovator in evidence-based policymaking -- in developing policy frameworks that support effective programming and building the capacity to use analytical tools that inform the budget process.

In 2014, Mississippi passed legislation establishing evidence standards for evaluating the state's corrections, health, education and transportation programs according to their predicted effectiveness. Similarly, the state, through its budget instructions, now requires executive agencies to justify funding for any new program by identifying evidence supporting the program's effectiveness. Mississippi policymakers expect to use this information to bolster the state's reinvigorated performance-based budget system.

Since 2013, New Mexico has used its customized benefit-cost model to compare the return on investment among programs in key policy areas, including criminal and juvenile justice, early childhood, and child welfare. The state, through its budget process, directed $104.4 million to the most effective programs.

Not surprisingly, city and county governments, often the leaders in finding innovative ways to improve government performance, are becoming increasingly interested in evidence-based policymaking. Earlier this year, Bloomberg Philanthropies launched its What Works Cities initiative, offering incentives to 100 city leaders to create innovative models for using data and evidence that will improve the lives of their residents.

Four California counties are also partnering with the Results First Initiative to bring an evidence-based approach to reducing recidivism. The impetus for this work was a major restructuring of the state's criminal-justice system that shifted much more responsibility to counties. Responding to a significant increase in the number of offenders serving time in their local jails or supervised in their communities, county leaders were eager to find tools to help identify effective programs for serving these populations.

In Santa Barbara County, leaders used information from a recidivism analysis -- a critical step in developing their customized benefit-cost model -- to determine that 63 percent of high-risk offenders were re-convicted within seven years, at a cost of $63,000 per offender. The next step was to look for programs that were most likely to reduce recidivism and to compare their costs and benefits. Armed with that information from the model, the county reallocated funding to serve 75 percent of high-risk probationers through evidence-based cognitive behavioral therapy programs.

Other California counties have used a national database, which pulls information from eight national research clearinghouses on the effectiveness of more than 1,000 interventions, to identify programs shown to work and compare them with the ones currently funded in the county. For example, leaders in Santa Cruz County used the database to determine that their correctional education programs lacked key components that research has shown are necessary to be effective, and are working to better align their services with proven practices.

Kern County, in the state's agricultural interior, previously operated only a handful of inmate-focused programs -- none of which were evidence-based -- largely because offenders stayed in county facilities only for short periods of time. But when county leaders needed to find ways to better target scarce resources, they allocated funding to support five evidence-based programs that are currently operating in local jails and dedicated Probation Department staff to ensure that programs are implemented effectively and achieve the desired results.

As information on "what works" continues to grow, and tools that make this information more accessible and understandable are used more widely, we are likely to start to see more policymakers expect and demand evidence when making decisions about what programs to fund and which policies to support. Meeting this demand will be challenging, but that's a good problem to have.

Executive vice president and chief program officer for the Pew Charitable Trusts
Special Projects