Programs Like D.A.R.E. and Scared Straight Don't Work. Why Do States Keep Funding Them?
There's a better way for governments to focus on effective initiatives.
In early April, the fifth graders at West Side Elementary School in Worland, Wyo., celebrated their graduation from the school’s D.A.R.E. program -- short for Drug Abuse Resistance Education -- with certificates and T-shirts. County Sheriff Deputy Colleen McClain was proud of the program. “It’s giving them ways to help them not do drugs,” she said.
Efforts like McClain’s have a big fan in Attorney General Jeff Sessions, who told a D.A.R.E. conference last year that he firmly believes the program saves lives. “Your efforts work,” he said. D.A.R.E. programs date back to the 1980s, and at one point, three-fourths of the nation’s school districts had them. A generation of school kids can repeat back D.A.R.E.’s central message: Just say no.
But there’s one problem. Study after study has shown that D.A.R.E. doesn’t work. Most analyses have found that it has little to no impact on reducing drug use -- and one study even showed that use increased. “I don’t get it,” one D.A.R.E. executive director said of the findings. “It’s like kicking Santa Claus to me.”
Another favorite program, Scared Straight, which pulls at-risk kids off the streets and puts them in prison for a day, has also been found to be ineffective. The idea behind the program is to show kids what life behind bars would be like -- to, in effect, scare them straight. It seemed like such a good idea. A 1979 documentary on the program won both an Emmy and an Oscar. It spawned programs across the country, as well as a long-running television show on A&E. But despite the program’s popularity, the evidence is clear: The program did prove effective -- in producing more criminals. Efforts to scare kids straight not only failed to keep them out of jail, but it also, in some cases, increased the odds they’d end up behind bars.
Still, many state governments doggedly stuck with the program until the Justice Department warned they could lose federal funding if they remained committed to something that the evidence proved was ineffective. Today, South Carolina alone still has a version of a Scared Straight program.
The federal Commission on Evidence-Based Policymaking has been campaigning to bring more -- and better -- analysis to policy decisions. It has pointed in particular to the value of randomized controlled trials, or RCTs, where individuals are assigned to control groups to test whether a program actually works. RCTs produced the findings on D.A.R.E. and Scared Straight, and they’re widely considered the “gold standard” for policy analysis.
But most state and local governments can’t afford them: RCTs are very expensive, in large part because they require skilled analysts. As a result, many local government officials get sucked into programs backed by strong constituencies but that offer no evidence of effectiveness. These governments essentially end up pouring enormous amounts of money into programs that don’t work.
The GovEx center at Johns Hopkins University believes its “Roadmap for Policy Change” can help communities do better. The roadmap advances a viewpoint that’s considered heretical in some parts of the world of analysis. It says: “When you can’t get rigorously tested, experimentally verified information, it is appropriate to work with what we do have.”
The roadmap suggests communities look to others for stories about how they’ve cracked tough problems and how they’ve navigated tough local political battles. GovEx certainly doesn’t argue against sophisticated analysis. But it contends that cities have to start somewhere, that they don’t always have the time or talent for mega-studies and that they often need to act before the big guns of policy analysis have produced big findings.
For example, Kansas City, Kan., was looking for fresh ideas to address urban blight. Its local government staff fanned out across the internet and, armed with the results of Google searches, dug up efforts in other cities such as Baltimore, Mobile, Ala., Memphis and New Orleans. But after they put the results through a filter -- Did any of these ideas seem to make sense for Kansas City? -- they discovered that efforts in Chicago suggested they revisit local vacant building ordinances.
Randomized controlled trials can certainly help cities stage a major breakthrough, like Denver’s innovative social impact bond to attack the problem of homelessness. (See “For Money or For Good?,” page 52.) That’s proven a big success, and the results are backed up by an RCT. But when big problems challenge small staffs and when systematic policy analysis simply can’t be done, a smart roadmap -- learning from other cities and working carefully with the evidence on hand -- may help lots of communities around the country do better. It can help them avoid the D.A.R.E. trap of chasing nifty ideas that, in practice, just don’t work.