How Data Can Improve Afterschool Programs for At-Risk Youth

As cities strive to improve afterschool programs, they're using data to figure out which kids to serve and how to serve them better.
by | February 25, 2013 AT 12:00 PM

The promise of data is huge, but putting it to good use can be a challenge. That’s what local officials and community volunteers, aiming to create effective afterschool programs for at-risk students, have found as they try to harness and share all the information that’s available to them.

It starts with collecting the right kind of data. Much like regular schools, which are often trapped by focusing solely on test scores, afterschool programs also can’t measure themselves by one metric. These are the ABC’s, as described by Robert Balfanz, a research scientist at the John Hopkins University Center for Social Organization of Schools: attendance, behavior and course completion.

“You have to ask yourself: what metrics and data tell you if you’re having the impact on students that you want to have,” Balfanz said at Better Together: Building Local Systems to Improve Afterschool, an event sponsored by various groups including The Wallace Foundation and the National Leagues of Cities in Baltimore Friday.

So officials need to know if students are showing up at afterschool programs, if they’re behaving when they do or what factors might be affecting their behavior if they aren’t, and whether they’re finishing the work they’ve started. “That’s what they need to do to progress in school and achieve,” Balfanz said.

Then that information needs to be gathered and presented in a way that’s actionable for all the various parties who are involved in making afterschool programs work: school faculty, city officials and community volunteers.

In many ways, that can be the biggest challenge. The school is used to collecting its own information, but hasn’t historically shared it with outside parties. A school district in a big city might be working with dozens of different community organizations that are running different afterschool programs within its borders. They could be all asking for different data or need the same information presented in different ways. The different systems also need to be able to talk to each other: one official said her school district had begun to tag all of their students who are enrolled in afterschool programs. That way, if the afterschool staff asks for all the information that the district has on an individual student, it’s a simple click away. That’s a start.

But the ideal is to create an interactive early-warning system, a portal that provides the right kind of data and processes it in a way that flags potentially at-risk students who benefit the most from afterschool activities. If a student becomes regularly truant, a flag will go up. If they start to have frequent behavioral problems, a flag will go up. If their grades drop, a flag will go up. The school district and the afterschool network will immediately know if a student needs an intervention and they’ll be able to work together using the same data to create an effective one -- at least that’s the hope.

That requires a lot of coordination, though. Not only is it technically difficult, but the groups involved are used to working in their own silos. They aren’t necessarily comfortable opening up their books for outsiders. Take Philadelphia, said Tom Sheaffer, director of policy and evaluation at the deputy mayor’s office for health and opportunity. More than 100 afterschool programs are up and running there, but they all connect separately with the state department of education, he said. So Philadelphia is trying to build a “system of systems” that unites all those different data streams and offers one forum for them to be analyzed. Only a half dozen programs have bought into it so far, he said, but there is hope that more will soon come onboard.

“Our experiment is if we build strong data system and tie it to quality measurements, then people who don't quite get it yet, they'll get it,” Sheaffer said.

The last step in harnessing data is using it to create the most effective and personalized intervention possible for specific students. Standardized approaches don’t always work, the experts said. If you start a phone campaign to reach out to truant students, and only 10 percent start coming to school or afterschool more regularly because of it, you could argue it was a waste of time and energy. If a student is falling behind in general math, but you tutor them on fractions when they have a test on probability approaching, you haven’t made a big difference.

“It’s great to know which students are falling off track, but it’s not going to help you if you don’t set up interventions,” Balfanz said, “And if we can figure out which interventions work for which kids, that’s sort of the Holy Grail at the end of the day.”

But there are some early examples of what effective interventions might look like. Katie Brohawn, director of research at TASC, a group dedicated to an expanded school day through afterschool programs, shared her experience working in Baltimore, New Orleans and New York City.

For one NYC school, they started by taking the attendance data that schools had always collected and putting it through an analytical model. What came out was what they called a “heartbeat chart,” regular patterns in absenteeism that school officials had never consciously noticed before. One discernible pattern was that student absences routinely spiked on the Friday before a Monday holiday, Brohawn said. So the school organized an outreach campaign to parents, encouraging them to make sure their students came to school on those Fridays.

That’s just one illustration of what effectively using data could look like. And it worked because they collected and analyzed the right data, it was understandable for the school officials and faculty on the ground, and it was easy to translate what the data was saying into an impactful intervention.

“The goal is to be able to provide data for real-time decision-making, so you have to make the data easily digestible for folks. I think in the research world you can sometimes assume people will automatically understand what you're talking about and know what to do with it, but they don't always,” Brohawn said. “You need to show them it's worth their while. It has to be a mutually beneficial relationship.”