Can Education Data Build the Perfect Teacher?
How student test scores are used to evaluate teachers is at the heart of the unresolved issues causing Chicago’s first strike in 25 years.
At the beginning of every semester, Pam Williams logs onto Georgia’s Statewide Longitudinal Data System (SLDS) to assess her incoming high school pupils. She sifts through mounds of information—from attendance records to disciplinary reports to scores on state exams—and identifies potential areas of concern. Maybe a student has a history of being frequently absent. Or maybe he or she has recently transferred to Appling County, where Williams has been a social studies and economics teacher for nearly 20 years. She will refer back to that data and collect her own throughout the year, administering pretests to pinpoint the strengths and weaknesses of her students as they begin her classes. As Williams explains, if a preceding U.S. government teacher already imparted the essential knowledge about the federalist and anti-federalist movements during the nation’s infancy, why rehash it? Better instead to delve into the principles of American democracy, a critical topic for senior students on the doorstep of adulthood.
“Anything that will help me better understand where my students are or where they’ve been will help me be more effective as a teacher,” Williams says. “I can learn more about each student, so that when I am in the classroom with the student, I’m spot-on.”
Test scores, for example, can be useful in discovering deficiencies and subject areas where students may need extra help. But Williams says “there is so much more than test scores” available to her on the SLDS dashboard, initiated in 2009 through a federal grant. If a student’s records show chronic poor attendance, Williams will make an extra effort to build a rapport with him or her in the first few weeks of the year. Or if she notices an incoming student who’s moved around a lot—one recent student had moved 12 times during four years of high school—it might be necessary to engage with his or her parents to ensure a smooth transition. (Williams’ dedication hasn’t gone unnoticed: In 2011, she was honored as the Georgia Teacher of the Year.)
At a time when the use of data is increasingly gaining momentum in education, many teachers are wary that it could be used as grounds to determine their pay or dismissal. But Williams embraces the change. In doing so, she’s become something of an ambassador between advocates pushing for more data-driven education and the educators who will be affected. In January, she traveled to Washington, D.C., for the National Data Summit convened by the Data Quality Campaign, a nonprofit funded by such philanthropies as the Bill and Melinda Gates Foundation and the Broad Foundation. Panelists at the summit included U.S. Secretary of Education Arne Duncan, Kentucky Commissioner of Education Terry Holliday and former D.C. Public Schools Chancellor Michelle Rhee. Among all the speakers, Williams was the only current teacher. “Teachers tend to trust other teachers,” Williams says. “When a teacher tells you, ‘Man, I’m using this and it’s working,’ other teachers tend to listen.”
There’s been a wide effort in recent years to incorporate data into the classroom. According to a recent report from the Data Quality Campaign, 40 states provide school principals with student longitudinal data, while 28 do so for teachers. Forty states offer feedback or growth reports to teachers based on student performance data, although only eight require teachers and principals to be data-literate. The federal government has provided its share of incentives: State applications for Race to the Top funding and No Child Left Behind waivers were required to include data elements in teacher evaluations and feedback. Through the Statewide Longitudinal Data Systems Grant Program, the U.S. Department of Education has awarded more than $500 million since 2005 to 41 states and the District of Columbia to develop those systems.
Much of the debate about student performance data has been focused on teacher evaluations. But for Williams and others like her, there’s a much larger, more fundamental question: Can administrators, policymakers and educators use objective data to create the perfect learning experience?
The first step in getting teachers to use data is making sure they know how to read it. The Oregon Direct Access to Achievement (DATA) Project is perhaps the most comprehensive statewide effort yet to educate teachers about how such objective analysis could help them hone their craft. Founded in 2007 and funded through federal grants, the project’s goal is to improve the ability of administrators and teachers to study the wealth of data pouring in from education agencies and apply it to curricula and instruction. It started with two-day training sessions across the state. After a few years, successful teachers were certified as “data coaches” for their peers. To help, the state has amassed an online warehouse of training videos and other resources on best data practices.
At the outset, “the waters were a little muddied,” says Mickey Garrison, who, as the director of data literacy at the Oregon Department of Education, oversees the project. “Teachers had to be able to analyze and make sense of the data before they could use it.” A 2011 independent analysis of the project by the University of Arkansas concluded that teachers had “made tremendous and swift progress” in increasing data-driven decision-making in their classrooms, and students “had likely benefited from these increases.”
Garrison had seen firsthand the potential of a comprehensive understanding of educational data. During her four years as an elementary school principal in Roseburg, Ore., prior to the founding of the data project, Garrison and her staff collaborated to utilize the information that was already available to them. Patterns in student test scores were used to identify gaps in instruction or students in need of specialized attention. Tutoring groups were formed and schedules were altered with additional time for difficult subjects. Test scores quickly improved. Garrison was soon tapped to lead the same effort statewide.
While Oregon is targeting teachers already in the classroom, Tennessee is putting data to use before teachers ever get there. Since 2007, the state’s Higher Education Commission has released an annual report card on the effectiveness of teacher training programs within the state. Teachers who are new to the classroom (less than three years out of college) are assessed based on the performance of their students. That data is then taken back to evaluate the training program that produced that teacher. The 2011 report card singled out several public universities -- Tennessee State University, University of Tennessee at Martin and Middle Tennessee State University, to name a few—for producing teachers with lower student achievement gains than their peers. As part of Tennessee’s Race to the Top application, which earned the state a $501 million grant, the report card was revamped in 2011: Teacher prep programs that produce high-quality educators could see an increase in state funding. Struggling programs will be given more time to improve, but they could ultimately be decertified if they fail to show progress.
Tennessee is one of only five states to tie teacher performance data with their training programs; Louisiana became the first, in 2004. “It’s a pre-emptive approach, instead of waiting until teachers are being evaluated out in the classroom,” says Richard Rhoda, executive director of the Tennessee Higher Education Commission. “This is a way of getting the information back to where those teachers were educated.”
Funding data programs like these can be a problem for some states, despite help from federal grants. That’s been the case with California’s Longitudinal Teacher Integrated Data Education System (CALTIDES). The system, approved by the state Legislature in 2006, was designed to track teachers’ credentials with the courses they teach, to ensure that each educator was being placed in the appropriate classroom. But in 2010, then-Gov. Arnold Schwarzenegger vetoed six months of funding for CALTIDES as part of nearly $1 billion in cuts to address the state’s ballooning budget deficit. Last year, Gov. Jerry Brown vetoed the full $2.1 million budget for CALTIDES, saying he believed that school districts could monitor their own teacher data. The program has been indefinitely suspended, and the state was also forced to return a $6 million federal grant tied to the effort. California’s struggle to develop seamless and sustainable data systems has been cited by the U.S. Department of Education in its denial of the state’s request for Race to the Top funding.
The biggest challenge may not be fiscal. It may be the effort to sell teachers on the notion that data can help them, rather than simply cut into their pay or lead to their firing. At the National Data Summit in January, former Tennessee Gov. Phil Bredesen, who signed the higher-education data program into law, said most people still view data as “primarily punitive. That’s not very persuasive.”
Georgia’s Williams acknowledges that such sentiment exists. For many teachers, she says, “data” still means an impenetrable stack of boxes in a closet in the school office. Convincing veteran teachers that technology and the data it provides can be harnessed to make them better educators will be difficult.
It’s worth the effort, says Education Secretary Duncan. “We want teachers to feel empowered,” he said at the data summit. “We want real feedback about where we’re strong and where we’re weak. This shouldn’t be an additional burden for teachers.” The goal, he said, is simple: “We’re trying to change behaviors.”
“Data has been made to seem like the enemy,” says Williams. “In reality, every teacher in the classroom went into teaching because he or she believed they could make a difference in the lives of young people. If data can help me do that more efficiently and more effectively, then it can only be my friend.”
10 State Actions to Ensure Effective Data Use
1. Link K-12 data systems with early learning, post-secondary education, workforce, social services and other critical agencies. (Currently in place in 11 states)
2. Create stable, sustained support for robust state longitudinal data systems. (27 states)
3. Develop governance structures to guide data collection, sharing and use. (36 states)
4. Build state data repositories that integrate student, staff, financial and facility data. (44 states)
5. Implement systems to provide all stakeholders with timely access to the information they need while protecting student privacy. (2 states)
6. Create progress reports with individual student data that provide information educators, parents and students can use to improve student performance. (29 states)
7. Create reports that include longitudinal statistics on school systems and groups of students to guide improvement efforts at the school, district and state levels. (36 states)
8. Develop a purposeful research agenda and collaborate with universities, researchers and intermediary groups to explore the data for useful information. (31 states)
9. Implement policies and promote practices, including professional development and credentialing, to ensure educators know how to access, analyze and use data appropriately. (3 states)
10. Promote strategies to raise awareness of available data and ensure that all stakeholders, including state policymakers, know how to access, analyze and use data appropriately. (23 states)