PerformanceStat: A Leadership Strategy That Keeps Innovating
Building on decades of experience, public leaders are finding new ways to tap the power of this powerful, evolving form of data-driven management.
A few months after taking office as mayor of Baltimore nearly two decades ago, Martin O'Malley launched CitiStat, the leadership strategy that built on the data-driven management ideas pioneered by the New York Police Department. In the years since CitiStat went live, the broader approach -- termed "PerformanceStat" by Harvard's Robert D. Behn -- has become one of the most important concepts in public management.
At its core, PerformanceStat involves ongoing, regular meetings between executive leadership and departments or bureaus. Participants review key performance measures and diagnose performance deficits, then decide how to fix those problems. The strategy has spread to a wide range of public-sector settings, including cities, counties, states and their agencies as well as some in the federal government. Importantly, the concept keeps adapting and innovating.
What are some of those innovative approaches? Based on our experience helping run PerformanceStat-based programs (Michael Jacobson and Melissa Wavelet) and tracking their development nationally (Andrew Feldman), here are three next-generation approaches that could be used to strengthen an existing PerformanceStat program or inform the design of a new one.
• Integrating evidence: The first innovative approach is to use the PerformanceStat process to advance evidence-based policy and decision making. That could begin by identifying the programs within your agency or jurisdiction that are backed by rigorous research. Just because those programs are research-based, however, doesn't mean that your organization is implementing them with fidelity to the model. The PerformanceStat process can provide a valuable forum to find out.
For example, Colorado's Department of Human Services used its PerformanceStat approach, called C-Stat, to track the fidelity of the state's home visiting program. In doing so, it became clear that the required monthly visits between providers and new parents weren't always happening; rates varied by provider and therefore by geography. The agency used C-Stat meetings to examine providers' monthly performance, identify fixes and track the results. Soon more providers were meeting the visiting requirement.
Another way to integrate evidence into PerformanceStat is by creating an organizational learning agenda. It's a document that identifies priority research questions for the organization -- in other words, "What do we wish we knew that would help us better achieve our mission?" PerformanceStat meetings provide the ongoing time and attention to advance a learning agenda.
• Taking meetings out to departments and bureaus: Another innovation is to "flip the script" of typical PerformanceStat meetings and hold them at bureaus rather than requiring bureau leaders to come to headquarters. That's the approach being taken by King County, Wash. Having the leaders of its PerformanceStat program -- called Operations Reviews -- go to bureaus eliminates the "being called to the principal" feeling that these programs can engender. More broadly, the change in venue underscores that ownership of the data and responsibility for improving results rest at the bureau level.
A related innovation in King County is to use visual management. That starts with putting the data on whiteboards. When bureau leaders put key performance metrics up in the hallway outside their offices, it sends a clear signal: These metrics, including problem areas in red, are issues we all need to focus on every day. Those leaders also organize regular staff huddles around the whiteboards to review problems that have been identified and steps in place to fix them.
• Structuring the program in new ways: In Baltimore's CitiStat, each meeting focused on a different city agency. When O'Malley become governor and launched a state version of CitiStat, he saw that many of the state's key challenges overlapped multiple agencies. So he structured some StateStat meetings around priority goals, not departments. An example was BayStat, which focused on improving the health of the Chesapeake Bay.
Another example comes from Louisville, Ky., which runs LouieStat under Mayor Greg Fischer. When issues arise in LouieStat meetings that require focused attention from issue experts, cross-functional problem-solving teams spend a few months diagnosing and trying to solve them. One team, for instance, focused on reducing ambulance turnout times, meaning the time from patient drop-off at the hospital to the next run. In just four months, the team's work resulted in efficiency gains worth $1.5 million a year, equivalent to two extra ambulances in service.
A final example comes from Montgomery County, Md., which as part of its CountyStat program creates performance agreements between the chief administrative officer and department heads. Those agreements include three key measures selected by the department head that are important to the department and its customers, are areas currently showing some deficiency in performance, and represent outputs or outcomes that the department can influence.
It is an encouraging sign for results-focused government that the PerformanceStat leadership strategy has spread to a variety of settings. That said, it is still unknown in much of state and local government and could be used much more widely within jurisdictions and agencies. The good news is that public leaders who want to create a PerformanceStat strategy can adapt existing next-generation approaches-and hopefully find ways to innovate even further.