Internet Explorer 11 is not supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

How Stat Got Stuck -- in the Place That Made It Famous

Using data to measure government performance has caught on in much of the country. But the tactic is in trouble in Maryland.

citistat
A governor, O'Malley often ran StateStat meetings.
(Flickr/MarylandGov)
In his seven years as mayor of Baltimore and eight years as governor of Maryland, Martin O’Malley’s proudest achievement was developing a new way to manage state and local bureaucracy. Borrowing from CompStat, a system used by the New York Police Department, O’Malley created for both his city and state an office of analysts that collected reams of performance data from departments and applied pressure for improvement through regular meetings and public progress reports. It was CitiStat in Baltimore, and StateStat when he took it to Annapolis in 2007. 

There were reports of dramatic results, from impressive reductions in chronic absenteeism among public workers to quicker turnarounds on filling pothole requests. The Stat programs played a role in driving down Baltimore’s murder rate, improving water quality in the Chesapeake Bay and clearing a backlog of unchecked state DNA samples collected from convicted criminals. The good news spread quickly. More than 20 large cities and a handful of counties now have Stat programs, as do several federal agencies. “This data-driven approach to governing, at least among effective leaders, is becoming more the norm than the exception,” O’Malley says. 

But for all the accomplishments of O’Malley’s Stat initiatives, the model is in trouble. His successor in Annapolis, Gov. Larry Hogan, has discontinued the program. Baltimore’s CitiStat hasn’t fared much better, languishing from inactivity for months, if not years, at a time. The celebrated innovation that inspired a movement of Stat-like programs from Jackson, Miss., to Washington state is struggling to stay alive on its home turf. 

In 2015, Hogan issued an executive order replacing StateStat with the Governor’s Office of Performance Improvement, or GOPI. On paper, the new office sounded a lot like the old one. It would “provide accurate and timely data about the efficacy and cost-effectiveness of government services.” The order called for the tracking of agencies’ progress against established strategic goals, along with regular meetings between the governor’s office and agency heads. But in practice, it looked more like a gut job than a rebranding effort. Hogan cut the office’s budget in half, reduced the staff from nine positions to four, and moved the headquarters from Annapolis to a small town 20 minutes outside the state capital. 

The GOPI website says it “publishes information on the progress that state agencies make in meeting their goals.” But under the “Track Our Progress” tab, visitors can’t track anything. The page provides a link to the state’s open data portal -- which still gets updates -- but without performance benchmarks, it’s nearly impossible to draw conclusions about agencies’ progress by the numbers alone. If GOPI staff meet regularly with agency heads, there’s no trace of it on the website. The office does not produce public agendas beforehand or written summaries afterward. (Gov. Hogan’s office did not respond to questions for this story.) “It’s heartbreaking from a data perspective,” says Beth Blauer, a former director of StateStat who now leads the Center for Government Excellence at Johns Hopkins University. “We built performance measures into basically everything we did and we talked about them very publicly because we wanted the public to hold us accountable. And it just disappeared.”

The hollowing out of StateStat isn’t a total surprise given that Hogan, a Republican, is a longtime critic of O’Malley, a Democrat. It can be explained at least in part as routine fallout in the transition of power between parties and political adversaries. But the Stat initiative has suffered setbacks in Baltimore as well. A few months before Hogan replaced StateStat, The Baltimore Sun reported that in 2014 the CitiStat office hadn’t published a single department report and had canceled a third of its meetings. The account echoed the paper’s coverage from 2012, when it found that CitiStat didn’t publish any reports in the first two years of Mayor Stephanie Rawlings-Blake’s administration. (Rawlings-Blake did not respond to interview requests.)

When local media scrutinized the program two years ago, other problems came to light: As eventually happened with StateStat, CitiStat’s staff of nine analysts had been reduced to four. Its director was splitting his time between his full-time city government job and a part-time position with a private law firm. When reporters inquired about how many CitiStat meetings the mayor or her chief of staff had attended, her spokesman couldn’t say. “My great fear in all of this is that we are losing the accountability that our constituents and we depend on,” Councilwoman Mary Pat Clarke said at an oversight hearing on CitiStat in 2015.

As the Stat model has struggled in its home state, it continues to spread elsewhere. The Johns Hopkins Center for Government Excellence is now involved in What Works Cities, a national consortium of municipalities that draws lessons from CitiStat and incorporates them into a broader strategy around transparency, data management and evidence-based decision-making. Performance measurement of one variety or another is an increasingly common way to run government agencies, which is why what happened in Baltimore and Annapolis may provide useful lessons for jurisdictions elsewhere. Many of the Stat programs are still being run by the elected officials who established them. They have yet to undergo the test of a political transition. 

In Baltimore, for all its problems, CitiStat still exists. Catherine Pugh, who won last November’s mayoral election, is the third person to inherit the program since O’Malley left office. As a candidate, Pugh pledged to keep CitiStat “as a tool of measurement and accountability,” but also as part of a broader strategy for analyzing and responding to crime trends. Whether CitiStat undergoes a revival, or drifts further into irrelevance, will speak volumes about the model’s long-term viability as a good-government tool from one administration to the next.

It’s something that O’Malley himself thinks about. He recalls discussing the sustainability of Stat programs with Bill Bratton, the former New York City police commissioner who helped institute CompStat. “Bratton said to me, ‘You know, Martin, the hardest things to institutionalize are new systems that require constant work.’ And that’s true on policing. That’s true on the environment. That’s true across any government -- city, state or federal,” O’Malley says. “Just as easily, that rock will roll back down the hill if someone’s not pushing it up.” 

pugh.jpg

During her mayoral campaign, Pugh pledged to keep CitiStat running in Baltimore. (AP)

In Baltimore, the man tasked with breathing life back into CitiStat is Sameer Sidh. His most recent post was with the city’s department of transportation, but he was a director of StateStat in the O’Malley administration. Since assuming his post in late 2015, Sidh has created a Twitter account and new website for CitiStat. He’s conducting regular meetings and publishing progress reports online. “I wanted to make sure it was clear that it was an active program,” he says. “It was a mix of getting us back to our fundamentals, holding agencies accountable and making sure we were following up on discussions that we had in meetings.” 

Perhaps Sidh’s most difficult assignment has been to cleanse CitiStat of the authoritarian and sometimes corrosive top-down management structure that rankled program managers and agency heads. “The word that we’re pushing is ‘collaboration,’” Sidh says. “I’ve tried to relax the atmosphere a little bit more. We want folks to be honest. Ultimately as city hall, we want to understand what’s going on from the agency level. You can extract more information if folks are actually comfortable talking.” 

Several of the CitiStat directors under Rawlings-Blake had made similar pledges to depart from the confrontational nature of the program in the O’Malley years. Viewers of the HBO television series “The Wire” are familiar with the dramatized version of Baltimore police meetings in which senior officials would flash numbers on a screen and publicly grill subordinates about their failure to meet benchmarks. One previous director likened CitiStat in its early days to “a Spanish inquisition.” Today, the new CitiStat, Sidh says, “is not an opportunity to browbeat middle managers, but an opportunity to get better as a city government.”

One former director, Matt Gallagher, says the old critique of CitiStat as hostile and demoralizing stems from rare incidents, usually after months of missed targets and poor performance, that led to confrontation. “The stuff about an adversarial relationship was overblown,” he says. “You have to remember that we were holding five Citistat meetings a week, sometimes six. That’s 250-plus Stat meetings over the course of a year. If you’re going to convene that often, if you’re going to have robust conversations about performance, good and bad, there’s going to be disagreements.” 

Bob Behn, a Harvard professor who visited dozens of CitiStat meetings in researching his book, The PerformanceStat Potential, says the Stat meetings he witnessed could make subordinates uncomfortable, but they were always civil. “Nobody swore at anybody. Nobody personally belittled people,” Behn says. “What you did was ask them questions that they couldn’t answer. And if you ask them questions that they can’t answer, that’s embarrassing. You don’t have to swear at people or raise your tone. Everybody gets it.” 

Regardless of how accurate the portrayal of CitiStat as a brutal inquisition might be, it reflects a common perception of how the program operated. And it sheds some light on why a successor would want to change it. 

In Baltimore, Sidh and Pugh plan to shift CitiStat from its focus on the performance of individual agencies to an emphasis on cross-cutting policy issues, such as homelessness or blight. In Stat meetings about the city’s homeless population, for instance, staff from the police, fire and transportation departments would attend, even though ending homelessness is not currently part of the core mission for any of them. Baltimore has tested this approach in the past with issue-specific Stat groups that focused on cleanliness, child well-being, illegal gun trafficking and domestic violence. But those were the exceptions. Now they will be the main program.

Gallagher thinks that’s what needs to happen. “You think about complex outcomes like the health of the Chesapeake Bay or producing a safe, happy, healthy child,” he says, “and it’s hard to hold one agency accountable for that outcome because so many different agencies contribute to it.”

Spreading the responsibility around seems to be in line with the way the overall field of government performance measurement is evolving. Programs inspired by the O’Malley Stat programs have since developed their own twist on the original concept. One of those programs is in Cincinnati, where City Manager Harry Black launched CincyStat based on his three years as chief financial officer in Baltimore. Black decided to place his Stat program in a larger Office of Performance and Data Analytics. Once his staff identifies a troubling trend in a Stat session, they refer the issue to another part of the office called the Innovation Lab, which uses business and process improvement techniques to address the problem. “Stat by itself is not enough,” Black says. “What are you going to do with what you discover?” 

Other jurisdictions are experimenting in similar ways. In Washington state, for example, Gov. Jay Inslee sits in on monthly Stat-like “Results Review” meetings where his analysts discuss strategy and performance data with senior agency officials. But his Results Washington program also includes “Lean” process improvement training to equip state employees with the skills necessary to address problems that arise in review sessions. 

In Baltimore, meetings under the last administration had slipped to six-week intervals, raising concerns from the city council that the program couldn’t be as responsive to emerging trends. Yet some Stat supporters say the programs shouldn’t be judged on the frequency of meetings. That, too, evolves over time. David Gottesman, manager of a Stat program in Montgomery County, Md., says the regularity of meetings should correspond with the general performance of an agency and the urgency of resolving a particular issue. When his predecessor created Montgomery CountyStat about 10 years ago, the office held weekly or biweekly meetings with department heads and program managers. “When CountyStat was in its infancy, it was appropriate,” Gottesman says. The meetings could last three hours and required the attendance of senior officials within a department, an expensive investment of labor and resources. “Over the years, the value in those meetings naturally diminishes because you attack the low-hanging fruit and you get things in a good state.” 

The exception is when a crisis hits that requires immediate attention. When the Montgomery County Department of Liquor Control received negative press in 2015 for poor customer service, CountyStat initiated routine meetings and weekly progress reports around a few key outcomes. As the department has improved in the past year, the Stat sessions have become less frequent. 

One of the potential lessons from CitiStat and StateStat is that these programs inevitably run the risk of being sidelined during a political transition. On one hand, they usually require the ongoing engagement of the executive. Both in Baltimore and in Annapolis, O’Malley not only gave the programs his blessing, but attended the sessions and sometimes ran them. His personal involvement made people identify Stat as an O’Malley project. That association has posed problems. “Nobody succeeds somebody else and wants to prove that her or his predecessor was brilliant,” says Behn, the Harvard professor. “Most people don’t come in and want to continue their predecessor’s initiatives. The most you can hope for is that they’ll keep the substance and change the name.”

Some of the Stat programs outside Maryland have tried to account for the risks of political transition. In Louisville, Ky., LouieStat has developed two layers. Agency heads hold regular sessions with their own internal analysts. Then the mayor’s office convenes more traditional Stat meetings that focus on cross-agency issues, such as pedestrian safety and citywide customer service. The goal is to transfer partial ownership of LouieStat to department heads, so that it isn’t exclusively the mayor’s program.

The way elected officials talk about Stat programs may also affect their longevity. In her role at What Works Cities, Blauer advises municipal officials that the public relations strategy around a Stat program should focus on improved outcomes that affect citizens’ lives, not the underlying machinery that made them happen. For example, New Orleans Mayor Mitch Landrieu used the Stat model to corral his administration’s efforts around reducing blight and violent crime in the city. What the public associates with Landrieu isn’t his performance management approach, but the results derived from it. “It hasn’t been ‘Mayor Landrieu, NolaStat Mayor,’” she says. “It’s been ‘Mayor Landrieu, the mayor that’s getting hard work done in the city of New Orleans, and these are just the tools that he’s relying on.’” 

If successors see Stat as a standard tool to achieve their policy objectives, they may be more likely to keep it. That is what Sidh hopes will happen in Baltimore. “This has to be broadly respected as a good-government practice and not just the creation of one specific political figure,” he says. “Regionally and nationally, I think you’re starting to see that tide turn and people understand that the value of the program goes beyond the one person who gave the program its name.” 

J.B. Wogan is a Governing staff writer.
Special Projects