Internet Explorer 11 is not supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Judging the Judges

Performance measures are coming to the tradition-bound world of state courts.

Last summer, a litigant at a Maricopa County courthouse in Surprise, Arizona, had a complaint. She had driven around for hours before finding the court. "The directions on MapQuest are wrong," she told court staff. It wasn't just an idle gripe: The woman was responding to a survey being conducted by the county. On the same questionnaire, she answered several other queries, such as whether she had been treated respectfully and whether, upon leaving the courthouse, she was clear about what to do next with her case. As it turns out, she was right about MapQuest. County employees contacted the online map company, and 48 hours later the directions had been corrected.

The survey was part of a broad new effort by the county to gauge how well its court system is working. It's not an entirely novel concept for Maricopa, which has historically emphasized good court management. But the county is embracing a set of new performance measures that many believe could transform the way courts across the country are run. The issues involved range from things as mundane as driving directions to sweeping concepts of justice and fairness.

It's taking place at an important moment in judicial administration. Courts around the country are starting to realize the benefits of performance measurement. They're learning to set formal standards for good court performance. And they're beginning to collect and analyze data in an effort to achieve them. Putting judges under a measurement system is not easy--courts historically haven't focused on performance standards--but jurisdictions are deciding that the investment is worthwhile for the accountability it provides.

20-YEAR PROJECT

The pace of change is picking up in part because the National Center for State Courts has released a performance measurement system expressly aimed at revolutionizing the way courts manage themselves. CourTools, unveiled last year, is a set of 10 core measures that give court managers a framework for assessing and improving performance. Unlike reviews that focus on the performance of a specific judge, CourTools is intended to evaluate the entire judicial process, from a litigant's ability to find the courthouse to a convicted felon's opinion of how he was treated at sentencing.

One metric, labeled "access to fairness," gauges a court's accessibility and its treatment of those who use it. Managers pick out a "typical" day and survey everyone who happens to be doing business with the court. Another measure looks at the percentage of court records that can be retrieved within established time standards, and how complete and accurate the files are.

Then there are yardsticks to gauge the speed and efficiency with which a court handles its caseload. One of them measures the number of incoming versus outgoing cases; others catalog how many cases are completed within a certain period of time, and how long they stay on the court docket. There are separate measures for trial date certainty, collection of monetary penalties, court employee satisfaction and cost per case.

The CourTools project is actually the result of nearly two decades of work. In 1987, the NCSC launched a Trial Court Performance Standards Project, a collaborative effort of trial judges, court managers and scholars. It was the first real national effort to limn performance measures for courts. The NCSC spent eight years developing and refining the measures, releasing the final set of standards in 1995.

The result was magisterial, but it was also unwieldy. It was a complex system of 22 goals in five areas of performance, with 68 different measures for evaluation. It laid out a sound conceptual framework, but it was too complicated. "There was too much information," says Ingo Keilitz, a former NCSC vice president who worked on the project. "Some of it read like a Ph.D. dissertation."

Only a smattering of courts adopted the Trial Court Performance Standards. But the NCSC, convinced that the approach was too useful and too important to discard, set about refining the measures to make them more workable. The pared-down result was CourTools, which uses the basic concepts of the earlier standards but with an emphasis on simplicity. It includes a six-step process to walk courts through the whole process of implementing CourTools.

GRADUAL PROGRESS

The effort is gaining momentum. Three county courts in Arizona are using the system, and the state is considering adoption of some of the timeliness measures. North Carolina has implemented several of the measures as well. Two county courts in California--San Joaquin and San Mateo--have instituted the full set in a pilot program that California is hoping to introduce statewide. A handful of other courts across the country--Seattle municipal court, the District of Columbia, two counties in Texas, one in Ohio, a district in Illinois--are working to put CourTools in place. Florida, New Jersey and New York are considering it. Last year, the Conference of Chief Justices and the Conference of State Court Administrators endorsed CourTools, urging state courts to adopt the system.

But Maricopa County has gone furthest. Court managers there actually began working with the NCSC a year before CourTools was released. The county put together multi-agency workgroups, ranging in size from eight people to two dozen, for each of the 10 measures. The workgroups met regularly over several months to study the measures and figure out how to capture the right data. "We've been really aggressive about this," says Marcus Reinkensmeyer, the county court administrator, "because it's got such great potential." The workgroups, he says, helped bring every court employee on board. "I don't think we could have just thrust this on the staff. You've got to make it a collaboration from the very early stages."

Part of the challenge is that, while the NCSC has strived to create simple, understandable measures, gathering the right data is still difficult. Take, for example, the measure known as "age of active pending caseload." It seems straightforward enough. A court simply computes how long its active cases have been open. But what if a defendant has fled, delaying a case for a month or more? That's not a reflection on the court's performance. "You have to ask yourself if your architecture is flexible enough to accurately measure things like timeliness," says Richard Schauffler, NCSC's research director. "The measures are simple but getting quality data is a lot of work. The data issue is the most mundane, but it's also the most challenging."

Maricopa is still rolling out its CourTools measures, and it's too soon to gauge their effectiveness. The most useful data, of course, will be those dealing with the longer-term trends in performance: Is the court system improving or declining? That will come later. But useful information is already emerging from the "snapshot" data collected over the past year. For instance, Reinkensmeyer says he was "shocked" at the number of respondents who said they were satisfied by the court. Roughly half the people walking out of a courtroom have lost their case, but over 80 percent of the respondents in Maricopa said they were pleased with their court experience.

 

HOMEGROWN STANDARDS

A national set of performance measures, such as CourTools, has obvious advantages: The pre-set list of measures is relatively easy to adopt, and courts can use them to compare their record to other jurisdictions. But that's not the only route courts can take. Some have worked to develop their own customized set of standards. Kevin Burke, a judge in Hennepin County, Minnesota, is the national leader of that effort. Six years ago, Minnesota was discussing the possibility of full state financing for its courts. Burke, then the chief judge for the county, knew that if that happened, legislators would want to know what they were funding and how well it was working.

Burke set out to develop homegrown measures and goals for Hennepin County courts, the busiest in Minnesota. He decided to style his system on performance frameworks used in the private sector. Initially, he says, he found the effort discouraging. Courts are very different from private companies, which tend to measure themselves against competitors. But Burke adapted that concept, creating fictitious "competitors" for Hennepin County.

Today, Hennepin is a national model of court management. It has cut its time-to-disposition from three years to six months. To deal with a crushing backlog of drug-arrest cases, Burke created a special drug court in 1997, radically shortening the time it takes for those cases to be resolved. Proceedings that used to take four months are now finished in a matter of days. A defendant in a drug case now receives chemical assessment the day he appears in court.

Hennepin also created a new court to handle the county's domestic- abuse cases, from assault to violation of protection orders, which previously had been scattered across the judicial system. The consolidation of cases reduces the number of times a victim has to tell his or her story, and it allows judges to deal with these cases more quickly.

 

Burke says that he supports efforts such as CourTools but that his system benefited enormously from the process of identifying its own standards and measurements. "There's something to be said for the ownership of developing the measures yourself," he says. "The effort worked so well in Hennepin County because we had developed them."

Indeed, no jurisdiction should unquestioningly adopt a performance measurement system in toto, says Keilitz, now a consultant who writes extensively about court performance measures on his blog, made2measure.blogspot.com. Still, he advises, any court considering performance measures should start by studying CourTools. Then it can develop an integrated system that incorporates the most useful measures into the entire court process, placing them on the agenda at regular meetings and assigning a specific staff person to be responsible for each measure.

"The value lies in being able to institutionalize this," says Schauffler, of NCSC. He thinks it's better for courts to engage in two measures for five years than to do 10 of them just once. "This can't be a one-time experiment," he says. "Anybody can do a fire drill. This is a commitment."

Special Projects