Internet Explorer 11 is not supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Measure for Measuring

Borrowing a lesson from corporate America, state and local IT agencies are using metrics to quantify how they're doing.

A large automotive plant splashes big numbers on wallboards to show how many cars were sprayed red in a week. Some McDonald's post signs listing how many billions of hamburgers have been sold to date. In Kentucky's information technology department, colorful bar charts plastered on the walls quantify how available the network was for users during a particular week.

All of these are metrics, ways of counting or quantifying important activities of a business to see if they meet the goals and expectations of the organization. It's a performance measurement strategy. Technology officials have been working with metrics programs for some time, and many now are able to see some useful results.

Kentucky, for instance, has a goal of having its wide area network up and running 99 percent of the time. After all, most Web site applications the state runs have to be available all the time. The state has been exceeding this particular goal. But if it deviates from one of its metrics, it does an analysis to root out why expectations were not met. This is important because the technology department has business agreements with Cabinet heads for shared technology services, such as the wide area network, e-mail and voice telephony.

The Governor's Office for Technology has been measuring performance in some areas for a full year. When it found it was consistently meeting some of its goals, those goals were changed. For network availability, the bar is being raised to 99.5 percent, says Aldona Valicenti, Kentucky's chief information officer.

That approach plays right into the first of three golden rules on metrics, as compiled by Otto Doll, commissioner of South Dakota's Bureau of Information and Telecommunications. The first rule, Doll says, is that "metrics need to change over time."

The second is "process first, IT second." By this he means that technology departments should work on processes before attaching technologies to a project. "So often, we're handed a situation where all we do is make a bad thing faster," he says. For instance, in development of systems and programs, South Dakota tracks what percentage of staff time goes toward putting in new systems versus enhancing an existing system or reacting to something that's broken. The state's goal is to put more time into working on the new and less into tinkering with the old. Some months, new development has gone as high as 39 percent and enhancements as low as 30 percent.

The third rule is to use percentages, not raw numbers. It may sound quite impressive that 1,300 state forms or processes are now online, but not if the state has 6,000 processes and therefore has automated fewer than 20 percent of them. Doll says there's often staff resistance to giving this truer view of what's going on.

In the government sector, metrics are not so much about cost as they are about quality, content, functionality and schedule. A government should choose what the metrics should measure in relation to its strategic plan.

Jack Heine, vice president for Gartner, an IT research firm, suggests thinking of the overall process as "a blanket that's too small to fit the corners." That is, if something gets pulled in one direction, something else has to give. If there's not enough money, a state might have to stretch out a schedule or reduce a project's content. The important point is to be able to define what's good and what's not in a universal way, "instead of asking someone, 'How are we doing today?' and getting the answer, 'Pretty good,'" Heine says. Metrics enable staff and managers alike to be more precise.

There are differences among metrics programs. Often, they report on what happened last month. But some use "leading metrics": The metrics are collected and analyzed in a way that helps the government act on something in the future. Instead of focusing on how many calls a help desk took care of last month, it may be more significant to know that, say, calls are taking more time than average. That finding might suggest that if staffing is not increased, service will deteriorate.

Heine compares the first approach to the light on an automobile that comes on when the car is just about out of gas. Personally, he prefers a gauge "that tells you more about the miles per gallon."

Special Projects