When Oregon launched its statewide performance measurement initiative called "Oregon Benchmarks" back in 1989, the whole effort took a considerable amount of explaining. At the time,...
When Oregon launched its statewide performance measurement initiative called "Oregon Benchmarks" back in 1989, the whole effort took a considerable amount of explaining. At the time, the prevailing management trend was to focus very specifically on improving discrete operations, whether it was making departments of motor vehicles more customer friendly or speeding up permitting processes. And here was Oregon, trying to track and improve some rather obtuse aspects of quality of life. There were broad benchmarks related to economic, social and environmental "livability," measuring everything from Oregonians' educational attainment and income levels to how much exercise they were getting each week.
Supporters in Oregon -- principally, then-Governor Neil Goldschmidt -- hoped that gathering expansive data on statewide well-being might begin influencing government policy in positive ways. And the effort to focus on high-level indicators did go on to win an Innovations in American Government award. However, critics always viewed Oregon Benchmarks as a quixotic attempt to measure things that government had little control over. Oregon legislators by and large ignored the effort, and they eventually let the whole enterprise sunset.
If Oregon Benchmarks got off to a rough start, it nevertheless kindled something of a national movement toward the use of what came to be known as "livability indicators." Across the country, there are now some 170 such efforts going on at the state, regional or local level. The push is coming from community activists in some cases, and in others from the business community or from within government itself. But in all cases, the idea is to develop bird's-eye goals for quality of life, along with measures for gauging government's progress toward reaching them. And there's some evidence that these initiatives are beginning to influence government action.
Even Oregon is back in the game. The legislature resurrected Oregon Benchmarks in 1997. Since then, Oregon has continued to refine its benchmarks in ways that aim to make them less of a feel-good exercise in great expectations and more relevant to policy makers. In recent years, Oregon has linked the broad benchmarks to the state budget process -- a change that is beginning to have some impact. In response to the broad statewide goal of "workforce readiness," for example, the Department of Labor and Industry overhauled its system of apprenticeship funding. Instead of throwing money at a variety of apprenticeships, the department now analyzes future needs and targets funding for specific programs accordingly. It's a logical approach to budgeting "that really hadn't been part of the process before," says Rick Gardner, who heads up the effort in the executive budget office.
Peaks and Valleys
There's nothing new about aspiring to improve community well-being by trying to measure it. One of the first recognized attempts goes back to 1913. That's when the U.S. Department of Labor published its "Handbook of Federal Statistics on Children." The handbook brought scattered information on child welfare together into one place, in hopes that it would inform federal policy. Tracking broad measures of economic well-being became popular after the Great Depression. The 1960s, meanwhile, witnessed a surge of interest in data on general social and environmental health.
It wasn't until Oregon Benchmarks, though, that the trend really reached down to the state and local levels. Oregon's program -- and other seminal indicator efforts in places such as Boston and Seattle -- had their roots in the environmental activism of the 1960s and '70s, says Allen Lomax, who runs the Community Indicators Consortium, which represents these projects nationwide. While the movement has had "peaks and valleys," Lomax says, he's seen a significant uptick of interest in indicator efforts in the past five or six years.
Still, what those pushing indicator efforts have long hoped for -- a viable working relationship between outside activists and government officials -- has remained elusive. One problem is suspicion among public officials that failure to make progress on measures could be used as a way to criticize them for things they have limited control over, such as teen pregnancy or traffic congestion.
For their part, indicator advocates have often felt ignored by government. Part of the problem was the jargon they used: Just the word "livability," for example, can mean different things to different people. "Sometimes people on the community indicators side don't speak English, so the message is garbled," says Lyle Wray, co-author of the book, "Results That Matter: How Performance Measurement and Civic Collaboration Empower Communities."
In some places, the two sides are learning how to work together. One of those places is Jacksonville, Florida. Since 1985, the Jacksonville Community Council has been tracking the metro region's socioeconomic health and publishing an annual "Quality of Life Progress Report." Over the years, the group has sometimes felt disregarded by some of the agencies it was trying to influence. But Executive Director Ben Warner says his group's relationship with the local government is getting better, particularly when it comes to public safety.
The county sheriff's department cooperates with the community council by supplying data on everything from crime rates to response times. The council returns the favor by offering the department survey data on how citizens feel about the job law enforcement is doing. "We ask citizens questions like 'How safe do you feel walking around your neighborhood?'" says Warner. "The sheriff's office has turned around and taken those measures and used them to guide internal operations."
A recent example of that collaboration came in response to what those in Jacksonville viewed as an unacceptably high murder rate. The sheriff paid the community council to study the problem and to come up with suggestions for addressing it. In researching and publishing its report, the council essentially was connecting one of its high-level livability indicators -- public safety -- with a specific departmental measure -- murder rates.
Among the council's findings was that there was a lack of a police presence in some key crime hotspots. Although the sheriff's department had been arguing for years that it was short-staffed, the report's findings provided the department with evidence. The county board responded by beefing up overtime pay, an injection of money and man-hours that went on to have a measurably positive impact on the region's murder rate.
In Jacksonville, the relationship between indicators advocates and local government is an informal one. The depth of that relationship varies from issue to issue, and the strength of it varies over time depending upon the personalities involved. Lately, however, the trend is toward institutionalizing the indicators approach within government, in order to make it more relevant and useful.
A good example is in the Reno, Nevada, metropolitan region of Washoe County, where the regional group Truckee Meadows Tomorrow (TMT) has been working on gathering, tracking and reporting on community, social and health indicators for almost 20 years. TMT was born of state legislation that called for regional planning around Reno, Sparks and throughout the county. That legislation included a requirement to define and monitor the area's quality of life. The result was TMT, made up of a cross-section of community players, ranging from businesspeople to environmental advocates, and funded by membership dues, donations and grants.
For a while, the link between what TMT was tracking and what city and county government was doing wasn't clear. Nobody was responsible for monitoring progress -- or lack thereof -- on indicators, and certainly nobody was responsible for actually trying to improve what the numbers showed. That all changed three years ago when County Manager Katy Singlaub decided it was time to require her 34 departments to begin tracking and reporting on key performance measures. In order to help connect those measures to the TMT's broader community indicators, Singlaub created 10 task forces focused around issues identified by the group, ranging from encouraging civic involvement to encouraging economic well-being. Those task forces are made up of a mix of people from within and outside government, and are chaired by county officials who now report to the county commission twice a year on what is going on in their program areas.
There's evidence that formalizing the collaboration in this way has paid off. For example, TMT's broad indicators on regional environmental quality were linked to specific county measures related to parkland cleanliness. The most recent result of that was a major cleanup drive in Keystone Canyon, a popular recreation area. Another recent effort created 387 units of moderately priced housing, an initiative that connects directly to TMT's overall goals for affordability in Washoe County. Similar initiatives certainly have been pursued in jurisdictions where there's a more casual connection between government performance measures and livability indicators, but those familiar with TMT argue that connecting the two simply makes progress that much more likely.
Even with a formalized connection between broad indicators and government action, however, collaboration still can be a challenge. In Virginia, for example, former Governor Mark Warner created the Council on Virginia's Future expressly to bring government and citizen leaders together in an effort to provide a long-term, statewide focus on high-priority issues. There have been some hiccups.
As an example, the council's executive director, Jane Kusiak, offers a key statewide indicator of intense interest to business: "workforce quality." First, she says, it's a thing that's just flat-out hard to measure. And second, there are lots of forces that influence workforce quality -- some that government can have an impact on, such as the quality of the education system, and others that it clearly can't, such as the motivation of individuals to work. And so identifying government's specific responsibility for improving such a measure is tough. Even indicators that are relatively easy to measure, such as infant mortality, can be difficult to analyze in terms of cause and effect.
The way the council has begun handling these problems is to worry less about identifying direct links of responsibility, and to focus more on agreeing which indicators are important to track. Once that's accomplished, the council lets the players involved -- whether it's the business community, nonprofits, or state or local agencies -- figure out not only what they can do to contribute to progress but also how to measure their own individual contributions. "Determining the key indicators is important," says Kusiak, "but it's equally important to build consensus around those indicators and commit to making progress on them."
Oregon has been applying many of these lessons in rebuilding Oregon Benchmarks. Once it was revived by the legislature, the Oregon Progress Board, which administers the benchmarks effort, regrouped with an eye toward developing a tighter, more concise set of measures that state government might actually latch on to. Meanwhile, a law passed in 2001 that put two legislators on the Progress Board helped to solidify buy-in from the legislature.
The key reform was linking the benchmarks to the state budget process. Both the executive budget office and the legislative fiscal office now are charged with establishing a connection between state agency spending and key performance indicators. For its part, the Progress Board has created a guide to help agencies connect their own measures to the high-level indicators.
Rick Gardner admits that it hasn't necessarily brought a revolution in budget writing. But there are examples, such as the apprenticeship funding, where it's brought a measure of rationality to spending. While the difficulties in trying to connect agency budgets and performance measures to higher-level statewide indicators can't be overestimated, Gardner thinks that the program has promise. As state agency officials get more tuned in to key indicators and figuring out how to view and use them, he says, "there seems to be real engagement and excitement."
From the Bench
One indicator measured by Oregon Benchmarks is percent of Oregonians who feel they are a part of their community:
Source: Oregon Progress Board
These articles are part of a continuing series on public performance measurement focusing on citizen involvement. Support has been provided by the Alfred P. Sloan Foundation. Although the foundation may assist some of the programs described in these articles, it had no control or influence over the editorial content, and no one at Sloan read the material prior to publication. All reporting and editing was done independently by Governing staff.
Join the Discussion
After you comment, click Post. You can enter an anonymous Display Name or connect to a social profile.
LATEST MANAGEMENT & LABOR HEADLINES
Why Is Public Corruption So Common in South Texas?18 hours ago
Alabama’s One-Man Pension Show18 hours ago
The Daily Crisis Cops Aren’t Trained to Handle18 hours ago
U.S. Transportation Secretary Replaces 3 D.C. Metro Board Members2 days ago
Foster Care Scandal: Oregon's Top State Regulator Demoted2 days ago
Pension Envy: Lessons From Well-Managed Plans3 days ago