Internet Explorer 11 is not supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

The Problem With Promises

Oregon has long had a reputation as a health-conscious place, so you probably won't be surprised to learn that people there don't smoke quite as much as people in the rest of the country.

Oregon has long had a reputation as a health-conscious place, so you probably won't be surprised to learn that people there don't smoke quite as much as people in the rest of the country. As of last year, 20.7 percent of adult Oregonians were smokers, compared with the national average of 23.2 percent.

If you are an Oregon health official, you might feel that those numbers are a small confirmation of your neighbors' fundamental good sense. By the standards Oregon set for itself a decade ago, however, they represent dismal failure. The state's Benchmarks program, initiated in 1989, made a commitment to get the smoking rate down to 15 percent by the year 2000, even though no state in the country-- except for heavily Mormon Utah--has been anywhere near that level in modern times.

But that's the way Oregon chose to do things in the early 1990s-- promise the moon and hope for the best. In addition to declaring war on smoking, the state vowed that by 2000, the incidence of child abuse would be cut by more than half, every two-year-old would be immunized against infectious disease, and all residents would have health insurance. Those things didn't happen, either. There's been progress on immunization, but the child-abuse rate and the percentage of adults without health coverage in Oregon are actually higher than in 1990.

None of this to say that Oregon is a bad place, or that the goals it set for itself a decade ago were foolish ones. It's merely to offer a cautionary tale about performance measurement, goal-setting and the dangers of letting your aspirations get too far ahead of reality.

The history of Oregon's performance-measurement program will soon be known in minute detail because it is being written by two of the people most familiar with it: Jeffrey Tryens of the Oregon Progress Board, the senior state official currently in charge of the program, and Howard Leichter, a political scientist at Linfield College in McMinnville, Oregon. Their research, sponsored by the Milbank Memorial Fund of New York City, focuses on one aspect of the program--the health benchmarks--and is still in progress. But even in preliminary form, it provides copious evidence of just how powerful a tool performance measurement can be--and how useless it can be when implemented without a necessary complement of common sense.

Oregon Benchmarks remains the most ambitious program of its kind ever attempted anywhere in the country. It established long lists of specific goals--in health, education, economic development, environmental policy and several other categories--and declared it the policy of state government to reach those goals within a decade. Decisions would be made and money appropriated accordingly.

Neal Goldschmidt, Oregon's governor in those days, was a man of vision and high ideals, and when he started the program, he sounded a little like John F. Kennedy launching the Apollo program. Failure was not an option. Goldschmidt took office in January 1989, immediately created a task force to draw up a statewide strategic vision, and then established the Oregon Progress Board to establish the detailed "benchmarks" by which the effort to meet the targets could be measured.

The Progress Board set about its task with energy and enthusiasm. By 1991, it had come up with 158 benchmarks, ranging from the indisputably important (reduce the infant-mortality rate) to the downright ludicrous (see that 90 percent of Oregonians were doing their aerobics.) The legislature approved the 158, but the Progress Board kept on drafting new ones. Within a couple of years, they were up to 272, more than even the most dedicated public official could reasonably be expected to keep up with.

But the most serious flaw in Oregon Benchmarks wasn't the number of goals; it was the indiscriminate lumping together of the merely challenging, the very difficult and the utterly impossible.

Immunizing every two-year-old in a state is challenging. It requires a serious commitment of time, energy and public money. But the knowledge exists. Any state willing to make it a priority can get somewhere. That's why it's no surprise that Oregon did pretty well in this category. It didn't get to 90 percent by 2000, as it initially promised, but it jumped from 67 percent to 80 percent, ahead of most of the other states in the nation.

Providing universal health coverage is more than challenging. It means spending more money than a financially strapped legislature is likely to want to spend. Decisions on how to organize the coverage are all but guaranteed to start a political war among physicians, hospitals, insurance companies, employers, patients' rights advocates and anybody else with an interest in the subject. Universal health care isn't an impossible goal, but anybody who thinks it can simply be declared and then methodically carried out, like the space program in the 1960s, is a little naïve, to say the least.

Child abuse is in another conceptual category altogether. The societal knowledge to prevent it does not exist. We don't even know if the existing data on it is accurate. Announcing a target of reducing it by half, and a fixed deadline for reaching that target, is only a couple of steps removed from King Canute commanding the tides to stand still.

The framers of Oregon Benchmarks must have understood some of this. What they apparently believed--and what some professed experts on performance measurement still believe--was that it's all right to set unattainable goals. The further away the goal, the harder you can expect people to work. If you make the end zone too close, people don't give their best. Besides, what harm can a little stretching do?

But the truth is that unrealistic expectations have done considerably more harm than the original goal-setters imagined.

Ten years ago, when Oregon embarked on its benchmarking experiment, I thought the most serious danger was that failure to reach the targets would lead to a gradually worsening cynicism on the part of the public. To some extent, I think that has happened.

But what I've learned from this new research is that there's a worse problem: confusion among the public servants asked to assume responsibility for the results. One child-welfare manager told interviewers that the unreachable goal of cutting child abuse in half led more to discouragement than anything else. "When we first used the benchmarks, we were very naïve," she reported. "We thought we could change things quickly." When that turned out not to be the case, her agency stopped paying much attention to the child-abuse figures altogether.

A county health director complained that the benchmark system was placing pressure on government agencies and managers to solve problems that, in fact, could only be solved in the society at large. "To improve community wellness is a community effort," he insisted, "not a health department thing." Administrators who looked at the daunting targets, and saw little community cooperation in achieving them, couldn't help but feel overwhelmed and think about giving up.

So I derive two rather simple lessons from reading about this whole experience. One is that you don't set benchmarks based simply on idealism and good intentions. They have to be realistic, or people tune out. The other is that you don't hold an agency of government responsible for solving problems largely outside its control. All that does is create frustration.

Then there's a smaller lesson: If you get too fixated on the numbers, bureaucrats will end up gaming you. At one point in the early 1990s, then-Governor Barbara Roberts announced that in order to receive full funding for any program, agencies would have to show that the program contributed to progress on one of 17 "lead benchmarks." In the next budget cycle, the Oregon Arts Commission announced that its museums were helping to reduce teenage pregnancy. By extending their hours until 7 p.m., they were keeping teenagers out of bed in the afternoon.

You'll notice that almost all the Benchmarks horror stories date back a while. There's no question that the program has learned some of its lessons and improved in the past few years. The number of benchmarks was reduced in 1996 to a more manageable 90, and the goals for 2000 were revised so that the state actually had a chance of meeting some of them.

Some of the changes amounted to more than just minor mid-course corrections. Originally, for example, the state committed itself to reducing drug use among eighth-graders from 14 percent to 3 percent by the end of the decade. By 1996, it was clear that nothing close to that was possible--the problem was actually getting worse--so the target was revised to 15 percent. When the figures came in at 14 percent, the same number as in 1990, that counted as a victory by the new formula. It sounds a bit contrived, but the truth is that any step in the direction of realism was a victory for a program that started out as wacky as this one.

Meanwhile, Leichter and Tryens argue, Oregon Benchmarks has accomplished several significant things apart from the goals themselves. It has encouraged agencies that rarely communicated with each other to collaborate in the interest of making progress. It has identified counties and other local jurisdictions that were lagging behind the rest of the state, and in some cases embarrassed them into putting forth greater effort. And it has multiplied--many times over-- the amount of detailed information available on the social and economic well-being of Oregon residents. For those reasons alone, the researchers believe, the program ought to continue. And there is every indication that it will.

But as it continues, and as policy makers from around the country look to Oregon to find out whether a program such as Benchmarks can really work, everybody concerned ought to acknowledge the single most important lesson of all: Be careful what you promise. Somebody out there may believe you.

Special Projects