B&G Interview: Questions for John Turcotte
A conversation with one of the nation's top experts on program evaluations
One of the most impressive recent advances in state and local management has been growth in the use of performance auditing, and to a lesser extent, program evaluations. John Turcotte may well know more about this topic than just about anyone else in the country. With 35 years of government work under his belt, he recently took the job of heading up a new legislative program evaluation unit in North Carolina -- a program he helped develop as a consultant. Previously, he held top positions in Florida's Office of Program Policy Analysis and Government Accountability and Mississippi's Performance Evaluation and Expenditure Review. He helped Maine set up its new evaluation program, which was loosely modeled on OPPAGA and has also provided training and consulting assistance to many state audit and evaluation staffs. We've enjoyed chatting with him about a variety of topics for years now, and thought it was time to share some of his thoughts with you.
You've been in this business a long time. What lessons have you learned?
You need to ask the difficult question: Are results being achieved with public funds? Having the mechanism to ask the question in a targeted but variety of ways is very important. That's what we're doing here.
What's the difference between performance auditing and program evaluation?
In performance auditing, you look more at processes. You may address effectiveness, but the focus is on process, efficiency and economy. With program evaluation, you look at return on investment and whether those results are still needed.
Program evaluation is slower and more deliberate and may conclude that the program shouldn't be done at all. There are legislatures that do program evaluation very well -- Arizona, Minnesota, Texas, and Virginia -- but most still do more performance audits than evaluations. The Washington State Public Policy Institute approach is more in line with what our legislators want us to do here.
Why is this kind of program evaluation needed?
Often the thoughtful people that designed the programs are no longer alive and programs have morphed since then. My late father was a wildlife biologist who, like others, pushed state intervention to prevent extinction of wildlife in the early 20th century. States created game and fish commissions. They knew precisely the outcomes they wanted and a valid means to get there. But while conditions changed, old interventions and outdated premises persisted. We now have overpopulation of deer, drastic loss of habitat, and global warming. You get second- and third-generation workers who are not as zealous and are more interested in sameness and keeping their jobs. You need legislators and the governor's office to question the interventions and shake things up.
Have you seen improvements since you started in this business?
Performance measurement is infinitely better now. The discipline is more highly developed and the technology is better. Local government has made more advances than states. Citizens want garbage collection measured. Most state-level functions indirectly serve people. It's harder to measure in that environment. But I've noticed that performance measurement is persistent and not as sporadic. And we're all paying more attention to outcomes.
What are the main obstacles to what you want to do?
The scarcity of sufficient, timely and reliable data.
Do you see enough young people who want to do what you do?
No. There's a vast difference between now and the '70s when many of us had responded to President Kennedy's call. The job market isn't as favorable as it was and we are not encouraging young people to study public administration as much as we have encouraged science and technology.
What are the qualifications that are needed?
You have to be a disciplined critical and creative thinker. If you're trained in logic and the scientific method, that's 90 percent of it right there. You need curiosity and a questioning mind. You may be trained in anthropology or biology. It doesn't matter. It's rare to find someone who has the critical thinking skills, is good at quantitative methods, has people skills, and has writing and communications abilities who wants to work for government. But that's what we look for.
Is there an evaluation that you've done in your career that stands out for you?
The Mississippi PEER Committee evaluated school land management in 1977. Statewide management of school land was a disaster. The Supreme Court had said that these were trust lands and the only purpose was to produce resources for the school system. But all across the state there were lands that were being leased for political or private gain -- some leases were for a penny an acre and renewable forever. When our report documented the problem and proposed legislation, there was a willing set of legislators who passed a reform act. Then we had a new secretary of state who used the new law to void those bad leases. Revenue for schools shot up tremendously. This brought in millions of dollars.
What's the biggest misconception about what you do?
That we're out to get the bad guys and we're looking for fraud. Most folks pigeonhole us. Also, there's a perception that we rely on what people tell us and base a report on a preponderance of opinions. That's hardly the case.
How would you like to see program evaluation develop in North Carolina?
We'd like to move in two simultaneous directions. We will target programs for evaluation that are important to the General Assembly and also look at all state government programs collectively to avoid legislative oversight gaps. It's similar to what we did in Florida with the OPPAGA Florida Government Accountability Report. We want to give legislators the same type of information readily available to investors. We'll profile all North Carolina state agencies and programs and track them the same way that rating houses cover stocks and bonds. There will be some elements of a balanced scorecard. We will develop unit cost measures for program activities to help the Fiscal Research Division.
So, as we continue doing ad hoc reviews on a targeted basis we will construct a framework for comprehensive continuous examination of North Carolina government.