The "Traps" to Successful Implementation

A guide to avoiding these five traps, and successfully implementing public initiatives.
December 15, 2010 AT 11:00 AM
By Russ Linden  |  Contributor
A management consultant, educator and author
By Russ Linden  |  Contributor
A management consultant, educator and author

Implementation ... is, ultimately, what government is all about," wrote Gordon Chase, a civil servant, in How to Manage in the Public Sector. But implementing programs is filled with speed bumps and landmines. How does one negotiate them successfully?

Authors Bill Eggers and John O'Leary describe several implementation issues in their recent book If We Can Put a Man on the Moon: Getting Big Things Done in Government. The authors describe a series of "traps," hurdles to implementing important public initiatives. Here are five of their challenging traps, and some thoughts on how to avoid them.

The Tolstoy Trap

The Russian writer Leo Tolstoy believed we see things the way we wish to see them, not the way they are. This certainly holds true in government. A classic example was the Vietnam War. President Lyndon B. Johnson's view of the war was one-dimensional; he saw Communist aggression and a threat to our freedom. His inability to also see it as a civil war and a war for independence, prevented him and his aides from designing diplomatic and economic policies that might have saved many lives.

How to avoid the Tolstoy trap? One approach is to engage a diverse set of stakeholders and experts to analyze the issue and form an implementation plan. When policymakers can hear from such people they're more likely to consider several options.

The Silo Trap

The silo trap is the inability to connect the dots because people work in separate, self-contained units that don't share information. Analyses of the 9/11 attacks highlighted the intelligence community's highly compartmented information systems. While sometimes necessary, this compartmentalization often prevents us from seeing the reality surrounding us.

Avoiding this trap can be done in several ways:

  • Organize people around processes that produce results for customers.
  • Challenge the workforce with a large "stretch objective" that can only be accomplished when all units work together.
  • Evaluate and reward individuals and teams based, in part, on their contribution to agency-wide goals.
  • Design information systems that all units contribute to and can easily access.

The Stargate Trap

Stargate is a term taken from a TV series and a sci-fi movie, and is the authors' term for the point at which an organization is firmly committed to implementing an idea or policy. In government, this is often the final vote by elected officials on a new program or policy. The stargate trap concerns the tension between two needs: How to shape the proposal in a way that decision makers will approve it, and how to get it approved without changing the proposal so much that it won't work in the real world.

Developers trying to get their project approved by a planning commission understand this trap well. They're often required to incorporate considerable community input resulting in many changes before the commission's approval. What if some of those changes prove too expensive, or create construction and maintenance problems? Developers, and civil servants, must walk a fine line.

To avoid this trap, consider a quote from public administration giant Harlan Cleveland, who said that to succeed as a civil servant one must "think politically without becoming political." We need to understand the demands, constraints and motivations of the policymakers we serve while maintaining our roles as unelected managers. We must retain our integrity and the integrity of the proposal we're presenting, while giving policymakers the flexibility to shape it in a way that meets their interests. We must keep timing in mind; when is the right time to put the proposal on the public agenda? And we need to engage the public in meaningful ways, balancing their interests and concerns with our own professional knowledge.

The Overconfidence Trap and Complacency Trap

These two traps are flip sides of the same challenge: How does one move forward on a program when there is evidence it could fail? The Bush administration fell into the overconfidence trap in the run-up to the U.S. invasion of Iraq in 2003. The military plan to conquer the country was excellent; the plan to reconstruct and administer the country afterward was naïve and just plain lacking.

The complacency trap, on the other hand, springs up when we see problems occurring but don't respond to them. A tragic case was NASA's Challenger disaster. NASA engineers found defects in shuttle O-rings at least eight times before the Challenger crashed, but because no shuttle had gone down previously, these defects appeared "normal." To avoid both traps, managers need to create a climate that demands candor so that planners and technical experts can actively debate the merits of a proposal before it is launched. That same candor is necessary once the proposal goes live: What's working well? What isn't? What small problems could grow large if not addressed? At some airlines, senior and junior pilots meet to discuss near misses, so that they can talk about the causes and remedies for such problems before they become disasters.

The keys to avoiding these five traps have common themes: Gather a diverse set of stakeholders and experts to debate the plan openly before it is launched, and then facilitate similar conversations once it's operational. Take policymakers' interests seriously but also speak up when they change the plan in ways that seriously weaken it. And perhaps most importantly, take the possibility of failure seriously. Ironically, that may be one of the best ways to ensure success.