The Danger of Missed Warnings

Working in a high-risk environment, like a space shuttle or an oil rig, it is easy to grow complacent. What can managers do to avoid the complacency trap?
by | June 2, 2010

This article is in part adapted from the book, "If We Can Put a Man on the Moon" (Harvard Business Press) coauthored by O'Leary and William D. Eggers, the global director of public sector research at Deloitte.

Investigators are now saying that British Petroleum may have ignored key warning signs in the hours prior to the explosion of the Deepwater Horizon rig that has caused one of the largest oil spills in history.

According to a memo (PDF) from members of Congress investigating the disaster, hours before the explosion a test was conducted, but despite poor results the well was not shut down. "BP's investigator indicated that a 'fundamental mistake' may have been made here because this was an 'indicator of a very large abnormality.'"

Warnings are easy to miss in any organization, public or private. AIG missed signs of a housing bubble, for example, while U.S. intelligence agencies failed to heed signs of an impending attack prior to 9/11.

Red flags were famously missed in the twin NASA tragedies, the 1986 Challenger and the 2003 Columbia disasters. In both cases, the failures that doomed these ships were cited as major risks in prior flights -- yet NASA failed to heed the warnings.

In eight previous shuttle flights prior to Challenger, NASA had found evidence that O-rings had allowed hot exhaust to burn through a primary seal. Since 1982 the O-rings had been designated a "Criticality 1" issue. Indeed, a January shuttle launch in cold weather just a year earlier had shown significant burn through of the O-rings.

The day before the Challenger launch, engineers at Morton Thiokol, a NASA contractor, raised concerns that the frigid temperatures at Cape Canaveral would cause the shuttle's rocket booster "O-rings" to fail -- which would mean catastrophe for the shuttle. Just hours before liftoff, Thiokol engineers were recommending that the launch be delayed. After hours of discussion, NASA pressed forward with the launch anyway.

In one sense, the Challenger explosion was caused by an O-ring failing due to low temperatures at liftoff. In another sense, it was caused by a desensitization to risk.

Sally Ride, America's first female astronaut, is the only person to have served on the commissions investigating both the Challenger and the Columbia accidents. In both disasters, NASA should have known that it was flirting with disaster.

"The first time they saw [burn through], they thought, 'Oh, my gosh, we dodged the bullet here. We need to do something about this,'" explains Ride. But each time NASA saw evidence of leakage around the O-ring seal, the risk seemed less serious. "Oh, yeah, O-ring singe -- we've seen that before."

The Columbia tragedy was eerily similar. Columbia's problems started when a foam tile dislodged during launch and punched a hole in the wing. "Foam had been falling off the tank since the very first shuttle flight, and NASA had long been trying to fix it," says Ride. "... But in each case, NASA decided it was okay to keep flying. Over time, this led to a significant understating or a collective ignoring of an actual risk."

Unlike Challenger, the Columbia actually made it into orbit. NASA engineers were worried about the foam hit, however, and asked the Department of Defense to use high-res telescopes to examine the wings for damage. "The Department of Defense was processing the request to examine the shuttle when the request was cancelled by NASA," says Ride. "The Department of Defense might have been able to spot the hole in Columbia's wing, and there were actions that NASA could have taken to rescue the astronauts on board."

During reentry, the hole in the wing proved catastrophic, and all aboard perished.

Working in a high-risk environment, like a space shuttle or an oil rig, it is easy to grow complacent. What can managers do to avoid the complacency trap?

Ask "what if?" If things are going well, there is a tendency to assume that they will continue to go well. We become desensitized to deviations from the norm. To counter this human tendency, organizations need 'tiger teams' to pro-actively create 'what-if' scenarios.

Welcome bad news. A big problem in many organizations, says Ride, is the difficulty of communicating bad news. Managers don't want to hear bad news and staffers don't want to give it to them. But without hearing bad news, leaders can't fix the problems until it is too late.

In the political world, strident voices warn of environmental doom due to global warming, or economic doom due to burgeoning deficits, or military doom if we fail to deal with some looming international crisis. Discerning which warning bells to listen to and which to ignore is an art -- not a science, and one which mere humans have yet to perfect.

Join the Discussion

After you comment, click Post. You can enter an anonymous Display Name or connect to a social profile.

More from Management Insights