Anyone who was alive to remember July 21, 1969, can still recall the sight of Neil Armstrong stepping for the first time on to the moon’s surface. I was 11 years old at the time, sitting with my family around our black-and-white TV when Armstrong proclaimed, "That's one small step for man, one giant leap for mankind." I was proud to be an American.
Seventeen years later, January 28, 1986, different room, different TV, but the same feeling of exhilaration as the Space Shuttle Challenger headed off for the dark side of the moon. But 73 seconds later the coin flipped and the spacecraft exploded, killing all seven crewmembers. Seventeen percent of all Americans were watching as the external tank containing the liquid hydrogen fuel and liquid oxygen oxidizer exploded. To this day, the Challenger disaster is a case study in engineering safety and workplace ethics.
What lessons can we learn from what went wrong that day? How could this disaster have been avoided by following a simple risk management process? To understand, we must examine the Challenger disaster using the five steps of risk management, which are:
1. Identify Risk — NASA managers had known the design of the solid rocket boosters (SRBs) contained a potentially catastrophic flaw in the O-rings since 1977, but failed to address it properly.
2. Analyze Data — The O-rings, as well as other critical components, had no test data to support the expectation of a successful launch in such cold conditions. Engineers who
worked on the Shuttle delivered a biting analysis: “We're only qualified to 40° F. No one was even thinking of 18° F. We were in no man's land.”
3. Control Risk — NASA had “Launch Fever” that morning and disregarded advice not to launch. Later, we learned that concerns about the launch never made it up the chain of command. It was these lapses in judgment that triggered this catastrophic event.
4. Transfer Risk — While there may be some risk transferred to an insurance company for a satellite aboard, there is no place to transfer the loss of life in this case. NASA put all shuttle launches on hold for 32 months and some would argue that they never recovered from the damage to their reputation.
5. Measure Results — This is where we ask the question, “What went wrong?” In fact, the Rogers Commission did an extensive review. They found NASA’s organization culture and decision-making processes had been key contributing factors to the accident. NASA managers had known since 1977 that the design of the solid rocket boosters (SRB’s) contained a potentially catastrophic flaw in the O-rings yet failed to address it properly. They also disregarded warnings (an example of “go fever”) from engineers about the dangers of launching because of the low temperatures that morning, while also failing to adequately report these technical concerns to their superiors.
Establishing a Safety Culture
So, how do we avoid this happening in your business? It’s all about establishing a safety culture. And, the ways in which safety is managed in the workplace often reflects the attitudes, beliefs, perceptions and values that employees share in relation to safety. Are senior leaders hearing the “real” story of what is going on, and does it trickle down all the way to the hourly worker that only came on board a week ago?
It never ceases to amaze me the risk we uncover when we go looking. During a recent risk assessment at a manufacturer we discovered they had a Hilo driver that was declared legally blind, but knew the factory floor so well that even with extremely limited eyesight the employer felt he could still do his job. Fortunately this optimistic viewpoint didn’t fare well with the senior manager who reassigned the employee was reassigned to a job in the warehouse, where it was unlikely he would run over a co-worker with a 2,000-lb forklift.
Another time we were analyzing workmen’s comp claim data at a trailer repair facility where we noticed several eye injuries had occurred recently because rust particles were getting past the safety glasses. We contacted a safety glasses supplier, explained the problem and asked for their recommendations. They provided a few samples of safety glasses that were designed to prevent the rust particles from getting in the worker’s eyes. The employer replaced the old glasses with the new model and there hasn’t been an eye injury in the past two years.
Unfortunately every story doesn’t have a happy ending. On the morning of November 5, 2003, Kristi Fries, an employee at Maverick Metal Stamping in Mancelona, Mich., reached to remove a part from a 110-ton stamping press. Her unzipped sweatshirt triggered the machine's controls, causing the press to slam down and crush her arms, both of which had to be amputated between the wrist and the elbow. Fries later said it was her first time using the machine, and that she had never been warned about the risk of wearing loose clothing while operating the machine, nor that the machine had previously malfunctioned.
In fact, the potential danger of the machine was so widely known that the regular operator wore his button down shirt backwards to make sure it he had no loose clothing in front that could trigger the machine. Sadly, no one thought to bring this potential danger to Fries’ attention.
This is the perfect example of risk management failure. The risk was identified but never addressed. After the accident it was discovered that the parts to fix the machine had been in the maintenance department for over two years. Not only did Fries loose both her arms, the company was hit with over $300,000 in OSHA fines and subsequently went out of business, resulting in a loss of jobs for 50 people.
Two Critical Mistakes
So what can we learn from the Space Shuttle tragedy, caused by an “O” ring failing just over a minute after lift-off? Manufactured by Morton-Thiokol, the O-rings had worked on all 24 previous shuttle flights. But, all of those launches took place with a temperature of at least 51°F. At 11:38 AM on January 28, the morning of lift-off, the outside temperature at the launch pad was 36°F.
A critical mistake had been made by at least two parties: NASA and Morton-Thiokol. Morton-Thiokol knew that there was risk in maintaining the integrity of the O-ring at such a low temperature, but so did NASA, which had ultimate responsibility for the launch. The result was seven lost souls.
Wayne Hale, a former NASA flight director and Space Shuttle Program manager, cited 10 lessons that can be learned from this catastrophe:
1. It can happen to you. Nobody is smart enough to avoid all problems. A preoccupation with failure results in high reliability organizations.
2. Focus. “Aviation in itself is not inherently dangerous. But, it is terribly unforgiving of any carelessness, incapacity, or neglect.” –Captain A. G. Lamplugh, RAF.
3. Speak up. A foolish question is more forgivable than a mistake. Loss of respect, loss of your job, loss of a promotion pale compared to program shutdowns, life-long regret and funerals.
4. You are not nearly as smart as you think you are. There is a reason we all have one mouth and two ears. Too many people are so busy giving their thoughts that they fail to hear warnings of a disaster. Don’t just listen, comprehend and take action.
5. Dissension has tremendous value. No dissension means the issue hasn’t been examined enough. Appoint devil’s advocates and don’t allow people to remain silent—draw them out.
6. Question conventional wisdom. People in groups tend to agree on courses of action that, as individuals, they know are stupid. Told that the shuttle was as safe as an airliner, we denied the shuttle crew parachutes and pressure suits — something patently wrong to a casual observer.
7. Do good work. There is no room for half-hearted efforts or second best. Do it well or don’t do it at all. Don’t accept excuses from others.
8. Any analysis without numbers is only an opinion. Engineering is done with numbers. Not having all the information you need is never a satisfactory excuse for not starting an analysis.
9. Use your imagination. Keep vigilant and have an active imagination of possible hazards. Murphy wasn’t completely wrong when he wrote his law.
10. Nothing worthwhile is accomplished without taking risk. At some point we must leap into the unknown without knowing everything we should know. Fear, or a preoccupation with failure, cannot and should not paralyze us into inaction. Make the risk of those who put their life on the line as small as possible, then go forward.
It’s only by identifying the risks you face each day as a business, and having a plan in place to minimize them, that will you succeed as a company that truly cares not only about the bottom line but also the human capital that makes your business run.
Randy Boss is a Certified Risk Architect at Ottawa Kent, a commercial insurance, risk management, workers' comp and employee benefits consulting business in Jenison, Mich. He designs, builds and implements risk management and insurance plans for middle market companies in the areas of human resources, property/casualty and benefits.