Last night I dreamed I was floating on a raft on a lovely pond I know in northern Connecticut. I was thrilled because the water was so clear I anticipated seeing lots of animals. But the first thing to appear was a dead whale, its snout protruding from the water like a boulder.
I turned the raft around and I was in the middle of a hellish ocean. A large black snake broke the surface heading towards me. Then I saw many more animals littering the water, their corpses flipped sideways and puddles of brown oil slid by. I saw boats ahead and I paddled toward them. As I drew closer I could see environmentalists being peppered by questions from reporters. I had my own questions but when I boarded the boat, it was empty.
One need not be a psychiatrist to interpret my dream. The world shares this nightmare. The question that keeps coming to me as I read with horror about the ongoing crisis in the Gulf of Mexico, is "How many of the lessons from flying would have prevented, could have prevented the Deep Water Horizon oil spill?"
Why does it seem none of them were being followed in this similarly complex and high stakes operation?
A history of safety problems, drilling technology that outpaced response procedures, a lack of redundancy, a federal agency known to be "cozy with industry", a chief executive who's answer to a congressional inquiry on whether the company put profit before safety who responds, “I was not part of that decision-making process.”
Earlier this week I spoke with Dr. Eduardo Salas, an expert in human factors and industrial psychology and a professor at the University of Central Florida. So that when I read of the BP chief's response in this morning's paper, Dr. Salas characterization of how to create a safe industry rang back like the morning alarm clock.
"To make change for people to comply starts at the top," he told me. "You can talk about safety and not have a culture of safety. Those who do from the number one guy to the janitor - they think about safety, they have policies and procedures, they reinforce safe behavior they create conditions under which people see this as 'important to me, to my teammate, to my company to my organization.' But not the oil industry apparently."
I'm not writing this to gripe. I'm spilling my dreams into the blog-o-sphere because I think that among you, dear readers and fellow aviation aficionados, there are answers. Through nearly two decades working in civilian and military aviation, Dr. Salas has come to this conclusion,
"Organizations get the behaviors they measure and reinforce. If you want teamwork, measure and reinforce it. If you want safety, measure and reinforce it."
Going down the list of aviation systems, well let's call them Flying Lessons, I count more than a dozen practices that systematize and reinforce safe behaviors, checklists, walk-arounds, crew resource management, pre-flight planning, flight debriefing, risk benefit analysis and its bureaucratic partner, cost benefit analysis, recurrent training, standardization. I could go on but I'd rather hear what you have to say.
Humans are fallible, technology is evolving, and nature is unpredictable. Aviation has been largely successful in navigating these truths. This is what flying has to teach the the oil industry and the mining industry and medicine and...well as I said, I'd like to hear from you.
8 comments:
Christine:
I think you are giving the aviation industry a little too much credit. It is true that our industry appears to have safely navigated some treacherous space but if you look more closely you will find that most of our improvements came after repeated high profile (read deadly) disasters. One need only look at the long string of icing-related accidents, mid-air collisions, windshear accidents, CFIT/ALAR accidents- the list goes on and on- to realize that we were very slow indeed to react effectively to many of these hazards. Certainly there were technological hurdles to be surmounted in some cases but an acceptance of risk and competing agendas often meant that true solutions were not pursued vigorously enough.
As far as I know, the underwater blowout that BP now confronts is unprecedented in its scale of devastation. Had there been a string of such disasters (or obvious precursors)we could rightly point the finger at the oil industry and claim that they have been too slow to learn. It’s tempting for the politicians and the media to jump on the bandwagon but I wouldn’t be so hasty to blame the oil industry alone for the current bad dream.
Lindsay Fenwick
Lindsay and Christine -
For years, the aviation safety mantra was “fly – crash – investigate – fix – fly”, as Lindsay pointed out. And the practices Christine mentioned are all good remedial actions for specific process errors. The key, though, as highlighted early in the blog, is that safety HAS to start at the top. Safety HAS to be proactive. Safety HAS to be the primary driver in any industry. Hence, the birth of Safety Management Systems, soon to be mandated in all aviation safety organizations. Isn’t it time that that safety culture, as the foremost lesson learned in aviation, is implemented in other industries as well?
Jim Walters
Dear Christine:
I do have a couple of thoughts to share after reading your interesting post.
Risk mitigation in an industry fraught with the chance for highly consequential mishaps is a journey - not an event. One of the reasons that the aviation community, viewed from some perspectives, seems good at risk mitigation could be that we've been at it for so long. We've come a long way, but then, we had a long way to come. Remember, pioneer aviatrix Harriet Quimby died in 1912 when she fell out of an airplane that did not have seat-belts! And, of course, we remain imperfect and still have a long way to travel as we add more 9's to the probability that each flight will avoid disaster.
Our techno-regulatory structure has evolved under constant public-sector pressure and continuing technological progress. And the FAR's are indeed written in blood. That's because our evolutionary process has been reactive rather than pro-active. Heretofore, no one has been charged with analyzing the effects of failure modes that change as the technologies of flight evolve ("What's it doing now?). So we wait for the smoking hole in the ground and then modify our systems and procedures to ameliorate new risks. It seems a bit backwards.
It's pleasing, therefore, to note that the ASIAS folks are shifting their focus from accidents to "pre-accident" information sources like FOQA and ASRS (and many other data sources). They have indicated that there just aren't enough air-carrier accidents any more to teach us all we need to learn. They're moving in the right direction and I wish them well.
But if we've spent a hundred years getting all this close to right under constant external pressure, other industries may not have been so fortunate. The resource extraction folks (I lump together the drillers and the diggers) have of late been paying a high price for failing to evolve their own techno-regulatory environment. It's early days, but now they may get all of the public-sector pressure all at once and need to respond with a vast step-function change in their risk response mechanisms. If so, we in aviation can help with techniques and procedures (some of which you enumerated); we've gone down that road. (They ought to talk to the
nukes, as well.) But first they have to come to understand what it is they need to do and commit to doing it.
There is already some of this going on in the medical world. Dr. Tony Kern at Convergent Performance is doing some good work facilitating knowledge transfer from aviation to medicine. Others follow suit. But this is motivated by constant and growing
economic and regulatory pressure to improve. As Dr. Salas points out, safety culture has to start at the top. And it can't be faked.
So, we in aviation stand by to share our hard-won lessons with the drillers and the diggers and the docs. But they first have to make the cultural commitments needed before we can make a difference. We can't do that for them.
Regards,
Frank
PS: For Lindsay; concur that the consequences of the DWH blowout are unprecedented but more importantly they were not unforeseeable. I fault BP and its partners for assuming that their existing risk-mitigation measures would prove adequate to problems that might (and did) arise in the more hostile environment of this well. This was an exhibition of technical hubris and as the Greeks knew well, hubris is inevitably followed by nemesis.
PPS: Earlier version deleted to correct HTML foul-up.
Interesting read regarding the oil disaster in the Gulf.
I echo Lindsay's sentiments regarding safety in aviation. We've come a long way, but we still have a long way to go! Just take a look at the NTSB's 'most wanted' list for a stark reminder of safety solutions and mitigations that have yet to be vigorously pursued. As an industry, we too need to be mindful of creating our own form of technical hubris! Maybe we're already there?
The market is responsible for the relatively fabulous safety record in aviation. If oil companies had to accomodate 150+ million people annually on their oil rigs, like the airlines do on their airliners, then oil platforms would be as safe as airliners. The danger is that the airline's safety culture will further erode because of market pressures to enhance profits - and maintain jobs, to the detriment of the flying public. The government can make all the rules they want, but they ultimetely have zero effect on the safety record or culture in the industry - you simply can't legislate safety - it has to be bought into by the industry, as observed by many other posts. In fact, my position is that in many cases the FAA and NTSB actions are counterproductive and the unintended consequences of their actions actually degrade safety.
Safety is not a destination, but a continuous journey in which efficiency and economic constraints are continuously being balanced against risks.
Consider the behavior of most drivers. No one is out to create an accident or to violate a law, but we all bend the rules to balance other demands. So, we exceed speed limits or answer our cell phone in order to meet our daily constraints (e.g., get to our destination at a reasonable hour). Eventually, most of us have a near accident or get a speeding ticket. For some time immediately following the incident, out behavior changes. We are more cautious and more aware of the risks. But in time, the attention to risks fades and other constraints become more salient. Thus, it is not long before we are again taking the same risks that we took before the incident. This is human nature! Safety is foremost in everyones mind for a short time following an incident, but with time other priorities will generally become more salient.
Oil companies and airlines are no different. Thus, discipline must be imposed through training, by regulation and monitoring, and by feedback (such as Accident Reporting Systems) to make sure that risks remain salient. These are all elements of creating a Safety Culture.
One element of this is regulation and penalties for violating safety rules. But, as with speeding fines, this is never sufficient. The impacts of these alone will be local and will fade -- with each year without getting a ticket.
We are dealing with dynamical systems that are very sensitive to the complex contexts in which they operate. Butterfly effects are common and it is dangerous to make any inferences or generalizations based on the logic of simple causal systems. Science is just beginning to frame a logic and language for analyzing these complex, nonlinear, dynamic systems.
Christine
You are absolutely right. It is unfortunate that other industries are slow to accept and adopt the safety concepts that the aviation industry has paid for with precious blood and treasure. Aviation safety management has not always been as effective as it is now. Traditionally, aviation has been reactive to incidents and accidents. In the mid nineties there was a profound paradigm shift. The shift was from error/failure elimination to error/failure management. The false goal of error free aviation was hard to abandon. Now the preferred goal is error management. After all, there is no difference in outcome between an error that is managed so that there are no consequences and the absence of error. Today it is rare that an airline does not have some sort of CRM/Error Management protocol.
That was not always the case, however. Much the same as it has been in medicine, a pilot acknowledging their ability to make errors was equivalent to admitting incompetence. The dirty little secret is that ALL humans make mistakes. The effective ones recognize that errors/failures are always possible. Effective safety management systems use the truth that errors are always possible to remain engaged and focused. Only recently has it become standard protocol for a surgeon to write "This One" on the appendage to be worked on. The surgeon who thinks he cannot amputate the wrong foot or the pilot that thinks they can't land gear up are the most likely to do just that.
The oil industry could benefit greatly from the safety practices of commercial aviation. One flight at a time, one rig at a time, be ever vigilant for that error or unexpected chain of events that managed early keeps a minor inconvenience from being a major catastrophic event.
Jim
Post a Comment