Wednesday, February 23, 2011

Much to Learn from Lucky Landings

Working on my back deck with a friend in 2010
Last summer, while replacing some of the planks on my back deck, I distractedly turned on my cordless circular saw and upon feeling the end of my work glove starting to twist, looked down to see I was just about to saw off a couple of fingers. I had a lot to say at the time, none of which I'll repeat here. But from the safe distance of time, I will now calmly describe the episode as they do in aviation as a "near miss."

I was reminded of my folly this week in Geneva, while reading one news story lauding a decrease in air accidents and another suggesting an increase in cockpit automation errors may be cause for alarm.

I'm thinking that this is a healthy sign. Certainly avoiding airplane crashes is desirable, but to maintain and improve on aviation's enviable level of safety, it's not only accidents that need attention but the glove twisting, "oh s--t" inducing, stomach churning, mostly unreported and often unknowable near-misses.

I've written about  the theory behind my philosophy in my last blog post about the miracle landing of Qantas Flight 32 in Singapore last November. It would take a very active imagination to come up a longer list of things to go wrong than what did go wrong on that flight.

To recap, a Rolls Royce engine failed on the Qantas Airbus A380 shortly after takeoff from Changi airport, sending parts into the wing and fuselage of the world's largest passenger plane. And yet, after two hours sorting through the error messages, the pilots did in fact bring the plane back to a safe landing. Having gotten it on the ground without loss of life certainly doesn't mean this happy ending is the end of the story. Far from it.

In a story on NPR last week, my friend and colleague in the Comprehensive Medical Aviation Safety Database, Capt. Patrick Veillette described an automated approach he was making to the Salt Lake City airport last fall. Patrick, a commercial pilot and and highly accomplished safety specialist told NPR, "What I anticipated the aircraft to do was to continue this descent," he told the reporter. "Well instead, the aircraft immediately pitched up, very abruptly and much to my surprise. And both of us reached for the yoke, going, 'What's it doing?' and there's that shock of, 'Why did it do that, what's it going to do next?'"

Pilot accounts provided to the NASA aviation safety reporting system detail many frightening encounters with automated flight controls as I reported last year in The New York Times and several posts here on my blog. At the same time automation has made many aspects of flying safer.

Tonight, I had a fascinating conversation with an engineer from a European aerospace laboratory who is here in Geneva to learn more about the digital transition in cockpit communication and air traffic control. The digitization of all things flight-related has left the station. That all this high-technology may sometimes err is as certain as the fallibility of low-tech humans.

That is why it has never been more important to thoroughly explore the factors that lead us to almost make a mess of it, whether that's understanding how pilots maintain the presence of mind to pull out after a computer goes astray, or figuring how why we cannot stay focused when the power (and the power tool) is totally under our own control.


Anonymous said...

Indeed. Thank you for an incisive post, Christine.

J. Blaszczak said...


Here's where you absolutely nailed the issue. "That all this high-technology may sometimes err is as certain as the fallibility of low-tech humans." The technology may malfunction so thats why there are pilots. Pilots are human so they also may malfunction or err. It is that reality necessitates another human be introduced into the system. Those three elements, in combination, allow for the safe operation that airline customers have every right to expect.

The machine will always be the machine where failure IS an unfortunate option. That is exactly why humans are still a mandatory element on the flight decks of passenger aircraft. A qualified pilot must always be capable of flying the airplane when the automation or technology fails. It is exactly what they are trained to do. The more common breakdown in the system is when the crew disengages from the machine. Whereas a machine may fail, it does not lose interest, become bored, get fatigued or become distracted.

On today's highly automated flight deck the most common breakdown in the system is a function of the humanity of the pilots. Technology, if properly programmed and applied will either carry out its task or malfunction. Pilots on the other hand may make a mistake or error, but additionally they can perform only a portion of their tasks or lose situational awareness or fail to notice a critical parameter.

At least as many airplanes have been lost by human failures as have been saved by pilot action. The trick is to minimize the human's failures while exploiting their incredible capabilities for situational assessment, evaluation and action.

It was humans that saved many lives by landing in a river. It was humans that unintentionally flew beyond their destination by 150 miles. As this juxtaposition shows, humans make great managers and lousy monitors. Automation is great at monitoring and performance of specific tasks, but it is binary and therefore lousy when situational awareness and decision making is what's needed.

Pilots and technology need to work together in synergistic harmony, not try to take on responsibilities the other does much better.

Ray Bradbury's Captain James T. Kirk never flew the Starship Enterprise.

Flying the Backside