Automation Complacency

Issue: 5 / 2011By Dr Mani Sishta

From preliminary reports of the first Airbus A320 crash at Bangalore, it appeared that the accident had a lot to do with the Captain being unable to comprehend the ‘fantastic’ cockpit automation technology that was then available on that airplane. The Captain was a young family man who could have either promised or even procured a vaunted desktop computer for his children. In all probability, that desktop, with a ’fast‘ 480 chip, would have been the talk of the neighbourhood and a source of pride and joy within his immediate family. His kids would have mastered its usage haltingly explaining its simplicity to an otherwise busy dad, and their mom would have gamely tried to figure out this modern wonder.

The dad grudgingly accepted that his airplane had numerous such computers on board and that these were somehow insulated from all faults—either on their own or during interaction with the crew. Besides, there were backups, he was told. That generation of pilots, still fill the skies, while the newer generation occupy the co-pilot‘s seat.

While the older generation of pilots have a healthy regard for automation, the ’gen-next‘ pilots are a lot more skeptical about the ’bugs‘ and ’glitches‘ that still emerge. The former would any day prefer a tried and tested single or dual purpose cell phone, while the newer generation prefers to flaunt a new iPhone or Blackberry, not to mention the iPad.

The term ”automation complacency“, which can be found in the new Directorate General of Civil Aviation (DGCA) syllabus on ’human performance and limitations‘, has a certain ’wake-up‘ ring to it. While one cannot put a label on its implications, one can instantly identify with its psychological connotations.

How Automatic Is That?

Cockpit automation has provided numerous benefits and has extended system functionality well beyond human capabilities. Yet the question arises, can there be too much of it?

Many years ago, the term ’manager‘ was applied to the Captain of an aircraft, indicating that his role was much more than that of a simple hands-on pilot. The manager had to deal with the related ’resources‘—human, organisational, regulatory and technical. The following accidents highlight the problems associating cockpit automation with a human interface.

In 1983, a South Korean airliner was shot down South of Sakhalin, USSR. It had strayed into that hostile airspace due to a probable navigational input error due to incorrect coordinates having been keyed in. Lack of effective communication also played a role in the disaster.

In 1987, a Northwest Airlines MD80 crashed during takeoff due to an improper configuration of flaps and slats. A major factor here was the failure of the automated takeoff configuration warning system upon which the crew had become reliant. Not realising that the airplane was improperly configured, the crew had neglected to carry out a physical check. Their dependence on automation to monitor this led them to become complacent, resulting in a lack of awareness about the functioning of the automation system or the critical flight parameters they expected the system to monitor.