Tech News

The paradox of aircraft: more automation should mean more training

[ad_1]

Soon The Smartlynx Estonian Airbus 320 took off on February 28, 2018, when four of the aircraft’s control computers ceased to operate. Each made himself out of the way he designed, after detecting a (wrong) breakdown. The problem that was later discovered was that the oil was too viscous. The design created to avoid the problem created the problem. Only the skill of the instructional pilot prevented a serious accident.

Now, when the Boeing 737 MAX returns to the sky around the world after being on the ground for 21 months, flight training and design have crossed paths. To ensure a secure future for aviation, a completely new approach to automation design is needed using methods based on systems theory, but plans with technology range from 10 to 15 years. For now we need to train pilots to better respond to the inevitable oddities of automation.

In investigating MAX, Air France 447 and other accidents, we spoke to hundreds of pilots and experts from major regulatory agencies, manufacturers and aviation universities. They agree that the best way to prevent accidents in the short term is to teach riders how to manage more surprises creatively.

The slow response to pilot training and design reform is an ongoing problem. In 2016, seven years after the landing of Air France 447 in the South Atlantic, airlines around the world began recycling pilots with a new approach to handling high-altitude aerodynamic outlets. Boeing convinced regulators that simulator training was not necessary for 737 MAX crews began after MAX the second crash, 2019.

These remedies only address these two specific scenarios. There could be hundreds of other unforeseen automatic challenges that could not have been foreseen using traditional risk analysis methods but introduced factors such as preventing a computer from using the reverse thrust when it did not “think” of landing the aircraft in the past. An effective solution must go beyond the boundaries of aircraft designers who are unable to create the perfect failure for aircraft designers. Captain Chesley Sullenberger noted that automation will never be a panacea for unforeseen new situations in training.

Paradoxically, Sullenberger rightly stated in a recent interview with us, “it requires a lot more training and experience, not less, to make highly automated aircraft.” Pilots must have a mental model of the aircraft and its primary systems, as well as the operation of flight automation.

Contrary to popular belief, it is a pilot error no the cause of most accidents. This belief is a manifestation of retrospective bias and a false belief in linear causality. It is more accurate to say that sometimes pilots are found in overcrowded scenarios. Doing more automation can lead to completely more serious scenarios. That may be the reason why the rate of millions of large commercial aircraft and serious accidents flown in 2020 will be higher than in 2019.

Today, pilot training has scripts and is based on the most popular and safe scenarios. Unfortunately, in many recent accidents experienced pilots had zero system or simulator training in the face of the unexpected challenges they encountered. Why can’t designers predict the anomalies that almost brought down the Smartlyn plane? They use outdated models created before the advent of computers. This approach to predicting scenarios that could pose a risk on the flight is limited. Currently, the only available model that examines new situations like this is System Theoretical Process Analysis, created by Nancy Leveson at MIT.

Modern jet aircraft developed using classical methods create scenarios that await the right combination of events. Unlike old aircraft built using only basic electrical and mechanical components, the automation of these modern aircraft uses a complex set of situations to “decide” how to perform.

In most modern aircraft the software that drives how the controls respond behaves differently depending on the speed, whether on the ground, in flight, if the flaps are up, and if the landing gear is up. Each mode can lead to different rules of the software and can lead to unexpected results if the software does not receive accurate information.

A pilot who understands these nuances, for example, may think about avoiding a change of modes by not withdrawing their wings. In the case of MAX accidents, the pilots found themselves in confusing situations, which meant that the automation worked perfectly, not as expected. The software was given bad information.

[ad_2]

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button