Robots can fly – but,

Image: winfuture.de

 

Robots can fly – but, they cannot do this at the same level of safety as manned aviation. Moreover, it is more than questionable whether automated flight systems/robo-pilots will be more efficient or effective than manned systems in the foreseeable future. This was the conclusion reached by the participants of the 22nd symposium of the Research Network for Pilot Training (FHP), which took place at the end of September under the topic “Are Robots Learning to Fly? – How will the current hype about artificial intelligence (AI) affect work in the cockpit as well as the training and further education of pilots?

A guest contribution by Max Scheck, Lufthansa Captain A320, Master of Aeronautical Science and member of the board of the research network for commercial pilot training, FHP.

Over two days, representatives from research, academia and flight operations discussed various aspects of AI in commercial aviation.

Manfred Müller, Lufthansa flight captain A330/350/340 and flight safety expert, informed in his lecture “Artificial Intelligence in Flight Guidance” about safety in unmanned systems. He emphasized that for a valid assessment of a safety level the total probability (which result from analysis of the individual single probabilities of the system-components) must always be considered. In the area of unmanned traffic systems, an overall safety level of less than 10E-5 currently poses major problems for the industry. Even relatively advanced projects, such as “self-driving cars”, are still far behind the classical (manned) systems. For example, driving a “Google Car” in California is 15 times more dangerous than driving the same car with an average driver. A safety level of 10E-5 would mean a total loss of one per day for commercial aviation.

Possibility of attack with networked systems

In addition to flight safety, aviation security in connection with AI is of also important, as a fast, reliable and secure data connection (protected from unauthorized access) is a basic requirement within a highly automated system. Mathias Gärtner (Dipl. Ing./certified public expert on IT and network technology) showed this in his lecture “General possibilities to attack network-systems”. Mr. Gärtner pointed out that the effort (both in terms of hardware and software as well as energy consumption) increases exponentially with the quality of the above criteria (speed, reliability and security of the data connection). However, there will never be one hundred percent security. Sooner or later, vulnerabilities will be detected and often exploited. To counteract this, the human being in the system should not only be an “interface”, but also always have a kind of “gateway function” (control of the transition from sensor network to control network). The more far-reaching the possible effects of potential system weaknesses (hardware and/or software), or attacks on the system are, the more important this gateway function is.

Human versus Artificial Intelligence in the Cockpit

Prof. Gerd Faber dealt with the energy “balance-sheet” of automated systems in his lecture “Human vs. Artificial Intelligence in the Cockpit”. Prof. Faber wonders whether many end users of modern digital infrastructure are actually aware of how much energy is required for this. Bitcoin, for example, now requires more energy per year for its services than Switzerland as a whole – or in the same period the computer centers of the financial service providers in Frankfurt require more energy than Frankfurt Airport. But even supposedly “small things”, such as a query with the Google search engine for example, require as much energy as an 11-watt energy-saving lamp consumes per hour. For a correspondingly secure and efficient infrastructure of a “pilotless cockpit” a high energy consumption would certainly be necessary (see above) and Prof. Faber doubts that the energy “balance-sheet” in comparison to a manned cockpit will be positive in the end.

What to do with fear?

Dr. med. Silke Darlington took up these thoughts in her lecture “Who owns the learning process and what to do with fear”. Dr. Darlington had already pointed out in the past (e.g. 21st FHP Symposium) that the efforts to “automate” people out of the cockpit are questionable from a medical and especially from a human point of view. On the one hand, the human brain is probably still the most powerful processor on our planet in terms of the overall efficiency (energy expenditure for complex performance), on the other hand it can be assumed that automation – if not ergonomically designed and properly introduced – can have a lasting negative effect on the mental and somatic health of affected employees. Many people associate automation and AI with certain fears (e.g. of bondage, surveillance and enslavement), which in the work situation can be seen as significant stressors (e.g. loss of free will, loss of trust, loss of self-confidence, etc.). These stressors in turn my trigger human “primal fears” such as fear of loss of control, failure and rejection from the community/colleagues (e.g. through exclusion or dismissal).

Dr. Darlington questioned whether these evolutionary factors anchored in humans are sufficiently taken into account in current development processes and sees the risk of, not least in the cockpit, increasing social isolation. Against this background, she criticized the real risk of a future withdrawal of human employees from areas of human strength in favor of their commitment in areas of human weakness. She hoped that the development process and the learning processes associated with it would continue to belong to people as learners and would not become independent before time away from people. According to E. Hollnagel, the strengths and weaknesses of the human being should therefore continue to be in the foreground and serve as an orientation aid for the further development of automation in a role supporting the human being – in order to achieve the best possible overall result for all participants with this approach.

Robots have no feelings

Prof. Gabriele Heitmann seamlessly continued this in her lecture “What psychological and physical effects could the increased use of AI on pilots have? Actually, automation and AI should make people’s work easier – especially because unpleasant tasks are taken over by technology. In fact, the technology does this, but whether the strain, especially the mental strain, has been reduced is more than questionable. In contrast to humans, technical systems/robots have no feelings (this is explicitly postulated as an advantage over social systems) – yet we humans tend to ascribe a role to the interaction with the systems, especially highly developed systems, which definitely contains emotional components. This role attribution depends strongly on the attitude of the human being, which interestingly shows great cultural differences. Thus, the following basic settings can be distinguished with regard to the relationship to automation/robots:

In the USA, the role image of automation/robots dominates as “servants”, in China as “colleagues”, in Japan as “friends” and in Europe rather as “enemies/competitors”. Irrespective of any evaluation of these different role attributions, different forms of interaction result, which in turn are reflected in the system architecture and programming. According to Prof. Heitmann, much more research should be done here and, if necessary, the training should be adapted accordingly. A learning of automation and/or learning under increased use of automation can only work if the understanding of roles is adequately considered. At the very least, the users/users of modern highly automated systems/robots must be made aware of the possible stressors. Findings from andragogy have shown that adults learn best when they are intrinsically motivated and such intrinsic motivators should therefore be at the very core.

Point of view modern brain research

Prof. Gertraud Teuchert-Noodt confirmed this from the point of view of modern brain research in her lecture “Can the commercial pilot be replaced by autonomous flying? Answers from brain research”. The learning processes in the human brain are complex electrochemical processes that require a minimum of time and certain sensory stimuli. If the cooperation between man and machine does not sufficiently take this into account, there is a risk of psycho-physical dependence, which impairs concentration and flexibility. This can lead to the above-mentioned stressors, which can have an inducing effect with the psychosomatic consequences of sleep disorders and poor concentration and ultimately lead to burnout syndrome.

KI – Great challenge for air traffic controllers

Edwin Stamm (ATM Specialist and Head of Compliance, Training Concepts, Support German Air Navigation Services) and Herbert Hoffmann (Ph.D. DFS Senior Expert Training Concepts) showed in their presentation “DFS in the air traffic system” that the above topics are also highly relevant in the field of air navigation services. On the one hand, the air traffic controller’s workplace is becoming increasingly automated (including remote tower concepts). In addition, the requirements for the required services are changing, e.g. due to the increased use of drones, especially in the lower airspaces. At the same time, some air traffic controllers of the younger generations would like to have the option of a college-level degree program linked to air traffic controller training. To do justice to all these factors is a great challenge for the DFS Academy. In particular, as economic considerations naturally also play a role and, due to a current shortage of trained air traffic controllers, there is considerable pressure to provide training as quickly and efficiently as possible. While Stamm and Hoffmann see definite possibilities in the increased use of automation and AI, both in training and at the future controller’s workplace, they do not consider this an adequate substitute for the human controller.

Andreas Klein (FO, B.Eng.) and Christopher Lohrey (FO , M.Eng, MBA) confirmed in their respective presentations “The Pandora’s Box of Aviation Training – New Ways to Sustainable eLearning Content” and “Market Research in Aviation” that economic aspects are an important factor. Particularly in “commercial” aviation, managers cannot escape economic constraints and serious market research can make a decisive contribution to success or failure. From Klein and Lohrey’s point of view, the above-mentioned findings from medicine, psychology and brain research must be included in such market research. Modern training and further education of pilots (and other aviation professions such as air traffic controllers, mechanics, dispatchers, flight attendants) must keep pace with the real working environment – both in terms of content and the media used (e.g. eLearning). The concepts must be designed in such a way that they result in the best possible man-machine (system) combination, taking into account the respective strengths and weaknesses. Automation and AI should not be an end in themselves.

This was also the tenor of the following discussions. It was especially emphasized that at the moment there is unfortunately a tendency to observe that automation/KI should increasingly take over execution and that humans “only” monitor this. Humans are however rather badly suited for a “bare monitoring” of tasks. It would make more sense, both from a psychological, neurological and medical point of view – as well as from the many years of operational experience – if people were to perform (execute) and the automation/KI supported the human in this (i.e. monitor and, if necessary, intervene to help). If the overall systems take the respective strengths and weaknesses of humans and machines and combines these in a meaningful way, an optimum result is ultimately achieved which is efficient and effective (thus also economically sound).

FHP welcomes feedback – in particular, if there should be further need for information on one or more of the above topics. Details on FHP https://www.fhp-aviation.com

Leave a Reply

Your email address will not be published. Required fields are marked *