Get Instant Help From 5000+ Experts For
question

Writing: Get your essay and assignment written from scratch by PhD expert

Rewriting: Paraphrase or rewrite your friend's essay with similar meaning at reduced cost

Editing:Proofread your work by experts and improve grade at Lowest cost

And Improve Your Grades
myassignmenthelp.com
loader
Phone no. Missing!

Enter phone no. to receive critical updates and urgent messages !

Attach file

Error goes here

Files Missing!

Please upload all relevant files for quick & complete assistance.

Guaranteed Higher Grade!
Free Quote
wave
Discussion

The purpose of this paper is to identify and understand human factors to increase safety in business aviation. Further, the variation and unusual developments that impinge on safe aircraft operation has been studied in this essay.

Air France Flight 447 was a transatlantic passenger flight from Rio de Janeiro to Paris. The plane crashed into the Atlantic Ocean on May 31, 2009, killing everyone on board. The crash is believed to have occurred as a result of temporary discrepancies in airspeed metrics induced by ice crystals blocking the pitot probe (Kharoufah et al. 2018). Finally, the autopilot failed, and the crew acted inaccurately, causing the plane to enter an engine problem from which it never recovered.

The accident was caused by a mix of several factors related to the aircraft's technology and the crew's training. Poor flight director indications, a lack of visual information, perplexing stall alerts, illegible exhibit of speed and altitude measurements, and Poor feedback loops were among the technological flaws (David and Schraagen 2018). Due to training deficiencies, the crew failed to respond to the stall alert, was not provided with training in cooling the pitot probes, and lacked technical learning in manual process processing the aeroplane. Furthermore, the co-pilots’ ability to share tasks was hampered by a lack of situational understanding and poor emotional management.

This disaster has brought to light a number of issues with aviation's human automation interface. While automated flight-control features could reduce a few of the risks associated with aviation, they also alter the operators’ skill levels, situation awareness, workloads, and activities that can lead to issues. The first issue raised by this disaster is the crew's transition from controller to supervise. The ability of the crew to undertake a passive supervisory function instead of an active functioning involvement is used in primary flight automation. One issue is a loss of vigilance that is amplified when a process is extremely trustworthy (Az?k Özkan, Kazemiafshar and Özkan 2019). These crashes, however, are not the result of human error; rather, they are the result of flaws in the automation systems design. Quite relevantly, the accident of Flight 447 was partly blamed on a lack of positional awareness, which could have occurred as a result of pilots being forced to execute a passive supervisory function. Surveillance duties can decrease positional awareness of the aircraft's current aerial state and also predictions of prospective actions (David and Schraagen 2018).

Difficult and complicated automation could also contribute to a lack of positional awareness, including a complex flight automation structure that causes pilot confusion because of poor design process. A poor human – machine Interface, according to a report in the case of Flight 447, played a significant role in the crash. A number of factors played a role in this, including an inaccurate flight director display, which accounted for the vast majority of the incorrect pitch-up inputs caused by an altimeter error (?enol and Acar 2020). Computer-detected inconsistencies in airspeed were not prominently illustrated.

Failed comments were obtained, and yet only the real repercussions were displayed, not the cause of the problem. There was no evidence of a clogged pitot probe in the plane exhibits. In addition, there was a lack of Angle of Attack data, which is critical for detecting and prohibiting stalls. This data was forwarded to on-board machines, but there were no exhibits to demonstrate it (Oliver, Calvard and Poto?nik 2017). In addition, as the standard and difficulty of automation rise, so does the level of skill and experience required to regain from a malfunction or unforeseen circumstance. This is due to the operator having less time to detect and resolve developing issues. For instance, on Flight 447, the crew had very little time than three minutes to recognize and assess an issue (Az?k Özkan, Kazemiafshar and Özkan 2019).

An aircraft's capacity to recoup from an unexpected or failed situation is also dependent on the manual flying skills of the crew (Oliver, Calvard and Poto?nik 2017). Even so, as aircraft become more automated, pilots' manual flying skills deteriorate. Strength and conditioning and aviation guidelines on automation frequently result in a lack of practise opportunities, likely to result in pilot passivity and the degradation of flying ability (Brown 2016). In manual flying tests, crews who utilised the most flight deck mechanisation outperformed others. It has repercussions in the event of an unusual circumstance wherein the automation system fails with no warning, forcing the crews to depend on their mechanical flying abilities. Moreover, automation would then ensure balance until it no longer could, culminating in the aeroplane losing control as the air crew assumes control, implying that crews must be skilled in manual flying (Mohrmann and Stoop 2019).

Another issue is that during times of high demand, automation enhances cognitive workload. The above workload issue worsens when there are instances that necessitate additional cognitive workload throughout a period of already high workload. Once the crew is under a lot of pressure, emerging automation system failures are quite probable to become a dangerous issue (Mohrmann and Stoop 2019). For instance, if there is destruction or a control system failure, the Flight Control Process advice is frequently misrepresentative or inaccurate, and flight crews could be overfilled with a plethora of details and alert, making it complicated to distinguish the issue.

The crew of the A447, for instance, was confronted with over fifty simultaneous warnings. The cockpit monitors were illuminated by alarm after alarm. The autopilot, automatic engine control system, and airplane screens all shut down at the same time. As a result, they were unable to comprehend or diagnose the issue before it became a major crisis, resulting in massive failure (Rocha and Lima 2018). The aforementioned issue could be caused by a lack of crew members as a result of automation. Automation can function as an untrained and ineffective member of the system's crew. Crews and automation systems frequently interact poorly, despite the need for multisensory feedback to crews. To accomplish a safe degree of shared situation awareness, the automated system should become a member of the crew. To maintain shared positional awareness, it must communicate its adjustments (Parra and Laguardia 2018).

Because of a lack of situation awareness of the bigger picture, current automated systems might still cause alterations on a dial or monitor, and yet they rarely draw attention to them. By communicating clearly, accidents can be prevented (Pruchnicki 2016). The chain of events would not have continued, for instance, if there had been clear communication that the pitot probe on Flight 447 was freezing. It is proposed that aircraft become better team players in order to improve automation. A human automation team is a dynamic, interdependent combination of more than one human operator and more than one automated systems that must collaborate and coordinate to complete a mission successfully. Current automation systems are ineffective team members when failures or unusual events occur, leaving human operators or crews unprepared (Parra and Laguardia 2018). To enhance human-machine interaction, system will be capable to exchange and discuss control, so that involving with a structure feels more like engaging with a co-worker.

Human automation teams will be used in future systems, such as Free Flight, to share and trade tasks as the situational demands change. In such dynamic situations, human–automation teams can coordinate implicitly on a nearly entirely cognitive basis. As a result, automation systems will be able to collaborate with one another. Furthermore, effective team players make their actions visible to their teammates and are easy to direct (Kyriakidis, et al. 2019). Automation activities must be demonstrated in ways that maximise on people skills in order to be visible. Changes and events, for example, should be highlighted in event-based representations. Human operators in lively systems require assistance in expected alter, realizing what to expect, and knowing where to look next. Operators based on pattern should be capable to quickly scan displays for potential anomalies while avoiding difficult cognitive work. Automation can reduce difficult mental tasks to simple perceptual ones by relying on pattern-based representations (Catchpole 2017).

Over the last three decades, many crashes, such as Flight 447, have been caused by unexpected behaviour, automation failures, decreased operator skills, decreased situation awareness, and changes in workload. As a consequence of these factors, manual recovery is frequently jeopardised when the automation system fails. These problems might have been worse by the system's tight coupling. Tight coupling makes recovering from minor failures before they become major ones difficult (Orlady, Orlady and Lauber 2017). Tighter coupling between parts accelerates the spread of things across the system. It implies that problems have more serious and complex consequences which could spread quickly. Meeting these demands becomes more difficult when automated partners are clumsy, silent, strong, and difficult to direct. As a result, coordination failures and new types of system failure occur (Kyriakidis, et al. 2019). Aircraft systems, it is currently argued, are only moderately tightly coupled. Airlines, on the other hand, are advocating for flight crews to be reduced from three (engineer, co-pilot, and pilot) to two (co-pilot and pilot) due to the engineering load being reduced by computers and other devices. More automation and fewer controllers in its system will result in much tighter coupling, resulting in fewer incident recovery resources (Lapesa Barrera 2022).

Once the issues with Flight 447's automation has been recognised, it is essential to comprehend how safety models attributed to the accident's knowledge and what the repercussions are for future safety management in order to avoid events from occurring itself. Breakdowns and errors, according to the Safety model, are caused by technological, human, and performance outcomes, with humans regarded as the primary hazard (Katerinakis 2019). When something goes wrong, the safety management principle states that the source of the accident should be investigated and identified before trying to minimise the causative factors or enhance blockade. As a result, safety is defined as a state in which the number of negative outcomes is minimised. Many different accident models have expressed the principles of safety models, with the Shell model being the most well-known (de Wit and Cruz 2019).

According to this model, accidents are caused by a combination of factors. When all of these factors combine, it can lead to an accident (O’Brien 2019). It could be latent circumstances, including issues with the organization's design or management that exist before an incident occurs. Active failures are human operator errors that, when mixed with latent failures, leading in an incident. It suggests that no single human or technological failure can cause an accident. Rather, it occurs as a result of an unlikely and frequently unforeseeable event involving a number of contributing factors originating at various degrees of the structure (Az?k Özkan, Kazemiafshar and Özkan 2019).

The model, in the Flight 447, might enable the identification of each adding factor. The Human Computer Interface, pitot probe, unlinked controls among pilots, and deceptive stall alerts are all technical flaws (de Wit and Cruz 2019). Human errors include the captain leaving the room, poor communication, poor startle effect management, and the co-pilot pulling back on the stick. Inadequate training, delayed installation of new pitot probes, and poor HCI design are examples of organisational flaws. When all of these factors were considered, they all had a key role in the accident. Searching for human errors after an event is a safe bet because they could always be discovered in retrospect (O’Brien 2019). Finding as well as identifying human errors makes it simpler to determine who must be held responsible and where anticipatory actions should be concentrated. When the cause is human error, however, anticipatory actions are usually ineffective. Incidents occur as a result of several reasons interacting, and by holding responsible the individual, humans frequently believe that the structure is safe as long as the bad apples are removed (Read et al. 2020).

In contrast, a proactive safety model has recently been proposed. Proactive safety management is part of safety's goal that contends that concentrating on incidents of malfunction does not demonstrate how to enhance protection and that as an alternative of concentrating on what goes incorrect, a centre on what goes correct is required to realise how that occurs. Following an incident, several organisational flaws are usually revealed (Haslbeck and Hoermann 2016). For example, detect and investigate deviations from rules and regulations. Deviating from a prescribed rule, on the other hand, is not always a contributing factor to an incident or an unusual occurrence. Adaptations, on the other hand, are more frequently than not the exemption rather than the rule.

It must be recognised that the daily routine unpredictability required to react to changing environment is the cause things go well. As a result, humans are viewed as a resource critical to system resilience and flexibility. The safety management structure is to always be on the lookout for new developments and events (Katerinakis 2019). Rather than looking for specific causes that only explain the failure, we should start by figuring out how things usually go right. According to this strategy, accidents are emergent rather than resultant. Understanding the cause of an accident necessitates formative why the control system failed, and avoiding future incidents necessitates creating a control system that imposes the required limitations (Az?k Özkan, Kazemiafshar and Özkan 2019).

Understanding the cause of an accident necessitates determining why the control structure failed. A control structure capable of enforcing the necessary constraints must be developed to prevent future accidents (Oliver, Calvard and Poto?nik 2017). Systems are viewed as hierarchical structures in systems theory, with each level constraining the action of the level below. It implies that limitation or the lack thereof at a superior level enable or manages behaviour at a inferior level. An accident is thought to be caused by a lack of constraints as a result of insufficient enforcement of behavioural constraints at each level of a socio-technical system.

Conclusion

In general, pilots are a division of a complex human-automation system that has the potential to increase or decrease the likelihood of an incident. Cockpit procedures, automation systems, and training can all be modified to prevent specific errors from recurring. However, because humans are unpredictable, an accident is always possible. On the other hand, transforming automation systems into effective team players has the potential to transform aviation and prevent avoidable disasters. Furthermore, safety management strategies should emphasise the importance of being positive to recognize possible incidents earlier than they occur, emphasising how adjustments and variability are division of what works well in daily performance, potentially preventing incidents.

References

Az?k Özkan, D., Kazemiafshar, A. and Özkan, T., 2019. Root Cause Analysis of Air France Flight 447 Accident (1 June 2009).

Brown, J.P., 2016. The effect of automation on human factors in aviation. The Journal of Instrumentation, Automation and Systems, 3(2), pp.31-46.

Catchpole, K., 2017. Surgery Through a Human Factors and Ergonomics Lens. In Surgical Patient Care (pp. 39-50). Springer, Cham.

David, L.Z. and Schraagen, J.M., 2018. Analysing communication dynamics at the transaction level: the case of Air France Flight 447. Cognition, Technology & Work, 20(4), pp.637-649.

de Wit, P.A. and Cruz, R.M., 2019. Learning from AF447: Human-machine interaction. Safety science, 112, pp.48-56.

Haslbeck, A. and Hoermann, H.J., 2016. Flying the needles: flight deck automation erodes fine-motor flying skills among airline pilots. Human factors, 58(4), pp.533-545.

Katerinakis, T., 2019. Communication and Human Factors Phenomena in Aviation Transmit Knowledge. In The Social Construction of Knowledge in Mission-Critical Environments (pp. 17-36). Springer, Cham.

Kharoufah, H., Murray, J., Baxter, G. and Wild, G., 2018. A review of human factors causations in commercial air transport accidents and incidents: From to 2000–2016. Progress in Aerospace Sciences, 99, pp.1-13.

Kyriakidis, M., de Winter, J.C., Stanton, N., Bellet, T., van Arem, B., Brookhuis, K., Martens, M.H., Bengler, K., Andersson, J., Merat, N. and Reed, N., 2019. A human factors perspective on automated driving. Theoretical Issues in Ergonomics Science, 20(3), pp.223-249.

Lapesa Barrera, D., 2022. Human Factors. In Aircraft Maintenance Programs (pp. 265-281). Springer, Cham.

Mohrmann, F. and Stoop, J., 2019. Airmanship 2.0: Innovating aviation human factors forensics to necessarily proactive role. In International Society of Aviation Safety Investigators (ISASI). Annual Seminar.

O’Brien, J., 2019. Mystery over the Atlantic: the tragic fate of Air France Flight 447. The CASE Journal.

Oliver, N., Calvard, T. and Poto?nik, K., 2017. Cognition, technology, and organizational limits: Lessons from the Air France 447 disaster. Organization Science, 28(4), pp.729-743.

Orlady, H.W., Orlady, L.M. and Lauber, J.K., 2017. Human factors in multi-crew flight operations. Routledge.

Parra, R. and Laguardia, J., 2018. Errors from engineering in flight safety. In 31st Congress of the International Council of the Aeronautical Sciences (ICAS). Belo Horizonte. The International Council of the Aeronautical Sciences. Amsterdã: Kamer van Koophandel (pp. 4-5).

Pruchnicki, S.A., 2016. Mid Morning Concurrent Sessions: Human Factors: Human Error and Cockpit Automation: Presentation: Cognitive Processes and Challenges during Surprise on the Flight Deck.

Read, G.J., O’Brien, A., Stanton, N.A. and Salmon, P.M., 2020, December. What is going on? Contributory factors to automation-related aviation incidents and accidents. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (Vol. 64, No. 1, pp. 1697-1701). Sage CA: Los Angeles, CA: SAGE Publications.

Cite This Work

To export a reference to this article please select a referencing stye below:

My Assignment Help. (2022). Aviation Pilot Human Factor Essay: Understanding Human Factors To Increase Safety In Business Aviation.. Retrieved from https://myassignmenthelp.com/free-samples/avat13011-aviation-human-factors/safety-in-business-aviation-file-A1D2F08.html.

"Aviation Pilot Human Factor Essay: Understanding Human Factors To Increase Safety In Business Aviation.." My Assignment Help, 2022, https://myassignmenthelp.com/free-samples/avat13011-aviation-human-factors/safety-in-business-aviation-file-A1D2F08.html.

My Assignment Help (2022) Aviation Pilot Human Factor Essay: Understanding Human Factors To Increase Safety In Business Aviation. [Online]. Available from: https://myassignmenthelp.com/free-samples/avat13011-aviation-human-factors/safety-in-business-aviation-file-A1D2F08.html
[Accessed 19 August 2024].

My Assignment Help. 'Aviation Pilot Human Factor Essay: Understanding Human Factors To Increase Safety In Business Aviation.' (My Assignment Help, 2022) <https://myassignmenthelp.com/free-samples/avat13011-aviation-human-factors/safety-in-business-aviation-file-A1D2F08.html> accessed 19 August 2024.

My Assignment Help. Aviation Pilot Human Factor Essay: Understanding Human Factors To Increase Safety In Business Aviation. [Internet]. My Assignment Help. 2022 [cited 19 August 2024]. Available from: https://myassignmenthelp.com/free-samples/avat13011-aviation-human-factors/safety-in-business-aviation-file-A1D2F08.html.

Get instant help from 5000+ experts for
question

Writing: Get your essay and assignment written from scratch by PhD expert

Rewriting: Paraphrase or rewrite your friend's essay with similar meaning at reduced cost

Editing: Proofread your work by experts and improve grade at Lowest cost

loader
250 words
Phone no. Missing!

Enter phone no. to receive critical updates and urgent messages !

Attach file

Error goes here

Files Missing!

Please upload all relevant files for quick & complete assistance.

Plagiarism checker
Verify originality of an essay
essay
Generate unique essays in a jiffy
Plagiarism checker
Cite sources with ease
support
Whatsapp
callback
sales
sales chat
Whatsapp
callback
sales chat
close