EASA updates CM-SA-002 on “Flight Crew Human Factors Assumptions in Aircraft and System Safety Assessments”,

This Certification Memorandum (CM) aims at stressing the importance of considering human factors (HF) in aircraft and system safety assessments for large aeroplanes, especially in frame of the classification of failure conditions identified using functional hazard assessments (FHAs) of the aircraft and system functions. It provides applicants with a structured HF process that may be used to confirm the assumptions made about the expected flight crew behaviours.

This CM focusses on flight crew HF aspects and more specifically on:

  •  identifying and defining elements to complement AMC 25.1309, including cognitive aspects underlying the failure condition recognition, the elaboration of the diagnosis of the situation, and the flight crew response and post failure management,
  • establishing the criteria driving the level of scrutiny required to demonstrate the validity of the assumptions,
  • providing guidance for the selection of methods and means to be used to show compliance with theapplicable certification specifications.

See Final Certification Memorandum ref. CM-SA-002 Issue 02 on “Flight Crew Human Factors Assumptions in Aircraft and System Safety Assessments” – Applicable to Large Aeroplanes | EASA (europa.eu)

GNSS Interference

Signals from the Global Navigation Satellite System (GNSS) are one of the main inputs used for aircraft positioning or time reference for Communication, Navigation and Surveillance functions on-board most of the Airbus aircraft.

Operators report an increasing number of events related to the loss of GNSS signals due to Radio Frequency Interference (RFI) during operations in some areas of the world.

This article explains the causes of RFI, the effects on the aircraft systems and provides recommendations for flight and maintenance crews

See GNSS Interference | Safety First (airbus.com)

NPRM impacting FAR 25.1309

The FAA proposes to amend certain airworthiness regulations to standardize the criteria for conducting safety assessments for systems, including flight controls and powerplants, installed on transport category airplanes. With this action, the FAA seeks to reduce risk associated with airplane accidents and incidents that have occurred in service, and reduce risk associated with new technology in flight control systems. The intended effect of this proposed action is to improve aviation safety by making system safety assessment (SSA) certification requirements more comprehensive and consistent.

See FAA Proposes Overhaul Of Airliner Certification – AVweb   and Federal Register :: System Safety Assessments

FAA pushes Boeing to review safety documents on new 737 MAX model

See FAA Pushes Boeing to Review Safety Documents on New 737 MAX Model – WSJ

Federal air-safety regulators have asked Boeing Co. BA -8.77%decrease; red down pointing triangle to launch a review of its safety paperwork for the 737 MAX 7, another setback for the plane maker’s push to win approval for the jet before a year-end legal deadline.

The Federal Aviation Administration is unable to review the company’s submissions “due to missing and incomplete information” related to cockpit crews’ potential reactions to catastrophic hazards, according to an Oct. 12 agency letter viewed by The Wall Street Journal. Plane makers must meet such hurdles before regulators clear jets to carry passengers.

The FAA’s request for a review covers system safety assessments for the 737 MAX 7, which is the shortest in Boeing’s family of the single-aisle jets, and which is awaiting regulatory approval to carry passengers. It comes after the agency recently said the aircraft was at risk of not being certified by a December deadline set by Congress following two fatal crashes of the 737 MAX 8, an earlier version of the jet.

 focus of air-safety legislation passed by Congress in 2020, which included the deadline, is so-called human factors engineering, which deals with how pilots respond to cockpit emergencies. The fatal 737 MAX 8 accidents involved a flawed Boeing assumption about how pilots would respond to a flight-control system’s misfire. The law would require MAX jets certified after the end of the year to receive a potentially costly and time-consuming cockpit overhaul.

Boeing said safety remains the driving factor in its effort to meet all regulatory requirements in certifying the 737 MAX 7. The company said being thorough and transparent with the FAA will continue to be a priority.

The fatal 737 MAX 8 accidents involved a flawed Boeing assumption about how pilots would respond to a flight-control system’s misfire.  PHOTO: MATT MILLS MCKNIGHT/REUTERS

The FAA said the letter speaks for itself. Acting FAA Administrator Billy Nolen said at a press conference earlier this month that the agency wouldn’t approve the 737 MAX 7 and another MAX model for passenger service until it was satisfied.

“When we’ve got all the information we need, and not until then, we’ll certificate the airplane,” Mr. Nolen said. “We are working through it very purposefully, and when we get there, we get there.”

Southwest Airlines Co. is a major buyer of the 737 MAX 7 and has been planning to add the fuel-efficient jet to its fleet and retire older planes.

Boeing Chief Executive David Calhoun said on Sept. 15 that he expected the 737 MAX 7 would be certified by the year-end deadline. FAA officials later signaled the 737 MAX 7 was at risk of not meeting the year-end deadline.

Boeing has also been working to get the longer model of the jet, the 737 MAX 10, certified by the end of the year. Mr. Calhoun has said Boeing may have to consider canceling that model without a congressional extension. United Airlines Holdings Inc. and Delta Air Lines Inc. are among that model’s buyers.

The Oct. 12 FAA letter regarding the MAX 7 was signed by Ian Won, acting manager of the agency’s Boeing oversight office. It cites examples that, he wrote, show Boeing inadequately addressing pilots’ roles in certain cockpit emergencies, such as avoiding ignition of the plane’s fuel tanks.

 

 

What makes an outstanding system safety professional?

See What Makes an Outstanding System Safety Professional? – Blog of System Safety (jsystemsafety.com)

Most employment ads for system safety professions will list education, areas of expertise and years of experience as requirements. They may also require certain capabilities, such as strong communication skills (written and spoken), and an ability to navigate standard desktop tools such as word processing software. Some may even have the insight to ask for specific analytical skills or the ability to systematically address specific systems or processes. Advertisements for senior or management positions may add organizational or administrative skills to the list. Descriptions of openings for top-level positions may call for promotional skills that seem more appropriate for a “company cheerleader” than for the manager of a serious technical or analytical effort.

What makes an outstanding system safety professional goes beyond a desire to do our best and the possession of the kinds of technical knowledge and skills cited in the employments ads. There is a range of personal qualities that contribute to a higher and broader level of performance. These qualities, which make up our “System Safety Character,” are an important part of everything we do and must come to the forefront in crisis situations and in the making of key risk decisions. These include:

  1. The ability to recognize potential risks and safety issues:
  • A perspective and an imagination that identifies hazards, supported by an inventiveness that aids in the formulation of solutions
  • The ability and enough healthy skepticism to recognize issues with proposed solutions to safety issues and false closure logic
  • A thorough understanding of our risk analysis tools and the ability to apply them to real-life situations (which may require real-time solutions)
  • A clarity and depth of vision of the safety aspects of the total operation, understanding the program as a whole and the interrelationships of the individual components

“What makes an outstanding system safety professional goes beyond a desire to do our best and the possession of the kinds of technical knowledge and skills cited in the employment ads.”

2. The ability to identify an issue must be coupled with a willingness to speak out. For example, the safety personnel present at critical meetings while Columbia circled the earth during the STS-107 mission were dedicated, and they knew the related safety assessments. Yet the Columbia Accident Investigation Board (CAIB) Report criticized their performance, noting,

“… safety personnel were present but passive and did not serve as a channel for the voicing of concerns of dissenting views.” “Safety representatives attended meetings of the Debris Assessment Team, Mission Evaluation Room, and Mission Management Team, but were merely party to the analysis process and conclusions instead of an independent source of questions and challenges.”

[CAIB Report, vol. I, p. 170]

Space Shuttle Columbia Final Launch

The CAIB also drew discomforting parallels to the “silent” role of a previous generation of safety professionals noted in the Rogers Commission report on the Challenger accident in 1986. Part of the willingness to speak up is the acceptance that this may require taking an unpopular stand, even to the point of nonconcurrence with a majority opinion.

3. Every outstanding practitioner exhibits certain leadership qualities:

  • The skill to “win over” others to their position, including the ability to present a position and defend it
  • A sense of teamwork that encourages inputs from all parties involved
  • The ability to focus on the issue and the search for the best solution
  • A sense of fairness, honesty and respect for opposing positions

4. A sense of responsibility that acknowledges the expectations of the customer (developer and/or user of the product):

  • Relentless pursuit of resolution of issues
  • Meticulous system analysis (including hazard identification and resolution)
  • Commitment to the role of safety advocate

5. The most overlooked quality in our system safety character is the ability to critically review our own performance. Successful self-assessment requires the application of all of our knowledge and skills. It requires an assessment of both the quality of the system safety effort (products and services) and how the effort is utilized. The CAIB Report observed that,

“Structure and process places Shuttle safety programs in the unenviable position of having to choose between rubber-stamping engineering analyses, technical efforts, and Shuttle program decisions, or trying to carry the day during a committee meeting in which the other side almost always has more information and analytic capability.”

[CAIB Report, vol. I, p. 187]

Clearly, this is not the kind of situation that leads to the best products or the most effective contribution to a program.

In short, we would submit that it takes more than dedication, knowledge, experience, special skills and even knowledge of the latest safety fight song. We would add system safety character, which includes a little common sense and a lot of true grit.


The authors, John Livingston and Chad Thrasher, are officers in the Tennessee Valley Chapter of the System Safety Society.

Trial for AF 447 crash 13 years ago

See Air France and Airbus on trial 13 years after Atlantic jet disaster | Reuters

More than 13 years after an Air France jet plunged into the Atlantic, killing all 228 people on board, the French carrier and Airbus go on trial in a Paris court next week.

After a two-year search for the A330’s black boxes, French investigators found pilots had mishandled the temporary loss of data from iced-up sensors and pushed the 205-tonne jet into an aerodynamic stall or freefall, without responding to alerts.

But the BEA accident agency also disclosed that Air France had expressed concerns about increased icing incidents before the crash and had started receiving improved speed probes. Experts say the relative roles of pilot or sensor error, as well as erratic displays or fatigue, will be key to the historic trial.

Monday’s opening hearing will mark the first time French companies have been directly placed on trial for “involuntary manslaughter” following an air crash, rather than individuals.

While corporate reputations and a long-awaited catharsis for families are at stake, the nine-week trial is not expected to lead to significant financial penalties. However, experts say larger sums have been paid in compensation or civil settlements.

The maximum fine for either company, if convicted of involuntary manslaughter, is just 225,000 euros ($220,612) or five times the maximum monetary penalty for an individual, who unlike a company can also face jail, according to French legal experts.

 

EASA Certification Memo: HF in the FHA

EASA issues CM ref CM-SA-002 Issue 01 on “Human Factors Considerations in Aircraft and System Functional Hazard Assessments”

This Certification Memorandum (CM) aims at stressing the importance of considering the Human Factors in Aircraft and System Functional Hazard Assessments for Large Aeroplanes. It provides applicants with a structured Human Factors methodology to validate the assumptions made about the expected flight crew behaviours, in the aircraft and system Functional Hazard Assessments (FHA).
This Certification Memorandum focusses on flight crew aspects and more specifically on:

  • identifying and defining the elements missing in the existing guidance material, incl. cognitive aspects
    underlying the failure condition recognition and the elaboration of the diagnosis of the situation,
  • establishing the criteria driving the level of scrutiny required to demonstrate the validity of these
    assumptions,
  • providing guidance in terms of acceptable methods and means to be developed for compliance with the
    regulations.

This CM thus impacts on your Means of Compliance to CS 25.1309(b) and CS 25.1309(c).

PRA: New 5G Frequencies Could Jam Critical Flight Instruments

A new hazard for our Particular Risk Analyses: Interference to RA operations can affect:
1. Autoland functions: This is particularly critical in low visibility auto approach like Cat II or III conditions. Pilots cannot conduct CAT II and III approaches if RA is malfunctioning.
2. EICAS/ECAM: Nuisance warning after take-off or during approach which will distract crew from their tasks at hand. This will lead to deterioration of operational safety levels.
3. False or missing GPWS alert: Anywhere in proximity to ground, this could inhibit some functionalities of the TAWS (Terrain Alerting Warning System) reactive modes which would remove a safety net in case against CFIT (Controlled Flight Into Terrain). Additional distractions for crews from tasks at hand, – “too low gear” and “too low flaps”, “don’t sink”,” terrain and pull up warning” and other alerts. A big concern is GPWS not triggering an alert when it should have done so, because of interference which can result in CFIT event!
4. Unreliable instrument Indications: This could contribute to an increased number of hard landings because of errors in automatic altitude indications and voice announcements.
5. Abnormal behaviours in Automatic Flight Systems:
a. Autoland system
b. Flight Control Laws (e.g. failure to transition to Flare law resulting in a higher than expected pitch on the flare; Retard function, etc.)
c. Auto-throttle automatic stall protection.
d. Auto Speedbrake deployment

For information, see:
ICAO Problem Statement.
US 5G roll out ignores concerns for Air Transport safety

Free SSA books

A useful list of free handbooks, guides, and textbooks covering all of the tools of system safety and probabilistic risk assessment is available here: https://functionalsafetyengineer.com/safety-and-pra-resources/

Failure Modes & Effects Analysis (FMEA)

Fault Tree Analysis (FTA)

Probabilistic Risk Assessment (PRA)

System Safety

Software Safety

Bayesian Analysis

Bayesian Networks

An 25.1309(c) issue: FAA flags potential safety problem in layout of controls on Boeing 767 and 757 planes

An article by Dominic Gates

The Federal Aviation Administration has issued a safety alert to all operators of Boeing 767 and 757 airplanes flagging a potential problem that led to the 2019 crash in Texas of an Amazon Air cargo plane and the deaths of the three pilots onboard.

Although the first officer flying the plane was faulted in the investigation into the crash, the alert points to a potential flaw in the way the pilot controls are laid out in the flight deck that initiated the chain of events. Crash investigators believe that the first officer inadvertently hit a switch that was too close to a handle he was holding, then reacted incorrectly to the plane’s sudden change in the flight mode. Just 32 seconds after the inadvertent activation of that switch, the plane slammed into the ground, killing the captain, the first officer and a third pilot who was hitching a ride in the jump seat.

On Feb. 23, 2019, Atlas Air Flight 3591 — a Boeing 767 cargo flight operated for and in the colors of Amazon Air — was en route from Miami to Houston when it crashed into a shallow marsh near Trinity Bay, Texas. On board were Captain Ricky Blakely, 60, of Indiana; First Officer Conrad Jules Aska, 44, of Antigua; and Mesa Airlines Captain Sean Archuleta, 36, of Houston, who was traveling home before beginning new-hire pilot training with United Airlines. The flight data recorder showed that as the plane descended from 6,000 feet toward a planned 3,000 foot level on the approach to Houston airspace, the pilot flipped a switch that shifted the plane to “Go-Around” mode. This is the mode used when a pilot close to the ground and slowing down on approach decides abruptly that it’s unsafe to land. The go-around signal immediately increases the engine thrust so that the plane can climb away from the runway.
The altitude and trajectory of Flight 3591 at that moment was “inconsistent with any scenario in which a pilot would intentionally select go-around mode” the National Transportation Safety Board concluded after investigation.
And neither the captain or first officer announced a go-around, as they would have if it were an intentional activation.
What happened next doomed the plane.The sudden acceleration from the engine thrust would have pushed the first officer’s body back into his seat. If there are limited visual cues to the contrary, this can make a pilot think a plane is pitching up, a recognized phenomenon known as a “somatogravic illusion.” In fact the plane was already on a downward slope. Investigators believe that under the influence of that illusion, the first officer pushed the controls forward to point the nose further down. That “forced the airplane into a steep dive from which the crew did not recover,” the NTSB report states.

Crash investigators re-creating what happened in a simulator observed that, when the first officer flying in the right seat kept his left hand on the speedbrake lever during the descent, as is normal procedure, “his left hand and wrist could be under the thrust levers and close to the left go-around switch.” They concluded that this was the likely cause of the unintentional go-around activation. “The NTSB demonstrated in a full flight simulator, that light turbulence could reasonably cause a pilot flying that is holding the speedbrake lever to move his or her arm enough to hit the go-around switch inadvertently,” the FAA stated.

The FAA issued the safety alert to make sure pilots of both the 767 and the 757, which has a similarly configured flight deck, are aware of this potential hazard.
Boeing declined to comment.
The Flight 3591 crash investigators separately raised questions about the first officer’s competence.