eGFI - Dream Up the Future Sign-up for The Newsletter  For Teachers Online Store Contact us Search
Read the Magazine
What's New?
Explore eGFI
Engineer your Path About eGFI
Autodesk - Change Your World
Overview E-tube Trailblazers Student Blog
  • Tag Cloud

  • What’s New?

  • Pages

  • RSS RSS

  • RSS Comments

  • Archives

  • Meta

Compass Course

A pair of air disasters exposes engineering mistakes and catalyzes real-world lessons for future professionals.

They might not imagine it now, but among today’s engineering undergraduates are probably a few talented, hardworking leaders like Dennis Muilenburg—destined both for success and a harsh public spotlight.

Hoping to become “the world’s best airplane designer,” as he did, they may follow a similar path: While an aerospace engineering major at Iowa State University, Muilenburg secured a summer internship at the Boeing Company in 1985. After graduation the next year, he joined the firm full time while pursuing a Master’s at the University of Washington. Rising through the company’s ranks to lead its defense division and then become chief executive in 2015, he is credited with “taking on tough challenges and delivering results,” as a Boeing director put it.

How prepared would these students be to control what followed: a series of pivotal steps by the company that investigators suspect contributed to two 737 MAX crashes that killed 346 people? These actions included omitting mention in the pilots’ operating manual of a major software addition to the flight control system; acceptance of a single sensor to measure the jet’s position in the air—a so-called single point of failure; flawed assumptions about pilots’ reaction time; and signs of “undue pressure” on production lines—and on Boeing engineers delegated by the Federal Aviation Administration to certify the 737 MAX.

And what would brace students for the aftermath of the tragedies: multiple investigations, including a Justice Department criminal probe, litigation, decline of market share and a wobbly stock price, and the need both to keep the confidence of Boeing’s board of directors and to assure a worried flying public that when the now-grounded 737 MAX returns to service, it will be, as Muilenburg promised, “one of the safest airplanes ever to fly”? Would they maintain Muilenburg’s stoic calm through two days of congressional grilling, deflecting calls for him to quit, while behind him relatives held up pictures of passengers who had perished? Would their responses be satisfactory or reflect what a House committee and subcommittee chair labeled, in reference to Boeing officials, “a culture of concealment and opaqueness”? Could they face victims’ families, as Muilenburg did, and listen to what he called their “heartbreaking,” unforgettable stories?

Fatal Consequences

Like the BP Deepwater Horizon oil spill, the Volkswagen emissions scandal, and the General Motors ignition “switch from hell” (November 2014 Prism cover story), which was linked to more than 100 deaths and forced the recall of millions of vehicles, the aftermath of the 737 MAX tragedies has exposed engineering mistakes, flawed judgments, and hesitation to challenge the status quo.

While investigations into the 737 MAX continue, abundant documentation has emerged from congressional committees and examinations by the Indonesian Komite Nasional Keselamatan Transportasi, which conducted an official investigation into the Lion Air crash, and an international panel of experts known as the Joint Authorities Technical Review, which reported to the FAA. Muilenburg himself told Congress that “the process of learning from failure, and even from tragedies like these, has been essential to the advances in airplane safety since the industry began roughly a century ago. In the months since the accidents, there has been much criticism of Boeing and its culture. We understand and deserve this scrutiny.”

Standing in the Engineers’ Shoes

Some educators have begun thinking about adapting the 737 MAX to the classroom. Michael Gorman, a social psychologist and an ethicist at the University of Virginia’s engineering school, is working with two colleagues on a case study—to be ready possibly by spring but certainly by the fall of 2020. He wants to “try to put students in the virtual shoes of the designers of the 737 MAX at the time. My starting assumption is that these designers were motivated both by safety and by the recognition that their aircraft had to compete for sales with Airbus and other airlines.” Questions raised by the case he would ask students to address include: “What were the designers’ mental models at the time of how the new systems were going to work? Why were pilots not trained on the simulator before they flew the plane with the automated system?” He will suggest that “pilots did not have the same mental model as the designers.” On the ill-fated flights, pilots appeared to fight with the software for control of the plane, according to reports. The study’s goal, of course, is for students to be able to apply what they learn to other situations, Gorman says.

Brock Barry, a professor of engineering education at the United States Military Academy at West Point, says: “The circumstance of the Boeing’s 737 MAX 8s has the potential to be a highly illustrative case history. It is high-profile and remains in the news, making for a solid connection with students.” Everyone in the classroom has the common experience of air travel, making this “a strong potential case for classroom discussion,” says Barry, who serves in the Civil and Mechanical Engineering Department and is senior associate editor of the Journal of of Civil Engineering Education. One of the cases already extensively used in engineering ethics instruction is a faulty cargo door design on the DC-10 he says. That design was associated with a 1974 crash near Paris that killed 346 people (coincidentally the combined death toll from the Lion Air and Ethiopian Airlines crashes) and a 1979 accident that resulted in 191 deaths.

“It’s such a great case for teaching. It has all the components,” says Marianne Jennings, professor emerita of business ethics at Arizona State University and author of The Seven Signs of Ethical Collapse. Accustomed to teaching engineers at the college level and as a business consultant (including for Boeing), she brought up the 737 MAX recently as part of a corporate leadership training course for engineers.

For some, the Boeing case is an impetus for research. Citing both the 1986 Challenger space shuttle disaster and the 737 MAX, three academics—Elif Miskioglu, an assistant professor of chemical engineering at Bucknell University, Kaela Martin, an assistant professor of aerospace engineering at Embry-Riddle Aeronautical University’s Prescott, Ariz., campus, and Adam Carberry, an associate professor in the Fulton Schools of Engineering at Arizona State University, have received $200,000 from the National Science Foundation for a project to study and figure out how students can develop “engineering intuition.” This is a “highly desirable but vague and abstract essential engineering skill,” a key feature of which is an ability to “assess and/or predict outcomes.” Noting that in both the Challenger and 737 MAX cases, “the vehicle was cleared to fly by an engineer,” they argue that avoiding such accidents “requires not only technical knowledge but also the ability to rapidly assess whether a solution is feasible and appropriate.”

An ambitious academic project related to the 737 MAX crashes has been launched by Mary “Missy” Cummings, an engineering professor at Duke University who specializes in autonomous aerospace systems and was one of the Navy’s first female fighter pilots. Using a $100,000 planning grant, she and her team are laying the groundwork to propose the first NSF-funded engineering research center (ERC) devoted to aerospace. The potential $50 million, multi-institution Aerospace Autonomy Center of Excellence, led by Duke, would conduct basic research on autonomous systems, develop software and certification protocols, train undergraduates and graduate students for the expanding field of autonomous flight, and, in the process, improve aerospace safety.

The 737 MAX crashes “highlight the complexities surrounding the insertion of advanced autonomy in aerospace systems,” Cummings writes in her proposal. In an e-mail exchange with Prism, she explains that while the 737 MAX aircraft relied on “plain-vanilla automation,” not autonomy, “if we can’t get basic automation testing correct, what chance do we have with complex autonomy?” Her ERC project aims to explore the intersection of computer science, air vehicles, aerospace systems, policy, and regulation. It would produce “a road map for aerospace autonomy research and education” in which engineering students and faculty would collaborate with counterparts in law and policy. “Boeing is actively participating in this, so they are very concerned about these issues,” Cummings says.

Fixing Flaws

Before it was grounded in March, the 737 MAX was the best-selling plane produced by Boeing, the world’s biggest commercial aircraft manufacturer and the nation’s largest manufacturing exporter. Boeing employs about 50,000 engineers in a workforce of 150,000. Its iconic, innovative planes include the B-52 long-range bomber, designed under intense time pressure and still serving the U.S. Air Force after six decades. (See Prism’s December 2013 cover story.) The company also builds helicopters and rockets, and is developing the CST Starliner, a capsule for commercial space travel.

In the classroom, educators may want to separate examples of system failure in the 737 MAX from questions of ethics and who was responsible. Testifying before the House, Muilenburg cited three mistakes that the company is correcting: implementation of the angle-of-attack disagree alert, which would let pilots know if sensors indicating the plane’s inclination don’t match where its nose actually is pointed and prevent the Maneuvering Characteristics Augumention System (MCAS) from erroneously kicking in to stabilize the plane’s pitch; and “communication, documentation across all of the stakeholders.”

The 737 MAX’s reliance on a single sensor has been widely criticized. That MCAS apparently was vulnerable to a single point of failure “should never have been permitted,” says Guy Gratton, an aeronautical engineer and visiting professor at Cranfield University in the United Kingdom. “Redundancy is a core idea, especially in a complicated system where lives are at stake,” adds Karl Smith, cooperative learning professor of engineering education at Purdue University. Boeing now says the flight control system will compare data from both AOA sensors. If the sensors disagree by a significant amount, MCAS will not activate.

On the Lion Air flight, the MCAS kicked in repeatedly based on flawed information, and pilots were helpless in trying to control it. Now, Boeing says, pilots will always have the ability to override the MCAS and manually control the aircraft.

As for communication, there are indications that FAA regulators weren’t fully informed about the MCAS. The Joint Authorities Technical Review, for example, raised the question of “whether inadequate communications played a role in the failure to address potential unintended consequences” arising from the system. The Associated Press quotes JATR chairman Christopher Hart as saying the MCAS evolved “from a relatively benign system to a not-so-benign system without adequate knowledge by the FAA.” Investigators said the MCAS should have been considered a novelty and clearly highlighted to the FAA technical staff.

Pilots, moreover, were totally in the dark, something that infuriated Dennis Tajer, spokesman for the Allied Pilots Association, the professional pilots’ union. “Before the Lion Air crash MCAS meant nothing to us,” he told the BBC. “The system is swift, violent, and terrifying. And the fact that you didn’t inform me? That’s beyond ‘shame on you.’ ” Shem Malmquist, a working Boeing captain and visiting professor at Florida Institute of Technology’s College of Aeronautics, says that “giving pilots information is always a good thing to do.” He worries, however, “that sometimes we provide it, and now we think we have passed responsibility to the pilots. It’s easy to transfer responsibility downwards and not fix the main problem. Providing this information also requires training into what the pilots are seeing.”

On a more fundamental level, one thing that can make a big difference in preventing similar disasters “is for engineers to document the assumptions they make in their designs and provide it to the people actually writing training manuals,” Malmquist says. “Engineers may be pretty surprised at what is not being trained and what they just assumed pilots knew, even when it comes to the basics of aerodynamics.”

By not informing pilots about MCAS, Boeing may have violated the principle of informed consent, “one of the most fundamental principles of applied ethics,” says Martin Peterson, a professor of the history and ethics of professional engineering at Texas A&M University. “It’s not always respected in engineering ethics, but I do believe it was relevant here. Perhaps Boeing was too optimistic and never thought the system would be activated, affecting their decision to inform their customers, but one could argue they had moral obligations to not have overly optimistic beliefs about the safety of the systems they were designing.” West Point’s Barry suggests that instructors using the Boeing case in class start off by assigning Peterson’s April 2019 blog post, “The Ethical Failures Behind the Boeing Disasters,” on the American Philosophical Association’s website.

Preventing Mistakes, Exercising Leadership

Chris Swan, dean of undergraduate education at Tufts University’s school of engineering, would “put students in the role of practicing engineers at multiple decision points in the process of developing an aircraft,” such as those who worked on the sensors and the software taking readings from the sensors, as well as those who looked at the overall design. “This way, you can look at how multiple ethical dilemmas played a role in how this ultimately led to a tragedy,” and probe “where their values should be and what they should be doing,” says Swan.

Other educators stressed the need to equip engineering students with the tools to prevent or correct mistakes they see occurring inside a company. Diana Bairaktarova, an assistant professor of engineering education at Virginia Tech, says students need to be good listeners and communicators, “open minded about what other professionals have to say” and “responsible for the impact their work has on the whole system.” Students also must be prepared to admit their own mistakes, she says. “Leadership—in the end it comes to that.”

An industry veteran herself, Bairaktarova recognizes the imperative of keeping cost and production schedules in mind. Karl Stephan, a professor of engineering at Texas State University who publishes the Engineering Ethics blog, notes that engineering requires trade-offs: “If you’re not compromising between two somethings, you’re not doing engineering.” Company culture, he adds, “is a very powerful thing. . . . What I tell people if they’re asked to do something they know is wrong, is, ‘First thing: Confront your boss.’ ”

Business ethicist Jennings seeks to get students to study a particular case and “see how it could happen to them.” There are occasions when employees are “persuaded to go along with something. They had concerns about it but let it ride,” she says. “The idea is to get them to understand the psychology of this process.” If students see a major problem developing and want to prevent it, she advises, “make the business case for what you’re concerned about. . . . You’ve got calculations? Use them. Outline the risk.”

That  may be a particular challenge in aerospace, where “the modern airliner is about the most complex machine the human race has created,” says Cranfield’s Gratton. “When dealing with machines of such enormous complexity, there will be mistakes made.” The risks and consequences of mistakes may only increase with the growing use of pilotless aircraft, including future air taxis.

Boeing has invested heavily in education, including putting $11 million into a partnership with NSF to accelerate training in critical skill areas and increase diversity in STEM fields. About three weeks after the Lion Air crash, Boeing committed $6 million to Muilenburg’s alma mater, Iowa State, much of it to support a new Student Innovation Center now under construction. Part of the money will also support engineering students seeking to broaden and enhance their educational experience by participating in undergraduate research, a university foundation announced. “Boeing is committed to inspiring the next generation of innovators and equipping them with the skills they need to excel in the modern workforce,” Muilenburg said at the time.

The commitment was in keeping with a forward-looking company tradition. Will the aerospace industry’s next generation of engineers also look backward and learn from past mistakes?AS

This eGFI Teachers blog post is adapted from ASEE Prism magazine’s December 2019 cover story. It was written by Prism editor Mark Matthews and Charles Q. Choi, a New York-based freelance writer and frequent contributor, with additional reporting from associate editor Jennifer Pocock.

Comments are closed.