Investigators probing two deadly crashes of Boeing Co. 737 MAX airliners are grappling with a hybrid of old and new technology. A complex piece of software controls hydraulic pumps and motors similar to those used when Lyndon Johnson was president.
The plane, first designed in the 1960s and modernized three times, is caked with successive generations of technology superimposed on each other. Digital retrofits to older equipment like the 737 MAX’s anti-stall system—known as MCAS and suspected of having contributed to the crashes that together claimed 347 lives—are increasingly common. From smart-home devices that control oil-burning furnaces to mainframe computers that oversee decades-old power grids, digital controls are popping up everywhere around the mechanical worth software ware has underpinned the internet’s virtual world from inception, of course, and has shown both its potential and vulnerability. Now, with the cost and size of digital sensors plunging and the ability to transmit data ballooning, more physical objects than ever are getting linked through software. Even before the Internet of Things becomes a pervasive reality, tech experts and public-safety professionals are fretting over the intersection of virtual and authentic in what they call cyber-physical security.
Forensics investigators and recovery teams collected items from the site of the Ethiopian Airlines crash, which killed 157.
Forensics investigators and recovery teams collected items from the site of the Ethiopian Airlines crash, which killed 157. PHOTO: JEMAL COUNTESS/GETTY IMAGES
The worry is that engineers are putting mechanical systems under the command of computers and algorithms without fully understanding the consequences. Problems include confusion about how controls work, software bugs leading to physical accidents, and, most worryingly, cyberattacks on infrastructure like power stations or chemical plants could cause catastrophes.
Cyber-physical systems are “embedded in virtually all aspects of our lives,” said Christos Papadopoulos, program manager of cyber-physical systems at the Science and Technology Directorate of the Department of Homeland Security. DHS has since 2013 sought to spot and address potential weaknesses in cyber-physical systems, initially involving cars, medical equipment, building controls, power grids, broad collaboration with academic institutions and research institutes.
Federal investigators have also pursued wrongdoers who exploited retrofit weaknesses, from Russian hackers who targeted U.S. utilities to Volkswagen AG engineers who fraudulently passed U.S. emissions tests by doctoring control software that had been added to diesel-engine designs.
Cybercrime and malware have long plagued the virtual world, though data breaches, theft, and extortion rarely cause direct physical harm. Software bugs can also arise in physical equipment designed from scratch with digital controls, like electric cars, medical equipment, and drones. But creators of those systems from the outset link hardware and software, and engineers test the products with both in mind. Retrofitted equipment, experts say, is rarely vetted so thoroughly.
“There’s a bigger temptation not to test things when you’re just making a little change by adding automation,” said Justin Cappos, a professor of computer science at New York University’s Tandon School of Engineering. With every adaptation, the potential for problems might accumulate without anyone noticing. “You’re boiling the frog,” he said.
Mr. Papadopoulos said since adding security to older systems is often impossible, DHS is assessing technologies to protect their communication links, “to create an isolation layer and intercept attacks before they reach vulnerable devices.”
Prof. Campos, participating in the automotive strand of the DHS project, said carmakers awoke to their vulnerability several years ago after high-profile hackings that took control of newly connected vehicles.
“A lot of other industries haven’t had the same wake-up call,” he said.
The aviation industry was an early adopter of computers to control physical equipment, always proceeding slowly and under intense scrutiny. Investigators are now probing whether Boeing’s integration of the automated Maneuvering Characteristics Augmentation System or MCAS—which in certain circumstances pushed the nose down and confused pilots—met industry standards.
Officials inspect an engine recovered from the Lion Air jet that crashed in Indonesia on Oct. 29.
Officials inspect an engine recovered from the Lion Air jet that crashed in Indonesia on Oct. 29. PHOTO: AP
Automation offers enormous benefits, even for decades-old mechanical equipment. Computers can run most machinery faster, more precisely, and more efficiently than humans. While automation can cost workers jobs, it can also eliminate toil and danger. But integrating computers into “dumb” machines poses challenges.
Britt Storkson, a designer of electronic controls for industrial pumping equipment in The Dalles, Ore., says careless computer retrofitting of mechanical gear has become “a serious problem you see them in the industry all the time.” He has seen computer processors locking up, stopping heating and cooling equipment, conveyor belts, and industrial-process systems.
Boeing Making Changes to Stall-Prevention System on 737 MAX
Boeing announced it is changing how a new stall-prevention system works on its new 737 MAX aircraft — the same model jet involved in the Ethiopian Airline s crash. WSJ’s Jason Bellini reports.
“It’s not that the software doesn’t work; it’s that it doesn’t work in all conditioners Mr. Storkson said. And since software development is divorced from equipment design, “software developers don’t know, if I type in this command, what’s going to be the impact down the line.”
Accidents can cause expensive damage, but not nearly on the scale of hacking. Martyn Thomas, emeritus professor of IT at Gresham College in London who specializes in safety-critical systems, notes that traditionally in the physical world, safety plans are based on the assumption that things fail by chance. If two elements must fall before conditions get dangerous, the probability of a catastrophic accident shrinks further.
“But malware is designed to make everything fail at once,” Prof. Thomas noted. “The insecurity of cyberspace changes everything.”
South Korean officials monitor possible cyberattacks from malware.
Over recent years, malware attacks such as WannaCry and NotPetya have hit medical scanners in the U.K., A.P. Moller-Maersk A/S shipping facilities, worldwide, and manufacturing, research, and sales operations of pharmaceuticals giant Merck & Co. Aviation haven’t faced notable hacks because planes use special software with extensive security.
While hackers can attack both new and retrofitted digital equipment, systems with network links added years later are harder to protect, said David Grout, chief technology officer for Europe at cybersecurity specialist firm FireEye Inc.
According to FireEye, malware dubbed Triton in 2017 almost disabled the industrial-safety software in a Saudi Arabian petrochemicals plant, potentially allowing hackers to control the facility and release toxic chemicals.
Triton was only discovered when a plant manager had to reboot equipment three times and wondered why. FireEye suspects state-backed hackers—likely from Russia—for the attack.
“Their objective was to show the world that they are there and can take action if they want to,” said Mr. Grout.