On Refusing
On the surface, engineering catastrophes appear to be technical failures. In reality, a single mistake does not create catastrophe; it is institutional conditions that often shape whether it escalates into disaster.
One of the quintessential case studies for the role of ethics in engineering is the Challenger disaster. During development, flight data from earlier missions showed O-ring erosion and blow-by (hot gases leaking past a damaged seal), but the risk was treated as acceptable. The forecast called for temperatures far colder than any previous launch conditions, and engineers argued they lacked evidence the O-rings would seal reliably under those conditions. During a teleconference the night before launch, an engineering manager asked the team to “take off their engineering hats and put on their management hats,” and the delay recommendation was reversed after internal discussion. 73 seconds after liftoff, the rocket exploded, killing all seven people aboard.
The Chernobyl accident is another case study. The Soviet nuclear program enforced strict hierarchical deference and party loyalty. Under scheduling pressure, operators performed a late-night safety test. The test kept tripping automatic safety systems, so the operators bypassed protections and operated the reactor outside established limits. The RBMK reactor design had known instability issues; institutional secrecy obscured the nature of these flaws from those operating it. Violations of safety procedures went unchallenged, in part because challenging the Party carried severe personal and professional risk. Thirty people died in the immediate aftermath. Estimates attribute thousands of excess cancer deaths, and hundreds of thousands were forced from their homes.
In more recent history, the Boeing 737 MAX illustrates these dynamics in software. The MAX variant fitted larger engines onto the existing 737 design, preserving the airframe to reduce cost and speed regulatory approval. Under the FAA’s certification system, Boeing employees were authorized to certify certain compliance findings on behalf of the regulator. The change altered the aircraft’s handling, introducing a tendency for the nose to pitch upward near stall. Boeing created a software system, MCAS, to compensate for this. MCAS relied on a single sensor, could repeatedly pitch the nose down even after pilot correction, and was not clearly described in manuals or training. During development, MCAS was given increased power over the 737 MAX’s controls, but its hazard classification was not updated accordingly, reducing the required level of redundancy and scrutiny. In October 2018, a faulty sensor triggered MCAS on Lion Air Flight 610. The aircraft crashed, killing 189 people. While Boeing internally began developing a software update, its public messaging emphasized pilot and maintenance error. In March 2019, Ethiopian Airlines Flight 302 crashed, killing 157 more people.
In complex systems, technical safeguards are necessary but insufficient. Safety also depends on information flow and escalation: on whether engineers can raise risks and ethical constraints without fear of retaliation, and on whether institutions respect those boundaries—or override them through pressure or force. As technology advances, so too does the potential scale of engineering catastrophe. Objection, dissent, and sometimes refusal are ethical duties of engineers: duties that employers or governments may resist. This applies at the organizational level too: engineering firms have an ethical duty to object, or refuse, when a government makes demands that come into conflict with the firm’s safety and ethical constraints.
The engineer’s duty to dissent is critical to public safety, but it is a last line of defense and not a reliable one. When it works, we never hear about it; when it fails, we get explosions, headlines, and investigations. That is why institutional conditions matter: organizations must protect good-faith objection rather than punish it, and coercion to silence engineers should be treated as a red flag demanding scrutiny before catastrophe takes shape.
As engineers, we must decide in advance where we will draw the line, what we will do when asked to cross it, and what consequences we are willing to accept. This right is ours to exercise, this duty ours to uphold. We also need to defend that right for one another, even when we draw our ethical lines in different places.