Opening the Hood
On the gap between how things look and how things work.
The Premise
Every system has two layers: the dashboard and the engine. The dashboard is what you see — the signals, the confident presentation, the consensus, the status indicators, the people in charge nodding. The engine is what actually runs — the mechanism, the failure modes, the dependencies, the thing that will either hold or break under real conditions.
Most of the time, reading the dashboard is good enough. It is cognitively efficient. Evolution shaped us to use social signals — authority, consensus, confidence — as proxies for truth, because individually verifying every mechanism we depend on would be paralyzing. The shortcut works. Until it doesn't.
The pattern that follows — across centuries, across domains, across every level of human organization — is almost always the same: the mechanistic information existed, someone had it, and a combination of authority gradient, schedule pressure, financial incentive, and the social cost of speaking up ensured it did not reach the decision with enough weight to matter.
Opening the hood is not contrarianism. It is asking: do I understand how this actually works, or do I understand how it is being presented to me? Those are different questions. The answer changes what you do next.
The Same Idea, at Different Altitudes
Ordered from simplest to most complex. Same observation throughout.
Age 4
Sometimes things look okay but aren't. A toy can look fine but have a broken part inside. It's always okay to ask "but does it work?" Even grownups forget to check.
Age 7 (or so)
Sometimes things look fine right up until they aren't. The check engine light doesn't come on until something's already wrong. The people in charge sound confident right up until the thing fails. Nobody's lying — they just never looked inside. Most of us never do. We trust how things look and how people sound, and usually that's good enough. But sometimes it isn't, and you only find out after. It's worth occasionally asking: does anyone here actually know how this works? Not to cause trouble. Just to know.
Adult (general)
Most of us learned to trust confidence, credentials, and consensus — and honestly, that works more often than not. But those are signals about a thing, not the thing itself. A car dashboard can show green while the engine overheats. A room full of nodding experts can be wrong in the same direction. At some point it's worth asking: do I understand how this actually works, or do I understand how it's being presented to me? Those are different questions. The answer changes what you do next.
Professional / analytical
The dashboard can look perfect while the engine quietly fails. Most of us were never taught to open the hood — we learned to read confidence, authority, and consensus as proxies for truth, because most of the time, in low-stakes situations, that works fine. It's not laziness or stupidity; it's an efficient heuristic running exactly as evolved. The trouble comes when the stakes rise and the heuristic doesn't notice. Some people feel an almost physical discomfort when a decision rests on signals rather than mechanism — when nobody in the room can describe how the thing actually works, how it fails, or who fixes it at 2 a.m. That discomfort is information. Not everyone feels it, and that's worth sitting with for a moment: not as a judgment, but as a question. When did you last look under the hood of something you were certain about?
High epistemic altitude
We are prediction machines that learned to outsource verification to social consensus, and mostly it serves us well — the cognitive load of mechanistic scrutiny applied universally would be paralyzing. But every epistemic shortcut has a failure mode, and the failure mode of signal-trust is systematic: it scales with stakes, it compounds across hierarchies, and it is invisible to the people inside it because the signals feel indistinguishable from knowledge. The discomfort some feel in the presence of unexamined certainty is not arrogance or anxiety — it is a calibrated response to genuine risk. The brain noticing that the map is being mistaken for the territory. What makes this hard is not intelligence; plenty of very smart people are exquisite signal-readers who never develop the habit of asking what the signal is actually pointing at. What makes it hard is that looking under the hood is socially costly, often unwelcome, and requires tolerating the ambiguity of not-yet-knowing in rooms that reward the performance of certainty. The people who do it anyway are not contrarians. They're just more afraid of being wrong than of being uncomfortable.
The Record
What follows is a chronological list of events where mechanistic knowledge existed, was accessible, and was overridden by signal. In almost every case, the outcome was preventable. Links point to primary sources, official investigations, and Wikipedia summaries where available. Controversies in sources are noted where known.
— The Vasa
The Swedish warship Vasa sank twenty minutes into its maiden voyage in Stockholm harbor. A stability test conducted before launch — men running side to side across the deck — was making the ship visibly rock so badly the test was stopped. King Gustav II Adolf had ordered the ship built with an extra gun deck against the advice of the original designer. Nobody with authority wanted to be the one to say it was not seaworthy. The hood was there. Opening it fully was the problem.
— Tacoma Narrows Bridge
The original Tacoma Narrows Bridge in Washington State collapsed four months after opening, oscillating itself apart in a 42 mph wind. Concerns about the bridge's slender, aesthetically elegant design had been raised before construction. The signal — it looked beautiful and modern — won over structural caution. The aerodynamic resonance failure mode was, in hindsight, predictable from known physics. Film footage of the collapse is among the most-watched engineering failure documentation in history.
— Milgram Obedience Experiments
Not a disaster, but a controlled laboratory demonstration of the mechanism itself. Stanley Milgram at Yale showed that ordinary people would administer what they believed were lethal electric shocks to another person, simply because an authority figure in a lab coat said to continue. The signal — authority, procedure, institutional context — overrode the mechanism — a person apparently in severe pain and distress. Approximately 65% of participants went to the maximum voltage level. The experiment has been replicated in multiple countries with broadly similar results.
— Thalidomide
Thalidomide was marketed in Europe and elsewhere as a safe sedative and treatment for morning sickness. Frances Kelsey at the US FDA refused to approve it for the American market because she found the safety data inadequate — particularly regarding peripheral neurotoxicity and the absence of reproductive safety data. Every signal said approve: European approval, manufacturer confidence, professional consensus, market success. She read the mechanism. The US was largely spared. An estimated 10,000 children were born with severe limb malformations in countries where it was approved.
— McDonnell Douglas DC-10 Cargo Door
In 1972, a DC-10 cargo door failed in flight over Windsor, Ontario. The aircraft landed safely. An internal memo — the "Applegate memorandum" — written by a Convair engineer described the exact failure mode in detail and predicted it would eventually cause a crash. McDonnell Douglas issued a service bulletin rather than an Airworthiness Directive, making compliance voluntary. In 1974, a Turkish Airlines DC-10 suffered the identical failure over Paris. All 346 people aboard died. It remains one of the deadliest aviation accidents in history.
— Ford Pinto
Ford's internal cost-benefit analysis — later made public in litigation — explicitly weighed the cost of fixing a fuel tank design flaw against the projected cost of lawsuits from deaths and injuries. The fix was estimated at $11 per vehicle. The analysis concluded it was cheaper not to fix it. This is the rare case where the hood was opened, the mechanism was understood, and the decision was made to close the hood again for financial signals. Multiple people died in rear-impact fires.
— Therac-25
The Therac-25 was a radiation therapy machine that killed at least three people and seriously injured others through massive radiation overdoses. The failure was a software race condition — a bug that only manifested under specific rapid-input sequences. Previous Therac models had hardware safety interlocks; confidence in the new software-only design meant those interlocks were removed. When patients and operators reported problems, the manufacturer's initial response was that the machine was safe. The mechanism — a timing-dependent software defect — was invisible behind the signal of institutional confidence and prior successful operation.
— Space Shuttle Challenger
Engineers at Morton Thiokol — the manufacturer of the solid rocket boosters — explicitly and formally recommended against launch. The O-ring seals had a known temperature sensitivity, and the overnight temperature at Cape Canaveral had been below freezing. NASA management overruled the engineers. The signal — schedule pressure, prior successful launches, institutional momentum — overrode the mechanism. All seven crew members died 73 seconds after launch. Richard Feynman's appendix to the Rogers Commission report is essential reading: he found NASA's internal risk estimates for engine failure were orders of magnitude more optimistic than engineers' own private estimates, and demonstrated the O-ring failure simply by dropping a piece of the material into a glass of ice water at a press conference.
— Chernobyl
The explosion at Reactor No. 4 of the Chernobyl Nuclear Power Plant occurred during a safety test — specifically a test to see whether the reactor could be safely powered down in a controlled way. The test was run by operators who did not fully understand the reactor's physics, particularly a design characteristic called positive void coefficient that made the reactor unstable at low power. Soviet institutional culture made stopping or questioning the test politically unacceptable. The signals — procedure, schedule, authority — overrode the mechanism. The long-term death toll from radiation exposure remains contested, ranging from tens to potentially hundreds of thousands depending on methodology.
— Bhopal
The Union Carbide pesticide plant in Bhopal, India, released approximately 40 tonnes of methyl isocyanate gas into a densely populated area. Safety systems at the plant had been documented as failing for months. Cost-cutting had disabled or degraded multiple backup systems. Local management knew. The signal — profitable operation, schedule, cost reduction — drowned out the mechanism. Immediate death toll estimates range from 3,800 to over 15,000. Long-term health impacts continue to affect the surrounding population.
— Barings Bank
Nick Leeson was simultaneously executing trades and reporting on them from Barings' Singapore office — a fundamental separation-of-duties failure. He was also generating implausibly consistent profits that should have raised immediate questions about how. Nobody in London's management asked mechanistic questions. The dashboard — growing profits, a star trader — was too comfortable to interrogate. Barings lost £827 million, more than its entire capital, and collapsed in 1995. It was Britain's oldest merchant bank, established 1762.
— Three Mile Island
The partial meltdown at Three Mile Island Unit 2 in Pennsylvania began with a stuck valve that operators believed — based on an indicator light — was closed. The light indicated the valve had been commanded to close, not that it had actually closed. Operators received contradictory instrument readings and a warning light was physically obscured by a maintenance tag. They interpreted the situation through their existing mental model rather than the actual state of the system. The map overrode the territory for hours until the situation became undeniable.
— Enron
Bethany McLean at Fortune published "Is Enron Overpriced?" in March 2001, asking a simple mechanistic question: how exactly does this company make its money? She could not get a clear answer from management. Most analysts continued to rate Enron a strong buy. The signals — charismatic leadership, soaring stock price, analyst consensus, Fortune's "Most Innovative Company" award six consecutive years — were overwhelming. Enron filed for bankruptcy in December 2001. Thousands of employees lost their retirement savings. It was, at the time, the largest corporate bankruptcy in US history.
— Space Shuttle Columbia
A piece of foam insulation struck Columbia's left wing during launch on 16 January 2003. Engineers at NASA requested satellite imaging to assess the potential damage. Management declined, concluding — partly based on the fact that previous foam strikes had caused no fatal damage — that it was probably fine. On re-entry, hot gases penetrated the damaged wing. All seven crew members died. The Columbia Accident Investigation Board was explicit: NASA's organizational culture, which discouraged bad news traveling upward, was as much a cause of the accident as the physical damage to the heat shield.
— Grenfell Tower
Residents of Grenfell Tower in London had been raising fire safety concerns for years through their tenant association blog, the Grenfell Action Group. The cladding installed during a recent renovation — aluminium composite panels with a polyethylene core — was known within the construction industry to present fire risk. Internal manufacturer documents acknowledged this. Seventy-two people died. Every warning had been read as complaint rather than as mechanistic signal. The subsequent public inquiry ran for five years.
— Boeing 737 MAX
The 737 MAX's engines were repositioned forward and upward to accommodate their larger diameter, changing the aircraft's handling characteristics. Boeing introduced MCAS — Maneuvering Characteristics Augmentation System — to compensate, but did not fully disclose its existence to pilots or regulators during certification. MCAS relied on a single angle-of-attack sensor with no redundancy, and could override pilot inputs repeatedly. Lion Air Flight 610 crashed in October 2018, killing 189. Ethiopian Airlines Flight 302 crashed in March 2019, killing 157. The signal: certified by the FAA, trusted manufacturer, familiar aircraft type. The mechanism: a control system whose failure mode had not been adequately disclosed to anyone flying the aircraft.
— Wirecard
Wirecard was a German fintech company listed on the DAX index, valued at over €24 billion at its peak. In June 2020 it was revealed that €1.9 billion supposedly held in escrow accounts in the Philippines did not exist. Auditors at EY had accepted confirmations from a third-party trustee without independently verifying the accounts. The Financial Times, specifically reporter Dan McCrum, had spent years publishing investigative stories about accounting irregularities. Short sellers who looked at the actual cash flows were investigated by German regulators and accused of market manipulation. The signal — DAX-listed, government-endorsed, growing, officially audited — was so strong it was used as a shield against the people reading the mechanism.
— Fukushima Daiichi
The tsunami risk to the Fukushima Daiichi plant had been assessed and raised internally. The seawall height was based on historical records that excluded the most extreme events in the geological record. Prior survivals — previous smaller earthquakes and tsunamis — had created confidence. When the 2011 Tōhoku earthquake generated a tsunami far exceeding design parameters, the seawall was overtopped, backup diesel generators flooded, and three reactor cores melted down. The knowledge that the design basis tsunami was too conservative had been present in the system. It did not change the decision.
— Deepwater Horizon
In the hours before the blowout, a negative pressure test — designed to confirm well integrity — produced anomalous results. Workers on the rig interpreted the anomalous readings as acceptable, under conditions of significant schedule pressure (the rig was 43 days behind and costing $533,000 per day). The mechanism — a compromised well casing — was trying to communicate. Eleven people died. The resulting oil spill — the largest accidental marine oil spill in history — released approximately 4.9 million barrels of oil into the Gulf of Mexico.
— Global Financial Crisis
Credit rating agencies assigned AAA ratings to mortgage-backed securities whose underlying default correlations had not been correctly modeled. The models assumed housing prices did not fall nationally and simultaneously — a assumption that was historical artifact rather than physical law. Investors who looked at the actual constituent mortgages — Michael Burry, Steve Eisman, and a small number of others — saw the mechanism clearly and shorted the market. The signal — AAA rating, trusted institutions, the longest housing bull market in US history — was so overwhelming that the people reading the mechanism were treated as eccentrics until they were proven right. The resulting crisis cost an estimated $22 trillion in lost household wealth in the United States alone.
— PG&E California Wildfires
Pacific Gas and Electric's internal inspections had flagged aging transmission infrastructure across northern California for years. The financial signal — deferring maintenance is cheaper than performing it — consistently won. PG&E equipment was responsible for igniting multiple catastrophic wildfires, including the 2018 Camp Fire that destroyed the town of Paradise and killed 85 people. PG&E filed for bankruptcy in January 2019, the first utility in US history to do so citing wildfire liability. The mechanism — aging equipment in a drying climate — had been visible in inspection reports for years.
— Nokia
Internal research at Nokia showed that touchscreen smartphones were the future and that Symbian, Nokia's operating system, was falling critically behind. Middle management filtered bad news upward because the organizational culture punished the messenger. Leadership made strategic decisions based on a dashboard curated by fear. Nokia went from controlling approximately 40% of the global mobile phone market in 2007 to selling its handset division to Microsoft in 2013 for €5.44 billion — a fraction of its former value. A peer-reviewed study based on 76 interviews with Nokia managers documented the psychological safety failure explicitly.
— Knight Capital
Knight Capital Group deployed new trading software on the morning of 1 August 2012. A misconfiguration left a legacy system — containing an obsolete order-routing function — running alongside the new code. Nobody had a complete map of what the deployed production system actually contained. Within 45 minutes, Knight had executed millions of unintended trades and lost $440 million — more than its net capital. The company was sold within days. Structural knowledge of the system had not kept pace with the signal that the software was working and ready.
— Global Warming
This is the largest and most well-documented case of this pattern in human history, and it has layers the others do not.
The mechanism has been understood since Eunice Newton Foote described the heat-trapping properties of CO₂ in 1856. Svante Arrhenius calculated a rough climate sensitivity number in 1896 that is not embarrassingly wrong by modern standards. Exxon's own internal research in the late 1970s and early 1980s correctly modeled what was coming — and then the company spent decades and hundreds of millions of dollars manufacturing signal noise to prevent that finding from competing with the dashboard.
That is what distinguishes this from most other entries on this list. In Challenger, the signals organically drowned out the mechanism through authority gradient and schedule pressure. In global warming, the signal pollution was industrially manufactured and deliberately targeted at the epistemic infrastructure itself — funding think tanks, seeding doubt in journalism, lobbying scientific institutions, and later, optimizing social media engagement around polarization. The hood was not just ignored. A well-funded industry built a second, fake hood and pointed at that instead.
We now have over 150 years of mechanistic data — temperature records, ice cores, sea level measurements, ocean acidification, permafrost methane release, coral bleaching — all confirming the original model, while the political dashboard in many countries still reads "disputed" or "too expensive to fix." The engine and the dashboard have almost completely decoupled.
What makes it the hardest case is the timescale. Every other entry on this list had a moment of obvious, concentrated failure. Global warming's feedback loops operate on decades to centuries, which means the people generating the signals bear almost none of the cost of being wrong, and the people who will bear the cost are not yet born and have no vote.
The hood has been open for 170 years. We keep looking at the dashboard.
— Commercial Real Estate / Regional Banks
Unresolved — pattern recognition, not prediction
Office vacancy rates in major US cities are at historic highs following the shift to hybrid and remote work. Regional bank loan books carry heavy exposure to commercial real estate. Refinancing walls — large volumes of loans coming due simultaneously — are approaching. The signals look manageable: no systemic panic yet, loan modifications extend and defer recognition, accounting rules do not force marking loans to market. The engine — actual occupancy rates, actual cash flows, actual collateral values — tells a more complicated story. Multiple economists have noted the structural similarity to the 2006–2007 period before the residential mortgage crisis became visible.
— AI Capability and Safety Evaluation
Unresolved — pattern recognition, not prediction
We are deploying AI systems whose internal mechanisms are largely opaque. We evaluate them primarily with benchmarks that the training process may have optimized toward, and we make safety claims based on behavioral signals rather than mechanistic understanding. The people building these systems will tell you openly that they do not fully understand why they work. The dashboard — benchmark scores, impressive demonstrations, no catastrophe yet — looks fine. Nobody can fully open this hood yet, which is itself the most important finding. We are, as an industry, making the classic error: treating absence of observed failure as evidence of safety.
— Antibiotic Resistance
Unresolved — slow-moving, well-documented
The signal is that antibiotics work, because they still mostly do. The mechanism is a global commons problem: agricultural overuse, incomplete treatment courses, lack of financial incentive for pharmaceutical companies to develop new antibiotics (you cure the patient; you do not create a lifetime customer), and inadequate global surveillance of resistance patterns. The WHO has been describing this as a slow-motion catastrophe for decades. Resistance to last-resort antibiotics is rising. The dashboard has not changed for most people yet because the problem is geographically and economically uneven — it is worse where antibiotics are cheap and unregulated.
— Public Pension Fund Solvency
Unresolved — actuarial arithmetic
Actuarial assumptions baked into public pension funds across the United States, United Kingdom, Japan, and most of Europe assume investment returns that require either significant market outperformance or accounting adjustments that defer recognition of shortfalls. The signal is that pensions exist and people are receiving them. The mechanism is a demographic and return-assumption problem that is arithmetic, not speculative: fewer workers supporting more retirees, for longer, with return assumptions set during a period of falling interest rates that no longer obtains. The people who will bear the cost are predominantly not yet retired.
— Microplastics
Unresolved — mechanism poorly understood at relevant timescales
Microplastics have now been found in human blood, placentas, lung tissue, and cardiac tissue. The signal is that plastics have been "safe" for decades — we built the modern world with them. The mechanism — what chronic low-level exposure to endocrine-disrupting compounds and micro-scale particulates does across a human lifetime, at a population scale — is almost entirely unstudied at the relevant timescales. We ran the experiment before we understood it. We are still running it. The results will not be fully visible for decades.
— Social Media and Adolescent Mental Health
Unresolved — causation debated, correlation strong
The dashboard signal for social media platforms is engagement, growth, and user satisfaction metrics. The mechanism — what algorithmically-optimized social comparison and variable reward schedules do to developing brains — is increasingly visible in population-level mental health data, particularly for adolescent girls. Rates of anxiety, depression, and self-harm in this demographic rose sharply and in synchrony with smartphone adoption across multiple countries. The platforms have internal research — some of it leaked — showing awareness of the harm. This has not meaningfully changed platform architecture. The tobacco parallel is not subtle: a product whose harm is visible at the population level, whose mechanism is understood internally, and whose financial model depends on continued use.
The Pattern
The through-line across all of these: the information to prevent the outcome usually existed. The mechanism was knowable. What failed was the social and institutional infrastructure for that knowledge to reach decision-making with enough weight to compete with the signals.
That is not primarily a technical problem. It is a status, incentive, and psychological safety problem — which is why it keeps happening regardless of how sophisticated the domain becomes.
Opening the hood is not about being smarter than the room. It is about being more afraid of being wrong than of being uncomfortable.