top of page

Human Risk Military Edition


When I discovered that one of my favourite podcasts, This American Life had released a episode entitled Human Error In Volatile Situations, I thought it might be interesting. After all it sounds remarkably like a story of Human Risk. Which it is. In an episode that is nothing short of astonishing, I learned more than I ever wanted to about risk in the US military.


The episode covers two separate issues: the first is how a very minor error caused an explosion that could’ve detonated a nuclear weapon. The second is around sleep deprivation (caused in part by what might best be described as a macho culture, but equally a lack of resources) in the US Navy and the impact that is having on safety.

What’s remarkable about the episode is that it reveals quite how fragile things are. Near misses, in an environment where the consequence of an actual miss would be immense, seem to be commonplace.


This is worrying to say the least. One of the “givens” that we all have to live our lives by, is that someone somewhere has taken responsibility for managing safety. It’s what allows us to nonchalantly get on planes because we know that air traffic control exists, or buy food at the supermarket because we know that there are food safety standards that mean what we eat is fit for human consumption. In most cases we probably don’t know what the relevant authority that has this covered is called or how they go about their business. But we know that they exist. Or at least we work on the presumption that they do.


You’d think that the military would have even tighter controls around their risks. Because they’re even bigger when you’re dealing with weaponry that can cause real damage. But also because the knock-on consequences of an error are also pretty serious. Firing off a nuclear missile by mistake would be really bad. But the potential response, whether accidentally you’ve fired it at another country or just set it off in your own, could be even more devastating. It just doesn’t bear thinking about.


It’s one of the reasons a lot of veterans choose risk management as a career and do very well at it. Because the military understands real risk and it’s engrained in their culture. What is apparent from this podcast is that, at least in these two examples, is that they don’t always manage risk as well as we’d expect.


You can find the episode right here. If it’s piqued your interest in any way, then I’d also thoroughly recommend a book called On The Psychology Of Military Incompetence by Professor Norman Dixon. First published in 1976, its continuing relevance is demonstrated by the fact that it is still virtually a compulsory read at many Military Staff Colleges. Though I’m not sure whether those responsible for the situations so vividly depicted in the podcast have had the benefit of reading it.


In his book Dixon covers similar ground, focussing on famous events from history. Heres the summary from the book’s cover:


“The Crimea, the Boer War, the Somme, Tobruk, Singapore, Pearl Harbour, Arnhem, the Bay of Pigs: just some of the milestones in a century of military incompetence, of costly mishaps and tragic blunders.


Are such blunders simple accidents –as the ‘bloody fool’ theory has it –or are they an inexorable result of the requirements of the military system?


In this superb and controversial book, Professor Dixon examines these and other mistakes and relates them to the social psychology of military organization and to the personalities of some eminent military commanders. His conclusions are both startling and disturbing”


Both the book and the podcast are very engaging and thoroughly recommended. Perhaps unsurprisingly given the nature of what they do and the pressures they’re under, as Human Risk stories go, the experiences of the armed forces are right up there.

bottom of page