top of page

What we can learn about Human Risk from my new favourite TV show

Eight Years Ago…

…I got onto a plane from London to the Ukrainian capital of Kyiv for a City break with a difference. I was on my way to visit Chernobyl which, 25 years earlier, had been the scene of one of the world’s worst man-made disasters.

It remains one of the most memorable trips I’ve ever been on. I say that as someone who is naturally a bit of a Dark Tourist. I went because I thought it would be an incredible experience and because there were lots of questions I wanted answers to.

If you find that hard to understand, watch this 2-minute interview with comedian and travel writer Dom Joly in which he explains why some of us feel the need. Equally, if the idea intrigues you, then do read Dom’s book The Dark Tourist, which inspired me to visit a few other places.

I knew it would be a great chance to take some atmospheric photos of the reactor and Pripyat, the now abandoned town that housed the people that worked in the nuclear plant:

The experience didn’t disappoint. It was mind-blowing on so many levels. Typically when something is dangerous, there’s a clue; fire feels hot, ice feels cold and we can see animals or people that are about to attack us. Radiation, on the other hand, is lethal and entirely invisible.

This is brought home to you upon entering and leaving the 30km exclusion zone around Chernobyl when you are screened for radioactivity. On the way in there’s novelty value, and at that point, you should be safe. But once you’ve spent time in the zone, there’s a risk of being exposed to so much radiation that, as a carrier, you become a danger to the outside world. If that happens, you’re not going home.

It’s admittedly a small risk, and I’ve not heard of any daytrippers not making it out. I also still harbour sneaking suspicions that the machines used on visitors (see below), were probably not that accurate and perhaps calibrated above normal limits. Detaining Dark Tourists isn’t great for encouraging visitors. But that didn’t make the process any less nerve-wracking.

(c) Human Risk Limited

The visit answered a lot of my questions, but it also posed many more. Like other Dark Tourism sites I’ve visited, Chernobyl is impossible to forget. What happened there has continued to fascinate me since my visit; particularly given the 2011 Fukushima Daiichi accident in Japan. So I was intrigued to see a recent rush of books appearing, that take another look at what happened. All of which I recommend:

Then in May, Sky and HBO, both no strangers to extremely high-quality television, began showing a jointly-produced 5 part mini-series called, you guessed it:

Poster for Chernobyl TV Series (c) Sky/HBO

If you’ve not yet got around to watching the show, then I strongly recommend you do. It’s some of the most uncomfortable, yet utterly compelling drama I’ve ever seen. I liked it so much that I’m currently working my way through it again. And I fully expect to watch it a few more times.

Don’t take my word for it. The show is already the highest rated show of all time, beating blockbuster multi-season hits like Game of Thrones and Breaking Bad. For the uninitiated, here’s the official trailer:

(c) Sky/HBO

You may be wondering why my travel and TV viewing experiences are featuring in a blog about Human Risk. The answer is that the story of the Chernobyl accident (rather than my visit) is the single best example of Human Risk I’ve ever come across.

The books and the TV show have re-kindled my fascination with the topic so much that I feel compelled to blog about it. Welcome, then to the Chernobyl Edition of the Human Risk Blog.

A Case Study in Human Risk

I’m going to explore the topic through the lens of the TV show. In part, because it is so visually memorable, but also because the show has won praise for its authenticity.

Don’t worry; I won’t spoil the plot for those of you yet to watch. Though arguably you broadly know how this one unfolds. My challenge here is that there is such a wealth of things I could cover. There’s probably an entire Chernobyl: Human Risk In Action book that I could write. Something I might get round to after I’ve finished the one I’m currently writing, so no stealing my idea!

I’m going to pick a few scenes from the show to illustrate some of the Human Risk dynamics from the story. In doing so, I recognise that I’ve missed out hundreds of others that would be equally worthy of study.

There are two aspects of human behaviour that stand out from the Chernobyl story:

On the one hand, there is a world of coverups, lies, abdication of responsibility and wilful blindness. On the other, one of limitless bravery by people who knowingly put their lives on the line for the greater good. I’m personally very grateful to the latter; if they hadn’t done what they did, then it could have had major implications for the whole of Europe.

As the writers of the TV show have kindly, with permission, released the scripts online, I can share dialogue in its original format. I’ve chosen scenes from the early episodes to avoid material spoilers.

Here then are four scenes that I think can teach us a few things about Human Risk:

I have a list of individuals we believe are accountable

The first scene I want to cover is when Boris Shcherbina, a senior politician who has been given responsibility for the Chernobyl cleanup, comes to visit the site and is met by Nikolai Fomin, the Chief Engineer and Viktor Bryukhanov, the Plant Manager. There aren’t many words, but there’s a lot hiding behind them:

What I find remarkable about this isn’t the servile attitude of the Plant staff towards the senior official. Nor am I surprised about the overly positive way in which they depict the situation to him. What Bryukhanov says is fundamentally dishonest: you don’t need to be overly familiar with the story to know that the progress being made on containing the damage was anything but “excellent“.

It’s the comment about having a list of accountable individuals that I find the most interesting. Of all the information they might choose to share with the man who is effectively their boss, it’s this that is uppermost in their minds. Instead of updating him with details of the cleanup operation, they’re pointing the finger of blame. We instinctively know that what they’re doing here: framing the narrative and getting their stories in order, to ensure they are not held accountable.

This isn’t just something that happens in incidents of this severity. In many organisations, one of the first questions people are asked, either orally or on the inevitable incident reporting form, is for the name of the responsible individual. Note “individual”, not “individuals”. All too often, there’s an understandable, yet unhelpful desire to pin everything on one single person. After all, it makes life much easier if there’s a simple explanation. Blame a single “bad apple”, rather than admit that there may be something wrong with the barrel they’re in.

In complex environments, there is a greater likelihood of incidents being caused by several factors. An individual might have pushed a proverbial or literal button and unleashed something. But that doesn’t mean they are necessarily entirely to blame. Our behaviours are heavily influenced by the environment in which we find ourselves and the people around us. Blaming an individual is an easy narrative. Sometimes it’s right. But more often it’s not that simple.

The problem with a focus on who is to blame is that it detracts from the ability to fix things. The expectation of a “blame game” is not only likely to deter people from coming forward and reporting problems, but it will impact the resolution and subsequent investigation of events. If the priority is establishing responsibility, then those potentially in the firing line are heavily incentivised to ensure they are out of it.

I’m not suggesting that individuals shouldn’t be held accountable for their actions. They absolutely should. But that is best done afterwards, in the clear light of day, not in the heat of the moment. It is hard to correctly attribute responsibility if you’ve already pre-determined where the blame lies without being in full possession of the facts.

I prefer my opinion to yours

The second scene I want to highlight features Deputy Secretary Garanin, a local politician, and Ulana Khomyuk, a nuclear physicist. While Garanin was a real person, Khomyuk is a fictional composite character who represents the contribution of the many scientists involved in the aftermath of the incident.

What you’re about to read is a conversation that pits Khomyuk’s scientific zeal to discover the truth, against Garanin’s inadequacy and political desire to keep things secret. At this point in the story, Khomyuk, who is based in Minsk, has detected a spike in radiation levels and has worked out that the most likely cause is a leak from Chernobyl. Given the 400km distance between the two locations, it’s unlikely to be something small. When she phones Chernobyl to get some answers, no-one picks up the phone and her suspicions are confirmed. Unsurprisingly, the plot requires Khomyuk to head the site herself, where she meets with Garanin and has this extremely confrontational discussion:

What stood out for me here is this line:

“I prefer my opinion to yours”

By describing her expert judgement as a mere “opinion“, Garanin can dismiss it as an alternative view which he doesn’t need to accept. It’s a false equivalence of the highest order, which we can see replicated nowadays in the denigration of subject matter expertise and classification of unhelpful exposition of the truth as “Fake News”. What Garanin calls “spreading fear“, is what most of us would see as “telling the truth“.

Of course, we know why Garanin is showing so much disdain to the far more qualified Khomyuk. He’s clearly out of his depth when it comes to the situation he’s been charged with managing. His experience in a shoe factory makes him woefully under-qualified to manage a nuclear plant, let alone deal with a disaster.

However, he does display precisely the skills that we can presume got him to that position: unquestioning loyalty to the state and Wilful Blindness towards what he knows is an extremely dire situation. There’s a real contrast between Khomyuk’s language (“I know“) and Garanin’s (“I have been assured there is no problem“). One of them knows what they are talking about; the other has to rely on third parties to even have an opinion.

Ultimately Garanin has to resort to pulling rank:

“Yes. I worked in a shoe factory. And now I’m in charge“.

By any reasonable standards, his authority has been totally and utterly undermined. In this world, however, he outranks Khomyuk whatever her expertise, and he resorts to ending the meeting by quoting Karl Marx and downing a vodka. He’s ended the conversation because he is incapable of continuing it, without further compromising his position.

Appointing unqualified people to positions of responsibility is an excellent way to induce Human Risk. Sadly it happens more often than we’d like to admit. Even more prevalent is the “nothing to see here” approach adopted by those who feel the risk of facing reality is not worth it. That can apply to all levels of an organisation: whether it is senior people choosing to remain aloof and unaware of what is happening, or junior people who feel unable to raise issues and remain silent.

We all know Garanins who say nothing and discourage others from raising issues. Although in this case, Khomyuk is obviously the more qualified of the two, in reality, the risk is that the people raising concerns might appear to be less qualified than the person they’re raising them with. Traditional logic, after all, dictates that seniority confers wisdom due to experience. It’s evident that a nuclear scientist knows more about nuclear science than a shoe factory worker.

But what about social trends? The CEO who has been in their industry for decades might actually be less equipped to deal with contemporary social and technological challenges, than the graduate they’ve just hired. In a fast-moving world, it is more than ever essential to hear alternative views; yesterday’s certainties can become tomorrow’s uncertainties. The words of Andy Stanley strike me as being particularly pertinent:

“Leaders who refuse to listen will eventually be surrounded by people who have nothing significant to say“

This doesn’t just mean listening to the echo chamber of the inner circle, whose views may well, intentionally or unintentionally, simply reflect those of the leader. To be successful, 21st-century risk management requires a diversity of data and inputs; including listening to ideas from people we might not traditionally have turned to.

Garanin’s rejection of expert opinion also reflects a dangerous societal trend. We often individually ignore big issues like climate change and equality, because solving them seems beyond us. Ignorance, as they say, is bliss…

Leave Matters of State to the State

The third scene I want to cover is a packed meeting of officials taking place in the first few hours after the explosion. Emotions run high, and there’s a palpable sense of panic. The proverbial Elephant in the room is that things aren’t under control at all and they really should evacuate the local population who are in real danger.

The scene features Bryukhanov, whom we’ve already met; he’s the Chernobyl plant manager who was the person who’d referenced the list of those responsible for the disaster. Also present is Anatoly Dyatlov, the Deputy Chief Engineer. The mood in the room risks spilling into anarchy. Which is when Zharkov, a Pripyat Executive Committee Member jumps in to politicise the discussion:

The first thing to notice is the justifications used by the two plant officials. Bryukhanov says that it can’t be that dangerous because his wife is in Pripyat. The implication being that he wouldn’t dream of placing her in harm’s way. But of course, we know that there’s probably no way that he could have got her out even he wanted to. He’s deluding himself. As his colleague makes abundantly clear in NSFW colourful language.

Dyatlov adopts a different approach, citing The Cherenkov Effect a scientific phenomenon that he uses to explain why things might look terrible, but actually, aren’t. We know he’s clutching at straws, but the beauty of this act of self-delusion is that he’s deploying his subject matter expertise. That makes it much harder for people to argue with. But it’s just as weak an argument as his colleague’s.

Both of them are suffering from Cognitive Dissonance; where our actions don’t match our beliefs. Each man knows that if things were as bad as they seem, then they ought to do something. But they can’t. Or they think they cant. So to deal with the dilemma, they come up with reasons to justify their (in)action. That way, they can live with themselves. It’s something we all do when faced with inherent contradictions: we try to find a way to square the moral circle.

As the noise rises in the room and the weaknesses of the arguments made by Bryukhanov and Dyatlov become ever more apparent, someone needs to take control. Otherwise, there’s a genuine risk that the group might decide to face reality, and the Party might lose control. People in a crisis can become what German speakers know as unberechenbar. The word is hard to translate but literally means unreckonable; for which also read unpredictable, erratic and irrational. Or, as I prefer it, human.

Hence the intervention of Zharkov, both as a plot device and the type of reaction that would have been all too common in the USSR. He’s a seasoned political operator who recognises that they need a new narrative. They will limit the damage by turning disaster into a political triumph. Of course, that damage limitation involves placing innocent people in mortal danger and not telling them what is going on. The “misinformation” he worries that will reach the population is something we call “the truth“. Zharkov’s plan is to create an alternative reality by lying to the people and making them question what is going on.

As I look at many of the dynamics of the modern world, I find myself increasingly drawn to the works of George Orwell. Zharkov’s plan echoes this quote from Orwell’s 1984:

“The party told you to reject the evidence of your eyes and ears. It was their final, most essential command”

I think this is one of the most evocative scenes in Chernobyl that neatly encapsulates the Human Risk dynamic that is unfolding. The people who know, choose not to know because it is easier for them than dealing with the truth. The people who don’t are relying on those that do. In many cases fatally.

Sometimes we count it in lives

The final scene I want to look at features Valery Legasov; the chemist brought in to manage the cleanup exercise and the central character of the TV series. He and Khomyuk, who have by now teamed up, are meeting with Mikhail Gorbachev, the Soviet leader, to brief him on developments and obtain approval for his plan. Legasov has some terrible news to share, and a plan that requires human sacrifice, needs Gorbachev’s approval:

In history, Gorbachev has gone down as a game-changing leader whose Perestroika reforms helped bring about the end of the Cold War; which they did. Though arguably he was merely responding to circumstances that would’ve seen the collapse of the Soviet Union regardless of his actions. Chernobyl, of course, was a major factor in that collapse. However, the Gorbachev we see here was much more of a Party apparatchik. This shouldn’t surprise us; after all, you don’t get to become Leader without being a Party man.

Ultimately what he does in this scene is to permit the operation that will cost human lives, but prevent an unthinkable disaster. Interestingly, he only does so after consulting with a General and the authorisation he gives comes in the form of an exhortation to the cause. At no point does he actually give permission, though his message is clear.

It is arguably easier to see something through the lens of a bigger picture than face the reality of what you’re doing. Gorbachev looks at human lives in the same way that he does roubles. To that extent, he’s no different to Zharkov in the earlier scene.

This “throwing bodies at the problem” approach goes to the heart of how the Chernobyl accident was brought under control. Given the scale of the disaster, there wasn’t a technological solution available. The Soviet solution was to “ask” their people to do what was necessary. It’s an approach that would have been close to impossible in the West, where governments wouldn’t have felt able to ask for that kind of sacrifice and there wouldn’t have been a stream of volunteers to go on suicide missions of this kind.

From a Risk Management perspective, the reactor design was substandard. Largely because the Soviet Union could neither afford to nor did it, prioritise human safety in the design in the same way that other countries did. This made it more likely that an incident would occur and that, if it did, that the impact would be that much greater.

What we see in Gorbachev’s response is an implicit acceptance of this reality: the loss of human life is an inevitability, rather than merely undesirable. His calculated language reflects this (“sometimes we count it in lives“). It’s a classic tactic that people use to distance themselves from reality. It is far easier to talk euphemistically about “collateral damage” than “deaths“. By de-sensitizing the language that we use, we can distance ourselves from reality and take decisions that we wouldn’t otherwise.

The choice of language can, therefore, be a critical enabler (or mitigant of) Human Risk. One of the reasons I think we have specific words to differentiate meat from the animal it comes from (eg. pork vs pig, beef vs cow, mutton vs lamb) is so that we can square the act of eating meat with being animal lovers. Though for some reason, that clearly doesn’t apply to animals that can swim or fly.

Challenging the words people use, can be a powerful way of making them think differently about their actions.

In Conclusion

In an era where Misinformation and “Fake News” are highly relevant topics, the timing of this series couldn’t be more apposite. In a further twist, Russian State TV announced that it would be making its own “more accurate” version of events that would highlight CIA involvement. Old habits, it seems, die hard.

They say that Pripyat won’t be inhabitable for another 200,000 years. It’s a fair bet, if we’re still around as a species by then, that Human Risk will continue to exist in some form.

Once you’ve seen the TV show, I also highly recommend the Official HBO Podcast that accompanies the series. Structured as an episode by episode commentary from one of the writers, it reveals a lot about the story and provides some fascinating Human Risk insights.

The Official Podcast to accompany the TV series

Also worth a listen is Risktory, a podcast that looks at History through the lens of Risk. Check out the episode on Chernobyl, subscribe and in doing so discover an interview with yours truly.

Finally, I recommend this Guardian article entitled The truth about Chernobyl? I saw it with my own eyes… which, terrifyingly, explains that what is depicted in the show is accurate, but that the reality was much, much worse. I’ll bear that in mind as I watch it all again.

If you’ve enjoyed this blog, do subscribe to my Human Risk Newsletter which looks at BeSci through a Risk lens. You’ll learn about how we can better manage Risk if we understand the real drivers of human behaviour.


bottom of page