top of page

Failed by design

I recently wrote about an Indian nuclear submarine that had to be decommissioned for 10 months because someone failed to properly close a hatch. Or more accurately because someone failed to build in a mechanism that would prevent someone unknowingly submerging it with an open hatch.


Far more seriously, but on a similar note, last weekend in Hawaii, residents and visitors received texts telling them that the island was about to come under missile attack and that they should take cover. It turned out that a government employee accidentally hit the wrong button thereby unleashing an emergency communications protocol that warned of impending danger. It took a whole 38 minutes for them to notify the population that this had been a false alarm.


During that time people did all they could to prepare. That included a father having to decide which of his children he would go and rescue on the basis that he didn’t have time to rescue all of them. A heartbreaking story that has potential long term consequences as the “abandoned” kids ask themselves why they weren’t the ones he chose to rescue.


Even more worryingly, the text message was a small part of a wider warning system that unleashed a series of communications protocols. In a world where there is heightened tension with North Korea and Russia, it’s really not good having people in a position to take retaliatory action given the impression that the US is under attack.


This NYTimes article on the reasons why a South Korean passenger jet was inadvertently shot down by the Russians in the 1980s makes chilling reading. It’s really not that far-fetched to assume that someone might think they were doing the right thing by unleashing a “fire and fury” response to what has all the hallmarks of an act of war.


The idea that someone could accidentally press a button that has potentially serious consequences seems somewhat disturbing. So I did a little research; surely there must be some form of failsafe mechanism?

It is actually much, much worse than you might think. This tweet, from what I understand is a well-respected investigative news source (interestingly owned by the founder of eBay), shows the interface used to set off the alert:


Yes, you read that right. There isn’t a button. It’s a webpage with hyperlinks. The one the operator was supposed to use, had the word DRILL in front of it. The one he used didn’t. It’s that simple.


There are so many things to criticise here. It’s a master class in utterly appalling design.

I had initially questioned why it took 38 minutes to give the “all clear”. But having seen the interface used to set off the alert, I’m willing to believe that resetting it is probably not as easy as it ought to be.


If we allow people to take decisions that have serious consequences, then we need to think about the manner in which we allow them to take that decision. It’s very easy to blame the individual responsible for making a decision; but if it is too easy to do the wrong thing, then arguably it’s also the architect of the system that’s allowed the poor decision to be made that easily in the first place, that should also bear some responsibility. How can it be harder to take money out of an ATM, than it is to unleash an emergency alert?


That’s slightly unfair: it turns out that there was an “are you sure?” option which the operator clicked “yes” to. Which isn’t really that surprising. After all, you’ve selected the option you think is right, so being asked to confirm it, is unlikely to yield a different result. Unless the consequences of that choice are made crystal clear. Which I suspect didn’t happen.

Fortunately, it seems as though lessons are being learned from this. As this article explains, the responsible individual is being protected and has been reassigned and new controls are being introduced. In short, they are treating it as an opportunity to learn from the incident:


“Looking at the nature and cause of the error that led to those events, the deeper problem is not that someone made a mistake; it is that we made it too easy for a simple mistake to have very serious consequences,” Mr. Miyagi wrote. “The system should have been more robust, and I will not let an individual pay for a systemic problem.”


All too often we hear that Human Error is the cause of something going wrong. When we think about that term, it’s important to consider where the fault really lies. Don Norman, a User Interface expert, rightly points at the part poor design plays (whilst plugging his rather good book!)



Norman coined the term “user-centred design” which is the concept of ensuring that when you design something, you think about the end-user. If something just looks good or it’s technically brilliant but terrible to use, you’re going to get bad results and it won’t achieve its ultimate aim. Or to put it another way: if you want people to do the right thing, make it easy for them.



Comentários


bottom of page