When software kills due to incomplete requirements

If you are lucky, your software has not been responsible for the death of anyone to date. If you are unlucky then you know it.

When a analyst gathers requirements for a piece of software there is a tendency to focus on the happy path and ignore the surrounding paths that can lead to disaster. Unfortunately events can lead up to the identification of the missing requirements and sometimes death is a result.

To be fair, we humans can still kill ourselves without software such as with the mechanical loaded gun or the speeding car taking a bend too fast. However software seems to give people in some cases a false sense of security that does not exist. In other cases it can give them power to do something that should not have been possible if they were directly engaged with the physical which leads to disaster.

The article below refers to two cases where software enabled a pilot to do something they should not have been allowed to do with death being the end result.

Lessons from spaceship two’s crash

In the above article, the situation was different from my previous article about lack of tactile feedback. In both cases the pilots knew what they were doing, they just did it at the wrong time or too frequently for the specific vehicle to survive.

As an analyst, be it a system’s analyst or business analyst, it is not enough to think of just the happy path. Whenever you are gathering requirements you need to also think of what will keep us on the happy path. Whenever there is an interaction or a key data point, ask yourself if the event that causes this can be triggered at the wrong time or occur too many times.

Look for the ways that one can step off of the path and see if you can build either a metaphorical wall to keep us on the path or ways to get us back on the path before any damage is done.