Self driving trucks increases risk of potential food and other consumables shortages

March 23, 2019 by · Leave a Comment
Filed under: General News 

As technological advances in the automotive industry bring us closer to the fully self driving vehicle on the roads, Governments would be wise to consider lessons learned from the airline industry. Business analysts should help advise our business partners on mitigating the risks.

Recent issue with Boeing 737 Max 8 highlights how a perceived software problem can kill people and cause a fleet of planes to be grounded from use. From a business perspective, airlines were lucky that they had other types of planes to use while the problem is worked on. Still, this was a tragedy that we all wish had not happened. We should use it as a reminder on being sure to understand the risks of any software we work on.

If we jump forward into the future where the delivery trucks on the road are all self driving. What will we do then if the self driving software is found to be flawed or hacked and requires trucks to be taken off the road? It might be days or weeks before the fleet can get operational again. This would leave food rotting in warehouses and docks as it was unable to be delivered to its final destination. This could lead to mass panic and civil unrest.

As a forewarning of the impact we might see, the cyber attacks of 2017 showed how information systems that organize the flow of goods were impacted. Shipping containers were unable to be moved to their destinations as the data required to manage the containers was unavailable. This lead to some temporary food shortages.

Self driving trucks, however take us into the world that actual physical transport is also at risk of being disabled. Even if we had a physical piece of paper showing us all the destination information for a shipping container, we would not have the means to move it. No manual work around. To avoid this risk, it would be good if self driving trucks at least have the following features:

  • Ability to disconnect the Self Driving brain from the truck.
  • Mechanisms to allow humans to control the truck directly in a manual form.
  • Retain physical security that allows authorized humans to manually drive and don’t rely on software security that may fail and prevent manual override.
  • Multiple vendors, with the theory that they won’t all fail at the same time.

If we want to look at history to see a previous example of when vehicles ground to a halt and how the problem was handled for the future, we can look at the Opec Oil embargo of 1973. Restriction of oil being sent to USA meant that fuel was hard to come by bringing traffic partially to a halt. Long term solution to avoid this problem in the future was for USA to keep its own strategic reserve of oil. One could argue for the need to keep sets of manually driven trucks on standby, spread throughout the country as a similar workaround.

As business analysts we should encourage our clients / business partners to weigh up the risk of their investments in new technology and help them to consider back up solutions at the same time. The old idiom of “Don’t put all your eggs in one basket” might be wise as software continues to replace even more manual processes. In some places it may be better to have multiple different solutions so that there is an alternative should one fail. Having a variety of planes has allowed the airlines to keep flying.

The more our technology solutions integrate with the infrastructure of the society we live in, the more need there is for a back up solution should a piece of technology fail. As business analysts we should not forget this.

VW a lesson in marketing versus regulations

By now you will be very aware of the VW diesel scandal where the software on the car detected when the car was being tested and controlled exhaust emissions to past the test.

Anyone that works in gathering requirements can easily see the problem here. There were two competing requirements Marketing and Regulatory and in the end the marketing side won out.

Big business is a game of cat and mouse. Laws are in place for a lot of things but for business the viewpoint of laws is the risk / cost of being caught and the benefit of not following the law. If the law is not enforced 100%, business will start to think of it as an optional law. There are numerous cases of settlements between car companies and the US government or consumers. The Titanic is a classic example of the law being met but the intent of the law being missed which was to have enough life rafts to save lives – the law had not been written in such a way as to force the life rafts to be enough to meet the number of passengers.

When gathering requirements for a solution, care must be taken to understand the implications of giving one set of requirements higher priority over another. Risk analysis is supposed to be done to ensure the VW, Titanic situation never occurs today. However profit is a powerful master and it will make people blind to that which is obvious.

Double check those requirements that fly in the face of morals to make sure you are not ignoring something that will later make you a headline.

 

When software kills due to incomplete requirements

If you are lucky, your software has not been responsible for the death of anyone to date. If you are unlucky then you know it.

When a analyst gathers requirements for a piece of software there is a tendency to focus on the happy path and ignore the surrounding paths that can lead to disaster. Unfortunately events can lead up to the identification of the missing requirements and sometimes death is a result.

To be fair, we humans can still kill ourselves without software such as with the mechanical loaded gun or the speeding car taking a bend too fast. However software seems to give people in some cases a false sense of security that does not exist. In other cases it can give them power to do something that should not have been possible if they were directly engaged with the physical which leads to disaster.

The article below refers to two cases where software enabled a pilot to do something they should not have been allowed to do with death being the end result.

Lessons from spaceship two’s crash

In the above article, the situation was different from my previous article about lack of tactile feedback. In both cases the pilots knew what they were doing, they just did it at the wrong time or too frequently for the specific vehicle to survive.

As an analyst, be it a system’s analyst or business analyst, it is not enough to think of just the happy path. Whenever you are gathering requirements you need to also think of what will keep us on the happy path. Whenever there is an interaction or a key data point, ask yourself if the event that causes this can be triggered at the wrong time or occur too many times.

Look for the ways that one can step off of the path and see if you can build either a metaphorical wall to keep us on the path or ways to get us back on the path before any damage is done.