Two months ago, I wrote about an incident where a fault in an airbag system was responsible for the death of a child. (Actually, as a court ruled later, it was not the airbag system that was responsible, but rather an engineer who kept quiet about a bug that he had discovered earlier.)
Now, it seems like we have another case of a software-based disaster: Volkswagen recently admitted that they used software in their cars that detects official test situations and then reconfigures the engine to reduce pollution — just to get the figures right. When not in this “cheat mode”, that is, under real driving conditions, Volkswagen’s NOx emission rates are up to 40 times higher.
I think this cunning piece of German engineering has the potential to smash Volkswagen. Not extinguish entirely, of course, but Volkswagen’s stock value might decrease so much that it can be taken over easily by the competition. In any case, it will cost this company, whose bosses probably thought it was too big to fail, big: billions of dollars, thousands of jobs, an immeasurable amount of reputation and credibility.
What makes this story so exceptionally shocking is that we are not talking about a bug (like in the failing airbag case) but a feature. This software was put in deliberately, so it is rather a sin of commission than a sin of omission.
While it would be interesting to know what made people act in such a criminal way, what really matters is this: the whole mess could have probably been avoided if someone along the chain of command would have stood up and said “No!”.
I will use this sad story as an opportunity to introduce the “Software Engineering Code of Ethics and Professional Practice” to you, a joint effort by IEEE and ACM. As for the Volkswagen scandal, it looks as if at least the following principles have been grossly violated:
Software Engineers shall:
1.03. Approve software only if they have a well-founded belief that it is safe, meets specifications, passes appropriate tests, and does not diminish quality of life, diminish privacy or harm the environment. The ultimate effect of the work should be to the public good.
1.04. Disclose to appropriate persons or authorities any actual or potential danger to the user, the public, or the environment, that they reasonably believe to be associated with software or related documents.
1.06. Be fair and avoid deception in all statements, particularly public ones, concerning software or related documents, methods and tools.
2.06. Identify, document, collect evidence and report to the client or the employer promptly if, in their opinion, a project is likely to fail, to prove too expensive, to violate intellectual property law, or otherwise to be problematic.
3.03. Identify, define and address ethical, economic, cultural, legal and environmental issues related to work projects.
5.11. Not ask a software engineer to do anything inconsistent with this Code.
6.13. Report significant violations of this Code to appropriate authorities when it is clear that consultation with people involved in these significant violations is impossible, counter-productive or dangerous.
In a world in which an ever-increasing part is driven by software, “The Code” should be considered the “Hippocratic Oath” of software engineers, something we are obliged to swear before we can call ourselves software professionals, before we are unleashed to an unsuspecting world. I wonder how many more software catastrophes it will take until we finally get there.