Bugs Kill
Software has been eating the world. On one end this is really powerful. The dream of a hardware product being continuously improved with "just" software updates. This has clearly worked for computers and smartphones. Of course to some extent, as there always comes a point when a new update is not enabled on "old" hardware. But generally everyone expects whatever they buy today will be a better prodict tomorrow. And in a year or two.
The other side of the coin are products which simply should work as-is, and nobody is expecting any updates to come. Yet these products are very often too software-driven. And when this embedded software has bugs, the products may suddenly fail.
There are many stories like that. Some examples:
- SanDisk Extreme SSDs abruptly failing (and the follow-up story)
- Sonos Arc soundbar "pop of death"
- Mazda FM radio bricked
Of course they are just examples what can happen (and does happen). The SanDisk issue is particularly nasty, as it is not just the product which fails. It loses the priceless data it was meant to protect. So the damage goes far beyond the product itself.
These and other similar stories indicate the manufacturers do not sufficiently test. And then do not sufficiently analyze the incident reports.
The problem becomes way more dangerous when there is a software bug in a car. A bug which can cause a serious accident, harming or even killing people. It appears one such issue is the unintended acceleration of Teslas (currently investigated by the California's attorney general).
There were many reports of unintended acceleration events in Telsas before, and all of them were dismissed, as "the logs showed the acceleration pedal was pressed by the driver". Which turns out to be a lie. Yes the logs showed the pedal was pressed and the car behaved like it was pressed - accelerated and crashed. But the fact the logs showed that did not mean this was the case. Because the input to the logs (and to the car logic) was incorrect. What likely might have happened (and it is a much more plausible explanation than putting the blame on drivers) - was a voltage spike happening when the pedal sensor was recalibrating itself.
There is - what I call - a defensive system architecture design approach which should always be applied. Especially for safety - critical systems like car control. This was not the case with Boeing 737-MAX and killed hundreds of people. The unprotected single low voltage signal to monitor car pedals has been no different. It happened. It has been happening. And it will happen.
The progress - unfortunately - may make the things worse. Now all the talk is about AI and the application of AI in self driving cars. Forget voltage spikes. How do we make sure the goals of the self driving car AI will be aligned all the time with what we want? A.I. experts say the so called "hallucinations" are in principle not fixable. Do we still want to go down that path of seeking corporate profits and being able to seat in the backseat and watch Netflix?
Comments
Post a Comment