[SystemSafety] Difference between software reliability and astrology
Steve Tockey
steve.tockey at construx.com
Thu Aug 22 06:50:44 CEST 2024
Derek,
“To me 6-10 years is not Extremely improbable.”
Maybe it depends on how you choose to look at it? From the perspective of a single atom of a radioactive substance like, say, Uranium 235—which has a half-life of 700 million years—the probability that it will decay in any one second is infinitesimally small. On the other hand, when you have 1 KG block of pure U-235 there are so darn many atoms that thousands of them are decaying every second. It’s “extremely improbable” from the perspective of any one single atom and yet essentially certain in that block.
“Perhaps the reliability figures were chosen when there were an order
of magnitude fewer aircraft.”
That could have something to do with it. But that said, when you consider what the FAA (and EASA, can’t forget them) means by Type Certification, even though a total number of Boeing 737s manufactured so far stands at a bit under 12,000 (https://en.wikipedia.org/wiki/Boeing_737), they are definitely not all of the same “Type” to the certification authorities. The original 737-100 has a different type certificate than the 737-200, which has a different type certificate than the 737-300, which has a different type certificate than the 737-400, … which has a different type certificate than the 737 MAX 8, which has a different type certificate than the 737 MAX 9, which would have a different type certificate than the 737 MAX 10 should it ever get certified, …
The 737 is by far an outlier on the high quantity side. On the other end of the spectrum is, for example, the Convair 990 (https://en.wikipedia.org/wiki/Convair_990_Coronado) of which only 37 were ever built. Only 250 Lockheed L-1011’s were ever built. Three different versions of the DC-10, the -10, -30, and -40, totaled 321 delivered to airlines. The total production run for the A-380 was 251.
“Multiplying these values by lots of orders of magnitude implies
that self-driving car incidents are going to be routine.”
A couple of responses to this:
1) Incidents involving non-self-driving cars are already quite “routine”. World wide, around 1.35 million people are killed each year in car accidents (https://www.ddlawtampa.com/resources/car-accident-statistics-you-need-to-know-in-2021/#:~:text=Global%20car%20accident%20statistics,injuries%20resulting%20from%20car%20accidents.). That’s about 3,700 deaths every single day.
2) The average number of commercial airline passengers killed in a year is somewhere around 500 (https://worldmetrics.org/commercial-plane-crash-statistics/).
3) So on average cars kill about 3 X 10^3 more people than airplanes in any one year. It takes an average of around 7 years of commercial airplane fatalities to amount to the same number of fatalities in a single day in cars. All of those annual car deaths never seem to make the news, and yet a single deadly commercial airplane accident is instant worldwide news regardless of cause (design, maintenance, pilot error, …). We seem to be perfectly willing to accept all those car deaths as literally “the cost of doing business” and yet the outcry over far, far fewer deaths in a single commercial airplane accident causes quite the public outcry. Something doesn’t add up here, IMHO.
4) Some have proposed that self-driving cars could be less likely to be involved in an incident than non-self-driving cars. Personally, I think that remains to be seen.
5) For now, I personally believe that all self-driving vehicles on public roads need to provide a clearly visible (360 degrees, at least 0.5KM away) indicator when they are in self-driving mode so that others (like me) can steer clear of them.
Maybe, just maybe, if self-driving car software was built to DO-178C criteria, even Level C, the probability of getting involved in a self-driving incident might go way, way down? The NTSB report on the Uber self-driving car in Tempe, AZ makes it painfully obvious that the coders (decidedly NOT software engineers by any stretch of the imagination) were pretty much as dumb as bricks.
And, speaking of “software engineer”, I’m awaiting your response to Les Chambers’ two questions on SWEBOK Guide.
Cheers,
— steve
On Aug 21, 2024, at 4:26 PM, Derek M Jones <derek at knosof.co.uk> wrote:
Steve,
Thanks for the numbers update.
5 hours per day is way too low. Airplanes are very expensive, airlines are low profit margin businesses (which is why they are so interested in other, more highly profitable side business like credit cards), and airplanes only earn revenue when they are in the air.
I was not sure whether there was a long tail of less
frequently used aircraft.
So if you double or triple your numbers below to account for 10-15 flight hours per day instead of the 5 you used, you get:
— 1 X 10^-5 equates to 2.5 to 3.75 Abnormal procedures per day
— 1 X 10^-7 equates to one Emergency procedure or Airplane damage every 30 to 45 days
— 1 X 10^-9 equates to one Catastrophic Accident every 6 to 10 years
To me 6-10 years is not Extremely improbable.
Perhaps the reliability figures were chosen when there were an order
of magnitude fewer aircraft.
Multiplying these values by lots of orders of magnitude implies
that self-driving car incidents are going to be routine.
--
Derek M. Jones Evidence-based software engineering
blog:https://shape-of-code.com
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.techfak.uni-bielefeld.de/pipermail/systemsafety/attachments/20240822/6b480a7f/attachment-0001.html>
More information about the systemsafety
mailing list