[SystemSafety] Autonomously Driven Car Kills Pedestrian
Robin Cook
robincook107 at gmail.com
Thu Mar 22 20:21:16 CET 2018
Folks,
Don't discount the automotive industry's capability to get its test mileage
up. I had a customer back in the late '80s providing small parts to a
vehicle manufacturer. When I suggested that reliability testing would be
ineffective given the high reliability of the part, I was told that they
were quite used to putting 5,000 on test for several months. The automotive
industry is a high volume industry unlike the aircraft and defence
industries.
The bigger question, where opinions differ, is the credence in such results
from a safety perspective and the test coverage claims in terms of
percentage of dangerous circumstances encountered (where the total maybe an
known unknown).
It's good to see someone using confidence levels from time or event
terminated tests rather than point estimates as long as the result is not
applied to an aspect of the system behaviour that wasn't tested.
Robin Cook
Thales Cyber and Consulting
-----Original Message-----
From: systemsafety <systemsafety-bounces at lists.techfak.uni-bielefeld.de> On
Behalf Of Smith, Brian E. (ARC-TH)
Sent: 22 March 2018 13:46
To: Mario Gleirscher <mario.gleirscher at tum.de>;
systemsafety at lists.techfak.uni-bielefeld.de
Subject: Re: [SystemSafety] Autonomously Driven Car Kills Pedestrian
Mario,
I handt come across the RAND report. Many, many thanks for passing it on.
I found this quote from page 3 kind of makes the point
To demonstrate that fully autonomous vehicles have a fatality rate of 1.09
fatalities per 100 million miles
(R=99.9999989%) with a C=95% confidence level, the vehicles would have to be
driven 275 million failure-free miles.
With a fleet of 100 autonomous vehicles being test-driven
24 hours a day, 365 days a year at an average speed of 25 miles per hour,
this would take about 12.5 years.
What I love about this social media platform for system safety is that so
many smart folks chime in, share information and reports, and make our
community greater than the sum of its parts. Theres also a nice spirit
among the group.
Brian
On 3/22/18, 2:39 AM, "systemsafety on behalf of Mario Gleirscher"
<systemsafety-bounces at lists.techfak.uni-bielefeld.de on behalf of
mario.gleirscher at tum.de> wrote:
>Nice estimation (it even matches my gut feelings :). Surely we are
>talking about 1.18E-8 when driving under any thinkable adverse
>condition, off-road, backwards, without light, with broken cars,
>through roadworks, non-cooperative road participants, whatever ...
>right? All stuff for which (by public consensus) responsibility is in
>the hands of the driver.
>
>Question 1: Just curious, did you compare it with what they used in
>
>Kalra, N. & Paddock, S. M. Driving to Safety: How Many Miles of Driving
>Would It Take to Demonstrate Autonomous Vehicle Reliability? RAND
>Corp., RAND Corp., 2016
>
>?
>
>Question 2: Does anybody of us know exactly about the testing
>conditions _in the field_? By exactly, I mean _exactly_.
>
>I suggest those conditions do not even in the slightest sense match
>with what is found on non-lab/non-instrumented streets. According to my
>state of knowledge, no one seems to properly discuss that, instead
>governments seem to be more declining regulatory opportunities (I am
>not talking about ISO 26262 v2 which is of course not supposed to play
>a relevant role in AV regulation).
>
>I am insisting on _exactly_, because when selling AVs, then testing on
>public land might grant the public the legal right to know all about
>the testing conditions (I am not just talking about the Waymo safety
>report). This would ultimately mean, that we would have the legal right
>to look into the implementations of each and every AV vendors field
>test procedures.
>
>Happy to know if that makes sense?
>
>Mario
>
>
>On 22.03.2018 08:44, Peter Bishop wrote:
>> Based on the data, we could reject the hypothesis that Uber is as
>> safe as human-driven (1.18E-8) with 99.998% confidence.
>>
>> And we could reject the hypothesis that Uber is better than 1 in a
>> million miles with 91% confidence.
>>
>> So they have quite a way to go.
>>
>> Peter
>>
>> On 21/03/2018 23:29, Smith, Brian E. (ARC-TH) wrote:
>>> Note sure if such a comparison would pass muster statistically. As
>>>of 2015, for human-driven passenger cars here in the U.S., there
>>>were about
>>> 1.18 fatalities per 100 million Vehicle Miles Traveled (VMT). Both
>>>the numerator and denominator are large enough to make the ratio
reliable.
>>>
>>> In 2017, driverless cars accumulated only about 485,000 miles of
>>>testing here in California. If the single Arizona accident had
>>>happened in my state, CA, then the rate would be 1 fatal accident
>>>every 485,000 miles for ³autonomous² vehicles or ~200 times greater
>>>than for human drivers.
>>>But
>>> the numerator is too small to be statistically reliable - basically
>>>fatalities are too rare at this time. Yes/no?
>>
>
_______________________________________________
The System Safety Mailing List
systemsafety at TechFak.Uni-Bielefeld.DE
More information about the systemsafety
mailing list