[SystemSafety] Internal Concerns at Uber Before March Fatal Crash
Matthew Squair
mattsquair at gmail.com
Mon Dec 24 11:00:23 CET 2018
The road ahead may get bumpier yet for self driving vehicles.
Phil Koopman points out if the arrival rate of ‘surprise events’ is heavy tailed then when you go into production and up the number of vehicles by a factor of magnitude all those very rare tail events will become visible.
One of the several reasons why I’m skeptical about the brute strength approach to ML in the context of autonomous vehicles operating on the real world.
Matthew Squair
MIEAust, CPEng
Mob: +61 488770655
Email; Mattsquair at gmail.com
Web: http://criticaluncertainties.com
> On 21 Dec 2018, at 7:24 am, Peter Bernard Ladkin <ladkin at causalis.com> wrote:
>
> Gleaned from Risks-30.97:
>
> Ars Technica has an article on an internal memo at Uber, written a short while before the March
> fatal accident in Tempe, specifying what the writer suggested were inappropriate organisational
> safety practices in the self-driving car program.
>
> https://arstechnica.com/tech-policy/2018/12/uber-exec-warned-of-rampant-safety-problems-days-before-fatal-crash/
>
> PBL
>
> Prof. Peter Bernard Ladkin, Bielefeld, Germany
> MoreInCommon
> Je suis Charlie
> Tel+msg +49 (0)521 880 7319 www.rvs-bi.de
>
>
>
>
>
> _______________________________________________
> The System Safety Mailing List
> systemsafety at TechFak.Uni-Bielefeld.DE
> Manage your subscription: https://lists.techfak.uni-bielefeld.de/mailman/listinfo/systemsafety
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.techfak.uni-bielefeld.de/mailman/private/systemsafety/attachments/20181224/bf021dd9/attachment.html>
More information about the systemsafety
mailing list