[SystemSafety] How Many Miles of Driving Would It Take to Demonstrate Autonomous Vehicle Reliability?
Peter Bernard Ladkin
ladkin at rvs.uni-bielefeld.de
Mon Jun 13 09:18:42 CEST 2016
On 2016-06-13 00:43 , Mike Ellims wrote:
> The other two appear to be where the vehicle requested the drive take control and before they could
> do so effectively the car drove into the back of stationary vehicles,.......
>
> Tesla’s current take on this appears to be that Autopilot isn’t a autonomous system and the driver
> should always be ready to take control.
This sounds like a replay of experience in commercial aviation.
"Human supervisory control" is a term associated with the lab of Thomas Sheridan at MIT (who is now
in his late 80's). The difficulty of reversion can be stated briefly: humans make poor monitors.
When the automation is doing its thing, a human monitor is often cognitively "outside the loop". In
a typical design, when the automation is unable to resolve a situation, it hands over control to a
human, who needs suddenly to be "in the loop" again to resolve a situation which the automation has
found too tricky to handle. This cognitive switch can be difficult and used to be regarded as
unreliable.
Consistent with this understanding, there have been some recent commercial-aviation accidents in
which a supposedly well-training and experienced cockpit crew have not understood the situation in
which a highly-automated aircraft found itself and reacted (or not reacted) inappropriately, and
then crashed. It's not a huge number in absolute terms (thankfully true of commercial-aviation
accidents generally), but it has people in commercial aviation safety thinking about the apparent
dependence of modern crews on the aircraft's automation and what should be done to give commercial
pilots more experience with "hand flying".
On the other hand, in road traffic, it seems that human monitors can remain well "in the loop"
without controlling the vehicle directly, as the phenomenon of "back-seat drivers" shows. This may
be because the data needed to make appropriate decisions is cognitively easily accessible, and
continuously so, to those with driving experience. You can see all of "what's happening" and thus
believe you "know" what you would do.
This may to some extent be an illusion - what you would do is often wrong. My general reaction to an
unanticipated traffic conflict on my bicycle, when I have time for one, is to slam on the brakes.
This has often left me stopped in the path of the vehicle I was trying to avoid.
I can't seem to find a good reference on the WWW outside of a paywall. I suspect Sheridan's chapter
on Supervisory Control in Salvendy's Handbook of Human Factors and Ergonomics might be one good
place to look (I seem to remember this from twenty years ago in the uni library) but the Salvendy
book is behind a paywall at Wiley. Other survey articles by Sheridan are behind the paywall at Springer.
PBL
Prof. Peter Bernard Ladkin, Faculty of Technology, University of Bielefeld, 33594 Bielefeld, Germany
Je suis Charlie
Tel+msg +49 (0)521 880 7319 www.rvs.uni-bielefeld.de
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 455 bytes
Desc: OpenPGP digital signature
URL: <https://lists.techfak.uni-bielefeld.de/mailman/private/systemsafety/attachments/20160613/777c1cc8/attachment.pgp>
More information about the systemsafety
mailing list