[SystemSafety] Cognitive bias, safety and disruptive technology
Bruce Hunter
brucer.hunter at gmail.com
Wed Mar 13 02:06:51 CET 2019
Despite the irony in today's twitter statement, by Donald Trump on 737 Max
8 crash - "often old and simpler is far better"; I never thought I'd say he
may be partly correct for addressing novelty in design.
It's been interesting following the news and social media opinions about
the Boeing 737 Max 8 crashes. We are often torn between "knee-jerk"
reactions and methodical investigation with prudent correction of system
faults. With this crash we even have flight information being made public
before investigations have even begun (flightradar24.com https://
t.co/Uyvfp1x9Xb). "Correlation [still] does not mean Causation".
Similarly with the NHTSA claim on safety correlation of Tesla SW, even
authorities can fall into cognitive bias which clouds their thinking. I
think cognitive bias is a real issue with disruptive technology such as
autonomous vehicles, IoT and others that are starting to be used in
safety-related systems.
System Safety does have these concerns addressed in IEC 61508 novelty
requirements for: competence (part 1, 6.2.14e); safety verification
(7.18.2.3); independence (safety justification (part 2, 7.4.10.5);and sw
verification (part 3, 7.9.2.3).
In the hope of staring discussion thread on the softer human side of system
safety I'd like to offer the a few examples of cognitive biases that impact
system safety, especially with novel and disruptive technology (see useful
infographic https://www.businessinsider.com.au
/cognitive-biases-that-affect-decisions-2015-8)
- Confirmation bias:
- Definition - We tend to take notice of only information that
confirms our preconceptions
- Example - This is very pertinent to media discussion on Boeing 737
Max 8 crashes and NHTSA claim on safety correlation of Tesla SW
- Compensating heuristic - Don't jump to conclusions. Wait till facts
are known and tested.
- Bandwagon bias (Groupthink) -
- Definition - the probability of one person adopting a belief increases
based on the number of people who hold that belief
- Example - the issue of ignoring rubber seal deficiencies that led
to ill informed decision for launch Challenger Space Shuttle
- Compensating heuristic - independence, and acknowledging and
dealing with all concerns despite other imperatives (saying no
is sometimes
very hard)
- Availability heuristic bias:
- Definition - Humans overestimate the importance of information
available to them
- Example - Donald Rumsfeld's known-knowns and news/social media
assumptions
- Compensating heuristic - Ensuring completeness in reviews and risk
analysis
- Clustering illusion bias
- Tendency to see patterns in random events
- Example - see Confirmation bias
- Compensating heuristic - Causation does not mean correlation, see
Confirmation bias
- Pro-innovation bias
- Definition - When a proponent of an innovation tends to overvalue
its usefulness and undervalue its limitation
- Example - This seems to be the case with the NHTSA claim and the
whole spectre of disruptive technology. Coolness conquers
caution (caution
constrains convenience; and convenience causes coolness....
Hopefully this
did not impact the 737 Max 8 design.
- Compensating heuristic - What could go wrong is more important that
what could go right - test to fail not to necessarily pass
- Conservatism bias
- Definition - favouring prior evidence over new evidence that has
emerged
- Example -"we have always done it that way" may close of may
opprtunities for better solutions that may also be safer
- Compensating heuristic -not to close off new options but manage
them with facts and verification
- and many more...
Why is it we are so quick to drop the rigour and existing standards which
have built up over time to prevent our judgements being blinded by biases?
Does anyone else have good advice on compensating for cognitive biases and
prevent bad safety decisions?
Bruce
(old and simpler 😉)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.techfak.uni-bielefeld.de/mailman/private/systemsafety/attachments/20190313/d5a4a74e/attachment.html>
More information about the systemsafety
mailing list