[SystemSafety] Fwd: Contextualizing & Confirmation Bias
Tracy White
tracyinoz at mac.com
Fri Feb 7 02:41:52 CET 2014
[PBL wrote]
First, it seems there is a lack of clarity as to what a safety case is (necessarily so, for there are many different notions, so the suggestion has to be evaluated separately for each notion, and the answers might well be different for different notions).
I think that this point probably explains a lot of the discord being raised in this discussion. As somebody who has written a number of safety cases, I can accept that you start out with the intent to argue the safety of a system (well would you even begin if you didn’t have that expectation??) but is that confirmation bias? Clearly you will seek evidence that supports a claim that the system is acceptably safe (defining and defending what ‘acceptable’ is along the way), but you can only make a claim that the evidence supports. Case in point, when putting a safety case argument together for the emergency door release (EDR) system fitted to a passenger train, yes the initial perception (confirmation bias?) was that the EDR was a safety improvement for the train, after all the EDR was a recommendation out of a Special Commission of Enquiry into the Waterfall rail accident. The evidence however demonstrated the opposite i.e. that there was an increased risk with the EDR fitted and that removal or disabling the system would lead to an overall risk reduction… confirmation bias or not, the evidence did not support a claim I set out to make (yes you can ignore non favourable evidence when formulating your safety case, but equally you can ignore non favourable test result in a prescription environment – that is just dishonesty in both cases).
[Nancy wrote]
I don't see what having a safety plan has to do with a safety case regime.
Arguably this should have mentioned engineering management plans as well as safety plan/s, but the plans identify the activities which generate evidence and it is the evidence which is required to support the argument (safety claim); plans are a vital element of any ‘safety case regime’. I think I am quoting Tim Kelly correctly here but I believe he stated that: “an argument without evidence is unsupported and evidence without an argument is unexplained”; the 882 regime that Nancy prefers would arguably fall into the “unexplained” bucket. I have carried out all these tasks, I have all this evidence, but I am not going to explain how that has delivered a safety product. I believe that there were many prescriptive safety requirements met on Piper Alpha as they had: a deluge system, fire resistant living quarters, lifeboats, emergency shutoff capabilities – lots of ‘evidence’ but perhaps when Lord Cullen called for a ‘safety case’ arguably he was asking for somebody to explain how that collective prescription amounted to ‘safe system’.
Regards, Tracy
> On 6 Feb 2014, at 20:25, Peter Bernard Ladkin <ladkin at rvs.uni-bielefeld.de> wrote:
>
>> On 2/5/14 11:57 PM, Derek M Jones wrote:
>> and confirmation bias does not disappear just because a person's
>> belief is found to agree with reality.
>>
>> All of the turkeys suffered from confirmation bias, including the
>> one that is pardoned.
>
> It does, actually. When the turkeys all survive to a grand old age because the farmer likes them and
> has no intention of killing them we call it a successful empirical induction.
>
> We are discussing this topic because there has been a suggestion that confirmation bias (CB) is
> somehow associated with system safety assessed via a safety-case regime.
>
> Now, such a phenomenon would be important and worrying if true, which is why we are discussing it.
> However, on closer inspection I find it unlikely that such a suggestion can be unpacked in any
> ultimately meaningful way, for various reasons.
>
> First, it seems there is a lack of clarity as to what a safety case is (necessarily so, for there
> are many different notions, so the suggestion has to be evaluated separately for each notion, and
> the answers might well be different for different notions).
>
> Second, CB is a psychological phenomenon which is notoriously tricky to handle conceptually, as my
> example of the vicar, and yours of the turkeys (when you take my and Matthew's observations into
> account) show.
>
> Third, since CB is a psychological phenomenon, it has to be shown to occur when individual assessors
> are faced with a safety case. Now, what is it that assessors are actually faced with? They are faced
> with documents consisting of rigorous and semi-rigorous arguments and supporting evidence, as well
> as probably a couple of people who produced the argument in the documents. That is so whether they
> are assessing those documents under IEC 61508 (in which case they are part of a safety case) or
> under civil aerospace certification (in which case, according to some, they are not part of a safety
> case). If any psychological phenomenon is to occur here, and lots of them do, it is going to occur
> equally in both circumstances. I can't see that there could be a systematic difference on the
> psychological level.
>
> Fourth, I am imagine I am the only person on this list who has been regularly teaching logical
> reasoning to generations of students for decades (both formal logic and so-called informal logic). I
> think back - have I seen any systematic influence of confirmation bias in the ability of people to
> assess arguments? Not really, once they've learnt what we try to teach them. This impression is
> reinforced by the study I cited, which is the only one I found which seems to have addressed the issue.
>
> Fifth, I work closely with people who assess safety-critical systems for a living, a half-dozen to a
> dozen of them. I respect their capabilities greatly. They turn out mostly to work in a safety-case
> environment. I don't see any difference in the way they approach assessment, under a safety-case
> regime, from the way in which such assessments in, say, civil aerospace are approached. Indeed, I
> know companies, and people in companies, who present the same cases for the same kit in both
> civil-aerospace certification proceedings and safety-case proceedings. (I mean, why would they be
> different?) I don't see any indication of anything I could call a systematic bias.
>
> Sixth, I do see regulation-induced biases on the level at which psychological phenomena operate, and
> think I know what they look like. IEC 61508-3:2010 includes a "Route 2S" for the assessment of
> previously-used SW which is claimed for a new use as "proven in use". The requirements are weak -
> they say in effect "adequate documentation to show that the likelihood of any systematic dangerous
> faults is low enough......" as well as that the proposed operational environment is "sufficiently
> close to" that of the previous use from which data on (lack of) occurrence of systematic dangerous
> faults is taken. Now, the problem here for some of us is that most people don't know what "adequate
> documentation" looks like, or how "close" is "sufficient", and when they do find out they say
> "that's completely unreasonable! Nobody has that kind of material!" Assessors are put under pressure
> to accept kit under weaker conditions than those necessary in the current state of the art, and
> providers are faced with rejection of what they had supposed were adequate arguments on grounds
> which they do not understand. Those are both regulation-induced biases. The fix is, of course, to
> spell out what constitutes adequate evidence, in a generally-comprehensible manner. Which is what my
> colleagues and I have been working on for nearly four years now. Now, we'd have to do that whether
> the situation was a safety-case regime or not. Whether that documentation is part of a safety case
> or part of written-reasons-why-this-system-is-acceptable-but-not-part-of-a-formal-safety-case (if
> there is such a creature) plays no role whatsoever.
>
> So, if you don't mind, I'll file this utterance with those other sayings, such as "formal methods
> don't work" and "IEC 61508 is dangerous" that sound good to some when issuing out of important
> mouths but whose grand meanings evaporate when you unpack them down to the daily grind.
>
> PBL
>
> Prof. Peter Bernard Ladkin, Faculty of Technology, University of Bielefeld, 33594 Bielefeld, Germany
> Tel+msg +49 (0)521 880 7319 www.rvs.uni-bielefeld.de
>
>
>
>
> _______________________________________________
> The System Safety Mailing List
> systemsafety at TechFak.Uni-Bielefeld.DE
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.techfak.uni-bielefeld.de/mailman/private/systemsafety/attachments/20140207/16ecad46/attachment-0001.html>
More information about the systemsafety
mailing list