[SystemSafety] "Lack of Imagination" greatest problem with hazard analysis
Olwen Morgan
olwen at phaedsys.com
Fri Sep 6 16:28:26 CEST 2019
The underlying problem is the difficulty of ensuring that our
requirements are complete. Fortunately there are some proven practical
ways of addressing this problem, one of which is the scandalously
under-exploited practice pseudorandom test generation. Pseudorandom
stress testing of compilers has a long record of exposing long-dormant
bugs (mostly in code generators). It works by generating convoluted
programs that a human tester would probably never think of.
The principle is readily adaptable to hazard analysis. One simply models
the environment in which the hazards arise and uses pseudorandom methods
to generate unlikely scenarios to see how a system would cope with them.
This can be done for code and, at least in principle, for any
machine-processable form of specification.
This issue reminds me of a lady I once worked with in my first proper
programming job. Her name was Fay and she had got into data processing
via accountancy. Fay was a natural tester. The first two tests she
applied to any batch program were: (1) a test case in which input files
were missing , and (2) a case in which one or more of them were present
but empty. She regularly crashed the efforts of COBOL lumpenprogrammers
(but NEVER mine, because I ALWAYS considered vacuous cases as a matter
of routine). What she understood was that tests should always include
DEVIANCE cases to check for robust behaviour in unspecified circumstances.
This is where pseudorandom test generation comes into its own. It's not
a panacea but it is straightforward to do and a great deal better than
relying on flaky humans.
Olwen
On 06/09/2019 14:28, Peter Bernard Ladkin wrote:
> Well, just to keep us on our toes, here is another quote from Risks-31.40
>
> Apparently those of us who perform hazard analysis are guilty of lacking imagination. Of a solution
> to this issue (perhaps micro-doses of LSD?) there is no suggestion. However, there is some rather
> implausible analysis of some airplane accidents with the root cause identified as ....... lack of
> imagination. I'll post the URL when it comes up on the Risks Forum WWW site.
>
>
>> Date: Tue, 3 Sep 2019 13:28:17 -0400
>> From: "R. G. Newbury" <newbury at mandamus.org>
>> Subject: Frequency-sensitive trains and the lack of failure-mode analysis
>> (Re: RISKS-31.39)
>>
>>> Identifying all these failure modes in advance obviously takes more
>>> expertise and foresight -- but is that really too much to ask of the
>>> relevant experts?
>> It is a lack of imagination. The 'relevant experts' are often what Nassim
>> Taleb calls Intelligent Yet Idiot. The experts transgress beyond their
>> expertise and wrongly (and disastrously) believe that NOTHING CAN GO WRONG,
>> beyond what they have considered. They lack the imagination to see other
>> scenarios. In Taleb's words, they cannot see black swans, therefore no black
>> swan can exist.
>>
>> What is actually needed in the planning/design stage is to present the
>> unexpected scenario to people who face the real situation every day, and ask
>> them ``X has just failed. What can happen next? What do you do? What can
>> happen then?'' And present it to *lots of people in the relevant
>> field*. Some one of them will likely have experienced it, or recognized it
>> lurking just out of sight, and *not gone there*.
>
> PBL
>
> Prof. Peter Bernard Ladkin, Bielefeld, Germany
> MoreInCommon
> Je suis Charlie
> Tel+msg +49 (0)521 880 7319 www.rvs-bi.de
>
>
>
>
>
>
> _______________________________________________
> The System Safety Mailing List
> systemsafety at TechFak.Uni-Bielefeld.DE
> Manage your subscription: https://lists.techfak.uni-bielefeld.de/mailman/listinfo/systemsafety
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.techfak.uni-bielefeld.de/mailman/private/systemsafety/attachments/20190906/ab5f772d/attachment-0001.html>
More information about the systemsafety
mailing list