[SystemSafety] NYTimes: The Next Accident Awaits
Nancy Leveson
leveson.nancy8 at gmail.com
Sun Feb 2 13:03:19 CET 2014
One more thing. The European regulator that I mentioned in the previous
message was in the Swiss Rail Agency. I spoke to him before their very
serious recent accident.
Nancy
On Sun, Feb 2, 2014 at 7:01 AM, Nancy Leveson <leveson.nancy8 at gmail.com>wrote:
> Drew, as usual, makes much sense.
>
> I would like to point out, however, that it seems like rail does not have
> an exceptionally low accident rate. At least in the past year, I have heard
> about a lot of very serious rail accidents in North America, Europe, and
> Asia.
>
> Nancy
>
>
> On Sun, Feb 2, 2014 at 6:52 AM, Andrew Rae <andrew.rae at york.ac.uk> wrote:
>
>> This may be an appropriate time to mention a paper John McDermid and I
>> wrote in 2012,
>> "Goal Based Safety Standards: Promises and Pitfalls". The title was
>> written and promised before the paper, so it
>> doesn't quite capture the fact that it is mainly about the question of
>> whether it is possible, even in principle, to empirically determine
>> which form of regulation works best.
>>
>> Whilst the authors are from the "other side" of the goal-based debate, it
>> makes the same epistemological point as Nancy was making. We shouldn't be
>> making sweeping claims about what works unless we have evidence to back
>> them up. At the level of national (or even industry-by-industry) regulation
>> the
>> complexity of confounding factors is ridiculous. It's a hard enough
>> empirical problem to determine who is actually safer let alone why.
>>
>> I think the confirmation bias issue (and review in general) is an area
>> where we could do some effective experimental work.
>> What types of errors are reviewers good or bad at identifying?
>> Does making the safety argument explicit help or hinder a reviewer in
>> finding weaknesses?
>> To what extent can you "prime" a reviewer to believe a system is
>> probably safe, and how does this change review performance?
>> Which types of evidence actually help us tell the difference between a
>> safe and unsafe system?
>>
>> There's a management science paper "Resolving scientific disputes by the
>> joint design of crucial experiments by the antagonists: Application to the
>> Erez-Latham dispute regarding participation in goal setting" which suggests
>> that where you have entrenched scientific disputes one way forward is to
>> design an experiment together.
>> Whilst some of the really big questions like "which form of regulation
>> works best, when" are beyond our resources to answer, it is probably worth
>> thinking about what experiments (or more likely non-experiment empirical
>> studies) would reveal the answers to disputes in system safety. If nothing
>> else it might help find smaller questions (e.g. those I listed above) where
>> we could realistically reach agreement based on evidence.
>>
>> [My own provisional view on safety cases - If they are done properly,
>> there is no reason to think that they shouldn't be better than prescriptive
>> regulation, because even when using prescription you still have to address
>> the problem of suitability and applicability of the regulation, and there
>> is no mechanism to capture this. HOWEVER, I am not at all convinced that
>> any industry is consistently using safety cases properly (rail is the one
>> possible exception, but there is a very heavy historical and regulatory
>> background to the type of argument and evidence used inside the safety case
>> framework). If my car is super-safe when driven under 40mph, but everyone
>> always drives at 50mph, there's a point where I have to stop insisting it
>> is a safe car.]
>>
>>
>> My system safety podcast: http://disastercast.co.uk
>> My phone number: +44 (0) 7783 446 814
>> University of York disclaimer:
>> http://www.york.ac.uk/docs/disclaimer/email.htm
>>
>> My system safety podcast: http://disastercast.co.uk
>> My phone number: +44 (0) 7783 446 814
>> University of York disclaimer:
>> http://www.york.ac.uk/docs/disclaimer/email.htm
>>
>>
>> On 2 February 2014 11:30, Nancy Leveson <leveson.nancy8 at gmail.com> wrote:
>>
>>> I served as an expert consultant to the Presidential Oil Spill
>>> Commission after Deepwater Horizon and helped write the report. Many people
>>> at that time were suggesting that all our troubles would be solved by
>>> adopting safety cases. As a result, I started studying this topic in depth,
>>> read everything I could find written on it, and in the end wrote a paper
>>> against the use of a safety case regulatory regime in the U.S. Here are
>>> some of my arguments (see the entire paper for details):
>>>
>>> 1. Confirmation Bias: Confirmation bias (a well established
>>> psychological principle) leads to incorrect safety cases (as have been most
>>> of the safety cases that I have seen published). And reviewers suffer from
>>> the same type of confirmation bias as those making up the cases. Without
>>> some way of combating confirmation bias, letting people make up arguments
>>> for safety or requiring certifiers to evaluate each argument individually
>>> is not going to be as effective as prescriptive regulation based on
>>> historical precedent.
>>>
>>> 2. Impractical Expertise Requirements on the part of Regulators: It is
>>> impossible for regulators to be an expert on every type of argument and
>>> analysis method that could possibly be used by an applicant. It is much
>>> more difficult and more error prone to have to evaluate any possible
>>> argument given. Where will such experts come from? If they exist, will they
>>> really want to work for government wages (at least in the US)? I don't know
>>> many people who could do this job well, including myself. So are arguments
>>> simply accepted because they sound good?
>>>
>>> At a meeting last year, I spoke informally with a European regulator who
>>> argued that he could not regulate without the use of PRA. His argument was
>>> that the systems in his industry were becoming so complex that the
>>> regulators could not possibly understand the details of the systems they
>>> were certifying. So they accepted probabilistic arguments by applicants
>>> that performance targets would be met. I asked him how the regulators could
>>> possibly know if the PRA results were correct or even reasonable if they
>>> did not understand the designs that were being analyzed? He had no answer
>>> for this question. In aviation, for example, it would be impossible for any
>>> regulator to understand the details of the design of the entire plane in
>>> order to follow an argument for why that design is safe. In addition, most
>>> of these details are proprietary and therefore safety cases would not be
>>> able to be open to the public or to any independent evaluation.
>>>
>>> 3. Impractical Resource Requirements: The safety case approach
>>> requires not only more expertise on the part of regulators, but more
>>> resources. The number of government resources required to apply such a
>>> regulatory regime adequately are much more than would be practical in many
>>> countries, including the U.S. For example in off-shore oil drilling, the UK
>>> and Norway employ a large number of highly educated personnel and technical
>>> specialists to perform audits, inspections and review required documents.
>>> The UK has about an equal number of off-shore oil rig inspectors as they
>>> have off-shore oil rigs. In Norway, the PSA has approximately 160
>>> employees, of which approximately 100 perform compliance and audit-related
>>> tasks regulating 105 offshore installations. Each of these 100 employees
>>> has a postgraduate (Master's degree) or equivalent level of training, in
>>> one of more areas of expertise, including drilling, petroleum engineering,
>>> structural engineering, and reliability engineering. In contrast, in the
>>> U.S., the Bureau of Safety and Environmental Enforcement (BSEE) and the
>>> U.S. Coast Guard share approximately 60 billeted offshore inspectors over
>>> 3,500 offshore installations. We would never be able to hire the number of
>>> people or put in the resources that the British and Norwegians do. It would
>>> simply devolve into lack of any adequate regulatory oversight by U.S.
>>> agencies due to lack of adequate personnel. Personnel requirements are less
>>> for prescriptive regulation.
>>>
>>> 4. Does it work? Is it better? There have been few objective studies
>>> conducted on the impact of the safety case regulatory approach on safety
>>> performance vs. other approaches. It would be nice before we engage in
>>> "proof by vigorous handwaving and strong advocacy" if people would collect
>>> scientific evidence of the superiority of the safety case approach over
>>> others. Proponents have not done so. Note, however, that the industries
>>> with the best accident statistics (such as civil aviation) do not use
>>> safety cases but rather use prescriptive regulation. So a scientific,
>>> comparative evaluation should be made by those advocating this approach as
>>> well as ways to overcome the three practical difficulties listed above.
>>> Just because it sounds good or is different than what we do now is not
>>> enough.
>>>
>>> Nancy
>>>
>>> On Sun, Feb 2, 2014 at 4:53 AM, Tracy White <tracyinoz at mac.com> wrote:
>>>
>>>> I have found through personal experience that people with a
>>>> 'certification' pedigree struggle with the concept if a safety case ...
>>>> this is particularly true in defence. Where people have come from the
>>>> prescriptive world which calls for completion if tasks x,y, z; their safety
>>>> case is then: it's safe because we did x,y,z. This approach completely
>>>> fails to justify or explain why x,y,z is appropriate or sufficient for
>>>> their particular project.
>>>>
>>>> I do not believe that 'safety cases' provides a free for all as, in the
>>>> absence of a suitable alternative, the same prescriptive sources will
>>>> feature as technical safety measures. But what the safety case should bring
>>>> to the table is a requirement to satisfy a claim as to why these measures
>>>> (or any others) are sufficient, appropriate, applicable, relevant etc.
>>>> something that prescription fails to do.
>>>>
>>>> Regards, Tracy
>>>>
>>>> On 2 Feb 2014, at 19:05, Nancy Leveson <leveson.nancy8 at gmail.com>
>>>> wrote:
>>>>
>>>> I don't think that anyone is implying that the safety case "replaces
>>>> some form of regulation". But it implies a particular form of regulation,
>>>> usually performance-based rather than prescriptive. Thus ARP 4751 in
>>>> aviation and MIL-STD-882 in defense, are not safety case regimes because
>>>> there are specific procedures that must be followed to be certified. The
>>>> applicant does not get to determine what type of argument they make.
>>>>
>>>> Nancy
>>>>
>>>>
>>>> On Sat, Feb 1, 2014 at 7:43 PM, Tracy White <tracyinoz at mac.com> wrote:
>>>>
>>>>> I am slightly confused and a little perturbed by an argument that a
>>>>> 'safety case' in someway replaces any regulatory control (or government
>>>>> interference). Even more that a safety case would not include a subclaim to
>>>>> have conducted a 'rigorous hazard analysis' program ... or to have applied
>>>>> appropriate 'procedures and standards'.
>>>>>
>>>>> Anybody who thinks that 'safety cases' in anyway replaces some form of
>>>>> regulation is ignorant of its purpose. I work in a regulatory environment
>>>>> and the 'safety case' is the primary communications medium with that
>>>>> regulator, elements of which will talk to hazard identification and
>>>>> compliance with standards and codes considered representative of
>>>>> engineering 'good practice'. I would agree that there are good and bad
>>>>> safety cases and I think that 'industries that do not 'have a good
>>>>> historical culture in terms of safety' are as ignorant of purpose of the
>>>>> safety cases as they of the need for safety in general.
>>>>>
>>>>> Regards, Tracy
>>>>>
>>>>> On Feb 01, 2014, at 12:48 AM, Nancy Leveson <leveson.nancy8 at gmail.com>
>>>>> wrote:
>>>>>
>>>>> It is very difficult to characterize the U.S. In general, the country
>>>>> is so physically large that there are extreme differences in culture and
>>>>> politics (generally but not always physically bounded). Much of the central
>>>>> government in the US and European worlds seem to be moving toward
>>>>> libertarianism, but I am probably mischaracterizing Europe based on biased
>>>>> news reports. The individual U.S. states show extreme differences. At the
>>>>> extremes, Texas and California may as well be in different worlds, let
>>>>> alone countries when it comes to safety regulations (and lots of other
>>>>> things irrelevant to this list). There are also such different cultures in
>>>>> different industries that it is difficult to make general statements.
>>>>> Mining and civil aviation are examples of such extremes.
>>>>>
>>>>> But I will make one general statement that is only my personal
>>>>> experience. Because of my paper arguing against safety cases, I am getting
>>>>> many calls from government employees and company lawyers as well as
>>>>> individual engineers. Some of the companies pushing the "safety case" in
>>>>> the U.S. are those who don't want any government interference and see the
>>>>> safety case as a way to get around the rigorous procedural standards that
>>>>> now exist here in many industries. They seem to feel that they will be able
>>>>> to get rid of the procedures and standards that exist now and can write
>>>>> anything they want in a safety case and therefore save money and time in
>>>>> the rigorous hazard analysis now widely required while using any design
>>>>> features they want. These are primarily in industries that do not have a
>>>>> good historical culture in terms of safety.
>>>>>
>>>>> Nancy.
>>>>>
>>>>>
>>>>> On Fri, Jan 31, 2014 at 4:08 AM, RICQUE Bertrand (SAGEM DEFENSE
>>>>> SECURITE) <bertrand.ricque at sagem.com> wrote:
>>>>>
>>>>>> Hi Nancy,
>>>>>>
>>>>>>
>>>>>>
>>>>>> Concerning France you are right, and in that case I think that the
>>>>>> cultural aspect dominates. There is no safety culture in the population as
>>>>>> in UK, as acknowledged after AZF accident. The risk stops at the fence of
>>>>>> the plant and you can safely build your house on the other side ... The
>>>>>> regulations have changed since but not the cultures. The safety engineers
>>>>>> concerned by the new regulations live a nightmare as the choices are more
>>>>>> or less, dismantle the plant versus dismantle the town ... I think that the
>>>>>> safety cultures have more impact on the final result than the competence of
>>>>>> the safety community.
>>>>>>
>>>>>>
>>>>>>
>>>>>> Bertrand Ricque
>>>>>>
>>>>>> Program Manager
>>>>>>
>>>>>> Optronics and Defence Division
>>>>>>
>>>>>> Sights Program
>>>>>>
>>>>>> Mob : +33 6 87 47 84 64
>>>>>>
>>>>>> Tel : +33 1 59 11 96 82
>>>>>>
>>>>>> Bertrand.ricque at sagem.com
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> *From:* systemsafety-bounces at lists.techfak.uni-bielefeld.de [mailto:
>>>>>> systemsafety-bounces at lists.techfak.uni-bielefeld.de] *On Behalf Of *Nancy
>>>>>> Leveson
>>>>>> *Sent:* Thursday, January 30, 2014 8:59 PM
>>>>>> *To:* systemsafety at lists.techfak.uni-bielefeld.de
>>>>>> *Subject:* Re: [SystemSafety] NYTimes: The Next Accident Awaits
>>>>>>
>>>>>>
>>>>>>
>>>>>> It would be nice to actually introduce some data into the discussions
>>>>>> on this list. First, although it is very true that the U.K. has excellent
>>>>>> comparative occupational safety statistics, this exceptional performance
>>>>>> predated safety cases by at least 100 years and is as much a cultural
>>>>>> artifact of the U.K. as any current practices. While the rest of the world
>>>>>> was suffering the results of steam engine explosions in the late 1800s, for
>>>>>> example, Great Britain was the first to implement measures to reduce them.
>>>>>> (I wrote a paper on this once if anyone is interested.) Although the
>>>>>> British citizens on this list know more about the history of the UK HSE, I
>>>>>> believe they were the first country to require companies to have safety
>>>>>> policies, etc., after the Flixborough explosion. Safety cases, I believe,
>>>>>> came into being only after the more recent Piper Alpha explosion.
>>>>>>
>>>>>>
>>>>>>
>>>>>> Trying to tie accident rates in different countries to particular
>>>>>> ways of regulating safety is dicey at best. First, there are significant
>>>>>> differences between the engineering, agricultural, industry, and service
>>>>>> rates of accidents in countries, often related to technical differences.
>>>>>> Some have high agricultural accident rates but low service accident rates.
>>>>>> For example, accident rates are going to be very different in a country
>>>>>> with high tech agricultural techniques compared to those still plowing
>>>>>> fields with a pair of oxen. Politics plays an even more important role. For
>>>>>> example, western countries often put very dangerous processes and plants in
>>>>>> third world countries or governments in these countries do not have laws
>>>>>> that require manufacturers to use even minimal safety practices in
>>>>>> manufacturing, for example, and they will not as long as they need the
>>>>>> revenue and jobs. The safety culture in these countries will not change
>>>>>> magically by using one type of regulatory regime.
>>>>>>
>>>>>>
>>>>>>
>>>>>> Note also, that there are vast differences in industries. Those with
>>>>>> the very safest records, such as the U.S. SUBSAFE program, do not use
>>>>>> safety cases. (And they have managed to have an incredible safety record
>>>>>> despite being in the U.S. :-)). If we want to compare the effectiveness of
>>>>>> different regulatory regimes, then we need to provide scientific
>>>>>> evaluations and not just misuse statistics (which may involve factors that
>>>>>> have nothing to do with the actual regulatory regime used).
>>>>>>
>>>>>>
>>>>>>
>>>>>> Also, as Michael Holloway noted, culture differences will make
>>>>>> different types of regulation more or less different in different countries
>>>>>> and industries.
>>>>>>
>>>>>>
>>>>>>
>>>>>> Finally, I would like to point out to those who are making some
>>>>>> national comparisons and putting down the U.S. in comparison with France,
>>>>>> for example, that the fatal occupational accident rate in the U.S. is less
>>>>>> than that of France. Perhaps we can avoid mixing politics and chauvinism
>>>>>> with science on this list.
>>>>>>
>>>>>>
>>>>>>
>>>>>> Nancy
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Thu, Jan 30, 2014 at 8:50 AM, Martyn Thomas <
>>>>>> martyn at thomas-associates.co.uk> wrote:
>>>>>>
>>>>>> I'm a non-exec Director at the UK's Health and Safety Laboratory (
>>>>>> www.hsl.gov.uk). We carry out the basic research that underpins the
>>>>>> UK's regulation of occupational health and safety, ranging from reducing
>>>>>> accidents on construction sites and improving the tethering of loads on
>>>>>> lorries, through to reproducing and analysing major explosions (such as
>>>>>> Buncefield - http://www.buncefieldinvestigation.gov.uk/) and
>>>>>> destruction-testing the physical integrity of tankers and rolling-stock.
>>>>>>
>>>>>> We also undertake commercial work that uses our unusual experimental
>>>>>> and analysis capabilities and very strong science base.
>>>>>>
>>>>>> The UK is unusual in having a goal-based, safety-case regulatory
>>>>>> regime and a regulator (HSE) with its own expert research establishment
>>>>>> (HSL). We are getting an increasing number of approaches from Governments
>>>>>> in the Far and Middle East who see the UK's good performance in
>>>>>> occupational Health and Safety and who want to investigate setting up
>>>>>> similar goal-based regulation.
>>>>>>
>>>>>> Maybe there is something in the HSE/HSL approach that the US chemical
>>>>>> industry could benefit from.
>>>>>>
>>>>>> Regards
>>>>>>
>>>>>> Martyn
>>>>>> Martyn Thomas CBE FREng
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> On 29/01/2014 22:05, Peter Bernard Ladkin wrote:
>>>>>>
>>>>>> A worthy opinion piece from the Chair of the US Chemical Safety Board. Note his suggestion that identifying hazards and mitigation is just well-established best practice. I can say from experience that it is not yet in Europe in all industries with safety aspects, even though he holds Europe up as having a factor of three fewer chemical accidents as the US.
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> _______________________________________________
>>>>>> The System Safety Mailing List
>>>>>> systemsafety at TechFak.Uni-Bielefeld.DE
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> --
>>>>>> Prof. Nancy Leveson
>>>>>> Aeronautics and Astronautics and Engineering Systems
>>>>>> MIT, Room 33-334
>>>>>> 77 Massachusetts Ave.
>>>>>> Cambridge, MA 02142
>>>>>>
>>>>>> Telephone: 617-258-0505
>>>>>> Email: leveson at mit.edu
>>>>>> URL: http://sunnyday.mit.edu
>>>>>>
>>>>>> #
>>>>>> " Ce courriel et les documents qui lui sont joints peuvent contenir
>>>>>> des informations confidentielles, être soumis aux règlementations relatives
>>>>>> au contrôle des exportations ou ayant un caractère privé. S'ils ne vous
>>>>>> sont pas destinés, nous vous signalons qu'il est strictement interdit de
>>>>>> les divulguer, de les reproduire ou d'en utiliser de quelque manière que ce
>>>>>> soit le contenu. Toute exportation ou réexportation non autorisée est
>>>>>> interdite.Si ce message vous a été transmis par erreur, merci d'en informer
>>>>>> l'expéditeur et de supprimer immédiatement de votre système informatique ce
>>>>>> courriel ainsi que tous les documents qui y sont attachés."
>>>>>>
>>>>>> ******
>>>>>> " This e-mail and any attached documents may contain confidential or
>>>>>> proprietary information and may be subject to export control laws and
>>>>>> regulations. If you are not the intended recipient, you are notified that
>>>>>> any dissemination, copying of this e-mail and any attachments thereto or
>>>>>> use of their contents by any means whatsoever is strictly prohibited.
>>>>>> Unauthorized export or re-export is prohibited. If you have received this
>>>>>> e-mail in error, please advise the sender immediately and delete this
>>>>>> e-mail and all attached documents from your computer system."
>>>>>> #
>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> Prof. Nancy Leveson
>>>>> Aeronautics and Astronautics and Engineering Systems
>>>>> MIT, Room 33-334
>>>>> 77 Massachusetts Ave.
>>>>> Cambridge, MA 02142
>>>>>
>>>>> Telephone: 617-258-0505
>>>>> Email: leveson at mit.edu
>>>>> URL: http://sunnyday.mit.edu
>>>>> _______________________________________________
>>>>> The System Safety Mailing List
>>>>> systemsafety at TechFak.Uni-Bielefeld.DE
>>>>>
>>>>>
>>>>> _______________________________________________
>>>>> The System Safety Mailing List
>>>>> systemsafety at TechFak.Uni-Bielefeld.DE
>>>>>
>>>>>
>>>>
>>>>
>>>> --
>>>> Prof. Nancy Leveson
>>>> Aeronautics and Astronautics and Engineering Systems
>>>> MIT, Room 33-334
>>>> 77 Massachusetts Ave.
>>>> Cambridge, MA 02142
>>>>
>>>> Telephone: 617-258-0505
>>>> Email: leveson at mit.edu
>>>> URL: http://sunnyday.mit.edu
>>>>
>>>>
>>>> _______________________________________________
>>>> The System Safety Mailing List
>>>> systemsafety at TechFak.Uni-Bielefeld.DE
>>>>
>>>>
>>>
>>>
>>> --
>>> Prof. Nancy Leveson
>>> Aeronautics and Astronautics and Engineering Systems
>>> MIT, Room 33-334
>>> 77 Massachusetts Ave.
>>> Cambridge, MA 02142
>>>
>>> Telephone: 617-258-0505
>>> Email: leveson at mit.edu
>>> URL: http://sunnyday.mit.edu
>>>
>>> _______________________________________________
>>> The System Safety Mailing List
>>> systemsafety at TechFak.Uni-Bielefeld.DE
>>>
>>>
>>
>
>
> --
> Prof. Nancy Leveson
> Aeronautics and Astronautics and Engineering Systems
> MIT, Room 33-334
> 77 Massachusetts Ave.
> Cambridge, MA 02142
>
> Telephone: 617-258-0505
> Email: leveson at mit.edu
> URL: http://sunnyday.mit.edu
>
--
Prof. Nancy Leveson
Aeronautics and Astronautics and Engineering Systems
MIT, Room 33-334
77 Massachusetts Ave.
Cambridge, MA 02142
Telephone: 617-258-0505
Email: leveson at mit.edu
URL: http://sunnyday.mit.edu
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.techfak.uni-bielefeld.de/mailman/private/systemsafety/attachments/20140202/a0ba5c88/attachment-0001.html>
More information about the systemsafety
mailing list