[SystemSafety] Australian System Safety Conference 2018, May 23 to 25, Melbourne
Peter Bernard Ladkin
ladkin at causalis.com
Fri Dec 8 09:58:28 CET 2017
On 2017-12-08 01:22 , Les Chambers wrote:
> I think I've got a good point. ......
I imagine you probably do :-)
Actually, I think you do have a good point on the "gun culture" contrast between the US and
Switzerland, which has been cited a lot over the decades, for example
http://world.time.com/2012/12/20/the-swiss-difference-a-gun-culture-that-works/
http://www.bbc.com/news/magazine-21379912
https://www.npr.org/2013/03/19/174758723/facing-switzerland-gun-culture
https://www.swissinfo.ch/eng/society/bearing-arms_how-gun-loving-switzerland-regulates-its-firearms/43573832
https://www.csgv.org/the-truth-about-guns-in-switzerland/
but I think you are wrong in generalising from possession of weaponry to engineered systems in
general. The reliable functioning of firearms has almost nothing to do with the number of civilian
deaths by firearm, and that makes it unusual amongst engineered-system safety issues.
A similar issue which is now current in Britain concerns knife crime. No one thinks the reliability
of knives has much to do with "knife safety".
> Safety is a cultural issue.
1. Engineered-system safety is partly, sometimes, a cultural issue.
> The standards and procedures are a hygiene factor only.
2. Standards, procedures and laws are essential.
> You only achieve safety through cultural change.
3. You achieve safety through engineering, procedural, organisational, legal change as well as
changes in standards.
Whether the braking system on my bicycle is dependable is prima facie a technical engineering.
issue. It has two aspects: (a) whether the design and implementation of the system makes it
effective and highly reliable; (b) whether I maintain it appropriately.
(a) is not at all cultural.
(b) is partly a matter of culture (am I motivated to do it), partly a matter of appropriate law (are
there legal sanctions if I don't), partly a matter of appropriate procedures (do I follow the
manufacturer's handbook and common engineering knowledge when performing maintenance).
The enormous advances in road safety in Western Europe (for example) in the last fifty years are
almost entirely due to technical advances: seat belts, airbags, crushable bodies with a relatively
uncrushable person-space, ABS, ESP in vehicles; road design. Some of it is cultural: cracking down
on DUI. Very little of it is due to any cultural changes in driving behaviour.
By far the majority of fatal accidents have overspeed as a factor, but speed limits are much less
well enforced nowadays (except in trucks with data recorders) than they were when I was a kid, with
the exception of France when Nicolas Sarkozy became Interior Minister in 2002 and decreed the
enforcement of speed limitations on motorways - when the fatal accident rate dropped by a third
within a month and stayed down.
The phrase "safety culture" was invented by either the US military (in particular the parts
concerned with nuclear weapons) or the atomic power industry, as far as I know. Without the name,
the airline industry in the West had it too. The US military doesn't necessarily make its figures
public, but I think there is little doubt (Boeing has documented it) that the enormous improvement
in airline safety over the decades has come about with technical improvement: the rising tide which
lifts all boats.
I have been to three workshops this year in which the question of how to translate organisation
safety culture to a comparable "cybersecurity culture" has figured prominently, and I shall be going
to a fourth in just over a week.
The main takeaway from organisational attention to "X culture" in Anglo-Saxon countries seems to be
that "X" must be represented at board level, that is, there must be a member of the board whose main
priority is "X". People often seem to forget that this is highly dependent upon the Anglo-Saxon
concept of firm, in which the Board of Directors is the governing entity. Having somebody on a
German-company "Aufsichtsrat" (assuming it is a publicly-traded company) with the same priority
would not necessarily have the same consequences, for the Aufsichtsrat is not the governing entity.
The governing entity is the (co-)CEO(s), Geschäftsführer.
That said, there is still an astonished (to me) lack of concern with, for example, HF in some
safety-critical industries. In IEC 61508:2010 it is almost completely ignored. Carl Sandom has tried
to rectify this, with the IEC working group on Human Factors in Functional Safety (IEC SC65A WG17)
of which he was the first Convenor in 2013, but this project is way behind on its progress since he
withdrew. I can testify that, decades after the introduction of no-blame procedures in some process
industries, it is still common to hear the view that "HF is simple. If there is an incident, it
means someone has not followed procedures. You find out who it is and sanction them. Done." There is
over the last half-century a massive literature on (amongst other things) why this approach does not
improve your HF performance (except in the trivial sense that, if you fire anyone who has been found
responsible for an accident, you can boast continually that there is no one on your staff who has
ever been responsible for an accident).
PBL
Prof. Peter Bernard Ladkin, Bielefeld, Germany
MoreInCommon
Je suis Charlie
Tel+msg +49 (0)521 880 7319 www.rvs-bi.de
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 833 bytes
Desc: OpenPGP digital signature
URL: <https://lists.techfak.uni-bielefeld.de/mailman/private/systemsafety/attachments/20171208/aca99879/attachment.sig>
More information about the systemsafety
mailing list