[SystemSafety] Safety Culture redux (David Green)
Todd Carpenter
todd.carpenter at adventiumlabs.com
Fri Feb 23 02:40:41 CET 2018
> Can I put an opposing view?
Sure, but I'm not sure the view is actually opposing.
Some of how it is stated is, and that could be a cultural difference.
> That in eliminating the word "bug" you will not eliminate the problem.
I'm not sure anyone claimed that. In fact, anyone who claims any one
single thing will eliminate the problem is confused, or is intentionally
lowering the discussion down to a squabble.
> The objective here is to change people's behaviour.
I fully agree. I will add that it is a cultural change. Those are *hard*
to accomplish in a directed fashion (hey, you say that below. See, we
agree!), and to be successful, we need to wield many tools.
> The transition you want is careless to careful.
Agreed.
> Bug is a metaphor. People are influenced by metaphors. If you
> don't like bug think of a better metaphor.
Many of the other terms people bandied about are better metaphors,
including "error" and "defect." As is your "murder suspect." So yeah, I
agree.
> Further refinement of our dictionaries is not the solution.
There I must beg to differ slightly. I agree it is not "the solution."
It is, however, "a useful tool." One of the reasons the life- and
mission-critical systems with which I have had the honor to be involved
are so successful is that on all of them, we started by defining the
terms we used in our requirements. We developed a shared understanding
of those terms on our teams, and with our regulators. It's a tough,
thankless job. A good dictionary is not *sufficient* to develop a safe
and secure system, but in my experience, it is *necessary.*
Will these terms evolve over time? Of course! The terms we developed and
used in the early 1990s were great, until I discovered the Laprie
taxonomy. Is that perfect? No, and there are people on this list who
have specific problem with the definition of some of the core terms. But
I have used that exact paper to educate and influence some
safety-critical cultures at some large companies.
Peter's list is a further refinement that I can use.
One thing I probably need to point out: I'm an engineer. My job is to
pick up the imperfect tools and resources available to me, and make
something work better, cheaper, and/or faster, than those who have gone
before. In my world, perfect is a pipe dream, but I can *always* figure
out how to improve things. This list helps me with that - I am always on
the lookout for treasures that I can use.
It took me 30 years in the industry to realize that not everyone has the
same world view. Most of my colleagues are classically trained
scientists. They look at life a different way: identify one counter
example and disprove the whole theory.
This type of reasoning is great for the pursuit of truth and perfection.
It's particularly comfortable for academics who always like to win
arguments. It is also useful to find the deadly flaws in engineering
solutions. But those of us in the real world still have to make things
work, and we can't afford to be stuck in analysis paralysis seeking
perfection. As we pointed out earlier this month, look at commercial
passenger aviation (by flag carriers). We've done remarkably well, and
we've replicated it across both countries and cultures. We have achieved
incredible levels of safety. However, you will never be able to point at
it and say, "This _one_ thing is why it is so safe." That one thing
might be necessary, but it will not be sufficient. So let's not
construct our arguments that way, shall we? At least when we are talking
about things in the real world.
So, back to the above two statements:
> 1. That in eliminating the word "bug" you will not eliminate the problem.
> 2. Further refinement of our dictionaries is not the solution.
To a scientist seeking a universal perfect solution, these might be
meaningful statements. To an engineer, "Of course not, no single thing
is a complete solution." Will they help me influence positive change
within my sphere of influence? You betchya they will. As will "More
compelling metaphors." Bring them on, help me influence that psyche.
> In my experience, by far the most effective tool for culture
> change in code quality is the code review.
Ha! I'll raise you one on that: If you catch it in a code review, it's
too late. Where was the design review? The requirements review? (BTW,
you can't write meaningful requirements without some shared
understanding of the terms) The systemic problems start at those higher
levels, and a code review is almost necessarily a peephole optimization
problem.
But yes, code reviews are *necessary* (not sufficient - again, no single
thing is sufficient). Code reviews need to be performed long before
integration and test, and they are definitely more important than some
after-the-fact penetration testing. Certainly more useful and productive
than a compliance checklist. Yes, they will definitely make people
uncomfortable, and that's not always a good thing - it depends on the
culture:
I was lucky enough to be raised by an expert safety team (several of
whom are members of this list). They were completely blunt in their code
(and documentation, design, and requirements) reviews. Nothing personal;
if they neglected to say something about an issue, someone *would* get
hurt. (As they say, I grew up with much love, mentoring, and many many
beatings.)
Guess how difficult it is now to bring up Millenials in that same
culture? Especially when the majority of "computer scientists" are
merely programmers, and don't have either safety or security as part of
their required training? Eegads, pointing out mistakes in their code has
resulted in stares as if I've kicked their puppy. Talk about
uncomfortable. Sometimes I even worry that their parents might call me.
Unfortunately, if the culture protects too many tender individuals like
this, the reviewers will be the ones who leave, and it requires an
external influence to shuffle the deck.
So yeah, long winded way of saying that I agree, culture change is hard,
and metaphors are useful tools to help influence change.
-TC
On 2/22/2018 6:17 PM, Les Chambers wrote:
>
> Todd
>
> Can I put an opposing view?
>
> That in eliminating the word "bug" you will not eliminate the problem.
> The objective here is to change people's behaviour. The transition you
> want is careless to careful.
>
> Bug is a metaphor. People are influenced by metaphors. If you don't
> like bug think of a better metaphor. How about: "murder suspect". Has
> it occurred to anyone that we've had all these definitions for decades
> yet we still have "bugs". Further refinement of our dictionaries is
> not the solution. More compelling metaphors are the key.
>
>
>
> I digress: back in the day I wrote a control task that operators would
> use infrequently to recover from an anomalous situation in a chemical
> reactor. Because they didn't use it often they tended to forget its
> name. So I renamed it "sex". That got their attention. "Got a problem?
> You need sex! ... damn right "
>
>
>
> Culture change is hard. Left alone people will take the least
> disruptive option. Discomfort is necessary. We have to be forced into
> it. A dictionary has never caused me any discomfort. The clear and
> present danger that my negligence might destroy property and kill
> people did. In my experience, by far the most effective tool for
> culture change in code quality is the code review. Not only does it
> focus on improving the quality of the work product but it instils
> cultural norms in all the participants. The current thinking is that
> culture change occurs by people working together. So you need to give
> them a reason to work together.
>
>
>
> Don't get me wrong, I'm not against using precise terms in engineering
> reports. But recognise that we're not discussing the beauty of an
> engineering report here. We're dealing with preventive action on the
> human psyche. It's a random and messy place where influence happens
> through metaphor backed up by action. Start reading here:
>
>
>
> http://shu.bg/tadmin/upload/storage/161.pdf
>
>
>
> Les
>
>
>
> *From:*systemsafety
> [mailto:systemsafety-bounces at lists.techfak.uni-bielefeld.de] *On
> Behalf Of *Todd Carpenter
> *Sent:* Friday, February 23, 2018 3:03 AM
> *To:* systemsafety at lists.techfak.uni-bielefeld.de
> *Subject:* Re: [SystemSafety] Safety Culture redux (David Green)
>
>
>
> +1 for the 2018 plan to eliminate the word "bug" from our collective
> vocabularies. I'm in.
>
>
> Can we fix these issues easily? Sure. I recommend, as usual, the definitions in
> https://causalis.com/90-publications/99-downloads/DefinitionsForSafetyEngineering.pdf
>
> PBL
>
>
> Why have I not seen this paper before now? I've been using (and
> mandating on my programs) the Laprie taxonomy since it came out. Your
> definitions are what it missed. One thing that is particularly nice
> about how you framed these terms is that I can easily map them into
> the vernacular used in avionics, medical device, and industrial
> control domains. Sure, there will be collisions against many of the
> standards, but those standards are often imprecise, as you pointed out.
>
> I can start discussions with, "is this what you really mean by that
> term you just used?"
>
> [sound of typing as I distribute it to several teams...]
>
> Thank you!
> -TC
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.techfak.uni-bielefeld.de/mailman/private/systemsafety/attachments/20180222/d72a9968/attachment-0001.html>
More information about the systemsafety
mailing list