[SystemSafety] Short and profound

Les Chambers les at chambers.com.au
Thu May 28 02:11:47 CEST 2015


Drew

I don't think it's a question of agreeing or disagreeing. You can project
the idea presented in this sentence onto your own life in your own way, as
you have done. And the act of doing so helps you to reflect on your own
belief systems. 

I heard Okri recite his sentence on the radio. He has a beautiful and
somewhat hypnotic African speaking voice, one that stops you in your tracks
if you're walking through the room. It turns out that he derived this
sentence from three pages of his writing. Reducing, reducing and reducing
until he found its essence (would that we all had the time to do this). The
result, I think, is perfectly balanced. "Take it," he says. "Use it as you
wish. Abstract it further or project it onto your own experience and learn
something about yourself." My choice would be to abstract it further to one
word: maturity.

 

No matter how advanced we think we are, we will always be immature in some
respect, and only "later on" will it "become clear" - on further reflection
- how immature we were and what a bad idea {that} was at the time. My
teenage adventures with alcohol and motorcycles are a good case study but
I'd rather not go there.

 

Instead, I turn on my radio this morning and hear Stuart Russell, Professor
of Computer Science at the University of California, Berkley speaking about
lethal autonomous weapons systems (LAWS). He has just published a an article
in the journal Nature. LAWS are the antithesis of functional safety,
designed to kill people rather than protect them. He says the Israelis
already have a LAW trained to loiter in the vicinity of the enemy, lock onto
radar signatures and take out the installation without further involvement
from a human being (even if it's set up in a preschool). Of course he'd like
to see LAWS banned.

 

Applying Okri's sentence, I'd offer that: engineering involvement in LAWS
development is a classical bad idea. Further, the profession should have a
code of ethics that prohibits it. The pace of advancement in AI is
accelerating to the point where legislators, the guardians of our collective
morality, can't keep up. At some point the engineering profession, the one
that knows what's happening before anyone else, will have to take ethical
responsibility for its work products and just say, "NO". I have no doubt
that this will " ... become clear later on," when the world is shocked
enough by some uncommanded apocalyptic event. Though I hope it will be
sooner than that.

 

I WAS walking through the room when I heard Okri's voice. I had plenty to do
but couldn't resist hearing him out. Then he said,

 

 "That which we move towards, changes us."

 

Analyse that!

 

Cheers

Les

 

From: Drew Rae [mailto:d.rae at griffith.edu.au] 
Sent: Wednesday, May 27, 2015 10:11 AM
To: Les Chambers
Cc: systemsafety at techfak.uni-bielefeld.de
Subject: Re: [SystemSafety] Short and profound

 

Les,

I don't think you're alone, I think it is a common sentiment. 

I'm going to disagree slightly.

 

Some things only exist with hindsight. 

 

Saying that things "become clear" might imply that they could have been seen
if we looked harder. In safety work (and those of us in research, myself
included, are the worst offenders) there is a tendency to admit that
hindsight bias exists, but to think that we can overcome it by trying really
hard. 

 

I'm heading towards the conclusion myself that safety as measured
in-the-moment or in advance is a totally different construct to safety
measured in its absence by accidents and injuries. It's not that safety is
blurry in advance but crystal clear after an outcome is reached; it's that
they are two different phenomena being observed and talked about. 

 

To use the most glaring example: accident investigation. We take as an
article of faith that investigation is worthwhile, and that it reveals
truths about physical systems and systems of work. Why should it be the case
that a workplace which:

  -  is emotionally charged,

  -  is highly a-typical of the normal business environment

  -  has the threat of negative consequences twisting every recollection,
statement and interpretation

  -  has the knowledge of the actual outcome twisting every recollection,
statement and interpretation 

 

would provide a good environment for collecting knowledge?

 

As a researcher, the only benefit of this environment is that the legal
environment reveals documents that  I wouldn't otherwise get to see. I can
only make sense of these documents with a context that I get from past
experience as a practitioner (i.e. information that is personal, subjective,
and not coming from the accident).

 

There is definitely a clear story that emerges from an accident. The idea
that it is more clear than the stories available before the accident is
seductive, but dangerously untrue. They are different stories, created by
different social forces. You can ask which story is more useful - that's an
open question subject to current debate - but it's a category error to ask
which story is more clear or objective. 

 

Regards,

Drew

 

 

* This message is from my work email

* I can also be contacted on andrew at ajrae.com

* My mobile number is 0450 161 361

* My desk phone is 07 37359764

* My safety podcast is DisasterCast.co.uk

 

 





 

On 27/05/2015, at 9:34 AM, Les Chambers wrote:





Hi

I came upon this sentence recently. I thought it was profound.

 

" Some things only become clear,  later on."

Source: Ben Okri, Booker Prize winner for The Famished Road

 

I see many applications for this sentiment in safety work.

Am I alone?

 

Cheers

Les

 

 

_______________________________________________
The System Safety Mailing List
systemsafety at TechFak.Uni-Bielefeld.DE

 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.techfak.uni-bielefeld.de/mailman/private/systemsafety/attachments/20150528/2f77644a/attachment-0001.html>


More information about the systemsafety mailing list