[SystemSafety] Safety Culture redux
Peter Bernard Ladkin
ladkin at causalis.com
Fri Feb 23 05:59:49 CET 2018
It is a little odd to see Les arguing for the relative pointlessness of words and dictionaries while
suggesting at the same time that code review is a most effective engineering procedure.
Code, in the sense in which we speak of it in "code review", is a series of assertions in a formal
language. A sort of non-fiction book (of instructions or declarations, whichever is your style).
When we review that book, we interpret its statements according to what we think is their meaning.
Dictionaries are devices which say what individual words mean. The only reason code review can be
successful at all is because of that binding of word and phrase to meaning.
Actually, fixing the meanings of individual words and phrases in this formal language, binding words
and phrases to short, clear meanings in an exceptionless way, turns out to be one of the most
effective methods in the engineering of reliable programs, as shown originally by Algol 60 and
Pascal, as well as the language REFINE, now sadly defunct, in which I implemented my thesis work, an
algebraic structure for implementing real-calendrical-time period calculations, and more recently by
the decades of experience with SPARK. And, conversely, not fixing them is known to be a source of
considerable vulnerability: witness, at the beginning of the Internet era and the establishment of
US CERT in the 1990's, the 80%-90% of security vulnerabilities which could have been simply ruled
out by using technology that had already existed for thirty years, namely making your data types
behave according to the way you thought about them (aka strong typing).
One may speak of words and dictionaries, but it is probably more efficacious to speak of concepts
and how they hang together.
Solving a problem, ameliorating an issue, inevitably involves conceptualising it in such a way that
a solution can be seen to be one. And if it can be seen to be one, but doesn't turn out to be one,
it likely means that you are missing part of the issue, that your conceptualisation turned out to be
inadequate. If you don't like the word "conceptualise" here, please replace it by the word
"understand", and I think you will see that this is almost a banal statement. So, whatever you might
prefer to call it, conceptual analysis, otherwise known as "understanding the problem", is a
necessary part of solving many problems. And the best tool for conceptual analysis is generally a
set of clean and clear concepts, rather than obscure and exception-laden concepts (do I need to
argue this?).
Anyway, the original issue raised by Chris is more about memes rather than just about words. Chris
pointed out that the meme associated with "error" contains a deprecatory social value-judgement.
Software people say all software contains bugs. And nothing follows from that, for most people. If
software people said instead that all software contains errors, then there is a plethora of
regulations and even laws saying who is responsible for damage arising from design errors in
commercial products and it is at least possible that someone might start trying to apply them.
PBL
Prof. Peter Bernard Ladkin, Bielefeld, Germany
MoreInCommon
Je suis Charlie
Tel+msg +49 (0)521 880 7319 www.rvs-bi.de
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 833 bytes
Desc: OpenPGP digital signature
URL: <https://lists.techfak.uni-bielefeld.de/mailman/private/systemsafety/attachments/20180223/0dd6e4ba/attachment.sig>
More information about the systemsafety
mailing list