[SystemSafety] "Safe" and "Unsafe" Languages
Littlewood, Bev
Bev.Littlewood.1 at city.ac.uk
Mon Jan 23 12:42:06 CET 2017
Hi Matthew
You are right, of course.
But I think ‘uncertainty’ is a better word than ‘randomness’ to describe what we are dealing with - loosely, decision-making in the presence of uncertainty. One of the great achievements of the last century was the development of the Bayesian formalisation that allowed your ‘measure of belief’ to be treated probabilistically with proper rigour.
Bev
On 23 Jan 2017, at 07:06, Matthew Squair <mattsquair at gmail.com<mailto:mattsquair at gmail.com>> wrote:
My vote goes to … ‘Probability'.
The original use of the word (and retained in legal usage) predates as a measure of belief in opinion predates it's association with the the idea of `chance' or `randomness’ concepts of classical risk theory.
On 23 Jan 2017, at 5:52 pm, Drew Rae <d.rae at griffith.edu.au<mailto:d.rae at griffith.edu.au>> wrote:
Peter,
That’s a good summary of the problem - and yes, it is a problem.
Computer science is a field where (in my opinion) there is strong justification for the invention of jargon. The field has a large number of very precise concepts, and those concepts need to be communicated.
Unfortunately, people are reluctant to invent totally new terms, so they repurpose “plain English” terms with precise meaning.
The problem comes when people (usually at least two people working together, but I’m sure it happens inside some single heads as well) confuse the precise term for the general term. The classic one is the one you illustrated:
Meeting safety properties is not exactly the same thing as meeting safety requirements
Meeting safety requirements is not exactly the same thing as being safe
(Yes, someone can redefine any of those terms to create temporary equivalence, but only in a context where the redefinitions hold true)
Individually, the most important thing is to remember that if we want to narrow the definition of a term, we’re divorcing it from the general meaning. It doesn’t work in the opposite direction. I can insist until I’m blue in the face that a “hazard” is a subset of a state space, but all I’m doing is explaining MY definition, not successfully narrowing what hazard means to other people.
If I want to use my precise definition, I need to explain that I’m using a particular definition and why I’m doing it. I then need to put up with the fact that other people will project their previous understanding of the term onto what I’m saying.
And yes, this is why work that depends on reliable communication of precise concepts should use formal notation at every step. It’s also why doing so is necessary but not sufficient for safety.
* This message is from my work email
* I can also be contacted on andrew at ajrae.com<mailto:andrew at ajrae.com>
* My mobile number is 0450 161 361
* My desk phone is 07 37359764
* My safety podcast is DisasterCast.co.uk<http://disastercast.co.uk/>
On 23 Jan. 2017, at 4:23 pm, mario.gleirscher at tum.de<mailto:mario.gleirscher at tum.de> wrote:
Dear Peter,
Thanks for this nice encyclopedically concise summary. I have come across similar terminology problems when I was doing literature research for my thesis. Although, I decided to stick with a simple variant of Pnuelis safety/progress notions when formalizing my stuff.
Concerning your question: I'd very pragmatically start with improving the respective Wikipedia articles of which a view are not satisfyingly interlinked, maybe a category page would suffice. More idealistic, it would help to put a nice terminology chapter into a next version of one of the more popular course books around in the system safety community. Moreover, I don't think that a standard part or addendum is the rightest place to disseminate such important basic knowledge.
Best,
Mario
Von: Peter Bernard Ladkin<mailto:ladkin at causalis.com>
Gesendet: Montag, 23. Januar 2017 06:43
An: The System Safety List<mailto:systemsafety at techfak.uni-bielefeld.de>
Betreff: [SystemSafety] "Safe" and "Unsafe" Languages
There is an article in ACM TechNews from last week that talks about the programming language Nim,
which apparently has the language Rust's "safe-by-default" behaviour (it will be item 3 in Risks
30.10 when this edition goes up on the Risks Forum Digest WWW site). I think it was about a decade
or so that I read a short note from Andy Wellings which talked about a "safe" programming language.
Apparently, it is common parlance amongst the programming language community to talk about "safe"
languages. Is that right?
Here is an example from Rust. https://doc.rust-lang.org/nomicon/meet-safe-and-unsafe.html
Apparently, they seem to mean something such as: write your programs in Rust and they'll not have
run-time errors. I suspect a much weaker claim is justified, namely there are different kinds of
"X-safe", for example type-safe and memory-safe, and Rust is type-safe and maybe memory-safe. Pascal
and Modula were type-safe; I don't know what memory-safe means but presumably it includes absence of
memory leaks. Note that it doesn't make complete sense to talk about a programming language as being
type-safe simpliciter: whether a concrete program is type-safe also depends on whether your compiler
reliably preserves the semantics of types.
It is not clear to me what calling a language "safe" means in terms of concrete scientific
assertions about the language. For example, I doubt it is practically possible to have a
general-purpose language, all valid programs in which are guaranteed to be free of run-time error.
E.g., You'd have to guard against any expression used as the denominator in a division being
digitally-non-zero, and although there are industrially-practical ways of doing that with specific
programs using Hoare logic, I don't know any way of ruling that out a priori in a language
definition for a practical language.
I see no problems with the use of terms such as type-safe or (if it has a concrete meaning)
memory-safe. They clearly refer to specific properties and there is little or no chance of confusion
with the general term "safe" as used in functional safety (although many people such as myself do
not use that term unqualified; I prefer to speak of "adequately safe" rather than "safe"
simpliciter, because safety is a matter of degree, not of a binary property which a system has or
does not have).
There is another use of the term "safety", in "safety property" in concurrency theory. Here, there
is a theorem that any program property may be expressed as a conjunction of a "safety property" with
a "liveness property". A safety property is an invariant, "always X"; a liveness property is a
task-progression assertion "always eventually Y". They are often expressed as saying "nothing bad
[that would be NOT-X] ever happens" and "something good (Y) always eventually happens". Here, the
relevant term is "safety property", and people tend to talk about "fulfilling a safety property"
rather than saying "safe".
I have occasionally but regularly come across confusing use of the term "safe" by
concurrency/algorithm/logical-program-reliability specialists where the correctness of a program is
called "safety" if the program has some important function in some way. I often object that people
are talking about the reliability of a program, its ability to do what is specified of it, but not
safety, which is a system property partially and essentially defined by the environment in which the
program runs. Sometimes this is acknowledged graciously; sometimes not. The people who use the term
in this way are often distinguished computer scientists, so I do see it as important to say
something, and am disappointed when the response is a metaphorical shrug.
Standardised functional-safety conceptions don't necessarily help avoid this confusion between
safety and reliability. Functional safety is assured in IEC 61508 through reliability conditions on
specific functions introduced to mitigate a hazard with unacceptable risk.
The phenomenon is probably difficult socially to avoid. Back twenty years ago, in CERT's early days
when it was possible to count vulnerabilities, some 80% of vulnerabilities were said to be due to
buffer overflows, which are avoided if source code is written in strongly-typed languages and the
compiler enforces strong typing correctly. Calling such languages "safe" comes from knowing that
programs written in the language and compiled with a semantics-preserving compiler are *free from
such unwanted behaviour*. For functional-safety people, the "unwanted behaviour" is that causing
possible "harm", and "harm" is damage to people, property, environment, what have you, outside the
vocabulary of the system itself.
Do people see such memes, cases in which a word with an important technical meaning has three or
four such meanings, as well as a colloquial meaning, as problematic? If so, what is it reasonable to
do about it?
PBL
Prof. Peter Bernard Ladkin, Bielefeld, Germany
MoreInCommon
Je suis Charlie
Tel+msg +49 (0)521 880 7319 www.rvs-bi.de<http://www.rvs-bi.de/>
_______________________________________________
The System Safety Mailing List
systemsafety at TechFak.Uni-Bielefeld.DE<mailto:systemsafety at TechFak.Uni-Bielefeld.DE>
_______________________________________________
The System Safety Mailing List
systemsafety at TechFak.Uni-Bielefeld.DE<mailto:systemsafety at TechFak.Uni-Bielefeld.DE>
_______________________________________________
The System Safety Mailing List
systemsafety at TechFak.Uni-Bielefeld.DE<mailto:systemsafety at TechFak.Uni-Bielefeld.DE>
_______________________________________________
Bev Littlewood
Professor of Software Engineering
Centre for Software Reliability
City, University of London
EC1V 0HB
Phone: +44 (0)20 7040 8420 Fax: +44 (0)20 7040 8585
Email: b.littlewood at csr.city.ac.uk<mailto:b.littlewood at csr.city.ac.uk>
http://www.csr.city.ac.uk/
_______________________________________________
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.techfak.uni-bielefeld.de/mailman/private/systemsafety/attachments/20170123/dac7227c/attachment-0001.html>
More information about the systemsafety
mailing list