[SystemSafety] Critical systems Linux
Roderick Chapman
roderick.chapman at googlemail.com
Mon Nov 26 16:34:51 CET 2018
On 23/11/2018 18:01, Olwen Morgan wrote:
> The original Pascal standard had the categories of;
>
> 1. Implementation-defined - meaning not defined by the standard but
> defined in every implementation.
>
> 2. Implementation-dependent - meaning not defined by the standard
> and not necessarily defined in every implementation.
>
> 3. Errors.
>
> AFAI recall, Ada does something similar (?????)
>
Nearly... Ada has the following categories of undefinedness...
1. "Implementation-defined" - not defined by the standard, but must be
defined and documented by an implementation. The choice is not context
dependent, so you can rely on it not changing. Example: the range of the
predefined types like Integer. These values can be configured so that a
static verification tools "knows" the values. Both SPARK and CodePeer
have such a mechanism. Important since things like the presence or
absence of arithmetic overflow depends on these things.
2. "Unspecified" (Ada 95 onwards) or "Implementation-dependent" (Ada 83
only). Roughly the same meaning as "unspecified" in C. There's an
enumerable set of possible behaviours, but the set might be very large,
and an implementation is under no obligation to document its behaviour
or to be consistent at all. Can also be context-sensitive, so a change
to one bit of code can impact the behaviour of another. Also sensitive
to things like optimization settings. Obvious example: expression
evaluation order for binary operators.
3. "Erroneous" (which has much the same meaning as "undefined behaviour"
in C). The worst-of-the-worst. Anything goes. Compiler can do whatever
it likes, and once your program is undefined, it stays that way.
Compilers can also make any assumption they like about undefined stuff,
and then make further optimizations depending on that possibly-wrong
assumption.
In SPARK, these things are dealt with by a combination of techniques
working together. Specifically:
1. Configuration of the tool to deal with Implementation-defined features.
2. Subsetting to just get rid of an offending language feature altogether.
3. Static analysis to render unspecified features harmless, where
"harmless" means you get the same behaviour no matter what choice any
compiler might make.
4. Static analysis (which must be sound) to eliminate undefined behaviour.
Take, for example, expression evaluation order...we want to make sure
that expressions always evaluate to the same result no matter what
evaluation order is chosen by any compiler. This relies on a combination
of the above techniques:
a. Functions have no "in out" or "out" parameters (subsetting).
b. Functions do not write to non-local state (info flow analysis)
c. Expressions do not overflow or raise any other exceptions (VC
generation and theorem proving).
d. The prover knows the base ranges of the predefined types (configuration).
and so on...
Pedants will also point out that you also need to do
e. Static analysis of worst-case stack usage to make sure than an
expression evaluation doesn't blow the stack. This is tractable using
tools like GNATStack. Some users choose to disallow recursion to make
this analysis simpler.
f. Some sort of analysis to be sure that loops within functions really
do terminate. In SPARK 2005 and earlier, the verification system was
strictly partial, so we assumed loop termination. In SPARK 2014,
verification of loop termination can be attempted using an explicit loop
variant contract.
- Rod
first, SPARK forbids functions that have side-effects by subsetting and
information-flow analysis. It then does VC-generation and
theorem-proving to show that intermediate results won't overflow or
raise any other exceptions.
More information about the systemsafety
mailing list