<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
</head>
<body>
<p><br>
</p>
<p>Thanks to Rod for this. Now two questions:</p>
<p>AFAI recall, recursion is banned in the SPARK subset. But,
considering that a compiler for a functional language may convert
tail recursion to iteration, I ask whether anything in the Ada
standard explicitly forbids an Ada compiler from implementing
iteration by recursion? Not that I'd expect any sensible compiler
to do it, but failure to prohibit it would be an example of the
possible lacunae to which I have previously alluded. An analysis
tool assuming an iterative implementation might well fail to
predict correctly the behaviour of a program in which iteration is
implemented recursively.<br>
</p>
<p>Also when did the Ada standard start using the
implementation-defined/unspecified/erroneous classification that
blights the C standard? Did it not start out with the same
classification as Pascal, namely
implementation-defined/implementation-dependent/error?<br>
</p>
<p><br>
</p>
<p>Olwen</p>
<p><br>
</p>
<p><br>
</p>
<p><br>
</p>
<div class="moz-cite-prefix">On 02/07/2020 12:29, Roderick Chapman
wrote:<br>
</div>
<blockquote type="cite"
cite="mid:5891fc95-e673-8d54-a140-501db96488cb@proteancode.com">
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
<div class="moz-cite-prefix">On 01/07/2020 17:10, Olwen Morgan
wrote:<br>
</div>
<blockquote type="cite"
cite="mid:abc6ce72-eab1-7d28-e367-ce2ca3aa5ef8@phaedsys.com">I'll
defer to those more familiar with the Ada RM but I'd be very
surprised if the language specification were entirely free from
lacunae related to implementation-defined features that could,
at least in principle, lead to similar problems. Does the RM tie
down the compiler strictly enough to preclude different
instances of implementation-defined constructs behaving
differently in different places?</blockquote>
<p><font size="+1">The Ada RM defines three levels-of-evil when it
comes to ambiguous features. These set different challenges to
verification tools. The levels are:</font></p>
<p><font size="+1">1. "Implementation-defined". This is something
that is left to an implementation to choose, but it must be
documented and a compiler must make a single choice that never
changes - essentially it's always the same for all versions of
the same compiler, and will not change with optimization and
so on. Obvious example: the range of values of the predefined
type "Integer". Things like this are unlikely to change for
the lifetime of a project, unless you change (say) the target
machine ISA and the compiler. Static analysis tools can be
"configured" (hopefully just once) to "know" these values.</font></p>
<p><font size="+1">2. "Unspecified" - Compiler has a choice from a
set of behaviors but the set might be very large, and the
compiler is under no obligation to document its behaviour.
Compilers are expected to produce binary equivalence for the
same program, but behaviour might change given an arbitrarily
small change in a program. These things can also be
context-dependent, so changing line N of a program can change
the behaviour of line N+1. Obvious examples in Ada: expression
evaluation order, and parameter passing mechanism for "in out"
mode record types.</font></p>
<p><font size="+1">3. "Erroneous" (This is equivalent to
"undefined behaviour" in C). Anything goes, and all bets are
off! Unrecoverable, in that once your program becomes
erroneous then you can't recover to a well-defined state.
Example in Ada: accessing an array outside its bounds if you
have chosen to compile with runtime checks disabled.</font></p>
<p><font size="+1">The approach taken by SPARK is essentially:</font></p>
<p><font size="+1">1. Implementation-defined behaviours are known
to the toolset by default, or can be overridden through a
simple configuration file.</font></p>
<p><font size="+1">Some care is needed here to get these things
right, but once you've got it right, they are very unlikely to
change for the lifetime of a project.</font></p>
<p><font size="+1">2. Unspecified features are either eliminated
by simple subsetting, or are rendered harmless by some
combination of subsetting and analysis. For example - any
expression evaluation order is allowed and valid in SPARK (so
a compiler can do anything it likes), but several other rules
in the language work together to ensure that the expression
always evaluates to the same value, so it doesn't matter.</font></p>
<p><font size="+1">3. Erroneous behaviour. Eliminated by
subsetting and/or various analyses. For example: unchecked
buffer overflow is eliminated by data-flow analysis (to ensure
variables are initialized) then theorem-proving (to show that
index expressions are in range) and so on.</font></p>
<p> - Rod</p>
<p><br>
</p>
<br>
<fieldset class="mimeAttachmentHeader"></fieldset>
<pre class="moz-quote-pre" wrap="">_______________________________________________
The System Safety Mailing List
<a class="moz-txt-link-abbreviated" href="mailto:systemsafety@TechFak.Uni-Bielefeld.DE">systemsafety@TechFak.Uni-Bielefeld.DE</a>
Manage your subscription: <a class="moz-txt-link-freetext" href="https://lists.techfak.uni-bielefeld.de/mailman/listinfo/systemsafety">https://lists.techfak.uni-bielefeld.de/mailman/listinfo/systemsafety</a></pre>
</blockquote>
</body>
</html>