[SystemSafety] A Common Programming Language for the Department of Defense
Steve Tockey
Steve.Tockey at construx.com
Tue May 2 23:05:36 CEST 2017
Paul Bennett wrote:
³I find most of my cost goes into testing. Then, I start testing on day one
of receiving requirements because I first test the requirements handed to
me.²
Very interesting. For me, the majority cost goes into requirements. On my
projects, requirements is 60% of the total technical (I.e., requirements +
design + construction + developer test) effort. Coding + developer testing
is only 10%. Testing (I mean, specifically, non-developer test) also turns
out to be a much smaller part of my projects than in typical projects.
³The six C's guidelines hold sway in this territory and all six C's have
to be in
place for a requirement to be usable. [I have told this group what the six
C's
are in previous posts but will list them if asked.]²
I prefer five C¹s and one P: Complete, Consistent, Clear, Correct,
Concise, and Precise (³Correct² in the sense of validated by the
stakeholders)
³Also how honest we are allowed to be about our estimations of time
required to
accomplish a proper design. Impossibly short time-scales should be resisted
with extreme vigour.²
I¹d go so far as to call it an ethical violation (reference: IEEE-CS/ACM
Joint Task Force on Software Engineering Ethics and Professional
Practices, Software Engineering Code of Ethics and Professional Practice,
IEEE & ACM, 1999. Available at
http://www.computer.org/tab/seprof/code.htm) to NOT resist.
³Now I find that an interesting statement. I have written code that has
run on several processors without modification. Regardless of one being
16 bit, the next being 32 bit, and then the final one being 64 bit. Same
software has also seen use on 18 bit and 21 bit processors to.
The reason for me being able to accomplish the above is because I only
have o design for a simple abstract processor which has been implemented
on almost every processor produced (clue in the sig).²
Boeing¹s 767 Engine Simulator Automated Test Equipment (ATE) was written
in C for HP/UX 9. 777 ATE is in C++ on HP/UX 10. 787 ATE is C# .net. 87
lines of code got copied from 767 Engine Sim ATE to 777 ATE, the remaining
130k in 777 ATE were new. Zero lines of code survived from 777 ATE to 787
ATE. All of this despite the fact that the fundamentals of how to test a
Boeing airplane are remarkably the same: Test, Block, Step, Pass/Fail, QA
Buyoff, the different kinds of steps, . . . The knowledge of how to test
an airplane exists completely external to computing, THAT is what¹s really
reusable. De-coupling ³domain knowledge² from technology is the real key,
IMHO.
³After seeing some requirements documents I even
considered that there should be a McCabe Cyclomatic Complexity measue
on the requirements document and an ideal number for it to fall below as an
aggregate before it was deemed acceptable to an implementer. I note that
there are requirements specification tools around but are they anywhere
near a solution to the complexity problem or do the help exacerbate the
problem.²
I get your point but it could never be a single number. A set of numbers
(³Software complexity is not a number, it¹s a vector²‹Meilir Page-Jones),
yes. But the other critical issue is that trying to write requirements in
any natural language like English is a massive waste of time.
Cheers,
‹ steve
-----Original Message-----
From: "paul_e.bennett at topmail.co.uk" <paul_e.bennett at topmail.co.uk>
Date: Monday, May 1, 2017 at 3:43 PM
To: Steve Tockey <Steve.Tockey at construx.com>, Les Chambers
<les at chambers.com.au>, Haim Kuper <h3k at 012.net.il>
Cc: "systemsafety at lists.techfak.uni-bielefeld.de"
<systemsafety at lists.techfak.uni-bielefeld.de>
Subject: Re: [SystemSafety] A Common Programming Language for the
Department of Defense
On 01/05/2017 at 6:57 PM, "Steve Tockey" <Steve.Tockey at construx.com> wrote:
>
>It¹s interesting to me that most people even think that the
>problems outlined in Section A.1. could ever be solved by a
>programming language (IMHO):
Some of the problems are more system wide than alluded to. So you are
correct
in your thinking.
>*) Responsiveness‹this is rooted in crappy requirements and
>uninformed, amateur design practices (e.g., lack of application of
>fundamental design principles)
This raises the arguments about distributed networked systems versus
centrally controlled systems. There are standards in existence that specify
the maximum time a user should wait for a confirmation that something is
happening at the user interface. Sadly, much of the software on our own
PC's will fall foul of those standards. Sometimes, a networked solution
can be faster than a centralised control solution as much of the work can
be hived out to the individual controllers across the network. Of course,
getting decent requirements is key to permitting such thinking to be
considered.
>*) Reliability‹this is caused by not paying attention to code
>semantics (e.g., lack of design-by-contract)
I have found a "Component Oriented" method works well here. Each
function (sub-routine performing a single simple task) can be considered
as a component with hard enough surfaces that CofC can be applied the
once on the proviso that component rules are followed (think about the way
we deal with mechanical components and do the same for software).
>*) Cost‹my data shows that about 60% of the cost of a typical
>software project is reworking mistakes made earlier (caused by
>crappy requirements and design and inattention to up-front quality)
I find most of my cost goes into testing. Then, I start testing on day one
of receiving requirements because I first test the requirements handed to
me.
The six C's guidelines hold sway in this territory and all six C's have to
be in
place for a requirement to be usable. [I have told this group what the six
C's
are in previous posts but will list them if asked.]
>*) Modifiability‹again, lack of application of fundamental design
>principles
This is from developing decent architectural frameworks that will allow
significant
structural changes without weakening the structures. Some of the reason I
tend
towards distributed solutions.
>*) Timeliness‹again, 60% rework caused by crappy requirements and
>design drive most of this
Also how honest we are allowed to be about our estimations of time
required to
accomplish a proper design. Impossibly short time-scales should be resisted
with extreme vigour.
>*) Transferability‹we have to finally admit that code is
>inherently un-reusable. Nearly 70 years and we still haven¹t
>solved it? It¹s time to look for alternate solutions. . .
Now I find that an interesting statement. I have written code that has
run on several processors without modification. Regardless of one being
16 bit, the next being 32 bit, and then the final one being 64 bit. Same
software has also seen use on 18 bit and 21 bit processors to.
The reason for me being able to accomplish the above is because I only
have o design for a simple abstract processor which has been implemented
on almost every processor produced (clue in the sig).
>*) Efficiency‹this will always and forever be a problem. Moore¹s
>Law helps on the supply side, but the customers continue to demand
>more complex applications that they would not have even dreamt of
>10 years ago. Demand for high performance will always outstrip
>supply
There is probably too much lazy programming going on. Why does it take
so much memory to update an app I already have on my phone? I can still
do amazing controls in under 64k. Yet, my phone app update will be 30Mb
or more and it didn't seem to be that complicated an app in the first
place.
If we demanded 100% fully certified software, where its operation was a
guaranteed success, then we would halt the majority of the industry.
>These are not problems that were ever caused by one programming
>language vs. another. With the exception of the last one, these
>have always been‹and will always be‹methodological issues (HOW you
>do requirements work, HOW you do design work, HOW you qualify
>people to be doing work in these areas in the first place, . . .).
>Until we fundamentally re-think HOW we should develop software in
>the first place, none of these problems would ever be solved.
>Thinking they can be solved by one magic programming language is
>pretty darned naïve.
I can already produce robust code for very high integrity solutions in the
controls world. Where most of the difficulty lies is in the requirements
end
and I have re-written more than my fair share after asking clients what
they
were really aiming for. After seeing some requirements documents I even
considered that there should be a McCabe Cyclomatic Complexity measue
on the requirements document and an ideal number for it to fall below as an
aggregate before it was deemed acceptable to an implementer. I note that
there are requirements specification tools around but are they anywhere
near a solution to the complexity problem or do the help exacerbate the
problem.
Regards
Paul E. Bennett IEng MIET
Systems Engineer
--
********************************************************************
Paul E. Bennett IEng MIET.....<email://Paul_E.Bennett@topmail.co.uk>
Forth based HIDECS Consultancy.............<http://www.hidecs.co.uk>
Mob: +44 (0)7811-639972
Tel: +44 (0)1392-426688
Going Forth Safely ..... EBA. www.electric-boat-association.org.uk..
********************************************************************
-------------- next part --------------
A non-text attachment was scrubbed...
Name: default.xml
Type: application/xml
Size: 3222 bytes
Desc: default.xml
URL: <https://lists.techfak.uni-bielefeld.de/mailman/private/systemsafety/attachments/20170502/9d78f5c2/attachment.xml>
More information about the systemsafety
mailing list