[SystemSafety] A General Theory of Love
Peter Bernard Ladkin
ladkin at causalis.com
Sat May 13 07:55:56 CEST 2017
On 2017-05-12 22:39 , Les Chambers wrote:
> I think the concept of "boundaries" is losing its utility.
What nonsense! It's one of the essential notions in system building and supply.
You can't write a contract for supplying components of a critical system unless it's clear which
bits you are responsible for and which bits not. Contracts explicitly set boundaries. All systems
and components come with such a contract (if you buy something off the shelf in a store, the
contract may be implicitly specified by giving the protocols to which the device is said to conform,
such as "IEEE 803.11ac" or "USB 3.0").
In order to begin with any of the activities necessary to design and assess safety-critical systems,
you need to identify what is to be "system" and what is to be "environment", to set the boundary. In
order to define and encapsulate components/subsystems of systems, you need to do the same.
Modularisation and scoping in complex computer programs depend on explicitly setting boundaries -
scoping is the name for how you do it. Such structured programming and modularisation is essential
for software-based system assurance, as it has been for almost 50 years now.
Some high-profile accidents where boundary-setting went wrong include:
* Ariane 501 (the values coming from a sensor were out of subsystem-nominal bounds, although they
were exactly right for the planned and actual behaviour of the vehicle)
* Three Mile Island (The indicator that a PORV, an "electromatic relief valve", was open or shut in
fact indicated the state of an electrical component, a solenoid which closed the circuit to actuate
the electrics to operate the valve. Something which should have indicated a state external to the
electrical control system, namely the position of the valve, was in fact indicating something
internal to it. Perrow mentions this early on in Normal Accidents, pp21-2 of my first edition, and
Michael on p164 of his Problem Frames book. The terms "external" and "internal" reference implicitly
a subsystem boundary.)
* Fukushima (floods descend into a basement under gravity - where the electrics were. Lochbaum and
Perrow had explicitly mentioned this hazard up to two decades previously; it's in Perrow's 2004 book)
Martyn's report http://www.raeng.org.uk/publications/reports/global-navigation-space-systems notes
situations in which people claimed their systems had no GPS dependence, but which systems were
rendered inoperable when GPS in the vicinity was jammed. Simply said, they got their system boundary
wrong.
PBL
Prof. Peter Bernard Ladkin, Bielefeld, Germany
MoreInCommon
Je suis Charlie
Tel+msg +49 (0)521 880 7319 www.rvs-bi.de
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 163 bytes
Desc: OpenPGP digital signature
URL: <https://lists.techfak.uni-bielefeld.de/mailman/private/systemsafety/attachments/20170513/e85b32ae/attachment.sig>
More information about the systemsafety
mailing list