[SystemSafety] The Intertwining of Safety and Security
Chris Johnson
christopher.johnson at glasgow.ac.uk
Mon Nov 7 16:54:37 CET 2016
For sure but we are regularly seeing sophisticated, planned attacks developing over months on critical infrastructures using human and digital intelligence sources - I don't think you would see anything approximating these behaviour from Murphy alone.
C.W. Johnson,
Professor and Head of Computing, University of Glasgow
> On 7 Nov 2016, at 15:44, Driscoll, Kevin R <kevin.driscoll at honeywell.com> wrote:
>
> > Murphy would have to be very persistent and energetic?
> Certainly can happen at 10e-7 or lower requirements. For example, I’ve seen 3-way Byzantine failures happening at several per minute. And, I don't have anywhere near 10e7 hours of hands-on experience.
>
> From: Chris Johnson [mailto:christopher.johnson at glasgow.ac.uk]
> Sent: Monday, November 07, 2016 08:43
> To: Driscoll, Kevin R
> Cc: paul cleary; Peter Bernard Ladkin; The System Safety List
> Subject: Re: [SystemSafety] The Intertwining of Safety and Security
>
> Coordinated attacks on independent components are just one distinguishing case.
>
> Also repeated attacks using strategic planning to systematically identify weaknesses? Murphy would have to be very persistent and energetic?
>
> C.W. Johnson,
> Professor and Head of Computing, University of Glasgow
>
> On 7 Nov 2016, at 14:36, Driscoll, Kevin R <kevin.driscoll at honeywell.com> wrote:
>
> > software can be designed to never fail, yet without robust security, the software can easily be comprised
> Contradictory?? Otherwise, what does “never fail” mean?
> In comparing Murphy vs Satan (natural failures vs human threats, respectively) at 10e-7 or lower requirements, Murphy is indistinguishable from Satan, except for coordinated attacks against independent components. That is, the worst possible human adversary attack also could be produced by Murphy with help from Mother Nature. Thus, a system correctly designed for safety includes coverage for the safety-relevant security threats, with the exception for coordinated attacks against independent components. If such a system is vulnerable to safety-relevant security threats, its claims for safety are not valid, even in the absence of security threats.
>
> P.S.
> The restriction to “safety-relevant security threats” is to avoid the safety vs. security contradiction with respect to the Bell–LaPadula model, which is a whole other can of worms.
>
> From: systemsafety [mailto:systemsafety-bounces at lists.techfak.uni-bielefeld.de] On Behalf Of paul cleary
> Sent: Monday, November 07, 2016 05:31
> To: Peter Bernard Ladkin
> Cc: The System Safety List
> Subject: Re: [SystemSafety] The Intertwining of Safety and Security
>
> Really it's only just become a hot topic now!! Wow
>
> There can be no safety without security. I'm still amazed that discussions on this forum and other like, continue to deliberate on subjects such as (notional) software reliability, and applying safety efforts to reduce take probability of failures occurring within software and hardware based systems, yet no discussion on security. A system and its software can be designed to never fail, yet without robust security, the software can easily be comprised and changed, rendering and notions of system safety completely irrelevant!!
>
> Paul Cleary BSc, MSc, CEng, EUR ING
> RailAssuranceConsulting
>
>
> On Nov 7, 2016, at 6:24 PM, Peter Bernard Ladkin <ladkin at causalis.com> wrote:
>
> A very hot topic nowadays. But I encounter a lot of people who think you can actually handle syste
> safety and system security in IACS systems separately. I encounter others who think that ensuring
> safety means you need to make sure your safety functions are not compromised.
>
> Not so. Your safety functions may be perfect, remain uncompromised, and still be insufficient to
> inhibit an unacceptable risk due to intruder activity. The argument is straightforward.
>
> https://abnormaldistribution.org/index.php/2016/11/07/an-observation-on-the-intertwining-of-safety-and-security/
>
> PBL
>
> Prof. Peter Bernard Ladkin, Bielefeld, Germany
> MoreInCommon
> Je suis Charlie
> Tel+msg +49 (0)521 880 7319 www.rvs-bi.de
>
>
>
>
>
>
> _______________________________________________
> The System Safety Mailing List
> systemsafety at TechFak.Uni-Bielefeld.DE
> _______________________________________________
> The System Safety Mailing List
> systemsafety at TechFak.Uni-Bielefeld.DE
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.techfak.uni-bielefeld.de/mailman/private/systemsafety/attachments/20161107/3b56e251/attachment.html>
More information about the systemsafety
mailing list