[SystemSafety] Educating the Postmodern Systems Engineer
Les Chambers
les at chambers.com.au
Thu Aug 15 09:32:44 CEST 2024
Steve
It occurs to me that many of the knowledge areas addressed in the SWEBOK
belong in the Philosophy category. My suggestion is that there should be a
section entitled "Philosophy of Software Engineering"
My justification:
The scope of the SWEBOK should cover not only what an engineer should know but
also what kind of person her/she has become upon graduation. A university
should provide a transformational experience for the candidate's knowledge and
character. Philosophy deals with character transformation, a Jesuit novice's
experience is a good metaphor (a Jedi even better - we are trusting them with
"the force", they need to wield it with compassion). The task is moulding
belief systems. There is a difference between knowing about a code of ethics
and having the courage to apply it at the risk of your career. Refer to the
"Creating" level of Bloom's Taxonomy.
The reason I labour this point is that the root cause of safety incidents that
result in the largest body counts can often be traced to lack of character in
senior decision-makers. Would we have 346 deaths in Boeing 737 Max crashes if
Dennis Muilenburg was educated in a Jesuit university?
I illustrate my point with the following courtesy of ChatGPT4:
Prompt:
I am an electrical engineer proposing to add a section entitled "Philosophy of
Software Engineering" to the IEEE's Software Engineering Body of Knowledge
(SWEBOK). Provide a table of contents for this section that addresses
elements of Philosophy relevant to software engineering.
Chat GPT4 Response:
Heres a proposed table of contents for the section "Philosophy of Software
Engineering" in the IEEE Software Engineering Body of Knowledge (SWEBOK). This
section aims to explore various philosophical elements that are relevant to
the practice and theory of software engineering.
### Table of Contents: Philosophy of Software Engineering
1. **Introduction to the Philosophy of Software Engineering**
1.1 Definition and Scope
1.2 Importance of Philosophy in Software Engineering
2. **Historical Context**
2.1 Evolution of Software Engineering as a Discipline
2.2 Key Philosophical Movements Influencing Software Engineering
3. **Epistemology in Software Engineering**
3.1 Knowledge and Understanding in Software Development
3.2 The Role of Experience and Intuition
3.3 Verification and Validation of Knowledge
4. **Metaphysics of Software Systems**
4.1 Nature of Software as a Construct
4.2 Ontological Questions in Software Engineering
4.3 The Reality of Software Artifacts
5. **Ethics in Software Engineering**
5.1 Professional Ethics and Responsibilities
5.2 Ethical Decision-Making in Software Development
5.3 Social Implications of Software Engineering
6. **Axiology: Values in Software Engineering**
6.1 Importance of Value-driven Design
6.2 Stakeholder Perspectives and Value Assessment
6.3 Balancing Competing Values in Design Decisions
7. **The Philosophy of Software Design**
7.1 Design Patterns and Philosophical Underpinnings
7.2 The Role of Abstraction and Simplification
7.3 Aesthetic and Pragmatic Considerations in Design
8. **The Scientific Method and Software Engineering**
8.1 Application of Scientific Principles in Software Development
8.2 Empirical Software Engineering Practices
8.3 The Role of Experimentation and Evidence-Based Engineering
9. **Philosophical Perspectives on Software Development Methodologies**
9.1 Agile vs. Plan-Driven Approaches
9.2 The Philosophical Foundations of Agile Practices
9.3 Critical Perspectives on Methodological Rigor
10. **Future Directions in the Philosophy of Software Engineering**
10.1 Emerging Trends and Technologies
10.2 Rethinking Software Engineering Paradigms
10.3 The Role of Artificial Intelligence in Software Development
11. **Conclusion**
11.1 Summary of Key Insights
11.2 The Ongoing Importance of Philosophy in Software Engineering
12. **References**
12.1 Academic Publications
12.2 Key Texts in Philosophy and Software Engineering
12.3 Online Resources and Further Reading
This table of contents aims to provide a comprehensive overview of the
philosophical dimensions that influence the field of software engineering,
fostering deeper understanding and encouraging critical thinking among
practitioners and scholars.
> Les,
> Good to hear from you, as always. Iâm buried in some substantial customer
work for the next couple of weeks, so not a lot of time for a detailed reply.
However, hereâs the top-level summary of where I would want to go:
>
> First, I am pulling the âdescriptionâ of Software Engineering from the
soon-to-be-released version 4 of the âGuide to the Software Engineering Body
of Knowledgeâ (aka âSWEBOK Guideâ v4). The final publication version
should be out in October or November (fingers crossed) but you can find the
public review draft here:
>
> https://waseda.app.box.com/s/r1j1mavf3glhtf6qh5j0xxdeb4uaws5h
>
> Not a lot will change from this public review draft to the final publication
version, mostly correcting some typographical errors and fixing some broken
formatting. The content is substantially in place.
>
> Second, if you look inside SWEBOK Guide v4 you will see knowledge area
Chapter 14, âSoftware Engineering Professional Practiceâ. Specifically,
section 1.2 starting on page 14.3 calls out "Codes of Ethics and Professional
Conductâ. My point here is that while SWEBOK Guide v4 does not go anywhere
near the level of detail that you go into below, it at least makes explicit
that a true Software Engineer must behave in an ethical manner. Quoting page
14-4:
>
> âFollowing their commitment to the health, safety, and welfare of the
public, software engineers shall adhere to the ten principles according to
IEEE Code of Ethics adopted by the IEEE Board of Directions, June 2020.â
>
> Finally, on a slightly different but still related topic, I got my
Masterâs of Software Engineering from Seattle University. Seattle University
is a Jesuit school. Every Jesuit school has a curriculum requirement that
every degree program include an element of ethics. Not as an elective, as a
core curriculum requirement.
>
> I was in Palo Alto, CA several years ago listening to a radio talk show
while commuting. One comment that stuck out was that if you look at corporate
CEOs and executive managers, those that graduated from Jesuit universities are
significantly less likely to get involved in corporate malfeasance such as
Fraud, Embezzlement, Extortion, Insider Trading, etc. If I recall correctly
the interviewee declared that no corporate executive that was ever convicted
of corporate malfeasance in the USA in, say, the last 50-100 years was a
graduate of a Jesuit university. So maybe explicit education in Ethics does
make a difference?
>
> All the best,
>
> â steve
>
> On Aug 13, 2024, at 5:02â¯PM, Les Chambers <les at chambers.com.au> wrote:
>
> Steve
> It's wonderful to see this list light up so passionately. Rumours of the
death
> of safety-critical systems debate were exaggerated.
> Your detailed breakdown of what a software and systems engineer should know
> resonated with me both for what it includes and excludes. It's correct but
> incomplete. Looking back on 50 years in systems engineering I've concluded
> that something was missing from my education. You know something is missing
> when you're flummoxed - faced with a situation where you have no analytical
> skills to guide your next move. For me, it was ethical dilemmas and people
> problems. But this is a reductionist view, the bigger picture is: I was
given
> no grounding in philosophy. No one taught me how to live as an engineer.
This
> ignorance was annoying then but is deadly now as engineers face mind-bending
> judgement calls such as, "Is this AI sentient . ergo it's in a safe state?"
>
> I refuse to retire and go quietly so I've been reading philosophy and trying
> to stay on top of developments in artificial intelligence (an impossible
> task), specifically its application to safety critical systems - Tesla and
the
> like. It's an exciting but horrifying experience. Exciting due to the
endless
> utility of the six artificially intelligent agents I converse with every day
> and horrifying when I witness a neural network deployed to drive an
> automobile. It's doing my head in that a giant blob of parameters (a neural
> network) has replaced a highly deterministic multilayer control systems
> architecture the like of which we have been refining for the past 50 years.
> This blob is not assembled subject to an unambiguous, complete and correct
> requirement specification and can therefore not be validated with a suite of
> human or machine-executable, module, unit, integration and systems tests.
All
> we have is actors such as Mira Murati (Chief Technical Officer of OpenAI)
> shrugging, "Aw shucks, we don't understand why it works so well." Or gems
from
> the likes of tech dudes such as Elon Musk, "Full Self-Driving V12.5 is good
to
> go, it no longer spills my coffee."
>
> Clearly, we are witnessing the death of determinism in control systems. It
is
> our sacred duty as professional engineers to turn this trend around "for the
> good of mankind". This is not a fad. Neural nets are a postmodern fact of
> life. You can take a Robotaxi in San Francisco, Los Angeles, Las Vegas,
Austin
> and Phoenix.
>
> Where do we start? Just as neural nets deployed in safety-critical
> applications need to be wrapped in a postmodern variant of systems
engineering
> discipline so do engineers need an upgrade to their educational wrapper - a
> solid grounding in philosophy at the core of all undergraduate engineering
> courses. And it must be COMPULSORY, not an elective.
>
> Over the years, in an unexamined creeping manner, engineers have accumulated
> massive power. We have designed nuclear bombs placing the world on a hair-
> trigger for the insanity of unwinnable nuclear war. Military technologists
> freely admit that the Ohio-class submarine is a greater threat to the planet
> than a meteor. We build deep fake tools that can mess with the minds of 3
> billion people overnight. Artificial general intelligence will put this
power
> on steroids, we therefore need to teach engineers how to wield it with the
> wisdom wrapped in philosophy. My thesis is that wisdom can and must be
taught
> not left in the nebulous netherworld we call experience where the test comes
> first and the lesson later. Some decisions must be right the first time,
> there being no opportunity for continuous improvement in a dead radioactive
> planet populated by 6 billion corpses.
>
> Graduate engineers need to be celebrated not only for the equations they can
> solve but also for who they are; educated individuals, possessed of open,
> calm, self-controlled, stoic, stable, moral, measured, rational and logical
> minds with an unshakeable commitment to the profession, virtuous, incapable
of
> committing an immoral act; not as culturally illiterate, laisser-faire,
> technical automatons (a guaranteed outcome of your current curriculum
Steve).
> All of the above are learned behaviours that are reinforced by experience
but
> will struggle for an engineer's attention unless the frameworks and
principles
> are taught and valued at the engineering origin - the University.
>
> Your summary curriculum is engineering hygiene, necessary but not
sufficient.
> We need to move our undergrads beyond algorithms to engage with the elements
> of our humanity that non-engineers are merrily moving into silicon. The
> squishy nondeterministic stuff that goes on in the brain.
> The engineering education must instil a state of mind - a certainty in "who
am
> I?". Over the years we have evolved from spot problem solver, to systems
> designer, to where we find ourselves today - Cognitive Systems Designer, a
> professional specializing in the creation of intelligent systems that
> replicate or enhance cognitive functions, such as perception, learning, and
> problem-solving, by utilizing insights from philosophy, psychology,
> neuroscience and artificial intelligence.
>
> The problem is that much of the customer-facing AI product development work
is
> currently being led by non-engineers. Look up Dennis Hassabis, Sam Altman,
> Jack Clark et al. they are neuroscientists, computer scientists and
> philosophers. Brilliant people but lacking an engineering mindset, as
> evidenced by safety as an afterthought in chatbot development. I would be
> surprised if any of them has ever heard of the concept of Functional Safety.
> My point is that the brain that works at the AI system product coalface must
> have an engineering mindset with philosophical reasoning embedded. Designers
> make decisions that can't or won't be reversed especially if they increase
> shareholder value. Frances Haugen testified that Facebook knowingly left
> algorithms injurious to user mental health in the mix to preserve cash flow.
A
> thought experiment: what if the designers just said, "No!" Where would you
> find the courage to do such a thing?
>
> And so we have come full circle to a place where ancient philosophy offers a
> wealth of insights that can be practically useful in this cognitive state.
My
> humble suggestions are:
>
> 1. Socratic Method: This technique emphasizes dialogue and questioning to
> stimulate critical thinking and illuminate ideas. Cognitive Systems
Designers
> can use this method to refine their ideas, evaluate assumptions, and enhance
> problem-solving through collaborative discussions. Postmodern Systems
> Engineering is a team sport that will welcome synthetic agents and
assistants.
>
> 2. Aristotelian Virtue Ethics: Aristotle's focus on virtue ethics encourages
> designers to consider the moral implications of their work. This perspective
> fosters a commitment to creating systems that prioritize human well-being,
> flourishing, and responsible use of technology. "For the benefit of mankind"
> means, that when interests conflict, we report to mankind, not our political
> or commercial masters. The courage to hold this line is a function of a
clear
> view of who you are.
>
> 3. Realism vs. Idealism: Philosophical debates on realism and idealism can
> inform Cognitive Systems Designers about the nature of perception and
reality.
> Understanding how different perspectives affect cognition can help in
> designing more intuitive systems. The naive engineer will be shocked to
> discover that reality is largely a function of personal perception, with the
> possible exception of gravity.
>
> 4. Teleology: The concept that everything has a purpose can guide designers
in
> understanding the intended functions of cognitive systems. By clarifying the
> goals of their creations, designers can align technology with human
cognitive
> processes and needs.
>
> 5. Mind-Body Dualism (Descartes): Exploring the relationship between the
mind
> and body can inform the development of interfaces in cognitive systems that
> account for the physical and psychological experiences of users, enhancing
> user engagement and interaction. Research indicates that a component of our
> wisdom resides in the body, independent of the mind. For example, the body
> does not want to lie - evidence the success of polygraph technology.
>
> 6. Stoicism: This philosophy emphasizes resilience and rationality in the
face
> of challenges. For designers, it can inspire a focus on creating robust
> systems that empower users to cope with uncertainties and enhance decision-
> making. In bad situations, engineers need to stay calm and rational when
> surrounded by panic. Stoicism provides simple tools to deal with high-stress
> events. Marcus Aurelius instructs us, "It's not the event that sparks panic
> it's how you react to it and this is under your control."
>
> 7. Phenomenology: The study of experience and consciousness can provide
> valuable insights into user interaction with cognitive systems.
Understanding
> how people perceive and interpret their experiences can lead to more
> empathetic and effective designs.
>
> 8. Pragmatism (William James): The pragmatic approach encourages designers
to
> consider the practical outcomes of their technologies. Focusing on real-
world
> applications and user impact can help create cognitive systems that are both
> functional and meaningful. What was the designer of the B61 nuclear bomb
> thinking when he provided for configurable yield (0.3 to 340 kilotons). "Aw
> shucks, if we're mildly upset we'll kill a few thousand enemy but if we're
> really angry we can smoke 2 million."
>
> And so I beat on (with apologies to F Scott Fitzgerald), a boat against the
> current, drawing the profession back into the past to assure its future. It
> could be an act of stupid courage, attempting to nudge a local University in
> this direction through my association with a colleague who lectures in their
> postgraduate engineering stream.
>
> Here's hoping that all of the above will attract some comment from this
list.
> In particular from the university educators amongst us. I read many American
> universities are aggressively pursuing AI in education. Is anyone pursuing
> philosophy for engineers?
>
> And to the freshman engineer a warning, suit up and front up to Marcus
> Aurelius, Epictetus, Seneca, Aristotle, Socrates . et al. Listen to what
they
> have to say, you don't want to find yourself in this situation:
>
> A broken wing rocks on the sand
> Beside a far-off sea
> In pitch black faith was placed in men
> Christ
> One of them was me!
>
> Cheers
> Les
>
> PS: A practical suggestion. Read Ryan Holiday, The Daily Stoic. It's not
> rocket science, five minutes a day could transform you into a Philosopher
> Engineer.
> --
> Les Chambers
> les at chambers.com.au
>
> https://www.chambers.com.au
> https://www.systemsengineeringblog.com
>
> +61 (0)412 648 992
--
Les Chambers
les at chambers.com.au
https://www.chambers.com.au
https://www.systemsengineeringblog.com
+61 (0)412 648 992
More information about the systemsafety
mailing list