[SystemSafety] Another unbelievable failure (file system overflow)
Smith, Brian E. (ARC-TH)
brian.e.smith at nasa.gov
Wed Jun 10 17:38:43 CEST 2015
Gentlemen and Ladies,
Brian Smith here. I’m a new contributor, but have been following the various discussions almost daily. Here are a few thoughts…
On the ethics of technologies…
At http://plato.stanford.edu/entries/computing-responsibility/ you will find an interesting piece, Computing and Moral Responsibility first published Wed Jul 18, 2012, that refutes the misconception that computationally-intensive technology R&D like fully-automated aircraft or cars (without pilots or drivers) or single-pilot concepts of operations for passenger planes is an ethically neutral practice. To quote:
The design and use of technological artifacts is a moral activity and the choice for one particular design solution over another has real and material consequences.
Accountability … is different from liability. Liability is about looking for a person to blame and to compensate for damages suffered after the event. Once that person has been found, others can be let ‘off the hook’, which may encourage people to look for excuses, such as blaming the computer. Accountability, however, applies to all those involved. It requires a particular kind of organizational context, one in which answerability works to entice people to pay greater attention to system safety, reliability and sound design [emphasis mine], in order to establish a culture of accountability. An organization that places less value on accountability and that has little regards for responsibilities in organizing their production [or research] processes is more likely to allow their technological products to become incomprehensible.
Nissenbaum identifies four barriers to accountability in today's society: 1) the problem of many hands, 2) the acceptance of computer bugs as an inherent element of large software systems, 3) using the computer as scapegoat and 4) ownership without liability. According to Nissenbaum people have a tendency to shirk responsibility and to shift the blame to others when accidents occur. The problem of many hands and the idea that software bugs are an inevitable by-product of complex computer systems are too easily accepted as excuses for not answering for harmful outcomes. People are also inclined to point the finger at the complexity of the computer and argue that “it was the computer's fault” when things go wrong. Finally, she perceives a tendency of companies to claim ownership of the software they develop, but to dismiss the responsibilities that come with ownership.
FYI… I’m a member of the NASA Integrated Product Team that is evaluating the safety of the driverless car experiments being conducted on the Ames campus here in Mountain View (the headquarters of Google). I also live in the city and see Google AVs driving by my home almost daily. Without specific “metrics” to evaluate how these vehicles behave, from my subjective knothole, they seem to respond to various traffic situations just like cars with drivers. Nissan is also about to begin AV experiments here in our area.
Because of my busy schedule, I may not be able to respond in as rapid-fire a way as others on this list.
Brian E. Smith
Special Assistant for Aeronautics
Human Systems Integration Division
Bldg N262, Room 120; Mail Stop 262-11
NASA Ames Research Center
P.O. Box 1000
Moffett Field, CA 94035
(v) 650.604.6669, (c) 650.279-1068, (f) 650.604.3323
Never let an airplane or a motorcycle take you somewhere your brain didn't go five seconds earlier.
From: Les Chambers <les at chambers.com.au<mailto:les at chambers.com.au>>
Date: Tuesday, June 9, 2015 at 5:57 PM
To: "safetyyork at phaedsys.com<mailto:safetyyork at phaedsys.com>" <safetyyork at phaedsys.com<mailto:safetyyork at phaedsys.com>>, "systemsafety at lists.techfak.uni-bielefeld.de<mailto:systemsafety at lists.techfak.uni-bielefeld.de>" <systemsafety at lists.techfak.uni-bielefeld.de<mailto:systemsafety at lists.techfak.uni-bielefeld.de>>
Subject: Re: [SystemSafety] Another unbelievable failure (file system overflow)
Chris
Thanks for responding. It's great that we've had some engagement on this issue. Even if we disagree, having the conversation is beneficial. Aristotle observed that, "It is the Mark of an educated mind to be able to entertain a thought without accepting it."
So, as an engineer with a self-confessed limited (but active) philosophical education, can I respond to your points.
On:
"... some of your examples are social rules not morality or ethics."
I don't think you can separate the two, they inform each other. A whip around the Internet revealed the following definition of ethics:
"... that branch of philosophy dealing with values relating to human conduct, with respect to the rightness and wrongness of certain actions and to the goodness and badness of the motives and ends of such actions."
The essence of my argument is that ethics evolved by exception and from observation of failed social behaviours. And the failures were self evident in that the individuals "behaving badly" died and, ergo, did not reproduce (thus eliminating bad behaviour). So we've come to the concept of right behaviour by a process of elimination (literally). Churchill referred to this in a somewhat harsh comment on his ally across the pond. "Americans can be counted on to do the right thing ... After they've tried everything else."
This is why I am confident that you will not walk out of your front door this morning and stand in front of a moving train. The thought will not occur to you - it's only attractive to me in the stupefied mental state one enters après reviewing a PBL mathematical proof - because it's axiomatic that if one stands in front of a moving train, one will die. Those that stood in front of moving trains are all dead, did not reproduce, and the idea died with them and their unborn children.
Further, the IEEE doesn't discriminate, including invasion of privacy as an ethical issue.
On:
"Morals and ethics depend on where you stand. What your view of the situation is and the legitimate objectives 'your tribe' as you put it ..."
As I mentioned, my values are with Gary Snyder, back in the upper Palaeolithic. But we don't have to go back that far to find a good example of culture and time independent values/morals/ethics/concepts-of-right-behaviour. We are about to celebrate the 800th anniversary of the Magna Carta. It turns out that only three clauses of that document still remain in British law. They relate to the freedom of the church ... . But the essence of the document was: "... it is not fair to tax people without their consent, without giving them the opportunity to present their grievances. So, those that would levy taxes should first form an assembly and debate the matter." King John found this good advice the hard way. Weeks after the document was signed he got the Pope to annul it. He alleged that he had signed it under duress. The Pope agreed that it was outrageous and totally unfair. Then the barons rose up and put King John out of business.
Moving forward a few hundred years, the United States of America exists today over righteous indignation about taxation without representation. Though the Magna Carta's words on the page did not survive in law, the fundamental principle did - it is argued that it was the idea behind Parliamentary democracy - because it plays to the self-interest and survival of the tribe (the human race in general). This is what I meant by ethics evolving through time. The fundamental idea is refined and projected on various situations in different ways, sometimes, as you mention, being used to justify bad behaviour. But the fundamental principle doesn't change.
On:
"Morals and ethics are not like engineering where 1+1=2"
I was attracted to engineering by mathematics, and later in life someone explained why this was so. Because mathematics gives us access to truth beyond intuitive reasoning (who knows, PBL may be almost there). But even later I have come to believe that our ethics hold the essence of truth. We demonstrate this in our everyday actions, some of us might die today because of them. They give some certainty to life. For example, I have never met you but I am 100 percent sure that this very morning, out your front door and down the street, short of a bus fare, you will not murder a passerby for their wallet. Civilisation would not stay glued together without ethics. And the fact that, in one form or another civilisation has stayed glued for millennia, must indicate that some aspects of ethical behaviour ARE eternal.
Returning to applications in safety engineering, if you look at Systems engineers as a tribe, we need to address the ethical question of enforcing our own code of ethics. This amounts to flat refusal to agree to courses of action or work that produces machines that take action which is unethical by our standards, regardless of the pressure that may be applied. This amounts to push back on the common management assertion that. " ... this decision is above your pay grade Pilgrim."
There is an excellent example of one of these situations occurring in Brisbane today. Refer: http://www.abc.net.au/news/2015-06-09/legacy-way-tunnel-builders-could-be-fined-250k-a-day/6531992
The skinny is this:
A 4.6-kilometre long road tunnel is scheduled for opening tomorrow 11 June, 2015.
>From tomorrow the builder faces fines of up to $250,000 for every day the tunnel remains closed.
No surprises, the tunnel will not open tomorrow " ... because it is undergoing final safety checks ... About 88,000 individual switches need to be tested as part of the safety checks."
I am not involved with this project and know nothing of the cast of characters involved BUT, having been in this situation myself I can just imagine the pressure being applied to the systems test engineers, their managers and their manager's managers. "Hey, it's good enough isn't it? Those defects. They're not serious are they? Sign here, open the tunnel, or … you will never work in Queensland again. "
I sincerely hope that no one in the chain cracks under political and financial pressure (by virtue of the risk raising $250,000 penalty my personal risk management strategy will be to avoid that tunnel for at least six months).
And if they don't crack the reason will be because they have ethics, the same ethics the Roman architects had when they built bridges 2000 years ago (and then stood under them while the first chariots rolled over).
Chris, I encourage you, return to your theological retreat, repair to the upper Palaeolithic, think harder, reflect deeper, channel the "the power-vision of solitude". Except that you're thinking on this subject may be, as yet, unenvolved.
Cheers
Les
From: Chris Hills [mailto:safetyyork at phaedsys.com]
Sent: Saturday, June 6, 2015 9:08 PM
To: 'Les Chambers'; systemsafety at lists.techfak.uni-bielefeld.de<mailto:systemsafety at lists.techfak.uni-bielefeld.de>
Subject: RE: [SystemSafety] Another unbelievable failure (file system overflow)
Hi Les
I stand by what I said. There are no moral or ethical absolutes.
You yourself said in closing: ” Morals don't change over time. They evolve through time”
The same is true of ethics.
BTW some of your examples are social rules not morality or ethics. The social rules including privacy are even more prone to change.
Morals and ethics depend on where you stand. What your view of the situation is and the legitimate objectives of “your tribe” as you put it…. though should we be looking at the survival of the tribe, the race, the species or the planet? The correct moral/ethical thing to do will change depending on where you stand.
BTW I did not have an ”incomplete[ness of the] modern technical education” however my background also includes dead people and situations where I personally might have to kill people. So my comments are not theoretical. Other than my technical education I spent some time in a theological retreat discussing the very topic of morals and ethics. Those who had “absolute” moral and ethical values based them on a foundation that was not solid in the first place. Which rather undermined their arguments.
What is “appalling and dangerous” are some things that are done in the name of morality and ethics “for the good of the tribe” usually against another tribe. There have been things done in the last two decades that depending where you stand can be argued to be both moral, ethical and legal or immoral, unethical and illegal.
Morals and ethics are not like engineering where 1+1=2 .
So I repeat my question: Can someone give me one moral or ethical absolute?
Regards
Chris
Eur Ing Chris Hills BSc CEng MIET MBCS FRGS FRSA
Phaedrus Systems Ltd Tel: FREEphone 0808 1800 358
96 Brambling B77 5PG Vat GB860621831 Co Reg #04120771
Http://www.phaedsys.com<http://www.phaedsys.com/> chills at phaedsys.com<mailto:chills at phaedsys.com>
From: Les Chambers [mailto:les at chambers.com.au]
Sent: 06 June 2015 03:29
To: safetyyork at phaedsys.com<mailto:safetyyork at phaedsys.com>; 'Andy Ashworth'; 'Steve Tockey'
Cc: systemsafety at lists.techfak.uni-bielefeld.de<mailto:systemsafety at lists.techfak.uni-bielefeld.de>
Subject: RE: [SystemSafety] Another unbelievable failure (file system overflow)
Chris
I find the statement, "There are no absolute morals nor ethics," appalling and dangerous for the following reasons:
Even though we are surrounded by nature, many of us find a way to believe that we are not part of it, that we are somehow exceptional, that our bodies and our minds and the way our society works have somehow not evolved alongside the creatures and the natural world that surrounds us.
I believe that the thing we call "morality" is an abstraction from human behaviours that have been successful over time. All enduring moral values can be traced back to the need for survival and can be attributed to the tribes that have survived. Us. For example, tribal groups that co-operate and work well together prosper. Lone wolves behaving "badly" don't reproduce and die out. Murder, in excess, has been proven to be ineffective for continued survival. If we take one clause from the IEEE standard for ethics: privacy; you could argue it came from the act of invading someone's personal space. Step inside someone's personal space and it's viewed as a threat, often leading to violence which is well known to be mostly counterproductive. So we teach our children not to do it.
Many of us objected to the NSA and GCHQ stepping into our personal space and spying on our emails. Hence the US congress has wound back the Patriot Act. Just about every well accepted moral / ethical principle can probably be traced back to behaviours that are perceived too secure the survival of the tribe.
I except that, from time to time, humanity takes detours often fuelled by the hubris of exceptionalism. In the 1930 s Adolf Hitler promulgated the idea that the German race was exceptional and justified the destruction of less exceptional races for the good of all. I recently spend eleven days in Berlin and received intensive tutorial, from the locals, on what a bad idea that was. I came away with the impression that the German nation is well and truly back on track with more effective rules for securing the longevity of their tribe.
Closer to home we have a sub culture that believes that detailed system specifications don't have to be written. "We are exceptional, we therefore don't need to think deeply before we act. We can be agile. We can put it together on-the-fly." I look forward to the day when this brand of exceptionalism will be viewed as yet another unfortunate detour from "right" behaviour.
In conclusion: why am I appalled? Simply because your statement that morals change over time IS appalling. It's been used to justify human behaviours that, quite apart from being disgusting, were grossly ineffective in a assuring the survival of the tribe. Go to Berlin, do a 360 degree scan of the skyline, you'll see a mountain with a decommissioned American spy station on top. It's not a mountain. It's the rubble pile from the bombing of Berlin.
And furthermore: the utterance of a sentence like this, on a list like this, is yet another indication of the incompleteness of the modern technical education. The day the technologists parted company with the wisdom of the poets was a sad one. I try to stay in touch. It calms me down. For example, this quote from Gary Snyder:
"As a poet I hold the most archaic values on earth. They go back to the upper Paleolithic: the fertility of the soil, the magic of animals, the power-vision of solitude, the terrifying initiation and rebirth, the love and ecstasy of the dance, the common work of the tribe."
Our society is more just than it has ever been but our survival is not a given. Nor are we exceptional. We are governed by the same principles that were true in the Paleolithic.
Morals don't change over time. They evolve through time. They indicate behaviours that assure survival and safeguarding them is "the common work of the tribe."
Cheers
Les
From: Chris Hills [mailto:safetyyork at phaedsys.com]
Sent: Friday, June 5, 2015 4:57 PM
To: 'Les Chambers'; 'Andy Ashworth'; 'Steve Tockey'
Cc: systemsafety at lists.techfak.uni-bielefeld.de<mailto:systemsafety at lists.techfak.uni-bielefeld.de>
Subject: RE: [SystemSafety] Another unbelievable failure (file system overflow)
Well give me one ethic or moral that is fixed and universally applies constantly.
From: Les Chambers [mailto:les at chambers.com.au]
Sent: 05 June 2015 00:47
To: safetyyork at phaedsys.com<mailto:safetyyork at phaedsys.com>; 'Andy Ashworth'; 'Steve Tockey'
Cc: systemsafety at lists.techfak.uni-bielefeld.de<mailto:systemsafety at lists.techfak.uni-bielefeld.de>
Subject: RE: [SystemSafety] Another unbelievable failure (file system overflow)
For instance???
From:systemsafety-bounces at lists.techfak.uni-bielefeld.de<mailto:systemsafety-bounces at lists.techfak.uni-bielefeld.de> [mailto:systemsafety-bounces at lists.techfak.uni-bielefeld.de] On Behalf Of Chris Hills
Sent: Friday, June 5, 2015 6:21 AM
To: 'Andy Ashworth'; 'Steve Tockey'
Cc: systemsafety at lists.techfak.uni-bielefeld.de<mailto:systemsafety at lists.techfak.uni-bielefeld.de>
Subject: Re: [SystemSafety] Another unbelievable failure (file system overflow)
The problem is that ethics, like morality are not fixed and change over time and culture. There are no absolute morals nor ethics.
Regards
Chris
Eur Ing Chris Hills BSc CEng MIET MBCS FRGS FRSA
Phaedrus Systems Ltd Tel: FREEphone 0808 1800 358
96 Brambling B77 5PG Vat GB860621831 Co Reg #04120771
Http://www.phaedsys.com<http://www.phaedsys.com/> chills at phaedsys.com<mailto:chills at phaedsys.com>
From:systemsafety-bounces at lists.techfak.uni-bielefeld.de<mailto:systemsafety-bounces at lists.techfak.uni-bielefeld.de> [mailto:systemsafety-bounces at lists.techfak.uni-bielefeld.de] On Behalf Of Andy Ashworth
Sent: 02 June 2015 19:11
To: Steve Tockey
Cc: systemsafety at lists.techfak.uni-bielefeld.de<mailto:systemsafety at lists.techfak.uni-bielefeld.de>
Subject: Re: [SystemSafety] Another unbelievable failure (file system overflow)
Here in Canada, in order to be registered as an engineer you must have a recognized engineer I degree, appropriate experience and you must also pass a Professional Practice Exam consisting of two papers - law and ethics.
Andy
Sent from Andy's iPad
On Jun 2, 2015, at 14:05, Steve Tockey <Steve.Tockey at construx.com<mailto:Steve.Tockey at construx.com>> wrote:
Les,
For what it's worth, Seattle University (a private, Jesuit university) requires discussions of ethics in all university degree programs. I got my masters degree there (Software Engineering) and I taught later as an adjunct professor. While we didn't have ethics in every course, the topic was highly encouraged by the university and came up fairly frequently.
-- steve
From: Les Chambers <les at chambers.com.au<mailto:les at chambers.com.au>>
Date: Saturday, May 30, 2015 9:14 PM
To: Steve Tockey <Steve.Tockey at construx.com<mailto:Steve.Tockey at construx.com>>, 'Robert Schaefer at 300' <schaefer_robert at dwc.edu<mailto:schaefer_robert at dwc.edu>>
Cc: "systemsafety at lists.techfak.uni-bielefeld.de<mailto:systemsafety at lists.techfak.uni-bielefeld.de>" <systemsafety at lists.techfak.uni-bielefeld.de<mailto:systemsafety at lists.techfak.uni-bielefeld.de>>
Subject: RE: [SystemSafety] Another unbelievable failure (file system overflow)
Steve
Thanks for referencing the code of ethics. It should be brought up more often. Unfortunately, for me, it makes depressing reading. Especially when you come upon paragraphs such as:
3.12. Work to develop software and related documents that respect the privacy of those who will be affected by that software.
Although he has probably never read it, there is a man, who will probably never see his homeland again because he took these sentiments to heart and attempted his own corrective action. And what of the thousands of scientists, engineers and technologists who contributed to the construction of the software, the existence of which, he exposed to the world?
My point is that non-compliance with this code of ethics is massive and almost universal. In fact, any engineer maintaining strict compliance with every paragraph of this code would be unemployable in our modern world.
Reading these paragraphs through the lens of experience I am blown away by their flippancy. From personal experience I can tell you that screwing up the courage to implement even one of these items can be a massive life changing event. This sentence would be lost on a graduate. They're all perfectly reasonable statements of how one should behave. Much like, "Thou shall not kill, thou shall not commit adultery ...". The issue lies in the moral courage to implement.
There is no quick fix to this problem as we are a decentralised, unorganised and generally fragmented lot. We don't have the luxury of the medical profession that deals with a single organism. We can't simply state and righteously comply with the notion of, "Do no harm." In fact, for us, the opposite is true, many of us work in industries where the primary purpose is to kill other human beings, and with high efficiency (fewer soldiers kill more enemy).
One thing we can do is deal with the problem at its root:
We are graduating incomplete human beings from science and engineering courses. There is insufficient focus on the moral issues surrounding the impact of our machines on humanity. For example, a study of applied philosophy, including ethics, should be a nonnegotiable component of all engineering courses. Not just a final year subject, but a subject for every year with a weekly reflection on the content. Much like the weekly safety meetings I was forced to attend in the chemical processing industry.
I'm sure there will be howls of laughter at this, but, let me tell you it's the only thing that caused me to back a senior manager about five levels above my pay grade into a corner - he could physically not escape me short of punching me out and stepping over my body - and berate him until he promised to properly train his operators in the emergency procedures for a safety critical system.
Popping a few paragraphs up on the web would never have done the trick.
That experience was trivia compared to where we are headed. The massive computing power now available means that our software is beginning to take higher level decisions away from human beings. Some of these decisions are moral ones (refer my previous post on lethal autonomous weapons systems). "Shall I kill all humans associated with this structure, or no?"
At a recent engineering alumni meeting I asked the head of my old engineering Department how much philosophy is taught to undergraduate engineers. He chuckled. "It is available as an elective but less than one percent participate," he said.
I plan to speak to him again soon.
Cheers
Les
From: Steve Tockey [mailto:Steve.Tockey at construx.com]
Sent: Sunday, May 31, 2015 5:43 AM
To: Les Chambers; 'Robert Schaefer at 300'
Cc: systemsafety at lists.techfak.uni-bielefeld.de<mailto:systemsafety at lists.techfak.uni-bielefeld.de>
Subject: Re: [SystemSafety] Another unbelievable failure (file system overflow)
Les (and all),
I think that one thing we all need to be really careful of is that it's way too easy for us technical people to push all of the blame onto the managers:
"It's the fault of those idiotic Pointy Haired Bosses because they gave us such a ridiculously short schedule, …"
And it's true to an extent that managers are guilty. But it's not 100% their fault. Us technical people need to grow up and be more professional in our own behavior. Here's a great quote from Watts Humphries:
"When people are directed by top management to run a mile in two minutes, what should they do? Experienced hardware engineers have learned to first test the directive. If it is truly firm, the best ones develop a comprehensive comparison of this job with their prior experience to show management why this schedule is unrealistic. They also dig in their heels and insist that management either change the schedule to one that makes sense or relieve them of responsibility for meeting the dates. When better engineers do this, managers have little choice, unless they want to do the design themselves. Unfortunately, all too many programmers start looking for their running shoes."
The IEEE Computer Society and the Association of Computing Machinery (ACM) got together several years ago and agreed on a "Software Engineering Code of Ethics and Professional Practice" (see https://www.acm.org/about/se-code). Of particular relevance to this discussion are items: 1.03, 3.01, 3.02, 3.09 (a personal favorite of mine), 5.11, and 6.11. It's a no-brainer that as long as the technical community doesn't push back then managers will just continue to push ever harder.
I often use the term, "highly paid amateur programmer". I firmly believe that this an appropriate label that applies to the vast majority of software practitioners world wide. Their behavior is wholly unprofessional and, in fact, entirely unethical.
I could (and often do--smile) go off on a rant about how auto mechanics behave far, far more professionally than typical programmers.
Cheers,
-- steve
From: Les Chambers <les at chambers.com.au<mailto:les at chambers.com.au>>
Date: Friday, May 29, 2015 8:23 PM
To: Steve Tockey <Steve.Tockey at construx.com<mailto:Steve.Tockey at construx.com>>, 'Robert Schaefer at 300' <schaefer_robert at dwc.edu<mailto:schaefer_robert at dwc.edu>>
Cc: "systemsafety at lists.techfak.uni-bielefeld.de<mailto:systemsafety at lists.techfak.uni-bielefeld.de>" <systemsafety at lists.techfak.uni-bielefeld.de<mailto:systemsafety at lists.techfak.uni-bielefeld.de>>
Subject: RE: [SystemSafety] Another unbelievable failure (file system overflow)
Steve
Clap, clap, clap, clap. At last, a serious metric, guaranteed to make a difference because it uses story patterns, the only facility guaranteed to change attitudes. George should go underground and embrace the onion router. He is clearly a dangerous radical.
However, Dilbert aside, it behoves us to dig deeper and look at causal factors. Somewhere further back in this stream the point was made that the good programmer/bad manager metaphor gets trotted out too often. This is very true, I've been guilty of it myself, having socialist leanings and being in the presence of far too many disgustingly poor management decisions in my 40 year career. But. We should ask, "How does a programmer or a manager become BAD." I put it to the list that this is the exact same question as, "How does a person become a criminal?"
Most serial killers are the product of child abuse. Indeed most criminals have had damaged childhoods. Incompetent child rearing or no child rearing - not brought up, just kicked and told to get up. No role models or the wrong role models: Street gangs, drug dealers, thieves and murderers. Bill Clinton addressed this once:
"People who grew up in difficult circumstances and yet are successful have one thing in common; at a crucial juncture in their adolescence, they had a positive relationship with a caring adult." (More at: http://www.chambers.com.au/public_resources/mentoring_wp.pdf)
The FBI specialists who hunt down serial killers have a saying, "The best indicator of future behaviour is past behaviour."
So, any way you want to look at this problem, the only way to break the endless cycle of "glitches" is: better child rearing. Anyone responsible for the rearing of a software developer or his or her manager should reflect on this.
Cheers
Les
PS: This "... has become clear" (at least to me), "later on."
From:systemsafety-bounces at lists.techfak.uni-bielefeld.de<mailto:systemsafety-bounces at lists.techfak.uni-bielefeld.de> [mailto:systemsafety-bounces at lists.techfak.uni-bielefeld.de] On Behalf Of Steve Tockey
Sent: Saturday, May 30, 2015 4:34 AM
To: Robert Schaefer at 300
Cc: systemsafety at lists.techfak.uni-bielefeld.de<mailto:systemsafety at lists.techfak.uni-bielefeld.de>
Subject: Re: [SystemSafety] Another unbelievable failure (file system overflow)
>From what I remember about Scott Adams, at least in the early days he used a "three company rule". The majority of his comics come from ideas submitted by readers. His rule was that he had to see the same basic idea come from at least three different companies before he had confidence the problem was widespread enough to be understood/funny for a majority of readers. I don't know if he follows the same rule now, but it would make sense.
I agree, a socio-economic study of insights of Dilbert would be fascinating.
And, by the way, if anyone remembers the Software Engineering institute's "Capability Maturity Model" (CMM), here's a proposed update:
----- cut here -----
API Austin - First there were software metrics. With these, software developers and their management could finally measure something for the output of the software creation process. In the 80's these techniques flourished. Funny names for these measurements emerged, like "McCabe complexity" and "software volume".
Soon it was realized that there needed to be a way not only to measure the quality of the software output, but also to measure the quality of the engineering organization itself. The Capability Maturity Model, CMM, was developed in the early 90's. Organizations are audited by professionals and rated on a scale of 1 to 5. Low scores mean the software production process is chaotic, while 5 means that all aspects of software development are fully understood and carefully applied, organizations today weigh in at a meager 1, and there's a surprising number of 0's out there.
Now, a revolutionary new measurement technique has been developed by a small startup consulting firm in Austin, Texas. The new system is simply known as DCF. The simplicity and elegance of the new measuring system belies its power in accurately judging the soundness of a software organization.
The inventor of DCF and founder of the DiCoFact Foundation, George Kritsonis, says the new measurement system is "simple and fool-proof, but modifications are being made to make it management-proof as well".
One Sunday morning George was performing his normal ritual of reading the most important parts of the newspaper first, when he came across his favorite comic strip, "Dilbert" by Scott Adams. George and his work colleagues loved this comic strip and were amazed by how many of the silly storylines reminded them of actual incidences at their company.
They even suspected that Scott Adams was working there in disguise, or at least that there was a spy in the company feeding Scott daily promised to make him millions: The Dilbert Correlation Factor (DCF).
George's idea was simple: "Take 100 random Dilbert comic strips and present them in a survey to all your engineering personnel. Include both engineers and management. Each person reads the strips, and puts a check mark on each strip that reminds him of how his company operates. Collect all surveys and count the check marks. This gives you your Dilbert Correllation Factor, which can range of course from 0% to 100%. Average out the engineers scores. Throw out the manager's surveys, we just have them do the survey to make them feel important; however, if many of them scowl during the survey, add up to 5 points to the DCF (in technical terms, this is your Management Dissing Fudge Factor, MDFF). Make sure to also throw out surveys of engineers that laugh uncontrollably during the whole survey (remember their names for subsequent counseling). And that's all there is to it! Oh yeah, then walk around the building and count Dilbert cartoons on the walls. Don't forget coffee bars, bulletin boards, office doors and of course, bathrooms". Add up to 10 points for this Dilbert Density Coefficient Adjustment (DDCA).
Interpreting the results is simple. Let's look at some ranges:
0% - 25%: You probably have a quality software organization. However, you guy's need to lighten up! Maybe a few surprise random layoff, or perhaps initiating a Quality Improvement Program, will do the trick to boost your company's DCF to healthier level.
26% - 50%: This is also a sign of a good software organization, and is nearly ideal. You still manage to get a quality product out, and yet you still have some of the fun that only Dilbert lovers can identify with... Mandatory membership in social committees, endless e-mail debates about the right acronyms to use for the company products, and of course detailed weekly status reports where everyone lists "did status report" on accomplishments.
51% - 75%: This is the most typical DCF level for software houses today. Your software products are often in jeopardy due to the Dilbert-like environment they are produced in. You have a nice healthy dose of routine mismanagement, senseless endless meetings with no conclusions, miscommunications at all levels of the organization, and arbitrary commitments made to customers which send engineers into cataplexy.
76% - 100%: The best advice for this organization is this: Get the hell out of the software business. Hire the best cartoonist you can afford, have him join your project teams and document what he sees in comic strips... get 'em syndicated and you'll make a fortune!
George has applied for a patent on his unique DCF system. He is anxious to become a high-priced consultant, going to lots of companies, doing his survey, getting the fee, and getting out before management realizes they've been ripped off and have to hire another high-priced consultant to come in and set things right. George reports, "I'm thinking about a do-it-yourself version for the future, too. I'd put Dilbert cartoons on little cards so they can be passed out to the engineers for the survey... I'll probably call it 'Deal-a-Dilbert'. I'm also thinking about a simple measurement system that lets employees find out their personality type and where they best fit into the organization. I call this the 'Dilbert/Dogbert Empathy Factor' or 'DDEF' for short.
----- end cut here -----
Cheers,
-- steve
From: Robert Schaefer at 300 <schaefer_robert at dwc.edu<mailto:schaefer_robert at dwc.edu>>
Date: Friday, May 29, 2015 5:11 AM
Cc: "systemsafety at lists.techfak.uni-bielefeld.de<mailto:systemsafety at lists.techfak.uni-bielefeld.de>" <systemsafety at lists.techfak.uni-bielefeld.de<mailto:systemsafety at lists.techfak.uni-bielefeld.de>>
Subject: Re: [SystemSafety] Another unbelievable failure (file system overflow)
I would claim that this not always prospect theory sometimes dysfunction due to greed.
By deliberately not testing you can get the customer to:
1. become your beta tester, i.e. work for you for free
2. directly or indirectly get the customer pay you again for you fixing your own mistakes
3. You leave no evidence of criminal negligence (when you are indeed criminally negligent ->
if you did detect safety issues during testing, those issues would be recorded in the testing documentation).
I would like to see, someday, a serious socio-economic study of the insights of the Dilbert comic (dilbert.com<http://dilbert.com>).
I have read in interviews with the cartoonist (Scott Adams) that people email him what they've experienced,
and he just draws it up. One might claim that what he does is all made up, but I have my doubts given what
I've experienced as a programmer in several large corporations over the past decades.
________________________________
From:systemsafety-bounces at lists.techfak.uni-bielefeld.de<mailto:systemsafety-bounces at lists.techfak.uni-bielefeld.de> <systemsafety-bounces at lists.techfak.uni-bielefeld.de<mailto:systemsafety-bounces at lists.techfak.uni-bielefeld.de>> on behalf of Matthew Squair <mattsquair at gmail.com<mailto:mattsquair at gmail.com>>
Sent: Friday, May 29, 2015 2:13 AM
To: Heath Raftery
Cc: systemsafety at lists.techfak.uni-bielefeld.de<mailto:systemsafety at lists.techfak.uni-bielefeld.de>
Subject: Re: [SystemSafety] Another unbelievable failure (file system overflow)
An example of prospect theory?
Matthew Squair
MIEAust, CPEng
Mob: +61 488770655
Email; Mattsquair at gmail.com<mailto:Mattsquair at gmail.com>
Web: http://criticaluncertainties.com
On 29 May 2015, at 7:43 am, Heath Raftery <heath.raftery at restech.net.au<mailto:heath.raftery at restech.net.au>> wrote:
On 28/05/2015 11:50 PM, Chris Hills wrote:
Static analysis isn't free. Testing isn't free.
Who determines the need for or business case for static analysis and test?
[CAH] normally (every report I have seen) static analysis saves a lot of
time and money.
The same is true of structured testing.
Funnily enough, the only experience I've had recommending static analysis is as the programmer to the manager. This is indeed the argument I use. A strange thing happens in business though (and perhaps my lack of comprehension explains why I'm the programmer and not the manager ;-) ) - capital costs and investment are worse than running costs. Buying and applying static analysis, even if it is cheaper in the long run, is always seen as less attractive than paying labour to deal with the consequences later.
Heath
_______________________________________________
The System Safety Mailing List
systemsafety at TechFak.Uni-Bielefeld.DE<mailto:systemsafety at TechFak.Uni-Bielefeld.DE>
_______________________________________________
The System Safety Mailing List
systemsafety at TechFak.Uni-Bielefeld.DE<mailto:systemsafety at TechFak.Uni-Bielefeld.DE>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.techfak.uni-bielefeld.de/mailman/private/systemsafety/attachments/20150610/8de536a3/attachment-0001.html>
More information about the systemsafety
mailing list