I went to the Wikipedia and looked up Murphy's Law, thereby running across all the similar such laws, Sod’s, Finagle’s, and all the variations of stating them:
- Whatever can go wrong, will go wrong.
- Anything that can go wrong, will.
- Whatever can go wrong will go wrong, and at the worst possible time
- If there's more than one way to do a job, and one of those ways will result in disaster, then somebody will do it that way.
- The perversity of the Universe tends towards a maximum.
- Bad fortune will be tailored to the individual.
- Good fortune will occur in spite of the individual’s actions.
The first list is generally considered to be Murphy’s Law variants, while the second list (Sod’s and Finagle’s Laws) has more to do with the human perception of fate, irony, and the injustice of the universe. The last statement is exemplified by the idea that it won’t rain if you take your umbrella. In our household, we call this “using unsympathetic magic.”
There are two ways of looking at Murphy’s Law, and one of them is the “perverse universe” or “jelly side down” viewpoint. That’s actually the more popular version, because people like to complain. There’s a physics version, for example, that says simply, “Mother Nature is a Bitch.”
Here, though, it’s worth quoting the Wikipedia entry:
In any case, the phrase first received public attention during a press conference in which Stapp was asked how it was that nobody had been severely injured during the rocket sled tests. Stapp replied that it was because they always took Murphy's Law under consideration; he then summarized the law and said that in general, it meant that it was important to consider all the possibilities (possible things that could go wrong) before doing a test and act to counteract them. Thus Stapp's usage and Murphy's alleged usage are very different in outlook and attitude. One is sour, the other an affirmation of the predictable being able to be surmounted, usually by sufficient planning and redundancy.
So, properly understood, Murphy’s Law is an engineering design principle, whose highest expression is in the idea of Fail Safe, a design whose failure modes are all innocuous, rather than a design with failure modes that, however rare, can have disastrous consequences. (The book and movie versions of Fail Safe, of course, used the phrase ironically).
One of the most famous (and flagrant) violations of Murphy’s Law as a design principle is to be found in nuclear reactor design in the 50s, 60s, and 70s, where safety was, for the most part, actively maintained, so a failure of safety mechanisms could lead to disaster. The notorious Rasmussen report (Rasmussen N. (editor) (1975) Reactor Safety Study WASH-1400, USNRC) calculated the likelihood of catastrophic accident as very unlikely, but the calculations involved an assumption of statistical independence of adverse events, i.e. that one bad thing happening didn’t have an effect on other bad things happening.
In the case of the Three Mile Island disaster, however, multiple bad things did happen, all more or less at once, at least partly because human error tends to come in clusters, like on the midnight shift, when people are not tippy top. What saved TMI from being a full bore catastrophe was the final, passive safety system: the containment structure. It’s worth noting that had Chernobyl had a similar containment structure, the graphite fire would have rapidly consumed all the available oxygen and ceased. The Chernobyl radioisotopes would probably have remained within containment. It’s also worth noting that there were many in the U.S. nuclear industry in the 1970s who argued that containment structures were a needless expense because reactors were so safe.
Another problem with the assumption of statistical independence of adverse events is the “bathtub curve,” the statistical distribution of failure of devices. Simply put, things tend to break most often when they are either new (due to manufacturing flaws), or when they are near the end of their designed life. In other words, new things have a greater likelihood of multiple component failure, as do devices near the end of their own design life. This is because designers try not to put very long-lived components with those of much shorter lives, sturdier components being more expensive. Thus, multiple component failure is much more likely near the end of device design limits, aka “The Wonderful One Hoss Shay” effect.
Evolution does this with people, too, and it’s not just aging that produces a “design envelope.” I wrote a paper, never published (though it made the rounds within the air geek community such that I’ve seen some signs that some of my colleagues got the point) entitled , "Does PM Mortality Follow a Bathtub Curve?" which suggests that there is both a high exposure and low exposure elevated human health reaction to particulate matter in the atmosphere. That, of course, is another, albeit related, topic.
No comments:
Post a Comment