System Accident
Encyclopedia
A system accident is an "unanticipated interaction of multiple failures" in a complex system
Complex system
A complex system is a system composed of interconnected parts that as a whole exhibit one or more properties not obvious from the properties of the individual parts....

. This complexity can either be technological or organizational, and often is both.

A system accident can be very easy to see in hindsight, but very difficult to see in foresight. Ahead of time, there are simply too many possible action pathways to seriously consider all of them.

These accidents often resemble Rube Goldberg devices in the way that small errors of judgment, flaws in technology, and insignificant damages combine to form an emergent
Emergence
In philosophy, systems theory, science, and art, emergence is the way complex systems and patterns arise out of a multiplicity of relatively simple interactions. Emergence is central to the theories of integrative levels and of complex systems....

 disaster. System accidents were described in 1984 by Charles Perrow
Charles Perrow
Charles B. Perrow is an emeritus professor of sociology at Yale University and visiting professor at Stanford University. He is the author of several books and many articles on organizations, and is primarily concerned with the impact of large organizations on society.-Academic appointments:After...

, who termed them "normal accidents", as having two main characteristics: interactive complexity and tight coupling. James T. Reason extended this approach with human reliability
Human reliability
Human reliability is related to the field of human factors engineering and ergonomics, and refers to the reliability of humans in fields such as manufacturing, transportation, the military, or medicine...

 and the Swiss cheese model
Swiss Cheese model
Models of accident causation are used for the risk analysis and risk management of human systems. Since the 1990s they have gained widespread acceptance and use in healthcare, in the aviation safety industry, and in emergency service organizations...

, now widely accepted in aviation safety and healthcare.

Once an enterprise passes a certain point in size, with many employees, specialization, backup systems, double-checking, detailed manuals, and formal communication, employees can all too easily recourse to protocol, habit, and "being right." Rather like attempting to watch a complicated movie in a language one is unfamiliar with, the narrative thread of what is going on can be lost. And other phenomena, such as groupthink, can also be occurring at the same time, for real-world accidents of course almost always have multiple causes, and not just the single cause that could have prevented the accident at the very last minute. In particular, it is a mark of a dysfunctional organization to simply blame the last person who touched something.

The processes of formalized organizations are often largely opaque. Perrow call this "incomprehensibility."

There is an aspect of an animal devouring its tail, in that more formality and effort to get it just right can actually make the situation worse. The more organizational rigmarole involved in adjusting to changing conditions, the more employees will delay reporting the changing conditions. The more emphasis on formality, the less likely that employees and managers will engage in real communication. And new additional rules can actually make it worse, both by adding an additional new layer of complexity and by telling employees yet again that they are not to think, but are instead simply to follow rules.

Apollo 13, 1970

Three Mile Island, 1979

The 1979 Three Mile Island accident
Three Mile Island accident
The Three Mile Island accident was a core meltdown in Unit 2 of the Three Mile Island Nuclear Generating Station in Dauphin County, Pennsylvania near Harrisburg, United States in 1979....

 inspired Perrow's Normal Accidents
Normal Accidents
Normal Accidents: Living with High-Risk Technologies is a 1984 book by Charles Perrow, which provides a classic analysis of complex systems conducted from the point of view of a social scientist...

book, where a nuclear accident occurs, resulting from an unanticipated interaction of multiple failures in a complex system. TMI was an example of a normal accident because it was "unexpected, incomprehensible, uncontrollable and unavoidable".

Perrow concluded that the failure at Three Mile Island was a consequence of the system's immense complexity. Such modern high-risk systems, he realized, were prone to failures however well they were managed. It was inevitable that they would eventually suffer what he termed a 'normal accident'. Therefore, he suggested, we might do better to contemplate a radical redesign, or if that was not possible, to abandon such technology entirely.


When systems exhibit both "high complexity" and "tight coupling", as at Three Mile Island, the risk of failure becomes high. Worse still, according to Perrow, "the addition of more safety devices -- the stock response to a previous failure -- might further reduce the safety margins if it adds complexity".

ValuJet 592, Everglades, 1996

The source of this article is wikipedia, the free encyclopedia.  The text of this article is licensed under the GFDL.
 
x
OK