I came across this paragraph while reading an article about cognitive psychology on the Hershey Medical Center's www.hmc.psu.edu/informatics page. Although it is specifically oriented to medical failures, citing comments by Richard Cook, an anesthesiologist, on the subject, I thought it was worth repeating as a general observation.
"Cook maintains that complex systems fail when a series of latent failures, each insufficient to cause an accident by itself, come together. He likens this to a pile of Swiss cheese slices: the latent failures are the holes, and when they line up, they form a tunnel through which safety falls. The result: an accident. No one person is to blame, yet all too often organisations respond to disaster by finding a culprit to blame, re-training the staff, issuing new regulations, and investing in "safer" technology. This sort of reaction is all the more likely because of what Cook calls "hindsight bias": the tendency to allow one's knowledge of the outcome to bias one's view of the events leading up to that outcome. But this reaction tends to obscure the complexities that actually led to the disaster, makes the system even more complex, and consequently introduces new opportunities for failure."
No comments:
Post a Comment