Well, this has been an interesting week. I worked from home for most of the week, so that I could provide assistance to my mother. She lives with us, and at the moment, needs help with pretty much everything, from the moment she gets up until she goes to sleep. And sometimes in that midnight interval, too. She's not completely helpless, since she can move with a walker, but she's tied down to an oxygen tank, so wherever she goes, she trails a thin plastic line. She tires very easy, and can be freaked out by the most modest of problems. My partner and I have come to the realization that this is, in many ways, exactly what it was like when we took care of my daughter, after she was born. She got over it, and can handle herself just fine now. We're hoping that my mother makes the same transition.
Saturday, April 26, 2003
Sunday, April 20, 2003
Last night, I turned to my partner at one point and said "If I ever get like that, just shoot me."
The occasion was the arrival home of my mother from her extended hospital stay after bypass surgery. She had done well in the hospital, and was then transferred to a rehabilitation hospital. At the time, we were perplexed, because she had seemed fine -- tiring quickly, and occasionally short of breath, but nothing that seemed unduly unsettling, given the circumstances. But now, at home, after two weeks in the rehab hospital, she seems frail, easily disoriented, and prone to panic.
Just shoot me.
The occasion was the arrival home of my mother from her extended hospital stay after bypass surgery. She had done well in the hospital, and was then transferred to a rehabilitation hospital. At the time, we were perplexed, because she had seemed fine -- tiring quickly, and occasionally short of breath, but nothing that seemed unduly unsettling, given the circumstances. But now, at home, after two weeks in the rehab hospital, she seems frail, easily disoriented, and prone to panic.
Just shoot me.
Tuesday, April 15, 2003
I came across this paragraph while reading an article about cognitive psychology on the Hershey Medical Center's www.hmc.psu.edu/informatics page. Although it is specifically oriented to medical failures, citing comments by Richard Cook, an anesthesiologist, on the subject, I thought it was worth repeating as a general observation.
"Cook maintains that complex systems fail when a series of latent failures, each insufficient to cause an accident by itself, come together. He likens this to a pile of Swiss cheese slices: the latent failures are the holes, and when they line up, they form a tunnel through which safety falls. The result: an accident. No one person is to blame, yet all too often organisations respond to disaster by finding a culprit to blame, re-training the staff, issuing new regulations, and investing in "safer" technology. This sort of reaction is all the more likely because of what Cook calls "hindsight bias": the tendency to allow one's knowledge of the outcome to bias one's view of the events leading up to that outcome. But this reaction tends to obscure the complexities that actually led to the disaster, makes the system even more complex, and consequently introduces new opportunities for failure."
"Cook maintains that complex systems fail when a series of latent failures, each insufficient to cause an accident by itself, come together. He likens this to a pile of Swiss cheese slices: the latent failures are the holes, and when they line up, they form a tunnel through which safety falls. The result: an accident. No one person is to blame, yet all too often organisations respond to disaster by finding a culprit to blame, re-training the staff, issuing new regulations, and investing in "safer" technology. This sort of reaction is all the more likely because of what Cook calls "hindsight bias": the tendency to allow one's knowledge of the outcome to bias one's view of the events leading up to that outcome. But this reaction tends to obscure the complexities that actually led to the disaster, makes the system even more complex, and consequently introduces new opportunities for failure."
Monday, April 14, 2003
I learned an interesting bit of trivia last night. The 'Ides' of March is an event that will actually occur every month. Four months of the year, it's on the fifteenth; for the others, it's on the thirteenth.
And in the newspaper on this month's Ides, unintentionally funny articles. One says that well, sure, Hussein was bad, but he would have died, eventually; this 'freedom' thing the Americans brought over will never go away. Therefore, we're in more trouble now than before. Another repors the lament of an Iraqi museum director that Iraqis looted his museum; the Americans should have provided security for it, therefore, its the Americans' fault.
What troublemakers these Americans are! Glad to be one.
And in the newspaper on this month's Ides, unintentionally funny articles. One says that well, sure, Hussein was bad, but he would have died, eventually; this 'freedom' thing the Americans brought over will never go away. Therefore, we're in more trouble now than before. Another repors the lament of an Iraqi museum director that Iraqis looted his museum; the Americans should have provided security for it, therefore, its the Americans' fault.
What troublemakers these Americans are! Glad to be one.
Friday, April 11, 2003
One of the functions of an expert should be to filter all -- or mostly all -- of the details of an activity from the person for whom the activity is being done. They should be a black box, where all the recipient knows is that the function was done, and output was created.
I was just reading the documentation for a capacity planning tool. The tool was written by some people whom I'm gonna assume as smarter than me; certainly more experienced in what they do than me. The goal of the tool is to let someone of reasonable intelligence figure out some things about a mainframe computing environment -- things that would be really helpful to know, such as; if we continue to grow workload at the current rate, when will we run out of capacity? To a capacity planner, that's a 'it depends' question. (Actually, sometimes it seems like everything, including "What did you have for lunch today?", is a 'it depends' question to them). But in this case, it really is, because the question translates into this: If we continue to grow workload at the current rate, assuming that all of the components grow at the same rate, and assuming we know what we mean by 'grow', when will we get to a point where we either a)dislike the response time or turnaround, b) despise it, or c) have to stop running some things so that we have enough capacity to run others'. This is where the person asking the question starts making hand motions which are intended to convey disgust with this pettifoggery; just answer the damn question already! And so the expert does, using round numbers, shrugs, and the occasional muttered ' it depends'.
What brought all this on was that the documentation that I was reading seems to have been written for a bright five year old. It uses lots of complex concepts (the bright part) and says really obvious things (if you don't like the screen color, you can change it). Okay, perhaps that isn't so obvious, but there's something about the way they say it that makes me want to skip to another paragraph in search of immediately useful information. Just answer the damn question already! In this case, the expert (them) is not shielding the user (me) from the details. They might not be tellling me everything about what it takes to do the function, but they're telling me way more than I wanted to know. I just want to push the button, turn the crank, and get the answer.
Is that too much to ask? (I can hear the expert's resounding answer.)
I was just reading the documentation for a capacity planning tool. The tool was written by some people whom I'm gonna assume as smarter than me; certainly more experienced in what they do than me. The goal of the tool is to let someone of reasonable intelligence figure out some things about a mainframe computing environment -- things that would be really helpful to know, such as; if we continue to grow workload at the current rate, when will we run out of capacity? To a capacity planner, that's a 'it depends' question. (Actually, sometimes it seems like everything, including "What did you have for lunch today?", is a 'it depends' question to them). But in this case, it really is, because the question translates into this: If we continue to grow workload at the current rate, assuming that all of the components grow at the same rate, and assuming we know what we mean by 'grow', when will we get to a point where we either a)dislike the response time or turnaround, b) despise it, or c) have to stop running some things so that we have enough capacity to run others'. This is where the person asking the question starts making hand motions which are intended to convey disgust with this pettifoggery; just answer the damn question already! And so the expert does, using round numbers, shrugs, and the occasional muttered ' it depends'.
What brought all this on was that the documentation that I was reading seems to have been written for a bright five year old. It uses lots of complex concepts (the bright part) and says really obvious things (if you don't like the screen color, you can change it). Okay, perhaps that isn't so obvious, but there's something about the way they say it that makes me want to skip to another paragraph in search of immediately useful information. Just answer the damn question already! In this case, the expert (them) is not shielding the user (me) from the details. They might not be tellling me everything about what it takes to do the function, but they're telling me way more than I wanted to know. I just want to push the button, turn the crank, and get the answer.
Is that too much to ask? (I can hear the expert's resounding answer.)
Subscribe to:
Posts (Atom)