NPR posted a piece about food that minimizes the risk of choking. In Japan, more people die from choking than from traffic accidents, and the difficulty that elderly person have swallowing is a leading cause of choking. The cooked food is pureed and then re-formed (with a thickener) into a dish that looks like regular food but is easier to swallow (no chewing required).
Meanwhile, The Washington Post had an article about the importance of knowing the Heimlich maneuver and CPR, both of which can help someone who is choking.
These highlight both sides of managing risk: (1) preventing a potential problem (choking) by eating foods that are less likely to cause choking, and (2) following a contingency plan (the Heimlich maneuver) if someone does start choking.
Bill Murray, who played a weatherman who saves someone from choking in the 1993 movie Groundhog Day, saved a man from choking in a Phoenix restaurant in 2016 by using the Heimlich maneuver, which he learned while making the movie.
Showing posts with label accidents. Show all posts
Showing posts with label accidents. Show all posts
Wednesday, July 12, 2017
Saturday, August 29, 2015
Accepting the Risk of a Derailment
According to an article in The Wall Street Journal, based on documents from the National Transportation Safety Board investigation into the 2014 crude oil train derailment in Lynchburg, Virginia, CSX Corp. knew of a flaw in the section of track where the derailment occurred.
On April 29, 2014, a track inspection had revealed the flaw, and CSX decided to replace a 40-foot piece of track on May 1. The accident, which caused an estimated $1 million in damage, occurred on Wednesday, April 30.
The track inspection indicates that CSX was monitoring the risk of a derailment, and their decision to replace the track segment (given the inspection result) shows that they were reacting to the increased risk (indicated by the precursor: the internal flaw).
Until the NTSB releases its final report, we can propose scenarios that illustrate the difficulty of risk management: The decision to continue using that line before the replacement was done suggests that someone at CSX was willing to accept the derailment risk. Risk mitigation has costs, and greater risk aversion costs more. Perhaps the cost of closing that line (with the consequent disruption to shipping and revenue) for two days was too large.
Another possibility is that those who detected the flaw and scheduled the track replacement failed to communicate the increased risk to the those responsible for the operations on that track.
Thursday, May 14, 2015
Columbia's Final Mission
In my Engineering Decision Making and Risk Management class last week we discussed the space shuttle Columbia accident using the case study Columbia's Final Mission by Michael Roberto et al. (http://www.hbs.edu/faculty/Pages/item.aspx?num=32441).
The case study highlights topics related to making decisions in the presence of ambiguous threats, including the nature of the response, organizational culture, and accountability.
It also discusses the results of the Columbia Accident Investigation Board (http://www.nasa.gov/columbia/home/CAIB_Vol1.html).
When discussing the case in class, a key activity is re-enacting a critical Mission Management Team (MMT) meeting, which gives students a chance to identify opportunities to improve decision making.
My class also discussed the design of warning systems, risk management, risk communication, different decision-making processes, and problems in decision making, all of which reinforced the material in the textbook (http://www.isr.umd.edu/~jwh2/textbook/index.html).
We concluded that the structure of the MMT made effective risk communication difficult and key NASA engineers and managers failed to describe the risk to those in charge.
Moreover, the decision-makers used a decision-making process that prematurely accepted a claim that the foam strike would cause only a turn-around problem, not a safety-of-flight issue, and this belief created another barrier to those who were concerned about the safety of the astronauts and wanted more information.
Failures such as the Columbia accident are opportunities to learn, and case studies are a useful way to record and transmit information about failures, which is essential to learning.
We learned how ineffective risk communication and poor decision-making processes can lead to disaster.
The case study highlights topics related to making decisions in the presence of ambiguous threats, including the nature of the response, organizational culture, and accountability.
It also discusses the results of the Columbia Accident Investigation Board (http://www.nasa.gov/columbia/home/CAIB_Vol1.html).
When discussing the case in class, a key activity is re-enacting a critical Mission Management Team (MMT) meeting, which gives students a chance to identify opportunities to improve decision making.
My class also discussed the design of warning systems, risk management, risk communication, different decision-making processes, and problems in decision making, all of which reinforced the material in the textbook (http://www.isr.umd.edu/~jwh2/textbook/index.html).
We concluded that the structure of the MMT made effective risk communication difficult and key NASA engineers and managers failed to describe the risk to those in charge.
Moreover, the decision-makers used a decision-making process that prematurely accepted a claim that the foam strike would cause only a turn-around problem, not a safety-of-flight issue, and this belief created another barrier to those who were concerned about the safety of the astronauts and wanted more information.
Failures such as the Columbia accident are opportunities to learn, and case studies are a useful way to record and transmit information about failures, which is essential to learning.
We learned how ineffective risk communication and poor decision-making processes can lead to disaster.
Subscribe to:
Posts (Atom)