Saturday, June 27, 2015

Will it work? The Failure (and Success) of the Little Joe II

When designing a brand-new product or system, an engineer asks (and is asked): "Will it work?"  The performance of the design is a major source of uncertainty, and because mission or commercial success depends upon the answer to this fundamental question, information about the its performance is valuable.  Testing a prototype is a common way to get this information.  If the prototype works, then there is less uncertainty, which makes accepting the design desirable.

During the development of the Apollo spaceflight program, the engineers at NASA and its contractors designed and conducted numerous tests of the many components and systems, especially those critical to the safety of the astronauts.  To test the Launch Escape System (LES), the Convair Division of General Dynamics constructed a 86-foot tall rocket named "Little Joe II" and used it to conduct tests at the White Sands Test Facility in New Mexico between 1964 and 1966.

During the test A-003 in May, 1965, the Little Joe II rocket broke apart unexpectedly, which triggered the LES, which worked exactly as it was designed to do.  It took the command module (an unmanned boilerplate version) away from the rocket.  After reaching 19,000 feet, the LES and command module separated, and the command module's parachute system lowered it to the ground.  This surprise was extremely valuable because demonstrated that the LES would work, even under extreme conditions.  Although the rocket failed, the test was a success.

For more information, see this section from a NASA publication: http://history.nasa.gov/SP-4205/ch4-2.html.  See also a video about the LES and this particular test at https://www.youtube.com/watch?v=AqeJzItldSQ.  The Johnson Space Center and the New Mexico Museum of Space History have examples of the Little Joe II rocket.

Saturday, June 20, 2015

Option Awareness

MITRE organized a Decision Making in Complex Systems Technical Exchange Meeting at their McLean, Virginia, site this week, and the meeting included valuable presentations on modeling complex systems, visualizing their performance, and supporting decision making.  (Full disclosure: I was one of the speakers.)

At the meeting, Jill Drury and Gary L. Klein discussed their research on option awareness (OA), which is "the perception and comprehension of the relative desirability of available options, as well the underlying factors and trade-offs that explain that desirability."
Their experimental research has shown that OA decision support tools that present the distribution of performance for each option can help decision-makers select the most robust alternatives, understand the factors that affect their performance, and generate new options.
Their collaborators include Mark Pfaff and others at Indiana University.
For more details and examples of the visualization, see their paper in the Journal of Cognitive Engineering and Decision Making, which can be found at http://www.iupui.edu/~grappa/publications/Supporting_Complex_Decision_Making_Through_OA_Pfaff_et_al_2012_JCEDM.pdf

MITRE also recently hosted the 12th International Naturalistic Decision Making Conference.
The conference website is http://www2.mitre.org/public/ndm/index.html.

Tuesday, June 2, 2015

Improving design decision making

Here at the ISERC in Nashville this week, I picked up the January 2014 issue of
IIE Transactions on Occupational Ergonomics and Human Factors (http://www.tandfonline.com/toc/uehf20/current) and found the article "Adapting Engineering Design Tools to Include Human Factors" by Judy Village et al.

The article describes how researchers at Ryerson University (in Toronto) worked with human factors specialists and engineers at a large electronics manufacturer to change how that firm designs the assembly process for its new products.  The changes led to design tools that help the firm's engineers consider human factors issues during the design process to make assembly easier, safer, and faster.  That is, the changes modified the objectives and constraints used to make assembly process design decisions.  For example, the design must satisfy a human factors target by scoring well on 22 items related to human factors.

In addition, the process used to develop these new design tools is interesting.  According to the article, "an action research approach was used, where researchers were embedded in the organization and together took action" to plan, implement, and improve the tools.  This process emphasized understanding the design process and its metrics, tools, and language and then identifying opportunities to improve the design tools with feasible, desirable changes (that is, the changes had to "fit the design process" and "provide important metrics for business performance").

Although the new design tools may be specific to this firm, the process used can be applied elsewhere.  The authors state that their contribution includes "the lessons learned about the process of adapting internal engineering tools"; that is, they have showed how an organization can improve design decision making.