ArticlePDF Available

Human Performance in uncertain environments: What makes avalanche forecasting hard?

Authors:

Abstract

This article ran in The Avalanche Review issue 37.4 (April 2019). Much of the human factors literature in snow safety decries the fallibility of humans and their poor decision-making, but this article posits that the work done by avalanche professionals is quite exceptional. While it is true that part of risk perception and decision making can be subject to flawed thinking and dangerous biases, this is only one small dimension of a much broader understanding of what constitutes expert performance in uncertain, changing environments. This article will describe the conditions that makes snow safety difficult while drawing comparisons to highly skilled practice in other high risk/high consequence domains and then make a pitch for using this perspective to explore some useful avenues for making progress on accident prevention.
Human Performance in uncertain environments.
By LAURA MAGUIRE
If you have been reading much about human factors lately, you might agree that mountain
safety professionals are due for a pep talk.
However, contrary to the extensive literature decrying the fallibility of humans and their poor
decision-making, the work done by avalanche professionals is quite exceptional.
So while it is true that part of how we perceive risk and make decisions can be subject to
flawed thinking and dangerous biases, this is only one small dimension of a much broader
understanding of what constitutes expert performance in uncertain, changing environments.
This article will describe the conditions that makes snow safety difficult while drawing
comparisons to highly skilled practice in other high risk/high consequence domains and then
make a pitch for using this perspective to explore some useful avenues.
What makes snow safety hard?
Looking into a field of practice and asking ‘what makes this hard?’ is a fundamental question
for understanding what is means to be an expert in that kind of work. It’s a frame of
reference that provides insights into the kinds of challenges faced by practitioners where
simple proceduralization is not possible (or desired).
In other words, how do experts cope when the data is ambiguous, analysis remains
uncertain, and rules are underspecified, insufficient or, inapplicable? The avalanche
community already recognizes the limitations of strict rule following in making judgements
about the snowpack. For example, in the Canadian Avalanche Association Observational
Guidelines and Reporting Standards for Weather, Snowpack and Avalanches (OGRS), there
are seven instances of the word “rule” whereby six of these indicated that a definitive rule
was impossible! (The seventh instance was to describe a rule of thumb and indicate
variability was required).
If the rules by themselves are unable to prescriptively define safe decisions, yet many
outcomes are successful, then avalanche professionals must be doing something right!
Given this paradox, we can make a guess that there is sophisticated cognitive work - in
perception, reasoning, evaluation and judgement - that goes into successfully managing the
ambiguity in forecasting and guiding work.
Given the extensive literature on the technical aspects of forecasting you might think this
already exists. Yes and no. There is a solid foundation of work based on introspective self
reports from interviews and surveying of experienced practitioners, but “cognitive
psychologists have noted there are limitations to what people can actually tell us about their
mental processes” (Nisbett & Wilson, 1977, p.232). This means studies based on self-
reports will only provide a partial understanding. In addition, observational studies can be
similarly limited. As the ‘fluency law’ (Woods, 2005) notes, most experts are really good at
what they do; this often makes their work look easy to observers so it’s hard to ‘see’ just how
difficult it actually is! We need to use other methods and triangulate them to uncover
cognitive aspects of managing risk in the mountains. A good place to start doing this is by
examining ‘the hard stuff.’
Challenges to performance in avalanche forecasting
Given what we know - based on studies in healthcare, air traffic management, and mission
control in space exploration (Cook & Rasmussen, 2005; Patterson, Watts-Perotti, & Woods,
1999; Smith, P.J., McCoy, E. and Orasanu, J., 2001)- there are common patterns to what
can make a job hard. In mountain environments, salience & discriminability, change, goal
conflicts, and coordination are of particular interest. Studying how experienced practitioners
handle these difficult aspects of their work helps provide insight into the nature of domain
specific expertise.
Salience & discriminability of cues.
Salience refers to how noticeable or discriminable information is given the backdrop of the
environment and all the other possible sources of information. For example, a rapid
deformation and crack propagating fracture is a very salient cue - there is the movement of
the snow shifting (even slightly) in crack propagation across a slope, changes in light and
texture on the surface of the snow as the fracture line opens up, and perhaps even an
auditory signal.
But that example is a bit of an outlier because to uncover many of the meaningful signals or
cues about what is happening in the snowpack you literally have to go digging to extract the
right kinds of information. In other high consequence monitoring environments like nuclear
power plants or intensive care units there are often layered sensor networks providing real
time data to aid the operator in monitoring the state of the system across multiple variables.
In those environments, threshold alarms or visual displays can highlight minor variations to
inform the plant operator or nurse that the state is changing. When out in the field, snow
safety professionals have to accurately perceive often very subtle cues. And, even
when assessing already existing data from previous reports or online databases, the
information is presented in a way that requires mental effort to extract the meaningful data
from the background information.
This represents the first of the challenges in forecasting - data has varying levels of salience
and is collected over different points in time - so variations in that data requires ongoing
interpretation to recognize when conditions change in a meaningful way.
Change is a constant.
Notice, in that last paragraph I didn’t say if conditions change. Because this is another key
factor of what makes avalanche forecasting cognitively demanding work: conditions are
continually changing, and often in an unpredictable and interactive manner.
Because, while we may understand conceptually how windloading can influence the
snowpack, there are only imprecise measurements of how much and where the loads are
setting up. Even with telemetry devices, these can fail or be overcome with rime providing
false information and adding to uncertainty. Also, the measurements taken are categorical (a
point in time) not continuous. This means relevant cues take time to accumulate which can
slow decisions about the trajectory of the stability (Is it increasing? Decreasing? How slowly
or quickly? What might this mean for my guiding plans today? What other factors will inform
not only my assessment but my planning or revision when conditions change?)
Noticing change (and rates of change) and interpreting its meaning is best supported with
continuous telemetry with low time delay. However, field data will always come with some
form of time delay (for instance, waiting for the sun to come up to be able to visually inspect
a cornice or in the time it takes to ski out to a slope and dig a pit). Lag in any kind of system
where the hazard, once triggered, is largely unstoppable severely compromises the capacity
to manage the variability (and its corresponding risk) common in that system.
Multiple competing demands
Forecasting work, like most high demand practice, occurs within a system subject to
constraints. For instance, control work must be completed before the hill can open to guests
anxiously waiting for first chair or incoming weather is limiting the window for the helicopter
meaning you have to prioritize the plan.
Even the most safety conscious organization does not exist solely to eliminate risk - instead
they control risk to meet other objectives. These goals - running a successful ski hill, keeping
a highway open, or enabling crews safe transit to keep an construction project on schedule-
are subject to tradeoffs in order to maintain safe operations.
Managing these competing demands is part of successful expert performance. Pilots, at the
bare minimum, are obviously expected not to crash the plane but they also balance
responsibilities for having on-time departures, ensuring passenger comfort, minimizing fuel
costs and keeping accurate flight records. These additional demands require consideration
in making or revising plans as disruptions occur. Similarly, the analysis that a ski guide has
to do to ensure clients who’ve paid thousands of dollars get to safely ski great lines means
additional cognitive burden.
Coordination is key.
The multiple goals of the work system go hand-in-hand with another aspect that makes
forecasting work hard: the need to coordinate across a distributed network.
The coordination may be so others can provide information (like when a team calls in the
results of their control work), to adjust their actions (say, changing the pickup location with
the cat driver), to provide approvals, or to communicate to public and other impacted users.
Well-coordinated groups run smoothly - minimizing downtime or unnecessary risk - and help
to proactively identify issues. Coordination breakdowns increase the cognitive work by
introducing lag or requiring more effort to determine what others are doing and how that may
impact your plans.
So, what does this mean?
In outlining the characteristics that make the work hard, it is clear that snow safety work is
cognitively demanding. Describing the “hard work” in this way provides more specific
explanations into why things sometimes go wrong. Research into the cognitive work of
avalanche forecasting in ski resort operations (Maguire & Percival, 2018) provides an
example of how this new perspective can reframe how we think about professional practice.
Further studies can help generate rich descriptions that allow for comparing and contrasting
performance across a variety of conditions and work environments. This gives us more
nuanced understanding of when things like heuristics and biases help us cope with dynamic
and demanding environments and when it can get us into trouble. This kind of data also
provides promising design directions for engineering tools and technologies to better support
practitioners. No matter what your opinion on why people make mistakes, avoiding
oversimplifications about how work gets done is critical.
While we know a lot about the technical expertise in snow safety professions, making visible
the strategies used to get work done in context - with all the messy details, goal conflicts,
and complexities - can help improve training programs, develop new technologies, refine
procedures and enhance teamwork to support more successful outcomes.
Many thanks to Greg Gagne for the conversation that inspired this article and his feedback
and to Jesse Percival for his feedback.
References:
Cook, R., & Rasmussen, J. (2005). “Going solid”: a model of system dynamics and
consequences for patient safety. Quality & Safety in Health Care, 14(2), 130134.
Maguire, L. M., & Percival, J. (2018). Sensemaking in the snow: Examining the cognitive
work of avalanche forecasting in a Canadian ski operation. Presented at the
International Snow Science Workshop, Innsbruck, Austria.
Nisbett, R. E., & Wilson, T. D. (1977). Telling more than we can know: Verbal reports on
mental processes. Psychological Review, 84(3), 231.
Patterson, E. S., Watts-Perotti, J., & Woods, D. D. (1999). Voice loops as coordination aids
in space shuttle mission control. Computer Supported Cooperative Work: CSCW: An
International Journal, 8(4), 353371.
Smith, P.J., McCoy, E. and Orasanu, J. (2001). Distributed cooperative problem-solving in
the air traffic management system. In G Klein And (Ed.), Linking expertise and
Naturalistic Decision Making (pp. 369384.). Erlbaum.
Woods, D. (2005). Studying cognitive systems in context: the cognitive systems triad.
Institute for Ergonomics, The Ohio State University, Columbus, OH. Retrieved from
http://csel.eng.ohio-state.edu/productions/woodscta/media/triad_intro_final.pdf
Author Bio:
Photo credit: Joshua Kutryk
Laura Maguire “pre-tired” at age 18 and spent 10 years ski bumming around Western
Canada. Now she is a PhD student & researcher with the Cognitive Systems Engineering
lab at The Ohio State University where she studies adaptive human performance in high
risk, high consequence environments. She has a Masters degree in Human Factors &
Systems Safety from Lund University in Sweden. When not in a library, she is out skiing,
climbing, biking or reading in a hammock.
PHOTOS FOR ARTICLE:
Photo credit: Jesse Percival Extracting meaningful data to support ongoing assessments is
part of the cognitive demands in snow safety.
Photo credit: Jesse Percival Cognition is distributed amongst the team. Multiple, overlapping
perspectives broaden assessments and broaden tentative hypothesis about what is
happening in the snowpack.
Photo credit: Jesse Percival. Canadian coast range; Mt. Horetzky
ResearchGate has not been able to resolve any citations for this publication.
Article
Full-text available
Voice loops, an auditory groupware technology, are essential coordination support tools for experienced practitioners in domains such as air traffic management, aircraft carrier operations and space shuttle mission control. They support synchronous communication on multiple channels among groups of people who are spatially distributed. In this paper, we suggest reasons for why the voice loop system is a successful medium for supporting coordination in space shuttle mission control based on over 130 hours of direct observation. Voice loops allow practitioners to listen in on relevant communications without disrupting their own activities or the activities of others. In addition, the voice loop system is structured around the mission control organization, and therefore directly supports the demands of the domain. By understanding how voice loops meet the particular demands of the mission control environment, insight can be gained for the design of groupware tools to support cooperative activity in other event-driven domains.
Book
This chapter focuses on strategies for designing a distributed work system, using the design of the air traffic flow management system as an example. It discusses the need to decompose such a complex system into nearly independent subtasks in order to reduce the cognitive load and complexity for each individual, but emphasizes the need to design to initiate and support richer interactions when this decomposition is inadequate due to some interaction that has emerged between subtasks or subsystems.
Article
Evidence is reviewed which suggests that there may be little or no direct introspective access to higher order cognitive processes. Subjects are sometimes (a) unaware of the existence of a stimulus that importantly influenced a response, (b) unaware of the existence of the response, and (c) unaware that the stimulus has affected the response. It is proposed that when people attempt to report on their cognitive processes, that is, on the processes mediating the effects of a stimulus on a response, they do not do so on the basis of any true introspection. Instead, their reports are based on a priori, implicit causal theories, or judgments about the extent to which a particular stimulus is a plausible cause of a given response. This suggests that though people may not be able to observe directly their cognitive processes, they will sometimes be able to report accurately about them. Accurate reports will occur when influential stimuli are salient and are plausible causes of the responses they produce, and will not occur when stimuli are not salient or are not plausible causes.
Article
Rather than being a static property of hospitals and other healthcare facilities, safety is dynamic and often on short time scales. In the past most healthcare delivery systems were loosely coupled-that is, activities and conditions in one part of the system had only limited effect on those elsewhere. Loose coupling allowed the system to buffer many conditions such as short term surges in demand. Modern management techniques and information systems have allowed facilities to reduce inefficiencies in operation. One side effect is the loss of buffers that previously accommodated demand surges. As a result, situations occur in which activities in one area of the hospital become critically dependent on seemingly insignificant events in seemingly distant areas. This tight coupling condition is called "going solid". Rasmussen's dynamic model of risk and safety can be used to formulate a model of patient safety dynamics that includes "going solid" and its consequences. Because the model addresses the dynamic aspects of safety, it is particularly suited to understanding current conditions in modern healthcare delivery and the way these conditions may lead to accidents.
Sensemaking in the snow: Examining the cognitive work of avalanche forecasting in a Canadian ski operation. Presented at the International Snow Science Workshop
  • L M Maguire
  • J Percival
Maguire, L. M., & Percival, J. (2018). Sensemaking in the snow: Examining the cognitive work of avalanche forecasting in a Canadian ski operation. Presented at the International Snow Science Workshop, Innsbruck, Austria.