Content uploaded by Kenneth M. Ford
Author content
All content in this area was uploaded by Kenneth M. Ford on Jun 11, 2016
Content may be subject to copyright.
Human-Centered Computing
72 1094-7167/02/$17.00 © 2002 IEEE IEEE INTELLIGENT SYSTEMS
the design of complex sociotechnical systems was primarily
known as cognitive engineering or cognitive systems engi-
neering (CSE), a term introduced in the 1980s to denote an
emerging branch of applied cognitive psychology.1,2
Research focused on such topics as human–computer inter-
action, the psychology of programming, display design, and
user friendliness. Although some have sought to make the term
cognitive engineering seem less of an oxymoron by doing work
that somehow looks like actual engineering, a number of new
terms have emerged, all of which might be considered mem-
bers of the “genus”Human-Centered Computing. Researchers,
research organizations, funding sources, national study groups
and working groups, and even entire national funding pro-
grams espouse these approaches. A number of varieties have
entered the judging competition, as the “acronym soup”in
Figure 1 shows.
This variety has come about for many reasons. Some
individuals have proposed terms to express views that they
believe are new. Others have proposed terms as a conse-
quence of the social and competitive nature of science and
science funding, leading to turf wars and the need for indi-
viduals to win awards and claim niches that set themselves
and their ideas apart from the crowd. The obvious, and
obviously incorrect, question is, “Which term is the right
one?”As we hope to suggest in this essay, this question is
rather like the quest for the blue rose. Using the rose meta-
phor, and taking some liberties with Latin, we organize the
essay around a set of “genuses”into which the individual
“varieties”seem to fall.
Rosaceae Traditionum Contrarium
Rosaceae: The rose family
Traditionum: The act of handing over
Contrarium: Opposite or contrast
This genus includes those varieties that express a reac-
tion against some less desirable alternative, often left un-
named (we will make up names to fill the voids). A pro-
posed umbrella term is Human-Centered Systems,which
people have used to denote
•Programs of college study—for example, at Cornell
University3
A Rose by Any Other
Name…Would Probably Be
Given an Acronym
Robert R. Hoffman, Paul J. Feltovich, and Kenneth M. Ford, University of West Florida
David D. Woods, Ohio State University
Gary Klein, Klein Associates
Anne Feltovich, Grinnell College
The rose is a rose,
And was always a rose.
But the theory now goes
That the apple’s a rose,
And the pear is, and so’s
The plum, I suppose.
The dear only knows
What will next prove a rose.
You, of course, are a rose—
But were always a rose.
“The Rose Family,”Robert Frost, 1928
In this essay, we concern ourselves with characteriza-
tions of the “new”approaches to the design of complex
sociotechnical systems, and we use a biological classifica-
tion scheme to organize the discussion. Until fairly recently,
Editors: Robert R. Hoffman, Patrick J. Hayes, and Kenneth M. Ford
Institute for Human and Machine Cognition, University of West Florida
rhoffman@ai.uwf.edu
Rosaceae Cogitationis Multiflorae
Rosaceae: The rose family
Cogitationis: Thoughts or ideas
Multiflorae: Trans-species root stock
•System development funding programs—
for example, the 2001 program at the US
National Coordination Office for Infor-
mation Technology Research and Devel-
opment4and the 1999 program at the US
Department of Transportation5
•A journal’s subtitle—AI & Society:
Journal of Human-Centered Systems
and Machine Intelligence
A report to the US National Science
Foundation presented proceedings from
an HCS workshop, which included posi-
tion papers from 51 researchers spanning
disciplines including electronics, psy-
chology, medicine, and the military.6,7
Although all said their work and ideas
were human-centered, they had diverse
opinions about precisely what human-
centering is all about. To some, human-
centering is
•A philosophical and humanistic position
regarding workplace ethics and aesthetics
•A software design process that results in
really user-friendly interfaces
•A description of what makes for a good
tool, that is, the computer does all the
adapting
•An emerging interdiscipline, requiring
institutionalization and special training
programs
To some, a human-centered system is
•Any system that enhances human
performance
•Any system that plays any kind of role in
mediating human interactions
Two main themes underlie this discussion:
First, HCS is really technology driven, with
human issues and concerns being an add-on
rather than the primary engine of change.
Second, HCS is driven by a reaction against
what is perceived to be a naughty tradition
in the design of information-processing
technology—a tradition we might dub
Technology-Centered Design. In TCD, sys-
tem developers specify the requirements for
machines, then implement or prototype the
requirements, and finally produce devices
and software. Then they go away, leaving
users to cope with what they have built.
Indeed, experience has shown that devices
that are designed according to the design-
then-train philosophy “force users to adapt
to the system. The user is entangled with
the system terminology and jargons that are
the designer’s view of the world.”8
Many lessons learned over recent decades
have pointed toward a need for an alterna-
tive to TCD, sometimes tagged as Participa-
tory Design. These lessons span a range,
including insights from significant accidents
caused by differences between designers’
intentions and users’understanding and cog-
nitive capabilities (for instance, the Three-
Mile Island incident). But the lessons also
come much closer to home.9 We h ave all
experienced, for example, the frustrations of
learning to use upgraded software, adver-
tised and lauded for its new capabilities by
those who designed it and are therefore
familiar with it. The new capabilities, how-
ever, usually require significant relearning,
backpedaling, kludging, and work-arounds.
Bells and whistles often go unused and even
unnoticed. The vision for HCS is to create
systems on the basis of an analysis of human
tasks and an awareness of human capabili-
ties, and then determine that the systems
result in a performance gain and are adapt-
able to changing human needs.7
At about the time that the NSF began its
HCS work, NASA Ames launched an effort
on what it called Human-Centered Comput-
ing.10,11 This designation is more forthright
than HCS in that the “systems”being de-
signed and built clearly are computational
systems (and not other things, such as
teapots). HCC also more directly reflects a
reaction against a tradition that we might
dub Machine-Centered Computing (MCC).
HCC’s core idea is to build systems that
amplify and extend human cognitive, per-
ceptual, and collaborative capabilities.12 In
this approach, system design must be lever-
aged by known facts and principles of
human psychology and not just by the
decontextualized principles that computer
scientists or electronics engineers are
accustomed to using as guidance (that is,
MCC). Human-Centered Systems must
complement humans and are not intended
to imitate or replace them, as the Turing
model for AI would have us believe.13,14
Even more speciated than HCC is the
designation Human-Centered Processes
(HCP), the term used by a Working Group
of the European Association for Operational
Research. At first blush, the HCP designa-
tion seems odd in the present context: Hu-
mans engage in cognitive processes, so how
could a process not be centered on (or in)
them? The answer is that this group focuses
on the design of systems for manufacturing
and industrial-process control. The Euro
Group shares the sentiments of the HCS and
HCC communities regarding goals such as
support for distributed teams and dissatis-
faction with user-hostile systems.15
JULY/AUGUST 2002 computer.org/intelligent 73
Human-Centered
Computing (HCC)
Decision-Centered
Design (DCD)
Practice-Centered
Design (PCD)
Human-Centered
Processes (HCP) Human-System
Integration (HSI)
Work-Oriented
Design (WOD)
Learner-Centered
Design (LCD)
Contextual
Design (CD)
Use-Centered
Design (UCD)
User-Centered
Design (UCD)
Participatory
Design (PD)
Situation
Awareness-Oriented
Design (SAOD)
Client-Centered
Design (CCD)
Human-Centered
Systems (HCS)
Cognitive Systems
Engineering (CSE)
Customer-Centered
Systems (CCS)
Figure 1. The acronym soup of terms that have been offered to designate “the” new
approach to cognitive engineering.
Contextual Design (CD) also expresses a
reaction in computer science, but focused
against what might be called Laboratory-
Based Design (LBD), for want of a better
term. The basic idea in CD is that the
design process cannot be conducted by
cloistered designers and programmers
feeding designs to the user.
Rather, designers must become field re-
searchers and immerse themselves in the
application domain to fully understand do-
main practice and the context of the
prospective designs’use.16 CD advocates
represent the impact on computer science
of ethnography (also known as cognitive
anthropology, situated cognition, and “cog-
nition in the wild”),17 especially the works
of such people as Edwin Hutchins18 and
Jean Lave.19
Related to the spirit of CD are two addi-
tional varieties in the genus Rosaceae Tra-
ditionum Contrarium, called User-Cen-
tered Design and Participatory Design,
which express a reaction in human factors
psychology, ecological psychology, and
applied cognitive psychology against the
traditional approach, TCD. The UCD con-
cept traces its origins to cognitive engineer-
ing around 198020 and is alive and well in
the software engineering community.21 The
core idea is that that the machine must sat-
isfy the needs of the people who will use
the system, and therefore those people need
to be involved in the system’s design.9,22,23
(Ironically, UCD as it is manifested in the
software engineering community is actu-
ally somewhat designer-centered: It relies
heavily on decontextualized principles of
usability, interface design, and so on, rather
than trying to deeply understand actual
work contexts. Furthermore, users’involve-
ment often only takes the form of user
commentary on design ideas and clever
prototypes—the “satisficing”criterion—
rather than something more powerful such
as a full empirical study of performance to
assess usefulness and usability.)
Similarly, in the field calling itself Hu-
man-Computer Interaction, some use the
terms Client-Centered Design (CCD) and
Customer-Centered Systems (CCS) to ex-
press the idea that designers must interact
with and satisfy clients and customers
(who are not necessarily the end users but
can be the ones paying for the work).16
In a moment of reflection on the ecologi-
cal approach to human–machine integra-
tion,24 John Flach and Cynthia Dominguez
realized that the goal for systems designers
is not really to build tools that support par-
ticular users, as the UCD designation sug-
gests, because more than one individual
might use any given system.25 The goal is
to provide information (possibilities for
perception) and “affordances”(possibilities
for action)—to support uses rather than
users. Hence, we find the term Use-Cen-
tered Design. Some have suggested the
term Work-Oriented Design (WOD),2,26,27
and more recently Practice-Centered
74 computer.org/intelligent IEEE INTELLIGENT SYSTEMS
Figure 2. The original Fitts’ List. Reprinted with permission from Human Engineering
for an Effective Air Navigation and Traffic Control System, National Academy of Sci-
ences, Washington, D.C., 1951. Reproduced courtesy of the National Academy Press.
Design (PCD), to clarify this subtle but
important point.28–30
The Theme to the Contraria
The contrast of Technology-Centered and
Machine-Centered Design with all the oth-
ers—HCS, HCC, HCP, UCD, UCD, CCD,
CCS, WOD, and PCD—shows perhaps most
clearly in David Woods’analysis of Fitts’
List, reproduced in Figure 2.31,32 This list was
developed during and just after World War II
by human factors psychologist Paul Fitts and
others who were designing cockpits, radar
devices, and the like for the US Army Air
Force. Fitts’List emphasizes the things that
machines and people do well, but it clearly
slants toward the view that we humans need
machines to make up for our human frailties,
limitations, and penchant for error. The focus
of design according to this tradition is to have
machines mitigate human error, emotionality,
memory limitations, and so on.
Advocates of the new approaches have
reacted against the Fitts’List tradition:
It has sometimes appeared as if the central
role of human factors has been to catalogue
the limitations of the human information
processor so that these limits could be taken
into account in the design.… However, we
would argue that … the human is currently
the most valuable resource for linking infor-
mation and action … it is not a question of
protecting the system against human variabil-
ity … but how to fully utilize the intelligence,
skill, and imagination of the human against
the complexities inherent in these domains.25
Woods formalized this view by offering
a different kind of list that is much in
accord with HCC advocates.33 Woods’
“Un-Fitts List”presents a rich view, one
that does not concentrate on human short-
comings. Table 1 contains one version of
this list that emphasizes what people do
well and how they create machines to
enhance those competencies. They do so,
for example, by creating algorithms that
are well suited to bounded conditions and
thus balance people’s tendency to get stuck
in “local”views and action patterns.
The idea that motivates an Un-Fitts list is
this, paraphrased:
Approaches to design will not succeed if they
maintain one conceptual space for the environ-
ment (machine, world) and another conceptual
space for the human (information processing).
Cognitive Engineering rests on a foundation of
treating people and machines that do cognitive
work as a single unit of analysis. Success
depends on creating a conceptual space in
which humans and the environment are jointly
represented.25
We express the spirit of the Un-Fitts’List
in a nutshell by what we call the Aretha
Franklin Principle, named after the singer
because of her well-known recording of the
song “Respect”:
Do not devalue the human in order to justify
the machine. Do not criticize the machine in
order to rationalize the human. Advocate the
human–machine system in order to amplify
both.
Rosaceae Urgentis Paniculae
Rosaceae: The rose family
Urgentis: Urgent
Paniculae:A type of flower shape, used
also to denote a type of swelling; etymo-
logically related to the word used to denote
the fear induced by the Greek God, Pan—
hence, “panic”
Terms that belong to this next genus rep-
resent panic attacks in reaction to the per-
ception that the engine of change (technol-
ogy) is overwhelming. A clear case is the
term Human-System Integration (HSI). Pro-
fessional meetings that have used this term
(for example, the November 2001 Human
Systems Integration Symposium, sponsored
by the American Society of Naval
Engineers) have resounded with excited and
ardent cries for the computer science com-
munity to cope with the design challenges
for the next generation of systems, in which
fewer people will have to do more work
using more computers. A great deal of con-
cern has been expressed over an imminent
potential disaster when people (more or less
poorly trained) are confronted with new and
highly complex technologies (more or less
human-centered) that themselves run new
and highly complex systems (for example,
ships to be manned by only 90 people).
Rosaceae Foci Explicationis
Rosaceae: The rose family
Foci: Fireplace or hearth
Explicationis: Analysis or explanation
A number of terms express a particular
focus point for empirical analysis. These
JULY/AUGUST 2002 computer.org/intelligent 75
Table 1. An ”Un-Fitts“ list.
Machines
Are constrained in that Need people to
Sensitivity to context is low and is ontology-limited Keep them aligned to the context
Sensitivity to change is low and recognition of anomaly is ontology-limited Keep them stable given the variability and change inherent in the world
Adaptability to change is low and is ontology-limited Repair their ontologies
They are not “aware” of the fact that the model of the world Keep the model aligned with the world
is itself in the world
People
Are not limited in that Yet they create machines to
Sensitivity to context is high and is knowledge- and attention-driven Help them stay informed of ongoing events
Sensitivity to change is high and is driven by the recognition of anomaly Help them align and repair their perceptions because they rely on
mediated stimuli
Adaptability to change is high and is goal-driven Affect positive change following situation change
They are aware of the fact that the model of the world is itself in the world Computationally instantiate their models of the world
terms fall into two subspecies. One is
Rosaceae Foci Explicationis Psychologicus.
Varieties in this subspecies focus on the
empirical investigation of certain psycho-
logical faculties and the design of systems
that support the exercise of those faculties.
Decision-Centered Design (DCD), for in-
stance, focuses the empirical analysis on
revealing the decisions that domain practi-
tioners have to make, and the information
requirements for those decisions.34,35 The
Psychologicus species also includes vari-
eties designated as Situation Awareness-
Oriented Design (SAOD)36 and Learner-
Centered Design (LCD).37
Additional varieties fall in the other sub-
species, Rosaceae Foci Explicationis Indi-
vidualis, which focuses on individual dif-
ferences. This includes a local variant of
User-Centered Design, which emphasizes
such things as autoadaptive systems and
expert systems. Both are systems that in-
teract with users on the basis of models of
the individual users (that is, their learning
history).
Having laid out the acronyms using the
rose metaphor of varieties (for terms) and
genuses (for the themes and origins behind
these terms), we can now attempt to clarify
the “acronym soup.”
Rosaceae Foci Explicationis
Pluralis
Rosaceae: The rose family
Foci: Fireplace or hearth
Explicationis: To analyze and explain
Pluralis: Plural
Our analysis leads to the question of
why there are not more varieties, rather
than fewer. We can easily imagine quite a
few new hybrids. For instance, much re-
search that uses methods of cognitive field
research aims to reveal the mental models
of domain practitioners, models of their
knowledge, and models of their reasoning
and strategies. So, we might have Mental
Model-Oriented Design (MMOD). For the
Individualis species, we might invoke Indi-
vidual Differences-Centered Design
(IDCD) and Trainee-Centered Design
(TCD). Indeed, we might create as many
new varieties as there are psychological
faculties to analyze.
This conclusion leads us to recognize a
problem with the acronym soup having to
do with the difference between the words
IEEE INTELLIGENT SYSTEMS
This new quarterly magazine aims to advance mobile and
ubiquitous computing by bringing together its various
disciplines, including peer-reviewed articles on
•
Hardware technologies
•
Software infrastructure
•
Real-world sensing and interaction
•
Human–computer interaction
•
Security, scalability, and privacy
http://computer.org/pervasive
Associate EICs:
•
Roy Want, Intel Research
•
Tim Kindberg, HP Labs
•
Deborah Estrin, UCLA
•
Gregory Abowd, Georgia Tech
•
Nigel Davies, Lancaster University and Arizona
University
Editor in Chief:
M. Satyanarayanan, Carnegie Mellon University and
Intel Research Pittsburgh
IEEE Pervasive Computing
The IEEE Computer and
Communications Societies present
“analysis”and “design.”Empirically ori-
ented researchers commonly generate data
and identify leverage points that might lead
to ideas for new tool designs, but they do
not actually build tools. Indeed, so-called
design activities are often analytical re-
search activities that are intended to yield
critical information for generating design
ideas. This information can include things
such as domain practitioners’reasoning,
where new technology might be brought to
bear. Thus, some activities whose name
includes “design”should really be called
something using the word “analysis.”For
instance, some activities that have been
referred to as Decision-Centered Design
should be called Decision-Centered Analy-
sis. Some activities that are referred to as
Situation Awareness-Oriented Design
should be called Situation Awareness-Ori-
ented Analysis—and so on. Furthermore,
we could invoke notions such as Mental
Model-Oriented Design versus Mental
Model-Oriented Analysis. Each type of
analysis is conducted in the service of
design.
However, this begs a larger, more impor-
tant question. What is the design process?
Has anyone laid out a process, perhaps to
complement the many detailed published
descriptions of the empirical methodology
that is used in analysis (for example, meth-
ods of cognitive task analysis and knowl-
edge elicitation)?38,39 Often, individuals who
have conducted analyses seem to identify a
leverage point and then recognize how they
might adapt a known idea or innovation to
the case at hand. In short, there is no specific
design “process.”What makes this issue
JULY/AUGUST 2002 computer.org/intelligent 77
Cognition
and
action
Human factors
engineering
(ergonomics)
Applied
cognitive science
embraces
methods
from
asserts that
is of
Psychosocial
evaluation
Cognitive field
research methods
Complex
interdependencies
Humans
(agents)
include
is based on
The Triples
Rule
Unit of
analysis is a
system
is defined
as an
Interaction
Method
Cognitive anthropology
(ethnomethodology,
sociology)
Workplace
(including
technologies)
Collaboration
(shared situation awareness,
team mental models)
Comprehension
(learning, mental modeling,
sense making, anticipation)
Perception
(situation awareness,
problem detection)
Planning
(option generation, replanning,
uncertainty management)
utilizes
Psychosocial
analysis
depends on i.e.,
leads
to
The Aretha
Franklin
Principle
are envisioned
on the basis of
Evaluation of
usefulness
Evaluation of
usability
includes
includes
Context
Domain
Organization
Creation of systems
that enhance people Goal
is of
Tools
(including
computational
tools)
Principles
Contexts
Systems stance
Commonalities
of the acronyms
Leverage point
identification
Human-centered
computational
tools
Cognition of
collectives
Activities and
work patterns
Cognition of
individuals
includes
involves
Pertinent
cognitive
processes
Figure 3. A Concept Map illustrating commonalities among the soup’s acronyms.
important is that customers in government
say that what they most desperately need is
a specification of the design process. This is
a topic to be pursued in a future column.
Rosaceae Cogitationis
Multiflorae
Rosaceae: The rose family
Cogitationis: Thoughts or ideas
Multiflorae: Trans-species root stock
We return to the initial genus and the idea
of a root stock—that is, that all the designa-
tions in the acronym soup have some com-
monalities. The Concept Map in Figure 3
depicts these commonalities: the goal, the
systems stance, the cognitive processes that
must be the focus for analysis, and the
method. Method includes both the analytical
method (cognitive field research in the ser-
vice of design) and the evaluative method
(that is, that new designs must be empiri-
cally evaluated for usefulness and usability).
At least one more difficulty remains. If
we begin with the overall premise that
design is the design of tools (including
information systems) and that tools by
definition involve humans-at-work, then
what is actually gained by referring to
either Human-Centered Design or Prac-
tice-Centered Design? After all, if the
design is not for humans and does not
support human practice, what good is it?
Referring even to the hybrid Human
Practice-Centered Design would be
redundant and could be reduced to the
single word “Design.”Thus we find our-
selves unable to completely escape our
genealogical history. The designation of
Practice-Centered Design speaks to tradi-
tion in human–factors engineering, man-
dating that we avoid all the pitfalls of
Technology-Centered Design and Ma-
chine-Centered Design. The designation
of Practice-Centered Design also serves
as a reminder that not all the tools that
we build are necessarily computational
tools. The designation of Human-Cen-
tered Computing speaks to tradition in
computer science, also mandating that
we avoid all the pitfalls of Technology-
Centered Design and Machine-Centered
Design.
The Concept Map in Figure 4 places all
the acronyms in the soup into a meaningful
framework based on these considerations.
We thus have differently hued variants
of the same variety of rose. They are all
rooted in the same soil. All drink the same
water. All reach toward the same light. To
turn a phrase, ex uno plura. From one
comes many.
Acknowledgments
We prepared this article through participation
in the Advanced Decision Architectures Collabo-
rative Technology Alliance, sponsored by the US
Army Research Laboratory under cooperative
78 computer.org/intelligent IEEE INTELLIGENT SYSTEMS
Psychosocial
analysis
Learning
suggests designation
Situational
awareness
suggests designation
Decision-making
Learner-Centered
Design
Cognitive
Systems
Engineering
Situation-Awareness
Oriented Design
Decision-Centered
Design
suggests designation
for example suggests designation
Some designers of
industrial process
control systems
User-Centered
Design
Client-Centered
Design
Participatory
Design
Customer-Centered
Design
suggests designation
depends upon
results from
engage in
builds upon
seminal worked called
can involve
a focus on
Constituencies Users Clients
are referred
to as
use designation
Human-Centered
Design
Domain
Practitioners
Design of complex
cognitive systems
Figure 4. The acronym soup formatted in a Concept Map that lays out the underlying relationships.
agreement DAAD19-01-2-0009.
“The Rose Family”by Robert Frost was
reprinted by permission of Henry Holt & Co.,
which published it in The Poetry of Robert Frost,
edited by Edward Connery Lathem. © 1928, 1969
by Henry Holt and Co., © 1956 by Robert Frost.
References
1. D.A. Norman, “Cognitive Engineering,”
User-Centered System Design: New Perspec-
tives on Human-Computer Interaction, D.A.
Norman and S.W. Draper, eds., Lawrence
Erlbaum, Hillsdale, N.J., 1986, pp. 31–61.
2. J. Rasmussen, A.M. Pejtersen, and L.P. Good-
stein, Cognitive Systems Engineering, John
Wiley & Sons, New York, 1994.
3. Cornell Univ., “Undergraduate Program
Courses in Human-Centered Systems,”2001;
www.fci.cornell.edu/infoscience/human-
systems.html.
4. “Networked Computing for the 21st Century:
Human Centered Systems,”Nat’l Coordination
Office for Information Technology Research
and Development, 2001; www.hpcc.gov/
pubs/blue99/hucs.html#technologies.
5. “Human-Centered Systems: The Next Chal-
lenge in Transportation,”US Dept. of Trans-
portation, Washington, D.C., 1999.
6. T. Flanagan et al., eds., Human-Centered Sys-
tems: Information, Interactivity and Intelli-
gence, tech. report, Nat’l Science Foundation,
Washington, D.C., 1997, pp. 266–268.
7. R. Kling and L. Star, “Human Centered Sys-
tems in the Perspective of Organizational and
Social Informatics,”Computers and Soc., vol.
28, no. 1, 1998, pp. 22–29.
8. C. Ntuen, “A Model of System Science for
Human-Centered Design,”Human-Centered
Systems: Information, Interactivity and Intel-
ligence, J. Flanagan et al., eds., Nat’l Science
Foundation, Washington, D.C., 1997, p. 312.
9. D.A. Norman, The Psychology of Everyday
Things, Basic Books, New York, 1988.
10. W.J. Clancey, Situated Cognition: On Human
Knowledge and Computer Representations,
Cambridge Univ. Press, Cambridge, U.K.,
1997.
11. C.M. Seifert and M.G. Shafto, “Computa-
tional Models of Cognition,”J. Hendler, ed.,
Handbook of Cognitive Neuropsychology,
vol. 9, Elsevier,Amsterdam, 1994.
12. R.R. Hoffman, P.J. Hayes, and K.M. Ford,
“Human-Centered Computing, Thinking In
and Outside the Box,”Intelligent Systems, vol.
16, no. 5, Sept./Oct. 2001, pp. 76–78.
13. K.M. Ford and P. Hayes, “On Computational
Wings: Rethinking the Goals of Artificial
Intelligence,”Scientific Am.,Winter 1998, pp.
78–83.
14. R.R. Hoffman et al., “The Triples Rule,”IEEE
Intelligent Systems, vol. 17, no. 3, May/June
2002, pp. 62–65.
15. European Working Group on Human-
Centered Processes, 2001; www-hcp.enst-
bretagne.fr.
16. H. Beyer and K. Holtzblatt, Contextual
Design: Defining Customer-Centered Sys-
tems, Academic Press, San Diego, 1998.
17. M. Cole,Y. Engeström, and O. Vazquez, eds.,
Mind, Culture, and Activity: Seminal Papers
from the Laboratory of Comparative Human
Cognition, Cambridge Univ. Press, Cam-
bridge, U.K., 1997.
18. E. Hutchins, Cognition in the Wild, MIT
Press, Cambridge, Mass., 1995.
19. J. Lave, Cognition in Practice: Mind, Math-
ematics, and Culture in Everyday Life, Cam-
bridge Univ. Press, Cambridge, U.K., 1988.
20. D.A. Norman and S.W. Draper, User-Cen-
tered System Design: New Perspectives on
Human-Computer Interaction, Lawrence Erl-
baum, Mahwah, N.J., 1986.
JULY/AUGUST 2002 computer.org/intelligent 79
Systems
engineering
focuses
on
Use- or
Practice-Centered
Design
Work-Oriented Design
Contextual Design
involves two related designations
emphasize different things Human-Centered
Computing
also
known as
Traditional
approaches
also known as
Traditional approach
in computer science
is often
contrasted with
is often
contrasted with
Human factors
include
those in
Machine-Centered
Computing
might be dubbed
Human-Centered
Systems
might
be dubbed
Human-System
Integration
Creation of new
computational devices
Technology-Driven
Design
Laboratory-based
design
focuses
on
Computer
science
21. A. Cooper, The Inmates Are Running the Asy-
lum: Why High Tech Products Drive Us Crazy
and How to Restore the Sanity, Sams Pub-
lishing, Indianapolis, Ind., 1999.
22. T.K. Landauer, The Trouble with Computers:
Usefulness, Usability and Productivity, MIT
Press, Cambridge, Mass., 1997.
23. K.L. McGraw and K. Harbison, User-Cen-
tered Requirements, Lawrence Erlbaum,
Mahwah, N.J., 1997.
24. J.M. Flach et al., eds. Global Perspectives on
the Ecology of Human-Machine Systems,
Lawrence Erlbaum, Mahwah, N.J., 1994.
25. J.M. Flach and C.O. Dominguez, “Use-Cen-
tered Design,”Ergonomics in Design,July
1995, pp. 19–24.
26. P. Ehn, Work-Oriented Design of Computer
Artifacts, Arbetslivscentrum, Stockholm,
Sweden, 1988.
27. K. Vicente, Cognitive Work Analysis,Law-
rence Erlbaum, Mahwah, N.J., 1999.
28. D.D. Woods, “Designs Are Hypotheses about
How Artifacts Shape Cognition and Collabora-
tion,”Ergonomics, vol. 41, 1998, pp. 168–173.
29. D.D. Woods et al., Studying Cognitive Work in
Context: Facilitating Insight at the Intersec-
tion of People, Technology and Work, tech.
report, Cognitive Systems Eng. Laboratory,
Inst. for Ergonomics, Ohio State Univ.,
Columbus, Ohio, 2002; http://csel.eng.ohio-
state.edu/woodscta.
30. D.D. Woods and N. Sarter, “Learning from
Automation Surprises and Going Sour Acci-
dents,”Cognitive Engineering in the Aviation
Domain, N. Sarter and R. Amalberti, eds.,
Lawrence Erlbaum, Mahwah, N.J., 2000, pp.
327–353.
31. S.S. Potter et al., “Bootstrapping Multiple
Converging Cognitive Task Analysis Tech-
niques for System Design,”Cognitive Task
Analysis, J.M. Schraagen and S.F. Chipman,
eds., Lawrence Erlbaum, Mahwah, N.J.,
2000, pp. 317–340.
32. D.D. Woods and D. Tinapple, “W3: Watch-
ing Human Factors Watch People at Work,”
43rd Ann. Meeting Human Factors and
Ergonomics Soc., Sept. 1999; http://csel.
eng.ohio-state.edu/hf99.
33. D.D. Woods, “Steering the Reverberations of
Technology Change on Fields of Practice:
Laws That Govern Cognitive Work,”Proc.
24th Ann. Meeting Cognitive Science Soc.,
Lawrence Erlbaum, Mahwah, N.J., 2002.
34. G. Klein et al., “Applying Decision Require-
ments to User-Centered Design,”Int’l J.
Human-Computer Studies, vol. 46, 1997, pp.
1–15.
35. D.W. Klinger, “A Decision-Centered Design
Approach to Case-Based Reasoning: Helping
Engineers Prepare Bids and Solve Problems,”
Advances in Agile Manufacturing, P.T. Kidd
and W. Karwowski, eds., IOS Press, Man-
chester, U.K., 1994, pp. 393–396.
36. M.R. Endsley, “Designing for Situation
Awareness in Complex Systems,”Proc. Sec-
ond Int’l Workshop Symbiosis of Humans,
Artifacts and the Environment, Japan Soc. for
the Promotion of Science, Kyoto, Japan,
2001, pp.175–190.
37. E. Soloway, M. Guzdial, and K.E. Hay,
“Learner-Centered Design: The Challenge for
HCI in the 21st Century,”Interactions, vol. 1,
1994, pp. 36–48.
38. R.R. Hoffman et al., “Eliciting Knowledge
from Experts: A Methodological Analysis,”
Organizational Behavior and Human Deci-
sion Processes, vol. 62, 1995, pp. 129–158.
39. J.M. Schraagen, S. Chipman, and V. Shalin,
eds., Cognitive Task Analysis, Lawrence Erl-
baum, Mahwah, N.J., 2001.
80 computer.org/intelligent IEEE INTELLIGENT SYSTEMS
Robert R. Hoffman is a research scientist at the University of West Florida’s Institute for Human
and Machine Cognition and a faculty associate at the university’s Department of Psychology.
Contact him at the Inst. for Human and Machine Cognition, 40 Alcaniz St., Pensacola, FL 32501;
rhoffman@ai.uwf.edu.
Paul J. Feltovich is a research scientist at the Institute for Human and
Machine Cognition, University of West Florida, Pensacola. His research
interests include expert–novice differences in complex cognitive skills,
conceptual understanding for complex knowledge, and novel means of
instruction in complex and ill-structured knowledge domains. He received a
BS in mathematics from Allegheny College and a PhD in educational psy-
chology from the University of Minnesota. Contact him at the Inst. for
Human and Machine Cognition, 40 Alcaniz St., Pensacola, FL 32501;
pfeltovich@ai.uwf.edu.
Kenneth M. Ford is the founder and director of the University of West Florida’s Institute for
Human and Machine Cognition. Contact him at the Inst. for Human and Machine Cognition, 40
Alcaniz St., Pensacola, FL 32501; kford@ai.uwf.edu.
David D. Woods is a professor of industrial and systems engineering and
codirector of the Cognitive Systems Engineering Laboratory at The Ohio
State University. He received the 1995 Laurels award and the 1984 Westing-
house Engineering Achievement Award and is a fellow of the Human Factors
and Ergonomic Society, the American Psychological Society, and the Ameri-
can Psychological Association. He received his degrees in experimental psy-
chology from Purdue University. Contact him at Cognitive Systems Eng.
Lab, Industrial and Systems Eng., 210 Baker Systems, Ohio State Univ.,
1971 Neil Ave., Columbus, OH 43210; woods@csel.eng.ohio-state.edu.
Gary Klein is chief scientist of Klein Associates. His work involves
recognitional decision-making. He received his PhD in experimental psy-
chology from the University of Pittsburgh. Contact him at Klein Associ-
ates, 1750 Commerce Center Blvd. North, Fairborn, OH 45324-3987;
gary@decisionmaking.com.
Anne Feltovich is a graduate of the Illinois Mathematics and Science
Academy and is now majoring in classics at Grinnell College, anticipating
graduate studies in classical archaeology. Contact her at Human and
Machine Cognition, attn. Paul Feltovich, 40 Alcaniz St., Pensacola, FL
32501; pfeltovich@ai.uwf.edu.