ArticlePDF Available

The trouble with UNIX: The user interface is horrid

From Datamation November 1981
The system design is elegant but user interface is not.
by Donald A. Norman
is a highly touted operating system. De-
veloped at the Bell Telephone Laboratories
and distributed by Western Electric, it has
become a standard operating system in uni-
versities, and it promises to become a stan-
dard for micro and mini systems in homes,
small businesses, and schools. But for all of
its virtues as a system--and it is indeed an
elegant system--UNIX is a disaster for the
casual user. It fails both on the scientific prin-
ciples of human engineering and even in just
plain common sense.
If UNIX is really to become a general
system, then it has got to be fixed. I urge
correction to make the elegance of the system
design be reflected as friendliness towards the
user, especially the casual user. Although I
have learned to get along with the vagaries of
’S user interface, our secretarial staff per-
sists only because we insist.
And even I, a heavy user of computer
systems for 20 years, have had difficulties:
copying the old file over the new, transferring
a file into itself until the system collapsed,
and removing all the files from a directory
simply because an extra space was typed in
the argument string. The problem is that
fails several simple tests.
Command names, lan-
guage, functions, and syntax are inconsistent.
The command names,
formats, and syntax seem to have no relation-
ship to their functions.
UNIX is a recluse, hid-
den from the user, silent in operation. The
lack of interaction makes it hard to tell what
state the system is in, and the absence of
mnemonic structures puts a burden on the
user’s memory.
What is good about UNIX? The system
design, the generality of programs, the file
structure, the job structure, the powerful op-
erating system command language (the
"shell"). Too bad the concern for system
design was not matched by an equal concern
for the human interface.
One of the first things you learn when
you start to decipher UNiX is how to list the
contents of a file onto your terminal. Now this
sounds straight-forward enough, but in UNIX
No I
is an operating system developed by
Dennis Ritchie and Ken Thompson of Bell
Laboratories. UNIX is trademarked by Bell
Labs and is available under license from
Western Electric. Although UNIX is a rela-
tively small operating system, it is quite
powerful and general. It has found consider-
able favor among programming groups, es-
pecially in universities, where it is primarily
used with
computers--various versions
of the DEC PDP- 11 and the VAX. The operat-
ing system and its software are written in a
high level programming language called C,
and most of the source code and documenta-
tion is available on-line. For programmers,
UNIX is easy to understand and to modify.
For the nonexpert programmer, the
important aspect of UNIX is that it is con-
structed out of a small, basic set of concepts
and programming modules, with a flexible
method for interconnecting existing mod-
ules to make new functions. All system ob-
jects-including all I/o channelsmlook like
files. Thus, it is possible to cause input and
output for almost any program to be taken
from or to go to files, terminals, or other
devices, at any time, without any particular
planning on the part of the module writer.
UNIX has a hierarchical file structure. Users
can add and delete file directories at will and
then "position" themselves at different lo-
cations in the resulting hierarchy to make it
easy to manipulate the files in the neighbor-
The command interpreter of the op-
erating sys.tem interface (called the
"shell") can take its input from a file,
which means that it is possible to put fre-
quently used sequences of commands into a
file and then invoke that file (just by typing
its name), thereby executing the command
strings. In this way, the user can extend the
range of commands that are readily availa-
ble. Many users end up with a large set of
specialized shell command files. Because
the shell includes facilities for passing argu-
ments, for iterations, and for conditional
operations, these "shell programs" can do
quite a lot, essentially calling upon all sys-
tem resources (including the editors) as sub-
routines. Many nonprogrammers have dis-
covered that they can write powerful shell
programs, thus significantly enhancing the
power of the overall system.
By means of a communication chan-
nel known as a pipe, the output from one
program can easily be directed (piped) to the
input of another, allowing a sequence of
programming modules to be strung together
to do some task that in other systems would
have to be done by a special purpose pro-
does not provide special pur-
pose programs. Instead, it attempts to pro-
vide a set of basic software tools that can be
struog together in flexible ways using t/o
redirection, pipes, and shell programs.
Technically, UNIX is just the operating sys-
tem. However, because of the way the sys-
tem has been packaged, many people use
the name to include all of the programs that
come on the distribution tape. Many people
have found it easy to modify the UNIX sys-
tem and have done so, which has resulted in
hordes of variations on various kinds of
computers. The "standard UNIX" discussed
in the article is BTL UNIX Version 6 (May
1975). The Fourth Berkeley Edition of UNIX
is more or less derived from
sion 7 (September 1978), with considerable
parallel development at the University of
California, Berkeley and some input from
versions. I am told that some
of the complaints in the article have been
fixed; however, Version 6 is still used by
many people.
The accompanying article is written
with heavy hand, and it may be difficult to
discern that I am a friend of UNIX. The nega-
tive tone should not obscure the beauty and
power of the operating system, file struc-
ture, and the shell. UNIX is indeed a superior
operating system. I would not use any other.
Some of the difficulties detailed result from
the fact that many of the system modules
were written by the early users of UNIX, not
by the system designers; a lot of individual
idiosyncrasies have gotten into the system.
It is my hope that the positive aspects of the
article will not be overlooked. They can be
used by all system designers, not just by
those working on UNIX.
Some other systems
need these comments a lot more than does
even this simple operation has its drawbacks.
Suppose I have a file called "testfile." I want
to see what is inside of it. How would you
design a system to do it? I would have written
a program that listed the contents onto the
terminal, perhaps stopping every 24 lines if
you had signified that you were on a display
tet-minal with only a 24-line display. UNIX,
however, has no basic listing command, and
instead uses a program meant to do something
Thus if you want to list the contents of
a file called "HappyDays," you use the com-
mand named "cat":
cat HappyDays
Why cat? Why not’? After all, as Humpty
Dumpty said to Alice, who is to be the boss,
words or us? "Cat," short for "concatenate"
as in, take filel and concatenate it with file2
(yielding one file, with the first part filel, the
second file2) and put the result on the "stan-
dard output" (which is usually the terminal):
cat filel file2
Obvious, right? And if you have only one file,
why cat will put it on the standard output--the
terminal--and that accomplishes the goal
(except for those of us with video terminals,
who watch helplessly as the text goes stream-
ing off the display).
The UNiX designers believe in the
principle that special-purpose functions can
be avoided by clever use of a small set of
system primitives. Why make a special func-
tion when the side effects of other functions
will do what you want? Well, for several
® Meaningful terms are considerably easier
to learn than nonmeaningful ones. In comput-
er systems, this means that names should re-
flect function, else the names for the function
will be difficult to recall.
® Making use of the side effects of system
primitives can be risky. If cat is used unwise-
ly, it will destroy files (more on this in a
® Special functions can do nice things for
users, such as stop at the end of screens, or put
on page headings, or transform nonprinting
characters into printing ones, or get rid of
underlines for terminals that can’t do that.
Cat, of course, won’t stop at terminal or page
boundaries, because doing so would disrupt
the concatenation feature. But still, isn’t it
elegant to use cat for listing? Who needs a
print or a list command? You mean "cat"
isn’t how you would abbreviate concatenate?
It seems so obvious, just like:
c compiler
change working
change password
search file for
Notice the lack of consistency in forming the
command name from the function. Some
names formed by using the first two con-
sonants of the function name. Editor, howev-
er, is "ed," concatenate is "cat," and
"date" and "echo" are not abbreviated at
all. Note how useful those two-letter abbre-
viations are. They save almost 400 millisec-
onds per command.
Similar problems exist with the name.~
of the file directories. UNiX is a file-oriented
system, with hierarchical directory struc-
tures, so the directory names are very impor-
tant. Thus, this paper is being written on a file
named "unix" and whose "path" is /csl/
norman/papers/CogEngineering/unix. The
name of the top directory is "/", and csl,
norman, papers, and CogEngineering are the
names of directories hierarchically placed be-
neath "/". Note that the symbol "/" has two
meanings: the name of the top level directory
and the symbol that separates levels of the
directories. This is very difficult to justify to
new users. And those names: the directory for
"users" and "mount" are called, of course,
to be the boss, words or
"usr" and "mnt." And there are "bin,"
"lib," and "tmp" (binary, library, and
loves abbreviations, even when
the original name is already very short. To
write "user" as "usr" or "rump" as "tmp"
saves an entire letter: a letter a day must keep
the service person away. But UNIX is inconsis-
tent; it keeps "grep" at its full four letters,
when it could have been abbreviated as "gr"
or "gp." (What does grep mean? "Global
REgular expression, Print"--at least that’s
the best we can invent; the manual doesn’t
even try. The name wouldn’t matter if grep
were something obscure, hardly ever used,
but in fact it is one of the more powerful,
frequently used string processing com-
Another important routine
goes by the name of
"dsw." Suppose you acci-
dentally create a file whose
name has a nonprinting character in it. How
can you remove it? The command that lists the
files on your directory won’t show nonprinting
characters. And if the character is a space (or
worse, a "*"), "rm" (the program that re-
moves files) won’t accept it. The name’ ’dsw"
was evidently written by someone at Bell Labs
who felt frustrated by this problem and hacked
up a quick solution. Dsw goes to each file in
your directory and asks you to respond "yes"
or "no," whether to delete the file or keep it.
How do you remember dsw? What on
earth does the name stand for’? The
ple won’t tell; the manual smiles the wry
smile of the professional programmer and
says, "The name dsw is a carryover from the
ancient past. Its etymology is amusing."
Which operation takes place if you say
"yes"? Why, the file is deleted of course. So
if you go through your files and see impor-
tant-file, you nod to yourself and say, yes, I
had better keep that one. You type in "yes,"
and destroy it forever. There’s no warning;
dsw doesn’t even document itself when it
starts, to remind you of which way is which.
Berkeley UNiX has finally killed dsw, saying
"This little known, but indispensable facility
has been taken over..." That is a fitting
commentary on standard
a system that
allows an "indispensable facility" to be "lit-
tle known."
The symbol "*" means "glob" (a
typical UNIX name: the name tells you just
what it does, fight’?). Let me illustrate with
our friend, "cat." Suppose I want to collect a
set of files named paper. 1 paper.2 paper.3
and paper.4 into one file. I can do this with
cat paper, l paper.2 paper.3 paper.4>
provides "glob" to make the job even
easier. Glob means to expand the filename by
examining all filcs in the directory to find all
that fit, Thus, I can redo my!
cat paper*>newfilename
where paper* expands to {paper. 1 paper.2
paper.3 paper.4}. This is one of the typic:el
virtues of UNIX; there are a number cf quite
helpful functions. But suppose I had decided
to name this new file "paper.all"--prctty
logical name.
cat paper*>paper.all
Disaster. In this case, paper* .expands to pa-
per. 1 paper.2 paper.3 paper.4!, and
so I am filling up a file from itself:
cat paper. 1 paper.2 paper.3 paper.4
paper, all>paper.a!l
Eventually the file will burst. Does UNiX
check against this, or at least give a warning’?
No such luck. The manual doesn’t alert users
to this either, although it does warn of anoth-
er, related infelicity: "Beware of ’cat a b > a’
and ’cat b a > a’, which destroy the input files
before reading them." Nice of them to tell us.
The command to remove all files that
start with the word "paper"
rm paper*
becomes a disaster if a space gets inserted by
rm paper *
for now the file "paper" is rcmoved, as well
as every file in the entire directory (the power
of glob). Why is there not a check against
such things? I finally had to alter my version
ofrm so that when I said to remove files, they
were moved to a special directory named
"deleted" and preserved there until I logged
off, leaving me lots of time for second
thoughts and catching errors. This illustrates
the power of UNIX: what other operating sys-
tem would make it so easy for someone to
completely change the operation of a system
command? It also il!ustrates the trouble with
what other operating system would
make it so necessary to do so’? (This is no
longer necessary now that we use Berkeley
UNIX--more on this in a moment.)
The standard text editor is
c~!!ed Ed. l spent a ye::r
using it as
vehicle to see ho~v people
deal with such cot:fusing things. Ed’s major
property is his shyness; he doesn’t like to talk.
You invoke Ed by saying, reasonably
enough, "ed." The result is silence: no re-
sponse, no prompt, no message, just silence.
Novices are never sure what that silenL’e
means. Ed would be a bit more likable if he
answered, "thank you, here I am," or at least
produced a prompt character, but in t’xlx
silence is go!den. No response means that
everything is okay; if something had gone
wrong, it woukl have to!d you.
Then there i; the famous append n,:}dc
error. To add text into the buffer, you have to
enter "append mode." To do this, you sim-
ply type "a," fo!lowed by RI..’TU,qN. Now
eve.’2,’th;ng 0""t is typz’d ca tt’e t: .,-r’:p2! goes
into the b,.:ffer. (Ed, tr,,_e to lb.’-:n, does net
infe..-rn you that it is now in appe,".l
when you type "a" fe!!owed by
the result is silence.) When you ere
adding text, you are supposed to type a line
that "contains only a. on it." This gets you
out of append mode.
Want to bet on how many e×trp., peri-
ods got inserted into text f!les~ or how many
commands got ~nserted ~nto te.l’ts, because the
users thought that they were in command
mode and forgot that they had not left append
mode? Does Ed te!l you when you have left
append mode? Hah[ This problem is so obvi-
ous that even the designers recognized it, but
their reaction, in the tutorial introduction to
Ed, was merely to note wryly that even expe-
rienced programmers make this mistake.
While they may be able to see humor in the
problem, it is devastating to the beginning
secretary, research assistant or student trying
to use UNIX as a word processor ,n experi-
mental tool, or just to learn abow ’, mputers.
How good is your sense of humor’?
Suppose you have been working on a file for
an hour and then decide to quit work, exiting
Ed by saying "q." The problem is that Ed
would promptly quit. Woof, there went your
last hour’s work. Gone forever. Why, if you
had wanted to save it you would have said so,
right? Thank goodness for all those other peo-
ple across the country who immediately re-
wrote the text editor so that we normal people
(who make errors) have some other choices
besides Ed, editors that tell you politely when
they are working, that tell you .if they are in
append or command mode, and that don’t let
you quit without saving your fi!e nnless you
are first warned, and then only if you say you
really mean it.
As I wrote this paper I sent out a
message on our networked message system
and asked my co!leagues to te!I me of their
favorite peeves. I got a lot of responses, but
there is no need to go into del.a!l about them;
they c!! have mu:h the sp,-ne i%vor, mostly
c,?.,-p.~.,ent;n3 about 0"e !nck ef cc.n°.istency
and the l::ck of ip:cracfive feedback. Thus,
there is no standardization of means to exit
programs (and because the "shell" is just
another program as far as the system is con-
cerned, it is very easy to log yourself off the
system by accident). There are very useful
pattern matching features (such as the "’glob"
¯ .’ function), but the
and the different
programs use ~he ’:ymbcls in ip.con,’;istent
ways. The ux!.
copy command (cp) and the
re!ate’J C pr.%t, rp.:p~.’!ng language "’string-
copy" (’:trcpy) reveree the meaning of their
argument~’, a:d ~’N!X move (mv) and copy
(cp) operat!:ms will destroy exi:;ting files
withat:t any x,.,arn..!ng. M,,ny prcgra ..... t.,t’e
special ":~gume.,,t flags" but lhc manner of
fl’.,gs is inconsi,;tent, varying
Prof. Norman praises the UNIX system de-
sign but makes a number of caustic remarks
about command names and other aspects of
the human interface. These might be ig-
nored, since he has no experimental tests to
justify them; or they might even be taken as
flattery of
since he does not name any
system he likes better; but some of his
comments are worth discussing.
Most of the command names Nor-
man points to are indeed strange; some,
such as dsw, were removed several years
ago (by the way, to repair the discourtesy of
the rnanual, dsw meant "delete from
switches"). However, it is not clear that it
makes much difference what the command
names are. T. K. Landauer, K. Galotti, and
S. Hartwell recently tried teaching people a,
version of the editor in which "append,"
"delete," and "substitute" were called
"allege," "cypher," and "deliberate." It
didn’t seem to have much effect on learning
time, and afterwards the users would say
things like ’~I alleged three lines and delib-
erated a comma on the last one" just like
subjects who had learned the ordinar. ver-
sion of the editor ("A Computer Command
By Any Other Name: A Study of Text Edit-
ing Terms," available from the authors at
Bell Labs.)
In addition to the amusing but sec-
ondary discussion of command names,
Prof. Norman does raise some significant
issues: (1) whether systems should be ver-
bose or terse; (2) whether they should have a
few general commands or many special-
purpose ones; and (3) whether they should
try to anticipate typical mistakes. Experi-
mental results on these issues would be wel-
come; meanwhile, the armchair evidence is
not all on one side.
UNIX is undoubtedly near an extreme
of terseness, partly because it was originally
designed for slow hardcopy terminals.
However, the terseness is very valuable
when connecting processes. If the com-
mand that lists the logged-on users prints a
heading above the list, you can’t tell how
many users are on by feeding the command
output to a line counter. If the editor types
acknowledgments now and then, its output
may not be directly usable as input some-
where else. Of course, you could feed it
through something which strips off the extra
remarks, but presumably that program
would add its own chatty messages.
Prof. Norman complains about us-
ing "cat" for a command which prints files,
rather than having a special-purpose com-
mand for the purpose (there is one, by the
way: "pg"). Having a few general-purpose
commands is a definite aid to system learn-
ing. In practice, it is not the novices who use
the alternatives to "cat"; it is the experts,
who want something better adapted to their
special needs and are willing to learn anoth-
er command. In general, people are quite
good at recognizing special uses of com-
mands in context, probably because it is a
lot like things they have to do every day in
English. To take an analogy from program-
ming languages, one doubts that Prof. Nor-
man would advocate a separate operator for
"+" in integer arithmetic and "+" in
floating point arithmetic. There are many
advantages to a small, general-purpose set
of commands. Having only one way to do
any given task minimizes software mainte-
nance while maximizing the ability of two
users to help each other with advice. But
this implies that whenever a general com-
mand and a specific command do the same
thing, the specific command should be re-
moved. It would be a definite service if the
"cognitive engineers" could tell us how
many commands are .reasonable, to give
some guidance on, for example, whether
"merge" should be a separate command or
an option on "sort" (on
it is a sort
option) and whether the terminal drivers
should be separate commands or options on
a graphics output command (on
are separate). The best rule of thumb we
have today is that designing the system so
that the manual will be as short as possible
minimizes learning effort.
Prof. Norman seems to think that the
computer should try to anticipate user prob-
lems, and refuse commands that appear
dangerous. The computer world is undoubt-
edly moving in this direction; strong typing
in programming languages is a good exam-
ple. The "ed" editor has warned for some
years if the user tries to quit without writing
a file. The "vi" editor has an "undo" fea-
ture, regardless of the complexity of the
command which has been executed. Such a
facility is undoubtedly the best solution. It
lets the user recognize his mistakes and back
out of them, rather than expecting the sys-
tem to foresee them. It is really not possible
to anticipate the infinite variety of possible
user mistakes; as every programmer who
has ever debugged anything knows, it is
hard enough to deal with the correct inputs
to a program. Human hindsight is undoubt-
edly better than machine foresight.
A large number of Prof. Norman
comments are pleas for consistency, uN
has grown more than it has been built, with
many people from many places tossing soft-
ware into the system. The ability of the
system to accept commands so easily is one
of its main strengths. However, it results in
command names like "finger" for what
Bell Labs called "whois" (identify a user)
and "more," "cat," or "pg" for what
Prof. Norman would rather call "list." The
thought of a
UNIX Command Standardiza-
tion Committee trying to impose rules on
names is a frightening alternative. Much of
the attractiveness of
derives from its
hospitality to new command~ and features.
This has also meant a diversity of names and
styles. To some of us this diversity is attrac-
tive, while to others the diw.~sity is ~-rustrat-
ing, but to hope for the hospitality without
the diversity is unrealistic.
---~ichaeJ Lesk
Murray Hill,
from program to program.
The version of UNIX I now use is
callcd the Fourth Berkeley Edition for the
VAX, distributed by Joy, Babaoglu, Fabry,
and Sklower at the University of California,
Berkeley (henceforth, Berkeley UNIX). This
is both good and bad.
Among the advantages: History lists,
aliases, a richer and more intelligent set of
system programs (including a list program, an
intelligent screen editor, an intelligent set of
routines for interacting with terminals accord-
ing to their capabilities), and a job control that
allows one to stop jobs right in the middle,
start up new ones, move things from back-
ground to foreground (and vice versa), exam-
ine files, and then resume jobs. The shell has
been amplified to be a more powerful pro-
gramming language, complete with file han-
dling capabilities, if--then--else statements,
while, case, and other goodies of structured
programming (see box, p. 130).
Aliases are worthy of special com-
ment. Aliases let users tailor the system to
their own needs, naming things in ways they
can remember; names you devise yourself are
easier to recall than names provided to you.
And aliases allow abbreviations that are
meaningful to the individual, without burden-
ing everyone else with your cleverness or
To work on this paper, I need only
type the word "unix," for I have set up an
alias called "unix" that is defined to be equal
to the correct command to change directories,
combined with a call to the editor (called
"vi" for "visual" on this system)on the ~’
alias unix "chdir/csl/norman/papers)
CogEngineering; vi unix"
These Berkeley
features have proven to
be indispensable: the people in my laboratory
would probably refuse to go back to standard
The bad news is that Berkeley UNIX
jury-rigged on top of regular tIN~X, so it can
lots of aids to memory that can be
provided, but the rnos powerful of all is understanding.
only patch up the faults: it can’t remedy them.
Grep is not only still grep, but there is an
egrep and an fgrep.
And the generators of Berkeley
have their problems: if Bell Labs people are
smug and lean, Berkeley people are cute and
overweight. Programs are wordy. Special
features proliferate. The system is now so
large that it no longer fits on the smaller
machines: our laboratory machine, a DEC 11/
45, cannot hold the latest release of Berkeley
UNIX (even with a full complement of mem-
ory and a reasonable amount of disk). I wrote
this paper on a w,x.
Learning the system for
setting up aliases is not
easy for beginners, who
may be the people who
need them most. You have to set them up in a
file called .cshrc, not a name that inspires
confidence. The "period" in the filename
means that it is invisible--the normal method
of directory listing programs won’t show it.
The directory listing program, Is, comes with
19 possible argument flags, which can be
used singly or in combinations. The number
of special files that must be set up to use all the
facilities is horrendous, and they get more
complex with each new release from Berke-
It is very difficult for new users. The
program names are cute rather than systemat-
ic. Cuteness is probably better than standard
UNIX’s lack of meaning, but there are limits.
The listing program is called "more" (as in,
"give me more"), the program that tells you
who is on the system is called "finger," and a
keyword help file--most helpful, by the
way--is called "apropos." I used the alias
feature to rename it "help."
One reader of a draft of this paper--a
systems programmer---complained bitterly:
"Such whining, hand-wringing, and general
bitchiness will cause most people to dismiss it
as over-emotional nonsense .... The
system was originally designed by systems
programmers for their own use and with no
intention for others using it. Other hackers
liked it so much that eventually a lot of them
started using it. Word spread about this won-
derful system, and the rest you probably
know. I think that Ken Thompson and Dennis
Ritchie could easily shrug their shoulders and
say ’But we never intended it for other than
our personal use.’ "
This complaint was unique, and I
sympathize with its spirit. It should be re-
membered, though, that UNIX is nationally
distributed under strict licensing agreements.
Western Electric’s motives are not altogether
altruistic. If UNIX had remained a simple ex-
periment on the development of operating
systems, then complaints could be made in a
more friendly, constructive manner. But
is more than that. It is taken as the very model
of a.proper operating system. And that is
exactly what it is not.
In the development of the system as-
pects of UNIX, the designers have done a mag-
nificent job. They have been creative, and
systematic. A common theme runs through
the development of programs, and by means
of their file structure, the development of
"pipes" and "redirection" of both input and
output, plus the power of the iterative "shell"
system-level commands, one can easily com-
bine system level programs into self-tailored
systems of remarkable power. For system
is a delight. It is well
structured, with a consistent, powerful phi-
losophy of control and structure.
Why was the same effort not put into
the design at the level of the user? The answer
is complex, but one reason is the fact that
there really are no well known principles of
design at the level of the user interface. So, to
remedy the harm I may have caused with my
heavy-handed sarcasm, let me attempt to pro-
vide some positive suggestions based upon
research conducted by myself and others into
the principles of the human information pro-
cessing system.
Cognitive engineering is a new disci-
pline, so new that it doesn’t exist, but it ought
to. Quite a bit is known about the human
information processing system, enough that
we can specify some basic principles for de-
signers. People are complex entities and can
adapt to almost anything. As a result, design-
ers often design for themselves, without re-
gard for other kinds of users.
The three most important concepts for
system design are these:
1. Be consistent. A fundamental set
of principles ought to be evolved and fol-
lowed consistently throughout all phases of
the design.
2. Provide the user with an explicit
model. Users develop mental models of the
devices with which they interact. If you do
not provide them with one, they will make
one up themselves, and the one they create is
apt to be wrong.
Do not count on the user fully under-
standing the mechanics of the device. Both
secretaries and scientists may be ignorant of
the difference between the buffer, the work-
ing memory, the working files, and the per-
manent files of a text editor. They are apt to
believe that once they have typed something
into the system, it is permanently in their
files. They are apt to expect more intelligence
from the system than the designer knows is
there. And they are apt to read into comments
(or the lack of comments) more than you have
Feedback is of critical importance in
helping establish the appropriate mental mod-
el and in letting the user keep its current state
in synchrony with the actual system.
3. Provide mnemonic aids. For most
purposes it is convenient to think of human
memory as consisting of two parts: a short-
term memory and a long-term memory (mod-
em cognitive psychology is developing more
sophisticated notions, but this is still a valid
approximation). Five to seven items is about
the limit for short-term memory. Thus, do not
expect a user to remember the contents of a
message for much longer than it is visible on
the terminal. Long-term memory is robust,
but it faces two difficulties: getting stuff in so
that it is properly organized, and getting stuff
out when it is needed. Learning is difficult,
unless there is a good structure and it is visible
to the learner.
There are lots of sensible memory aids
that can be provided, but the most powerful
and sensible of all is understanding. Make the
command names describe the function that is
desired. If abbreviations must be used, adopt
a consistent policy of forming them. Do not
deviate from the policy, even when it appears
that a particular command warrants doing so.
System designers take note. Design
the system for the person, not for the comput-
er, not even for yourself. People are also
information processing systems, with varying
degrees of knowledge and experience.
Friendly systems treat users as normal, intel-
ligent adults who are sometimes forgetful and
are rarely as knowledgeable about the world
as they would like to be. There is no need to
talk down to the user, nor to explain every-
thing. But give the users a share in under-
standing by presenting a consistent view of
the system. Their response will be your re-
ward. :~
Partial research support was provided by
Contract N00014-79-C-0323, NR 157-437
with the Personnel and Training Research
Programs of the Office of Naval Research,
and was sponsored by the Office of Naval
Research and the Air Force Office of Scientif-
ic Research. 1 thank the members of the
research group for their helpful suggestions
and descriptions of miser3’. In particular, l
wish to thank Phil Cohen, Tom Erickson,
Jonathan Grudin, Henry Halff. Go
man, and Mark Wallen for their attalysis of
Gary Perlman and Mark Wallen pro-
vided a number of useful suggestions.
Donald A. Norman is professor of psy-
chology and director of the program in
cognitive science at the University of
California, San Diego. He has degrees
in electrical engineering from MIT and
the University of Pennsylvania, and a
doctorate in psychology from the Uni-
versity of Pennsylvania. He is the au-
thor of seven books, including
Information Processing,
Press, N.Y., 1977.
... Shneiderman (1980) proposed guidelines for software design, to ensure that the computer is friendly. Norman (1981) coined this term, proposing guidelines for designing operating systems. ...
Full-text available
Systems engineering is conventionally defined as a multidisciplinary approach to the realization of successful systems. There is opportunity to further strengthen and improve the practice of systems engineering by increasing the use of cognitive sciences analysis in the practice of systems engineering. The goal of this paper is facilitate this improvement by providing a framework for incorporating additional cognitive sciences analysis in systems engineering decisions. Case studies show that imperfect decisions with cognitive roots have contributed to notable accidents. Ironically, most sources limit their disciplines to technological and managerial processes, mentioning cognitive sciences only in a benign context. Cognitive science is a sub-discipline of psychology, focusing on mental processing. In this paper we focus on two sub domains of cognitive sciences, cognitive bias and social constructionism. Existing sources discuss cognitive aspects of systems engineering but in general devote little attention to those aspects that may have catastrophic consequences. This paper suggests that these disciplines play a larger role than is generally recognized, particularly with respect to decision making which is a standard part of systems engineering. The decisions discussed in this paper pertain both to those made by executives and by operators in the operational phase of a system. Decision making is one area that may have such consequences; this is the area that will be emphasized in this paper. In addition, case studies show that, in addition to design decisions, imperfect decisions with cognitive roots have contributed to notable accidents. Two major approaches suggested by other sources for improving the decision making process are improved leadership and independent review. This paper suggests that both of these approaches offer promise if rigorously implemented but have their own flaws that need to be overcome. However, the need for improved decision making and a study into the conditions that lead to the flawed decisions suggest that a deeper look into these and other approaches is desirable. This paper examines barriers to safe operation due to decision errors, and proposes a framework for remedies.
... Top-down processing enhances letters, words, patterns, and image recognition (Wickens and Hollands, 2000). Words are perceived faster and understood better compared to abbreviations or acronyms (Norman, 1981). Thus, top-down processing enhances visual recognition (Palmer, 1975). ...
Full-text available
This study presents generations of digital marketing business strategy, guidelines and insights to online social media companies for how to design online target advertisements in digital market places, application development, and methods for improving click-through rates. Prior research has focused mainly on the input and process components of designing online target advertisements. Hence, there has been limited research that addresses the influence of visual display principles (VDP) on the output component of online target advertisement design. Motivated by this limited research, we have developed the online target advertising design (OTAD) model that incorporates the VDP; the theoretical background for the VDP is the visual perception theory. The OTAD model provides guidelines, insights, and IT strategies for how to effectively design online banner ads, apps, and websites for digital target advertising. Also, the OTAD model encourages designers to pay equal attention to all the components that are involved in target advertising/marketing.
... This is because the reviewed research implies that the best challenge/skill balance for achieving flow can depend on a user's level of intrinsic motivation. For example, those with a high hope of success (such as expert computers users) would require a match between perceived challenge and skill in their interaction (e.g., Norman, 1981), but those with a high fear of failure would require a lower level of challenge than skill in order to achieve flow. As an alternative to adaptation, an implication would be that designers should consider the range of user-profiles in terms of person characteristics that influence flow or moderate the effect of other variables on flow, in order to help estimating the potential impact of design decisions on users' flow. ...
Flow experience, the degree to which a person feels involved in a particular activity, is an important influence on human–computer interaction. Building on Guo and Poole’s (2009) model of flow experience in Web navigation, and van Schaik and Ling's (in press) cognitive-experiential approach to modelling interaction experience, this research demonstrates the crucial role of the preconditions of flow experience in human–computer interaction. In an experiment, the preconditions of flow experience – but not flow experience proper – mediated the effects of artefact complexity, task complexity and intrinsic motivation (as a situation-specific trait) on both flow and task outcome. However, preconditions did not predict overall artefact evaluation. Within a staged model of flow experience, the broader implications of this work for human–computer interaction are explored.
Full-text available
*Introduction: The thesis presents theoretical framework of cognitive paradigm of information science. It is focused on information behaviour, interactive approach and applications of cognitive aspects to information retrieval systems design. Trends in the area of search systems are analyzed. Empirical research aimed at interaction of human and information search agent is framed by these theoretical approaches. *Methods: Research methodology is based on combination of qualitative approach blended together with quantitative approach. Research is aimed at finding answers to questions about human information behaviour by interacting with information search agent and perception of interaction with agent interface. Data were collected from interviews and think aloud sessions with five respondents. *Analysis: Interviews and think-aloud transcripts were analyzed by means of conceptual and grounded theory analysis. Analyzed data were synthesized by coding, creation of tables and maps. Synthesized data were used for further generalization of results and modelling. *Results: Resulting interpretations describe detailed characteristics of information behaviour aspects focused on cognitive and affective perspectives. Context is shaped by identification of demographic data and also by description of experiences, future plans and presently solved tasks. *Conclusions: Interaction is perceived as being framed by information behaviour consisting of cognitive, affective and sensomotoric dimensions. As the research results showed, core of the interaction is created by action level blended with subject’s orientation to object and context. Uncertainty dominated from the perspective of affective information behaviour. Further satisfaction, dissatisfaction, interest, confusion and curiosity were the most frequent affective states. Some differences in behaviour of advanced users and novices were compared and generalized. Synthesis of results of closing interviews and affective aspects of solved tasks contributed to particular recommendations which can help improve interaction of human and agent interface and decrease cognitive load. *Keywords: information science, cognitive paradigm, information behaviour, interactive approach, information retrieval systems design, qualitative approach, think aloud method, cognitive and affective aspects
Conference Paper
A new operating system interface based on the langauge Ada is described: the Ada Virtual Operating System (AVOS). Its goal is to provide a completely uniform user interface. All operating system commands and programs are expressed in Ada, and Ada is shown to provide sufficient expressive power to allow conventional operating system entities such as files and directories to be expressed in Ada terms.
Conference Paper
A command language for a Minimal Ada Programming Support Environment (MAPSE) is described. This MAPSE Command Language (MCL) blends features from the UNIX* environment (such as IO redirection, pipes and background processing) with features of the Ada programming language (such as Ada-like parameter passing). The result is a command language which is easy to use, yet provides a variety of means for interconnecting tools. Details of the implementation of MCL are also discussed.
Conference Paper
Full-text available
Many problems that have to be solved in present day human-computer interfaces arise from technology limitations, quite apart from those arising from lack of appropriate knowledge. Some of the progress we see in the most recently developed interfaces has occurred simply because bit-mapped screens, large memories, colour, compute-power appropriate to local intelligence, and the like, have all become inexpensive at the same time as rising human costs have finally been appreciated, and deprecated, by those who pay the bills. The new technical possibilities, and the now obvious economic advantages of providing good interactive computer support to enhance human productivity in all areas of endeavour has created tremendous pressure to improve the human-computer interface. This pressure, in turn, has dramatically highlighted our lack of fundamental knowledge and methodologies concerning interactive systems design, human problem solving, interaction techniques, dialogue prototyping and management, and system evaluation. The design of human computer interfaces is still more of an art than a science. Furthermore, the knowledge and methodologies that do exist often turn out to fall short of what is needed to match computer methods or to serve as a basis for detailed algorithm design. The paper is addressed to a mixed audience, with the purpose of reviewing the background and current state of human-computer interaction, touching on the social and ethical responsibility of the designer, and picking out some of the central ideas that seem likely to shape the development of interaction and interface design in future computer systems. Areas are suggested in which advances in fundamental knowledge and in our understanding of how to apply that knowledge seem to be needed to support interaction in future computer systems. Such systems are seen as having their roots in the visionary work of Sutherland (1963), Englebart and English (1968), Kay (1969), Winograd (1970), Hansen (1971), Papert (1973), Foley and Wallace (1974), and D. C. Smith (1975). Their emphasis on natural dialogue, ease of use for the task, creativity, problem solving, appropriate division of labour, and powerful machine help available in the user's terms will still be crucial in the future. However, the ability to form, communicate, manipulate and use models effectively will come to dominate interaction with future computer systems as the focus of interactive systems shifts to knowledge-based performance. Human-computer interaction must be regarded as the amplification of an individual's intellectual productivity by graceful determination and satisfaction of every need that is amenable to algorithmic solution, without any disturbance of the overall problem-solving process.
Cognitive engineering has led to a consideration of the capabilities and limitations of users in relation to the products of system design. However, there has not been a similar recognition of human limitations in the process of design. One way to improve the products of system design is by making improvements in the process of design and, in particular, by addressing the limitations of designers through scaffolding with the technology that is the object of design. To do this we need to identify what we know about the process of designing for computer-mediated learning systems, and to build an understanding of how the design process might benefit from cognitive strategies and computer technology. Here we survey various approaches to support the design process, in order to identify approaches and opportunities that may inform both fields. Specific suggestions include heuristics for design, and scaffolding the design process through support tools. © IFIP, published by Kluwer Academic Publishers
Full-text available
This report consists of two papers on MENUNIX, an experimental interface to the programs and files on the UNIX operating system. In the first paper, I discuss how the decisions about the design of MENUNIX were made: based on my intuitions and user comments, but also on psychological theory and data whenever available. MENUNIX presents both the programs and files of UNIX in two menus from which users can make selections with single keypresses. The FILE menu presents the UNIX file hierarchy that allows users to organize files into directories by subject (e.g. writing and programming). The PROGRAM menu presents UNIX programs in a hierarchy organized into workbenches according to the tasks for which they are used (e.g. writing and programming) much as files can be organized in directories. Special facilities are provided for: finding out about useful commands; using variables to set options, to save commands, and to avoid typing long strings; and for editing strings (including recent commands). The second paper is a tutorial user manual for MENUNIX, in which the features of the program are more fully explained. (Author)
ResearchGate has not been able to resolve any references for this publication.