Keyboard Shortcut Users: They Are Faster at More than Just Typing
Jo Rain Jardina, S. Camille Peres, Vickie Nguyen, Ashitra Megasari, Katherine R. Griggs,
Rosalinda Pinales, & April N. Amos
University of Houston-Clear Lake
Software efficiency may be important for employees who want to be viewed as valuable assets in
a company. One efficient method people can employ is the use of the keyboard to issue
commands (KICs) because KICs are faster than other methods, e.g. menus or icons. Furthermore,
using KICs may reduce the risk of repetitive strain injuries (RSIs). This paper examines the
question of whether KIC users utilize different types of interaction techniques more quickly than
non-KIC users. Participants were exposed to five conditions, each consisting of different
computer tasks. One condition was used to determine KIC usage, the independent variable. The
other four conditions were used to objectively measure performance in time (i.e., efficiency), one
of the dependent variables. After each condition, participants completed the NASA-TLX survey,
which was used as a subjective measure of workload, the second dependent variable. Task
performance correlated strongly or moderately with KIC usage for all conditions, which indicates
that KIC users finished all tasks more quickly than other users—even when they used other
techniques than KICs to accomplish those tasks. There was no relationship between KIC usage
and subjective workload.
Software usage serves various purposes in many
environments such as in educational settings and the
workplace. Employees may use software to enter and
edit data or to communicate with others (Dennerlein &
Johnson, 2006). Given the pervasive use of software,
employees who desire to improve their overall efficiency
are well served by adopting more efficient techniques
and strategies with software (Shneiderman, 1982).
Indeed, better knowledge of how to use software
efficiently may allow the user to utilize the software’s
functions to its full extent (Bhavnani & John, 2000).
However, little is known about who adopts efficient
techniques and why. Furthermore, even less is known
about how those who have adopted efficient techniques
compare to those who have not in overall productivity.
Those who want to work in an efficient manner may
benefit from using the keyboard to issue commands
(KICs) instead of the mouse. When users take advantage
of KICs they are taking advantage of efficient methods
that are faster than utilizing other methods, such as menu
and icon methods (Lane, Peres, Sándor, & Napier, 2005).
Using KICs can potentially increase a user’s
productivity in one sitting period versus another who
does not utilize KICs. Individuals who spend long
periods of time using the computer without shortcuts
face the potential of injury, such as repetitive strain
injury (RSI) (Fagarasanu & Kumar, 2003). Thus, when
an individual uses KICs, this could potentially reduce
the risk of RSIs (Dennerlein & Johnson, 2006).
Reducing injury and being a more efficient user are just
some of the advantages of KICs, but there are likely
other advantages to being a KIC user. The data presented
in this paper address the question of whether there is a
relation between someone’s usage of KICs and their
performance on other tasks when they use methods other
than KICs. Specifically, if someone is a KIC user, will
they use other methods of issuing commands (i.e. icon
issued or menu issued) more quickly than a non-KIC
Participating in the experiment were thirty-three
University of Houston-Clear Lake students (12 males)
(Age, M = 27.44, SD = 7.73). The participants were
given USD15 or 1.5 course participation credits for
participating in the experiment.
The experiment was carried out on a PC with
Windows XP, a 3GHz Pentium 4 processor, 2.0 GB
RAM, and 17” monitor. Observations were recorded
with Noldus Observer XT and uLog software. The
equipment used by the participants (which included the
computer equipment and workstation furniture) was set
up based upon present ergonomic guidelines (ANSI &
PROCEEDINGS of the HUMAN FACTORS and ERGONOMICS SOCIETY 53rd ANNUAL MEETING—2009 975
Copyright 2009 by Human Factors and Ergonomics Society, Inc. All rights reserved. 10.1518/107118109X12524443343917
To approximate participants’ perception of the
workload they thought the different types of interaction
techniques required, the participants completed the
National Aeronautics and Space Administration Task
Load Index (NASA-TLX), a survey that measures the
amount of workload experienced during a certain task
(Hart & Staveland, 1988). The NASA-TLX was selected
because it supplies a single score and it has been used in
the past with other experiments when workload was
measured (Hart & Staveland, 1988). The single score is
based on the six subscale ratings, which include mental
demand, effort, frustration, physical demand, temporal
demand, and performance (Hart & Staveland, 1988). At
the conclusion of this experiment, participants
completed a demographic survey that included questions
about computer usage, extracurricular, and work-related
The procedures for this experiment are the same as
those used by Peres, Nguyen, Kortum, Akladios, Wood,
and Muddimer (2009), which we will briefly describe.
The experiment lasted approximately 1.5 hours. After
reading and signing an informed consent, participants
were seated at a computer workstation within an
observation room and given instructions to carry out the
first section called command implementation. Following
the completion of the command implementation section,
the participants were given explicit, step-by-step
instructions before starting each of the four remaining
sections, which were referred to as keyboard, icon,
mouse right-click, and mouse dragging. All participants
were stopped after five minutes from the start of a
section, even if they had not fully completed all the tasks
in the section.
All five of the sections within the experiment
involved editing a document in 2003 MSWord.
Participants were asked to perform such tasks as finding
a certain phrase and then italicizing that phrase. See
Table 1 for more detailed information on the sections.
There were 25 editing tasks for the keyboard, icon, and
mouse right-click sections, 12 tasks for the command
implementation section, and 45 related tasks for the
mouse dragging section.
Table 1: Information on how edits were carried out in each
(e.g. Copy and Paste)
choose any methods to
issue the editing
issue commands by
using combinations of
instance, issuing the
copy command by
pressing "Ctrl” and
“C” at the same time.
issue commands by
appropriate icon on the
The mouse button
switched for this
section so that
issue commands by
right-clicking on the
appropriate icon in the
Participants move a
selection of the
document by dragging
the selection. This
could be done by
holding down the left
or both mouse
button(s) and dropping
it in the desired
location by releasing
Participants were given explicit instructions for all
sections to compensate for the differences in their
experiences with different editing tasks and to eliminate
confusion associated with performing unfamiliar tasks.
Any questions the participants had were answered before
they began their tasks. If they started to utilize the wrong
editing method, the experimenter would repeat the
instructions in order to quickly correct the error.
After each section, participants answered the NASA-
TLX, which took approximately five minutes to
complete. The survey was read to the participants and
they reported their answers verbally. After the last
NASA-TLX was administered, participants completed
PROCEEDINGS of the HUMAN FACTORS and ERGONOMICS SOCIETY 53rd ANNUAL MEETING—2009 976
the demographic survey. Finally, participants were
debriefed and dismissed.
The percentage of the total number of times
participants used the keyboard to issue commands
(KICs) during the command implementation section was
calculated for each participant. This percentage was used
as a measure of the participants KIC usage.
The time in seconds it took for participants to
complete a particular section was used as the objective
measure of their performances. This was done for all
sections except for the mouse dragging section because
none of the participants completed all tasks in this
section. Instead, the percentage of the tasks completed
was used as the performance measure. For perception of
required workload, the NASA-TLX scores for each
participant for each section were calculated. To examine
the relation of KIC usage with objective and subjective
measures of performance, all variables were correlated
using Pearson’s correlations. Demographic variables
were also correlated with the dependent and independent
For this experiment we employed a cross-sectional,
correlational design. The command implementation was
the first section participants completed, which ensured
they were only using prior knowledge, and not
knowledge gained during the other sections of the
experiment. The other four sections were
counterbalanced throughout the participants in a Latin
Our independent variable was the percentage of KIC
usage by each participant during the command
implementation task. Our two dependent variables were
the objective performance, which is the time in seconds
it took for a participant to complete a particular section;
and the subjective impression of workload, which is the
single workload score from the NASA-TLX for each
participant from each section.
Pearson correlations were calculated to determine
the strength of the relationship between participants’
KIC usage data and the dependant variables; i.e., their
performance for each section, total NASA-TLX scores
for each section, their NASA-TLX subscale scores for
each section, demographic variables (age, sex), and
computer usage (years of personal computer experience,
hours per week of using a personal computer). In
addition, participants’ performance for each section,
demographic data, and computer usage were correlated
with their NASA-TLX section scores and subscales.
As seen in Table 2, the correlations between
participants’ KIC usage and their section completion
results by command type are significant (all p’s < .05).
Table 2 provides further information on the correlations
and corresponding p values.
Table 2. Mean values, standard errors, Pearson correlations
and p values for correlations between KIC usage and Section.
223.55 58.57 -.700 .000
Icon 230.50 48.93 -.434 .012
Keyboard 268.63 40.93 -.744 .000
241.85 48.23 -.495 .003
28% .071 .459 .007
Negative correlations were found between KIC
usage and command implementation, icon, keyboard,
and mouse right-click section completion times. These
negative correlations indicate that the more participants
used KICs in the command implementation section, the
faster they completed the tasks in the subsequent
sections. Correlations found between KIC usage and
command implementation and keyboard completion
times were the strongest out of all relationships
Percent KIC Usage
Dragging Completed Tasks (percent)
Figure 1: Positive linear relationship between KIC usage and
the amount of tasks completed by mouse dragging commands.
PROCEEDINGS of the HUMAN FACTORS and ERGONOMICS SOCIETY 53rd ANNUAL MEETING—2009 977
Additionally, a positive correlation (r(31) = .459, p <
.01) was found between KIC usage and the completion
of the mouse dragging tasks. Figure 1 shows this
positive relationship that suggests that the greater use of
KICs was related to a higher completion rate for the
mouse dragging tasks.
Negative correlations were found between KIC
usage and the participants’ NASA-TLX effort subscale
score for command implementation and icon tasks (see
Table 3). This negative relationship implies that
participants with higher percentages of KIC usage felt
that they did not have to put as much effort into their
performance in order to complete the command
implementation and icon tasks.
Table 3: Mean values, standard errors, Pearson correlations
and p values for correlations between KIC usage Section
NASA-TLX effort subscale score.
18.39 22.29 -.388 .026
Icon 18.76 17.18 -.346 .048
Keyboard 21.30 20.28 -.259 .145
25.58 22.64 -.171 .342
30.64 24.48 -.317 .072
The only significant correlation between
participants’ performance in a section and the NASA-
TLX subscales was for the Frustration subscale. Table 4
shows the correlations between the performance in the
five sections and this subscale. As shown this table,
participants who spent less time completing the
command implementation section also felt less frustrated
while completing the command implementation and icon
Table 4: p values for Pearson correlations between Command
Implementation completion time and Section NASA-TLX
frustration subscale score.
4.18 7.54 .389 .025
Icon 5.45 11.85 .355 .043
Keyboard 11.39 17.40 .303 .087
12.33 15.17 .189 .292
15.67 23.56 .054 .767
No other significant correlations were found
between KIC usage or section completion results and
other section NASA-TLX subscale scores. Nor were
significant relationships found for demographic data, or
computer usage that were relevant to our current
It is not surprising that there is a strong negative
correlation between KIC usage and time needed to
complete the keyboard section. It is reasonable to believe
that those who used KICs in the command
implementation section would be able to complete the
KIC section in less time than the participants who
typically use other methods. It is surprising, however,
that participants with high percentages of KIC usage
tended to complete the non-keyboard tasks faster than
participants with low KIC usage. Regardless of the
reason, the evidence does support that KIC users are
faster at more than just typing.
Future research should examine why this is the case.
We found no evidence that demographics, hours per
week spent in using a computer, or number of years of
experience with computers determine KIC usage or
speed on non-keyboard tasks. However, more detailed
investigation is needed to determine how participants
regularly use their computers; those who use their
computers mostly for business-type applications may
have a different level of proficiency than those who use
them to play games. Similarly, information about hand
dexterity might be important to explain why KIC users
are faster at other computer tasks than non-KIC users.
It was also surprising to find that there is no
correlation between KIC usage and NASA-TLX scores.
This means there is no evidence to support that KIC
users experienced lower levels of workload when
completing the sections than non-KIC users. Thus,
although KIC users were more efficient at the sections
(completed them more quickly), they did not perceive
the workload as lower.
None of the measures collected for this experiment
investigated users’ comfort levels or self-efficacy with
the tasks or their perceived performance on the tasks.
This could be done using an instrument like the
Computer Anxiety Rating Scale (CARS) or a simple
self-efficacy scale. Perhaps there is a correlation
between KIC usage and self-efficacy while working with
word processors. Identifying a relationship between the
two would provide insight for instructors and employers
as to how to either identify employees who will easily
adapt to using KICs or how to train employees to use
KICs. It would also be interesting to see if similar results
PROCEEDINGS of the HUMAN FACTORS and ERGONOMICS SOCIETY 53rd ANNUAL MEETING—2009 978
extend to applications other than word processors; for
instance: spreadsheets, databases, or applications that are
less familiar to the participants.
This research is important because it demonstrates
that there is an advantage to using KICs. Because KIC
users are faster at all computer-related tasks, training
employees to use KICs can help them use their time
more productively when working on computers,
increasing the economic benefits to their employers.
ANSI & HFES. Human Factors Engineering of Computer
Workstations. (2007). ANSI/HFES 100-2007. Santa
Monica, CA: Human Factors and Ergonomic Society.
Bhavnani, S. K. & John, B. E. (2000). The strategic use of
complex computer systems. Human-Computer
Interaction, 15, 107-137.
Dennerlein, J. T. & Johnson, P. W. (2006). Different computer
tasks affect the exposure of the upper extremity to
biomechanical risk factors. Ergonomics, 49, 45-61.
Fagarasanu, M. & Kumar, S. (2003). Carpal tunnel syndrome
due to keyboarding and mouse tasks: A review.
International Journal of Industrial Ergonomics, 31, 119-
Hart, S. G. & Staveland, L. E. (1988). Development of NASA-
TLX (Task Load Index): Results of empirical and
theoretical research. In P. Hancock & N. Meschkati
(Eds.), Human Mental Workload (pp. 139-183).
Amsterdam, North Holland: Elsevier Science.
Herot, C. F. (1982). Graphical user interfaces. In Y. Vassiliou
(Ed.), Human Factors and Interactive Computer Systems
(pp. 83 - 103). Norwood, NJ: Ablex Publishing
Lane, D. M., Napier, H. A., Peres, S. C., & Sandor, A. (2005).
Hidden costs of graphical user interfaces: Failure to make
the transition from menus and icon toolbars to keyboard
shortcuts. International Journal of Human-Computer
Interaction, 18, 133-144.
Peres, S. C., Nguyen, V., Kortum, P., Akladios, M., Wood, B.,
& Muddimer, A. (2009). Software ergonomics: Relating
subjective and objective measures. Proceedings of
SIGCHI conference on Human Factors in Computing
Systems, Boston, MA USA, ACM Press.
Shneiderman, B. (1982). The future of interactive systems and
the emergence of direct manipulation. In Y. Vassiliou
(Ed.), Human Factors and Interactive Computer Systems
(pp. 1- 27). Norwood, NJ: Ablex Publishing Corporation.
Vassiliou, Y. & Jarke, M. (1982). Query languages – A
taxonomy. In Y. Vassiliou (Ed.), Human Factors and
Interactive Computer Systems (pp. 47 - 82). Norwood,
NJ: Ablex Publishing Corporation.
PROCEEDINGS of the HUMAN FACTORS and ERGONOMICS SOCIETY 53rd ANNUAL MEETING—2009 979