ArticlePDF Available

Keyboard Shortcut Users: They Are Faster at More than Just Typing

Authors:

Abstract and Figures

Software efficiency may be important for employees who want to be viewed as valuable assets in a company. One efficient method people can employ is the use of the keyboard to issue commands (KICs) because KICs are faster than other methods, e.g. menus or icons. Furthermore, using KICs may reduce the risk of repetitive strain injuries (RSIs). This paper examines the question of whether KIC users utilize different types of interaction techniques more quickly than non-KIC users. Participants were exposed to five conditions, each consisting of different computer tasks. One condition was used to determine KIC usage, the independent variable. The other four conditions were used to objectively measure performance in time (i.e., efficiency), one of the dependent variables. After each condition, participants completed the NASA-TLX survey, which was used as a subjective measure of workload, the second dependent variable. Task performance correlated strongly or moderately with KIC usage for all conditions, which indicates that KIC users finished all tasks more quickly than other users—even when they used other techniques than KICs to accomplish those tasks. There was no relationship between KIC usage and subjective workload.
Content may be subject to copyright.
Keyboard Shortcut Users: They Are Faster at More than Just Typing
Jo Rain Jardina, S. Camille Peres, Vickie Nguyen, Ashitra Megasari, Katherine R. Griggs,
Rosalinda Pinales, & April N. Amos
University of Houston-Clear Lake
Houston, Texas
Software efficiency may be important for employees who want to be viewed as valuable assets in
a company. One efficient method people can employ is the use of the keyboard to issue
commands (KICs) because KICs are faster than other methods, e.g. menus or icons. Furthermore,
using KICs may reduce the risk of repetitive strain injuries (RSIs). This paper examines the
question of whether KIC users utilize different types of interaction techniques more quickly than
non-KIC users. Participants were exposed to five conditions, each consisting of different
computer tasks. One condition was used to determine KIC usage, the independent variable. The
other four conditions were used to objectively measure performance in time (i.e., efficiency), one
of the dependent variables. After each condition, participants completed the NASA-TLX survey,
which was used as a subjective measure of workload, the second dependent variable. Task
performance correlated strongly or moderately with KIC usage for all conditions, which indicates
that KIC users finished all tasks more quickly than other users—even when they used other
techniques than KICs to accomplish those tasks. There was no relationship between KIC usage
and subjective workload.
INTRODUCTION
Software usage serves various purposes in many
environments such as in educational settings and the
workplace. Employees may use software to enter and
edit data or to communicate with others (Dennerlein &
Johnson, 2006). Given the pervasive use of software,
employees who desire to improve their overall efficiency
are well served by adopting more efficient techniques
and strategies with software (Shneiderman, 1982).
Indeed, better knowledge of how to use software
efficiently may allow the user to utilize the software’s
functions to its full extent (Bhavnani & John, 2000).
However, little is known about who adopts efficient
techniques and why. Furthermore, even less is known
about how those who have adopted efficient techniques
compare to those who have not in overall productivity.
Those who want to work in an efficient manner may
benefit from using the keyboard to issue commands
(KICs) instead of the mouse. When users take advantage
of KICs they are taking advantage of efficient methods
that are faster than utilizing other methods, such as menu
and icon methods (Lane, Peres, Sándor, & Napier, 2005).
Using KICs can potentially increase a user’s
productivity in one sitting period versus another who
does not utilize KICs. Individuals who spend long
periods of time using the computer without shortcuts
face the potential of injury, such as repetitive strain
injury (RSI) (Fagarasanu & Kumar, 2003). Thus, when
an individual uses KICs, this could potentially reduce
the risk of RSIs (Dennerlein & Johnson, 2006).
Reducing injury and being a more efficient user are just
some of the advantages of KICs, but there are likely
other advantages to being a KIC user. The data presented
in this paper address the question of whether there is a
relation between someone’s usage of KICs and their
performance on other tasks when they use methods other
than KICs. Specifically, if someone is a KIC user, will
they use other methods of issuing commands (i.e. icon
issued or menu issued) more quickly than a non-KIC
user?
METHODS
Participants
Participating in the experiment were thirty-three
University of Houston-Clear Lake students (12 males)
(Age, M = 27.44, SD = 7.73). The participants were
given USD15 or 1.5 course participation credits for
participating in the experiment.
Equipment
The experiment was carried out on a PC with
Windows XP, a 3GHz Pentium 4 processor, 2.0 GB
RAM, and 17” monitor. Observations were recorded
with Noldus Observer XT and uLog software. The
equipment used by the participants (which included the
computer equipment and workstation furniture) was set
up based upon present ergonomic guidelines (ANSI &
HFES, 2007).
PROCEEDINGS of the HUMAN FACTORS and ERGONOMICS SOCIETY 53rd ANNUAL MEETING—2009 975
Copyright 2009 by Human Factors and Ergonomics Society, Inc. All rights reserved. 10.1518/107118109X12524443343917
Instruments
To approximate participants’ perception of the
workload they thought the different types of interaction
techniques required, the participants completed the
National Aeronautics and Space Administration Task
Load Index (NASA-TLX), a survey that measures the
amount of workload experienced during a certain task
(Hart & Staveland, 1988). The NASA-TLX was selected
because it supplies a single score and it has been used in
the past with other experiments when workload was
measured (Hart & Staveland, 1988). The single score is
based on the six subscale ratings, which include mental
demand, effort, frustration, physical demand, temporal
demand, and performance (Hart & Staveland, 1988). At
the conclusion of this experiment, participants
completed a demographic survey that included questions
about computer usage, extracurricular, and work-related
activities.
Procedure
The procedures for this experiment are the same as
those used by Peres, Nguyen, Kortum, Akladios, Wood,
and Muddimer (2009), which we will briefly describe.
The experiment lasted approximately 1.5 hours. After
reading and signing an informed consent, participants
were seated at a computer workstation within an
observation room and given instructions to carry out the
first section called command implementation. Following
the completion of the command implementation section,
the participants were given explicit, step-by-step
instructions before starting each of the four remaining
sections, which were referred to as keyboard, icon,
mouse right-click, and mouse dragging. All participants
were stopped after five minutes from the start of a
section, even if they had not fully completed all the tasks
in the section.
All five of the sections within the experiment
involved editing a document in 2003 MSWord.
Participants were asked to perform such tasks as finding
a certain phrase and then italicizing that phrase. See
Table 1 for more detailed information on the sections.
There were 25 editing tasks for the keyboard, icon, and
mouse right-click sections, 12 tasks for the command
implementation section, and 45 related tasks for the
mouse dragging section.
Table 1: Information on how edits were carried out in each
section
Section
Input
Device
Editing Methods
(e.g. Copy and Paste)
Command
Implementation
Participant’s
choice
Participants could
choose any methods to
issue the editing
commands.
Keyboard Keyboard
Participants would
issue commands by
using combinations of
keystrokes. For
instance, issuing the
copy command by
pressing "Ctrl” and
“C” at the same time.
Icon Mouse
Participants would
issue commands by
left-clicking the
appropriate icon on the
icon toolbar.
Mouse Right-
Click
Mouse
The mouse button
functions were
switched for this
section so that
participants would
issue commands by
right-clicking on the
appropriate icon in the
icon toolbar.
Mouse
Dragging
Mouse
Participants move a
selection of the
document by dragging
the selection. This
could be done by
holding down the left
or both mouse
button(s) and dropping
it in the desired
location by releasing
the button(s).
Participants were given explicit instructions for all
sections to compensate for the differences in their
experiences with different editing tasks and to eliminate
confusion associated with performing unfamiliar tasks.
Any questions the participants had were answered before
they began their tasks. If they started to utilize the wrong
editing method, the experimenter would repeat the
instructions in order to quickly correct the error.
After each section, participants answered the NASA-
TLX, which took approximately five minutes to
complete. The survey was read to the participants and
they reported their answers verbally. After the last
NASA-TLX was administered, participants completed
PROCEEDINGS of the HUMAN FACTORS and ERGONOMICS SOCIETY 53rd ANNUAL MEETING—2009 976
the demographic survey. Finally, participants were
debriefed and dismissed.
Measures
The percentage of the total number of times
participants used the keyboard to issue commands
(KICs) during the command implementation section was
calculated for each participant. This percentage was used
as a measure of the participants KIC usage.
The time in seconds it took for participants to
complete a particular section was used as the objective
measure of their performances. This was done for all
sections except for the mouse dragging section because
none of the participants completed all tasks in this
section. Instead, the percentage of the tasks completed
was used as the performance measure. For perception of
required workload, the NASA-TLX scores for each
participant for each section were calculated. To examine
the relation of KIC usage with objective and subjective
measures of performance, all variables were correlated
using Pearson’s correlations. Demographic variables
were also correlated with the dependent and independent
variables.
Design
For this experiment we employed a cross-sectional,
correlational design. The command implementation was
the first section participants completed, which ensured
they were only using prior knowledge, and not
knowledge gained during the other sections of the
experiment. The other four sections were
counterbalanced throughout the participants in a Latin
Square design.
Our independent variable was the percentage of KIC
usage by each participant during the command
implementation task. Our two dependent variables were
the objective performance, which is the time in seconds
it took for a participant to complete a particular section;
and the subjective impression of workload, which is the
single workload score from the NASA-TLX for each
participant from each section.
RESULTS
Pearson correlations were calculated to determine
the strength of the relationship between participants’
KIC usage data and the dependant variables; i.e., their
performance for each section, total NASA-TLX scores
for each section, their NASA-TLX subscale scores for
each section, demographic variables (age, sex), and
computer usage (years of personal computer experience,
hours per week of using a personal computer). In
addition, participants’ performance for each section,
demographic data, and computer usage were correlated
with their NASA-TLX section scores and subscales.
As seen in Table 2, the correlations between
participants’ KIC usage and their section completion
results by command type are significant (all p’s < .05).
Table 2 provides further information on the correlations
and corresponding p values.
Table 2. Mean values, standard errors, Pearson correlations
and p values for correlations between KIC usage and Section.
Section
Performance
(in seconds)
St.
Error
Correlation
with KIC
Usage
p
Command
Implementation
223.55 58.57 -.700 .000
Icon 230.50 48.93 -.434 .012
Keyboard 268.63 40.93 -.744 .000
Mouse
Right-Click
241.85 48.23 -.495 .003
Mouse
Dragging
28% .071 .459 .007
Negative correlations were found between KIC
usage and command implementation, icon, keyboard,
and mouse right-click section completion times. These
negative correlations indicate that the more participants
used KICs in the command implementation section, the
faster they completed the tasks in the subsequent
sections. Correlations found between KIC usage and
command implementation and keyboard completion
times were the strongest out of all relationships
examined.
Percent KIC Usage
1.0.8.6.4.20.0
Dragging Completed Tasks (percent)
.5
.4
.3
.2
.1
0.0
Figure 1: Positive linear relationship between KIC usage and
the amount of tasks completed by mouse dragging commands.
PROCEEDINGS of the HUMAN FACTORS and ERGONOMICS SOCIETY 53rd ANNUAL MEETING—2009 977
Additionally, a positive correlation (r(31) = .459, p <
.01) was found between KIC usage and the completion
of the mouse dragging tasks. Figure 1 shows this
positive relationship that suggests that the greater use of
KICs was related to a higher completion rate for the
mouse dragging tasks.
Negative correlations were found between KIC
usage and the participants’ NASA-TLX effort subscale
score for command implementation and icon tasks (see
Table 3). This negative relationship implies that
participants with higher percentages of KIC usage felt
that they did not have to put as much effort into their
performance in order to complete the command
implementation and icon tasks.
Table 3: Mean values, standard errors, Pearson correlations
and p values for correlations between KIC usage Section
NASA-TLX effort subscale score.
Section
NASA-
TLX
St.
Error
Correlation
with KIC
Usage
p
Command
Implementation
18.39 22.29 -.388 .026
Icon 18.76 17.18 -.346 .048
Keyboard 21.30 20.28 -.259 .145
Mouse
Right-Click
25.58 22.64 -.171 .342
Mouse
Dragging
30.64 24.48 -.317 .072
The only significant correlation between
participants’ performance in a section and the NASA-
TLX subscales was for the Frustration subscale. Table 4
shows the correlations between the performance in the
five sections and this subscale. As shown this table,
participants who spent less time completing the
command implementation section also felt less frustrated
while completing the command implementation and icon
tasks.
Table 4: p values for Pearson correlations between Command
Implementation completion time and Section NASA-TLX
frustration subscale score.
Section
NASA-
TLX
Frustratio
n Score
St.
Error
Correlation
with Command
Implementatio
n Performance
p
Command
Implementation
4.18 7.54 .389 .025
Icon 5.45 11.85 .355 .043
Keyboard 11.39 17.40 .303 .087
Mouse
Right-Click
12.33 15.17 .189 .292
Mouse
Dragging
15.67 23.56 .054 .767
No other significant correlations were found
between KIC usage or section completion results and
other section NASA-TLX subscale scores. Nor were
significant relationships found for demographic data, or
computer usage that were relevant to our current
question.
DISCUSSION
It is not surprising that there is a strong negative
correlation between KIC usage and time needed to
complete the keyboard section. It is reasonable to believe
that those who used KICs in the command
implementation section would be able to complete the
KIC section in less time than the participants who
typically use other methods. It is surprising, however,
that participants with high percentages of KIC usage
tended to complete the non-keyboard tasks faster than
participants with low KIC usage. Regardless of the
reason, the evidence does support that KIC users are
faster at more than just typing.
Future research should examine why this is the case.
We found no evidence that demographics, hours per
week spent in using a computer, or number of years of
experience with computers determine KIC usage or
speed on non-keyboard tasks. However, more detailed
investigation is needed to determine how participants
regularly use their computers; those who use their
computers mostly for business-type applications may
have a different level of proficiency than those who use
them to play games. Similarly, information about hand
dexterity might be important to explain why KIC users
are faster at other computer tasks than non-KIC users.
It was also surprising to find that there is no
correlation between KIC usage and NASA-TLX scores.
This means there is no evidence to support that KIC
users experienced lower levels of workload when
completing the sections than non-KIC users. Thus,
although KIC users were more efficient at the sections
(completed them more quickly), they did not perceive
the workload as lower.
None of the measures collected for this experiment
investigated users’ comfort levels or self-efficacy with
the tasks or their perceived performance on the tasks.
This could be done using an instrument like the
Computer Anxiety Rating Scale (CARS) or a simple
self-efficacy scale. Perhaps there is a correlation
between KIC usage and self-efficacy while working with
word processors. Identifying a relationship between the
two would provide insight for instructors and employers
as to how to either identify employees who will easily
adapt to using KICs or how to train employees to use
KICs. It would also be interesting to see if similar results
PROCEEDINGS of the HUMAN FACTORS and ERGONOMICS SOCIETY 53rd ANNUAL MEETING—2009 978
extend to applications other than word processors; for
instance: spreadsheets, databases, or applications that are
less familiar to the participants.
This research is important because it demonstrates
that there is an advantage to using KICs. Because KIC
users are faster at all computer-related tasks, training
employees to use KICs can help them use their time
more productively when working on computers,
increasing the economic benefits to their employers.
REFERENCES
ANSI & HFES. Human Factors Engineering of Computer
Workstations. (2007). ANSI/HFES 100-2007. Santa
Monica, CA: Human Factors and Ergonomic Society.
Bhavnani, S. K. & John, B. E. (2000). The strategic use of
complex computer systems. Human-Computer
Interaction, 15, 107-137.
Dennerlein, J. T. & Johnson, P. W. (2006). Different computer
tasks affect the exposure of the upper extremity to
biomechanical risk factors. Ergonomics, 49, 45-61.
Fagarasanu, M. & Kumar, S. (2003). Carpal tunnel syndrome
due to keyboarding and mouse tasks: A review.
International Journal of Industrial Ergonomics, 31, 119-
136.
Hart, S. G. & Staveland, L. E. (1988). Development of NASA-
TLX (Task Load Index): Results of empirical and
theoretical research. In P. Hancock & N. Meschkati
(Eds.), Human Mental Workload (pp. 139-183).
Amsterdam, North Holland: Elsevier Science.
Herot, C. F. (1982). Graphical user interfaces. In Y. Vassiliou
(Ed.), Human Factors and Interactive Computer Systems
(pp. 83 - 103). Norwood, NJ: Ablex Publishing
Corporation.
Lane, D. M., Napier, H. A., Peres, S. C., & Sandor, A. (2005).
Hidden costs of graphical user interfaces: Failure to make
the transition from menus and icon toolbars to keyboard
shortcuts. International Journal of Human-Computer
Interaction, 18, 133-144.
Peres, S. C., Nguyen, V., Kortum, P., Akladios, M., Wood, B.,
& Muddimer, A. (2009). Software ergonomics: Relating
subjective and objective measures. Proceedings of
SIGCHI conference on Human Factors in Computing
Systems, Boston, MA USA, ACM Press.
Shneiderman, B. (1982). The future of interactive systems and
the emergence of direct manipulation. In Y. Vassiliou
(Ed.), Human Factors and Interactive Computer Systems
(pp. 1- 27). Norwood, NJ: Ablex Publishing Corporation.
Vassiliou, Y. & Jarke, M. (1982). Query languages – A
taxonomy. In Y. Vassiliou (Ed.), Human Factors and
Interactive Computer Systems (pp. 47 - 82). Norwood,
NJ: Ablex Publishing Corporation.
PROCEEDINGS of the HUMAN FACTORS and ERGONOMICS SOCIETY 53rd ANNUAL MEETING—2009 979
ResearchGate has not been able to resolve any citations for this publication.
Conference Paper
Full-text available
Information Systems Working Papers Series
Article
Full-text available
Graphical interfaces allow users to issue commands using pull-down menus, icon toolbars, and keyboard shortcuts. Menus and icon toolbars are easier to learn, whereas keyboard shortcuts are more efficient. It would seem natural for users to migrate from the use of easy-to-learn menu and icon methods to the more efficient method of key- board shortcuts as they gain experience. To investigate the extent to which this transi- tion takes place, 251 experienced users of Microsoft Word were given a questionnaire assessing their choice of methods for the most frequently occurring commands. Con- trary to our expectations, most experienced users rarely used the efficient keyboard shortcuts, favoring the use of icon toolbars instead. A second study was done to verify that keyboard shortcuts are, indeed, the most efficient method. Six participants per- formed common commands using menu selection, icon toolbars, and keyboard short- cuts. The keyboard shortcuts were, as expected, the most efficient. We conclude that even experienced users are inefficient in their use of graphical interfaces. One possible way to improve user efficiency is for training programs to provide a roadmap for users to make the transition from using pull-down menus and clicking icon toolbars to issu- ing keyboard shortcuts.
Article
Full-text available
Several studies show that despite experience, many users with basic command knowledge do not progress to an efficient use of complex computer applications. These studies suggest that knowledge of tasks and knowledge of tools are insufficient to lead users to become efficient. To address this problem, we argue that users also need to learn strategies in the intermediate layers of knowledge lying between tasks and tools. These strategies are (a) efficient because they exploit specific powers of computers, (b) difficult to acquire because they are suggested by neither tasks nor tools, and (c) general in nature having wide applicability. The above characteristics are first demonstrated in the context of aggregation strategies that exploit the iterative power of computers.Acognitive analysis of a real-world task reveals that even though such aggregation strategies can have large effects on task time, errors, and on the quality of the final product, they are not often used by even experienced users. We identify other strategies beyond aggregation that can be efficient and useful across computer applications and show how they were used to develop a new approach to training with promising results.We conclude by suggesting that a systematic analysis of strategies in the intermediate layers of knowledge can lead not only to more effective ways to design training but also to more principled approaches to design systems. These advances should lead users to make more efficient use of complex computer systems.
Article
So far, many different studies have examined possible implications of typing related posture and activity on carpal tunnel syndrome (CTS) incidence. Although they tend to present the findings as very apparent ones, assessing the complex relationships between the different causal factors implicated in keyboarding and in the usage of pointing devices, on the one hand, and work related upper extremity disorders (WRUED), especially CTS, on the other hand, is problematic. The aim of this review paper is to outline relevant information about CTS risk factors present in data entry task and their implications, with a special emphasis on different extreme postures determined by conventional and alternate keyboards, pointing devices and their role in the development of CTS. Secondly, a comparison of several keyboards with respect to design of keyswitch to reduce force and its effect on carpal tunnel pressure (CTP) is provided. This review critically considers the factors implicated in the occurrence of CTS due to computer work, analysing the determining factors from a well-considered perspective instead of considering them separate entities. Many “ergonomic” keyboards change the musculoskeletal region exposed to risk, instead of eliminating hazardous postures. The ergonomic assessment of new devices should precede their introduction and not follow it. Future research should be directed to establish a comprehensive understanding of what combinations of trigger factors should be eliminated or modified, to assess the impact of workstation redesign and to uncover the interrelationships between different factors that contribute to the development of CTS.
Article
This paper suggests three motivations for the strong interest in human factors' aspects of user interfaces and reviews five design issues: command language versus menu selection, response time and display rates, wording of system messages, on-line tutorials, explanations and help messages and hardware devices. Five methods and tools for system development are considered: participatory design, specification methods, software implementation tools, pilot studies and acceptance tests and evolutionary refinement based on user feedback.The final portion of the paper presents direct manipulation, an approach which promises to become widely used in interactive systems. Direct manipulation involves representation of the object of interest, rapid incremental reversible actions and physical action instead of complex syntax.
Conference Paper
The use of computers in the workplace is now commonplace. Correspondingly, injuries associated with computer use have increased. However, little research has been done investigating whether these injuries are associated with the software being used. One reason is the difficulty in measuring muscle strain (a predictor of muscle related injuries). Here we present preliminary results of study on the relationship between objective and subjective measures of muscle strain during computer use. As users completed sets of tasks using MSWord®, SEMG muscle activity was recorded for the muscles associated with using a keyboard and mouse. After each task set, users completed surveys asking the level of strain they experienced during the tasks. Correlations between the measures suggest that subjective measures can provide reliable information regarding the muscle strain associated with software use. These easily obtained subjective measurements could assist in producing software interaction designs that are better for users.
Chapter
The results of a multi-year research program to identify the factors associated with variations in subjective workload within and between different types of tasks are reviewed. Subjective evaluations of 10 workload-related factors were obtained from 16 different experiments. The experimental tasks included simple cognitive and manual control tasks, complex laboratory and supervisory control tasks, and aircraft simulation. Task-, behavior-, and subject-related correlates of subjective workload experiences varied as a function of difficulty manipulations within experiments, different sources of workload between experiments, and individual differences in workload definition. A multi-dimensional rating scale is proposed in which information about the magnitude and sources of six workload-related factors are combined to derive a sensitive and reliable estimate of workload.