Access to this full-text is provided by Wiley.
Content available from Advances in Human-Computer Interaction
This content is subject to copyright. Terms and conditions apply.
Hindawi Publishing Corporation
Advances in Human-Computer Interaction
Volume 2011, Article ID 202701, 8pages
doi:10.1155/2011/202701
Research Article
Working towards Usable Forms on the World Wide Web:
Optimizing Date Entry Input Fields
Javier A. Bargas-Avila, Olivia Brenzikofer, Alexandre N. Tuch,
Sandra P. Roth, and Klaus Opwis
Department of Psychology, Center for Cognitive Psychology and Methodology, University of Basel, 4055 Basel, Switzerland
Correspondence should be addressed to Javier A. Bargas-Avila, javier.bargas@unibas.ch
Received 15 February 2010; Revised 13 May 2011; Accepted 7 June 2011
Academic Editor: Armando Bennet Barreto
Copyright © 2011 Javier A. Bargas-Avila et al. This is an open access article distributed under the Creative Commons Attribution
License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly
cited.
When an interactive form in the world wide web requires users to fill in exact dates, this can be implemented in several ways. This
paper discusses an empirical online study with n=172 participants which compared six different versions to design input fields
for date entries. The results revealed that using a drop-down menu is best when format errors must be avoided, whereas using
only one input field and placing the format requirements left or inside the text box led to faster completion time and higher user
satisfaction.
1. Introduction
Most websites use interactive online forms as the main
contact point between users and the company. The design of
these forms can be a crucial factor for the success of online
transactions. Users do not visit a website with the intention
or goal of filling in a form. Focusing on the example of
online shopping: once users have chosen the items that
they wish to buy, they want to complete their shopping as
quickly, easily, and safely as possible. In this context, a form
may often be perceived as a hurdle. If they are difficult to
use, it may even lead to customers aborting the transaction,
resulting in loss of profit [1]. A successful revision and rede-
sign of a suboptimal online form may result in an increased
completion rate in the range of 10%–40% [1]. The eBay User
Experience and Design Group reported that a redesign of
the eBay registration form made a significant contribution
to eBay’s business and user success [2].
A growing body of research and guidelines have been
published on how to make online forms more usable (e.g.,
[1,3,4]). Some of these have been empirically tested; others
instead have been derived from experience and best practice
of usability experts. Although the knowledge in this field is
increasing, there are still many open questionswhen it comes
to designing an online form.
2. Theoretical Background
In the last decade, many aspects of online forms have been
explored. There are several aspects of usable form inter-
action: (1) form content, (2) form layout, (3) input types,
(4), error handling, and (5) form submission. The following
section provides a brief summary of the most important re-
sults within these areas. This study will explore aspects within
the area of “input types”.
Form Content. There are many different aspects to consider
when designing web forms. One of the basic guidelines of
user-centered design is to map the natural environment,
which is already familiar to the user, as closely as possible
to the virtual one [5]. If users are familiar with a concept in
real life, it is probable that they will also understand this con-
cept if it is applied to the online environment. In the case
of web forms, this may, for example, be achieved by using
a layout analogous to paper forms. Beaumont et al. [4]state
that users’ preferred input types for providing answers online
are textboxes. A demonstration by Nielsen [6] showed that
providing a separate drop-down menu for entering the street
type (e.g., road, street, and avenue) caused people to turn
back to the previous field because they were used to entering
the street type into the textbox for the address. Miller and
2Advances in Human-Computer Interaction
Jarrett [7] recommend not using too many different input
types in one form as this can cause confusion, and Beaumont
et al. [4] suggest keeping an intuitive order of the questions,
for example, first ask for the name, then the address and, at
the end, for the telephone number.
To keep forms simple and fast, Beaumont et al. [4]rec-
ommend asking only those questions that really need to be
answered, for example, the shipping address in the case of
an online shop. Other “nice-to-know” questions only an-
noy users and require more time to fill in the form. On the
other hand, these questions may provide insight into the
user population and may be helpful for marketing purpos-
es. In this case, users must be enabled to distinguish between
required and optional fields [8,9]. Nowadays, this is often
realized through the use of asterisks. Pauwels et al. [10]ex-
amined whether highlighting required fields by color coding
leads to faster completion time compared to an asterisk
next to required fields. Participants were faster, made fewer
errors, and were more satisfied when the required fields were
highlighted in color. Tullis and Pons [11] found that people
were fastest at filling in required fields when the required and
optional fields were separated from each other.
Form Layout. Penzo [12] examined the position of labels
relative to the input field in a study using eye-tracking. He
compared left-, right- and top-aligned labels and came to the
conclusion that with left-aligned labels people needed nearly
twice as long to complete the form as with right-aligned
labels. Additionally, the number of fixations needed with
right-aligned labels was halved. The fastest performance,
however, was reached with top-aligned labels, which required
only one fixation to capture both the label and the input
field at the same time. As a result of this study, Wroblewski
[1] recommends using left-aligned labels for unfamiliar data
where one wants users to slow down and consider their an-
swers. On the other hand, if the designer wants users to
complete the form as quickly as possible, top-aligned labels
are recommended. Another advantage of top-aligned labels
is that label length does not influence placement of the
input fields. Based on an eye-tracking study, Das et al. [13]
recommend right-aligned labels in the context of forms with
multiple columns. Finally Jarrett [14] emphasizes, that the
question of label placement is secondary, as long as users
know what to fill in the fields, whether they are willing to
reveal the information, and whether the validations do not
prevent them from entering the answers of their choice. The
literature review shows that there is little consensus when it
comes to label placement.
In terms of form layouts, Robinson [15] recommends
that a form should not be divided into more than one col-
umn. A row should only be used to answer one question.
Concerning the length of input fields, Wroblewski [1]rec-
ommends matching the length of the field tothe length of the
expected answer. This provides a clue or affordance to users
as to what kind of answer is expected from them. Christian
et al. [16] examined the date entry with two separated text
fields for month and year. Participants gave more answers in
the expected format (two characters for the month and four
for the year) if the field for the month was half the size of
the one for the year. In another study by Couper et al. [17],
people gave more incorrect answers if the size of the input
field did not fit the length of the expected input.
Input Types. Another question in web form design relates to
which input type should be used. As mentioned, Beaumont
et al. [4] recommend using textboxes as often as possible.
However, if the number of possible answers has to be restrict-
ed, radio buttons, checkboxes, or drop-down menus can be
used [8]. These input types are also recommended to avoid
errors, prevent users from entering unavailable options, and
simplify the decision process. Radio buttons and drop-down
menus are used for choosing only one option (single choice);
with checkboxes, users can select as many options as they
like. Concerning the use of drop-down menus and radio but-
tons, Miller and Jarrett [7] see the advantage of radio buttons
in the fact that all options are visible at once whereas the
advantage of drop-down menus lies in the saving of screen
real estate. With the help of the Keystroke-Level Model [18],
it can be theoretically calculated that interaction with a drop-
down menu takes longerthan interaction with radio buttons,
mainly because of an additional point and click (PK) needed
to open the drop-down menu. In an empirical study, Healey
[19] found that on the single-question level, radio buttons
were faster to choose from than drop-down menus, but the
use of drop-down menus instead of radio buttons did not
affect the overall time to fill in the whole questionnaire.
Hogg and Masztal [20] could not find any differences in
the time needed to select answers between radio buttons
and drop-down menus. Heerwegh and Loosveldt [21]found
that people needed significantly more time to select options
from drop-down menus than from radio buttons, but these
findings could not be replicated in a second study. Concern-
ing the drop-out rate, no differences between radio buttons
and drop-down menus could be found [19–21]. According
to Miller and Jarrett [7], radio buttons should be used
when two to four options are available; with more than four
options they recommend using drop-down menus. When
drop-down menus are used, Beaumont et al. [4] suggest
arranging the options in an order with which the user is
already familiar (e.g., for weekdays, the sequence Monday,
Tuesday, etc.). Where there is no intuitive sequence, an
alphabetical order should be considered. If users are required
to indicate multiple options, Bargas-Avila et al. [22]show
that checkboxes (instead of list boxes) enhance usability and
user satisfaction—at least when a smaller number of options
are provided.
A frequent issue concerning data input is the design of
date entries. With date entries, it is important that they are
entered in the expected format to avoid confusion between
month and day. There are many different ways of designing
input fields for date entries and many possibilities for how
they have to be completed. Christian et al. [16]examined
date entries where the month and year field consisted of
two separate text boxes. Their study revealed that 92.9%–
95.8% provided their answer in the correct format when
symbols (MM and YYYY) were used to state the restrictions.
Positioning the date instructions to the right of the year
field led to fewer correct answers. Linderman and Fried [8]
Advances in Human-Computer Interaction 3
suggest using drop-down menus to ensure that no invalid
dates are entered. There are other ways of designing date
entries and their format requirements, for example, placing
the requirements inside the answer boxes or using a single
text box. Currently no studies are known to the authors that
compare these different versions. Concerning the formatting
of other answers, accepting entries in every format is recom-
mended, as long as this does not cause ambiguity [8]. This
prevents users from having to figure out which format is re-
quired and avoids unnecessary error messages.
Error Handling. It is important to guide users as quickly
and error-free as possible through forms. Errors should be
avoided from the start by explaining restrictions in advance.
Often, errors cannot be avoided; in this case, it is important
to help users to recover from them as quickly and easily as
possible. To assure usable error messages in the web, Nielsen
[23] and Linderman and Fried [8] state that an error message
must be written in a familiar language and clearly state what
the error is and how it can be corrected. Nielsen [23]also
advises never deleting the completed fields after an error has
occurred, as this can be very frustrating for users. Bargas-
Avi la et al. [24] compared six different ways of presenting an
error message, including inline validation, pop-up windows,
and embedded error messages. People made fewer consecu-
tive errors when error messages appeared embedded in the
form next to the corresponding input fields or one by one
in a pop-up window. This was only the case if the error
messages showed up at the end after clickingthe send button.
If the error messages appeared at the moment the erroneous
field was left (inline validation), the participants made signif-
icantly more errors completing the form. They simply ig-
noredor,inthecaseofpop-upwindows,evenclickedaway
the appearing error messages without reading them.
Form Submission. At the end of the fill-in process, the form
hastobesubmitted.Thisisusuallyrealizedthroughabut-
ton with an action label. Linderman and Fried [8] suggest
disabling the submit button as soon as it has been clicked to
avoid repeated submissions due to long loading time. Some
web forms also offer a reset or cancel button in addition to
the submit button. Many experts recommend eliminating
such a button as it can be clicked by accident and does not
provide any real additional value [1,8,15]. After a successful
transaction, the company should confirm the receipt of the
user’s data by e-mail [1,8].
3. Goal of the Study
In date entry on the World Wide Web, there can be ambigu-
ities, especially in global context. In most European coun-
tries, dates are written in the format day/month/year, where-
as in the US the day follows the month. In an online environ-
ment, where a date is filled into an input field, it is therefore
important to tell users in which format the date has to be
entered to avoid confusion. Prior studies have shown that
providing format restrictions to users in advance lowers error
rates and increases user satisfaction [25].
Therearemanywaysofstatinghowthedateentryhas
to be formatted. A study by Christian et al. [16]examined
date entry using two separate input fields for month and year.
They placed above the corresponding input fields the words
“Month” and “Year” or the characters “MM” and “YYYY”,
respectively. They found that people provided more correct
answers using characters. Another factor that decreased
errors was when the year field was twice as long as the month
field. Their study also showed that the way the question was
asked, namely, “when” versus “what month and year”, had
no influence on the number of correct answers. Also, the
position of the symbols—left, above, or right of the corre-
sponding input fields—did not change the percentage of cor-
rect answers. Using a half-size month-box, separating the
month— from the year-box, and grouping a symbolic in-
struction with the corresponding input field led to a rate
between 92.9% and 95.8% of participants reporting the date
in the desired format.
There are more possibilities for designing input fields
for date entries with formatting requirements, for example,
using one field and placing the requirements next to or inside
the field. There are even ways where no formatting require-
ments need to be stated, namely, drop-down menus and pop-
up calendars. Because no studies are known that had tested
the performance of these possibilities, an online experiment
was conducted to shed more light into these options.
4. Methods
4.1. Error Categorization. When entering dates in forms,
there are two different types of errors that can occur.
(1) Wrong format. The user enters the correct day (e.g.,
his/her birthday), but chooses a wrong format. This
can happen for instance when the required format is
month-day-year, and the user enters first the day and
then the month and year, or when two-digit numbers
are enforced, but the user enters the date using single-
digits. Format errors stem usually from insufficient
communication of the applied format restrictions or
from users overlooking the instructions.
(2) Wrong dat e. The user enters the wrong date. This usu-
ally happens if the wrong keys are pressed on the key-
board or the wrong entries are selected in a menu or
calendar widget.
4.2. Design. This study was conducted as online experiment,
where six different date entry designs were compared using a
one-way related design.
We chose four designs that required users to enter the
dates using variation of entry field(s), one where the dates
were entered with drop-down menus, and one design pro-
vided a calendar widget. The six designs used as independent
variables are illustrated and explained in Figure 1.
As dependent variables, the following metrics were as-
sessed.
(i) Wrong format: date entries in a wrong format (see
Section 4.1 for a definition).
4Advances in Human-Computer Interaction
Version 1 (separate). Three separate input fields are used for day, month and year and symbols above of the
corresponding input field state how many digits have to be entered. The year field is twice the size of the month
and day field.
Version 2 (drop-down). Three separate drop-down menus are used for day, month and year. The menu for the
year includes dates from 1900 to 2007.
Version 3 (left). Only one input field is used. The formatting requirements are stated in form of symbols to the
left of the input field.
Version 4 (inside, permanent). The formatting requirement is placed inside the input field. It stays visible when
the user clicks inside the field and has to be overwritten.
Version 5 (inside). The formatting requirement is placed inside the input field. It disappears when the user clicks
into the input field.
DD
Day
dd.mm.yyyy
dd.mm.yyyy
dd.mm.yyyy
Month Year
MM YYYY
Version 6 (calendar). A calendar pops-up when the icon right to the input field is clicked.
Figure 1: Design versions used as independent variables: input types for date entries (translated by the authors).
(ii) Completion time: time needed to fill in the dates.
(iii) Wrong dat e: dates that did not correspond to the ones
required (see Section 4.1 for a definition).
(iv) User satisfaction: a satisfaction questionnaire meas-
ured whether entering the date was perceived as being
comfortable and efficient.
4.3. Participants. A total of n=172 subjects participated
in the study. Fifty-three of the participants were male, 113
were female, and six did not specify their gender. The mean
age was 30.33 years (SD =12.07), with the youngest person
being 15 and the oldest 68 years old. All participants were
recruited through the University recruitment database and
contacted by e-mail. In this database, people interested in
participating in user studies can leave their contact address.
As incentive, they had the chance of winning an iPod Shuffle
or one out of 10 USB memory sticks.
4.4. Procedure. The study was conducted online. The authors
selected five arbitrary dates that were used for the experi-
ment. Each of these five dates was presented to each partic-
ipant in all six design versions. This leads to a total of 30
cycles for each participant (5 dates ×6designs).Allcycles
were presented to participants in random order to counter
learning effects.
First the participants received a short introduction and
instruction. They were told that they would see 30 tasks,
where a date is presented, and they will need to copy this
date as fast as possible with the provided input mechanism.
Advances in Human-Computer Interaction 5
Welcome Instruction Questions 1/30 Evaluation Demography Thank you
DD MM YYYY
Next
17.05.1957
Figure 2: Example of how a task was presented to users (translated by the authors).
After acknowledging this instruction, the study started. The
presentation on each screen consisted of a date to fill in and
the corresponding input field. Figure 2 shows an example for
version 1 (separate). The task was to fill in the date presented
in the input field as quickly as possible and to click on the
button to proceed to the next screen. In line with the stated
format requirements, dates had to be entered using two digits
for the day and the month and four digits for the year.
At the end, a post-test questionnaire appeared where each
design version was presented again on the same screen, with
thequestions“Fillinginthedatewascomfortable”and“I
could fill in the date quickly and efficiently” placed under
each version. Participants had to answer these questions for
each design on a six-point Likert scale (scale: 1 =does not
apply; 6 =applies). Finally, participants were asked for their
age and gender and thanked for their participation.
5. Results
All data were checked for outliers (difference larger than
three standard deviations), normal distribution, and linear-
ity. For a better fit to these criteria, response time was log-
transformed. To analyze the differences between the six ver-
sions, the mean for the dependent variables of the five dates
was calculated for each input type. In version 6 (calendar),
where there was the possibility of choosing the date via the
pop-up calendar, only data where the calendar was used
were included into the calculation (n=126; it was possible
to enter the date without using the calendar by typing
directly into the date field). The outlier analysis revealed that
one subject took an abnormally long time to answer ver-
sion 2 (drop-down). Therefore, this subject was excluded
from further analyses. An alpha level of .05 was used for all
statistical tests.
5.1. Errors: Wrong Format. The number of format errors was
not normally distributed and the assumption of variance
homogeneity was violated. Therefore, to test whether the six
design versions differed in the number of answers that were
giveninanwrongformat(seeSection 4.1), a non-parametric
Friedman ANOVA was used. Results indicate that there were
significant differences between the six versions, χ2
r(5) =
168.864, p<.001. As shown in Tab l e 1, version 2 (drop-
down) and version 6 (calendar) performed significantly
better than the other four versions. They both had zero
entries in a wrong format, because these versions make it
impossible to enter a date in a wrong format. There was no
difference between the other four versions; they all led to the
same number of incorrect date formats.
5.2. Response Time. Response time data were not normally
distributed and therefore log-transformed. To test whether
there were differences between the six versions for the time
needed to fill in the dates, a one-way ANOVA for related
samples was conducted. Because the sphericity assumption
was violated, degrees of freedom were adjusted using Green-
house-Geisser. Again, for version 6 (calendar) only data
where the calendar was used were included into the analysis.
The global analysis revealed significant differences in the log-
arithmic transformed mean response time, F(2.72, 339.49) =
112, 07, p<.001, with version 2 (drop-down) and 6 (calen-
dar) requiring more time to be filled in than the other four,
F(1, 125) =290.90, p<.001. The fastest performance was
reached with versions 3 (left) and 5 (inside), which were both
faster than versions 1 (separate) and 4 (inside, permanent),
F(1, 125) =72.88, p<.001. Mean values for the time needed
to fill the dates are shown in Ta bl e 2 .
5.3. Errors: Wrong Dates. Data did not meet assumptions of
distribution and sphericity. Therefore, again a nonparamet-
ric Friedman ANOVA was applied to compare the number
of wrong dates entered. Tabl e 1 shows the mean values for
all groups. The six versions differed significantly, χ2
r(5, N=
126) =76.63, p=.001. Post hoc analysis revealed that in
version 6 (calendar) more dates that differed from the ones
required (wrong dates, see Section 4.1) were entered than in
the other five versions (see Tab l e 1).
5.4. Satisfaction, Efficiency, and Conformance Ratings. Con-
cerning the question whether the action of entering the
dates was perceived as being comfortable, the global analysis
revealed significant differences between the six versions
χ2
r(5, N=121) =69.622, p<.001, with version 4 (inside,
permanent) being perceived as significantly less comfortable
than each of the other versions. There was no significant dif-
ference between the other five versions regarding perceived
comfortableness (see Tab le 3 ). Significant differences were
found between the six versions for the perceived efficiency
of entering the dates, χ2
r(5, N=116) =78.016, p<.001.
Again, version 4 (inside, permanent) was perceived as being
less efficient than the other versions. Version 2 (drop-down)
was perceived as being less efficient than versions 3 (left) and
5 (inside). In addition, version 3 (left) was rated significantly
more efficient than the calendar (see Ta bl e 4 ).
6Advances in Human-Computer Interaction
Tab le 1: Statistic parameters for number of errors: entries in a wrong format and wrong date.
Vers i o n Wrong format Wrong date
M(%) SD M(%) SD
Version 1 (separate) 18.5 30.5 3.0 8.1
Version 2 (drop-down) 0 0 2.1 7.6
Version 3 (left) 22.3 33.5 1.6 5.6
Version 4 (inside, permanent) 22.3 32.8 1.7 5.7
Version 5 (inside) 25.7 32.5 1.9 7.5
Version 6 (calendar) 0 0 18.3 29.7
Tab le 2: Statistics for time needed to fill the dates.
Vers i o n Time needed (s)
M(s) SD
Version 1 (separate) 3.68 2.69
Version 2 (drop-down) 6.94 2.88
Version 3 (left) 3.61 2.82
Version 4 (inside, permanent) 4.15 2.49
Version 5 (inside) 3.41 2.32
Version 6 (calendar) 5.03 5.40
6. Discussion
6.1. Discussion of the Findings. The results show that no for-
matting errors occurred with the drop down and calendar
entry options. However, this benefit came with the cost of
longer input times. Concerning the other four versions, none
of them outperformed another regarding the mean number
of registered formatting errors. There is a trade-offin the
number of formatting errors and completion time. If it is
crucial that the requested date is entered in the correct format
(e.g., the date for a flight when buying an online ticket),
either the calendar or the drop-down version can be used at
the expense of more time. If, on the other hand, completion
time is a more important factor than correctly formatted
answers (e.g., optional fields like one’s date of birth), one
of the other versions can be considered. When analyzing
completion time, versions 3 (left) and 5 (inside) performed
best and received the highest satisfaction ratings. Therefore
when formatting errors are secondary, these two versions can
be recommended. In version 5 (inside), where the format
advice disappears as soon as the field is activated, users
might accidentally delete the format requirement using the
tab key. This is a legitimate argument, which could not be
tested in this study as there was only one input field per
screen, and tabbing was not possible or necessary. Version 1
(separate), which has been proposed by Beaumont et al. [4]
and Christian et al. [16], did not lead to fewer formatting
errors and took more time to be filled in than versions 3
(left) and 5 (inside). Version 4 (inside, permanent) was rated
as being significantly less comfortable and efficient than the
other versions. As it does not seem to prevent formatting
errors any better than other versions, using this type of date
entry field should be avoided. Tab le 4 gives a performance
overview for the tested versions.
Concerning the difference between drop-down and cal-
endar version, using the drop-down menu to ensure that
fewer errors occur might have some advantages. With the
calendar interface, more incorrect dates were registered (see
Section 5.3). With calendars, it is usually possible to enter
the date manually in the field without using the calendar.
Therefore, it may happen that people either do not see the
calendar icon or decide not to use it. In this study, 28% of
participants never used the calendar and only 25% used it
for all five dates that they had to enter. If a designer wants
to force the use of the calendar, it has to be implemented
in such a way that the calendar automatically pops up
when clicking into or activating the corresponding input
field. Also, the calendar is the only interface element that
requires the usage of a mouse: in situations where a mouse
is unavailable (e.g., mobile applications) or the form is used
by handicapped user population, this version might bear
some serious disadvantages compared to regular drop-down
menus. A calendar that can be activated by the user can be
very helpful for date entries where it is important to know
the exact weekday of the date. This type of calendar is, for
example, commonly used on travel websites to book flights
and hotels. However, in cases where the required date lies far
in the future or past, it is often not practical to use a calendar
to enter a date, because it will require many clicks to arrive at
the desired timeframe.
The reported findings should be tested in a complete
form to see whether these rules also apply in a “natural
environment”. The interaction with the labels should be
taken into account, especially in version 3 (left), where the
label and the formatting advice are placed to the left of the
input field. In this case, it should also be tested as to whether
the formatting advice should be associated with the label or
the input field.
6.2. Limitations. This study helps to clear up an important
question regarding the design of usable forms: how should
date entry input fields be presented to reduce errors and
processing time, and to increase user satisfaction? At the
same time, it must be stressed that the presented findings
have to be regarded as a first step. The study was conducted
in a rigorous laboratory setting. First, all chosen tasks were
artificial: participants had not to enter real dates they had
to retrieve from their memory. They simply had to copy
dates from an instruction into a user interface—it remains
to be seen if these results scale to more realistic settings.
Advances in Human-Computer Interaction 7
Tab le 3: Subjective evaluation of the six versions.
Vers i o n Comfortableness Speed/Efficiency
MSDMSD
Version 1 (separate) 3.75 1.71 3.80 1.71
Version 2 (drop-down) 3.65 1.85 3.30 1.74
Version 3 (left) 4.04 1.44 4.23 1.44
Version 4 (inside, permanent) 2.66 1.38 2.72 1.39
Version 5 (inside) 3.95 1.39 4.10 1.50
Version 6 (calendar) 3.70 1.82 3.35 1.73
6-point Likert scales, 1 =does not apply; 6 =applies.
Tab le 4: Performance overview of the six versions.
Avo id wro n g form a t Avo id wro n g dat e Ti m e efficiency User satisfaction
Version 1 (separate) 0 + 0 0
Version 2 (drop-down) + + −0
Vers i o n 3 ( l e f t ) 0 + + +
Version 4 (inside, permanent) 0 + 0 −
Version 5 (inside) −+++
Version 6 (calendar) + −−0
Second, the tasks were very repetitive, something that is
usually not found in online form. Third, the interactions
were not embedded in a realistic setting, like for instance a
shopping or registration process. The ecological validity of
the presented findings is therefore low. Fourth, we instructed
the participants to enter the dates as quickly as possible. This
performance oriented setting may differ from a more relaxed,
natural task setting.
To overcome these limitations, future studies may vary
interface elements within a real setting using real tasks.
Another question is whether these results can be generalized
beyond web forms to, for example, forms in mobile applica-
tions.
6.3. Future Outlook. Regarding the future outlook, there are
many open questions concerning usable forms that must be
answered. In recent years new developments, like for exam-
ple, Web 2.0, have led to new ways of implementing interac-
tions on the Internet. Nowadays for example, users get more
and more accustomed to receiving immediate feedback in
webforms through the use of AJAX technology. It remains
to be seen if these technologies can be used to enhance date
selection in online forms.
There is a growing body of empirical research and best
practice recommendations by usability experts to achieve
usable forms on the World Wide Web. Most studies—like the
one presented here—choose to explore one specific aspect
or interface element to find the optimal solution. Usually
this is done in a laboratory situation, using abstract forms
with artificial tasks. In the near future, this knowledge needs
to be consolidated in practical guidelines. These guidelines
must be empirically tested using real forms in realistic user
situations, to see whether they really lead to better usability,
manifested in faster form-completion time, fewer errors,
higher user satisfaction, and reduced dropout rate.
References
[1] L. Wroblewski, Web Form Design: Filling in the Blanks,Rosen-
feld Media, 2008.
[2] J. Herman, “A process for creating the business case for
user experience projects,” in Proceedings of the Conference on
Human Factors in Computing Systems, pp. 1413–1416, ACM,
New York, NY, USA, 2004.
[3] J. Bargas-Avila, O. Brenzikofer, S. Roth, A. Tuch, S. Orsini, and
K. Opwis, “Simple but crucial user interfaces in the world wide
web: introducing 20 guidelines for usable web form design,” in
User Interfaces (INTECH ’10), R. Matrai, Ed., pp. 1–10, 2010.
[4] A. Beaumont, J. James, J. Stephens, and C. Ullman, Usable
Forms for the Web, Glasshaus, Birmingham, UK, 2002.
[5] J. Garrett, The Elements of User Experience,NewRiders,
Indianapolis, Ind, USA, 2002.
[6] J. Nielsen, “Drop-down menus: use sparingly,” 2000, http://
www.useit.com/alertbox/20001112.html.
[7] S. Miller and C. Jarrett, “Should I use a drop-down? Four
steps for choosing form elements on the web,” 2001, http://
www.formsthatwork.com/files/Articles/dropdown.pdf.
[8] M. Linderman and J. Fried, Defensive Design for the Web: How
to Improve Error Messages, Help, Forms, and Other Crisis Points,
New Riders Publishing, Thousand Oaks, Calif, USA, 2004.
[9] T. Wilhelm and C. Rehmann, “Nutzergerechte formula-
rgestaltung,” 2006, http://www.eresult.de/studien artikel/for-
schungsbeitraege/formulargestaltung.html.
[10] S. L. Pauwels, C. H¨
ubscher, S. Leuthold, J. A. Bargas-Avila,
and K. Opwis, “Error prevention in online forms: use color
instead of asterisks to mark required-fields,” Interacting with
Computers, vol. 21, no. 4, pp. 257–262, 2009.
8Advances in Human-Computer Interaction
[11] T. Tu l l i s a n d A. P o n s , “ D e s i g n at in g r e q u i r e d vs. o p t i o n a l i n p u t
fields,” in Proceedings of the Conference on Human Factors in
Computing Systems, pp. 259–260, ACM, New York, NY, USA,
1997.
[12] M. Penzo, “Label placement in forms,” 2006, http://www.
uxmatters.com/MT/archives/000107.php.
[13] S. Das, T. McEwan, and D. Douglas, “Using eye-tracking to
evaluate label alignment in online forms,” in Proceedings of
the 5th Nordic Conference on Human-Computer Interaction:
Building Bridges, pp. 451–454, ACM, 2008.
[14] C. Jarrett, “Label placement in forms: what’s best?” in Proceed-
ings of the 22nd British HCI Group Annual Conference on People
and Computers: Culture, Creativity, Interaction-Volume 2,pp.
229–230, British Computer Society, 2008.
[15] D. Robinson, “Better web forms,” 2003, http://www.7nights
.com/dkrprod/gwt four.php.
[16] L. M. Christian, D. A. Dillman, and J. D. Smyth, “Helping
respondents get it right the first time: the influence of
words, symbols, and graphics in web surveys,” Public Opinion
Quarterly, vol. 71, no. 1, pp. 113–125, 2007.
[17] M. P. Couper, M. W. Traugott, and M. J. Lamias, “Web survey
design and administration,” Public Opinion Quarterly, vol. 65,
no. 2, pp. 230–253, 2001.
[18]S.K.Card,T.P.Moran,andA.Newell,“Keystroke-level
model for user performance time with interactive systems,”
Communications of the ACM, vol. 23, no. 7, pp. 396–410, 1980.
[19] B.Healey,“Dropdownsandscrollmice:theeffect of response
option format and input mechanism employed on data quality
in web surveys,” Social Science Computer Review, vol. 25, no. 1,
pp. 111–128, 2007.
[20] A. Hogg and J. J. Masztal, “Drop-down, radio buttons, or
fill-in-theblank? Effects of attribute rating scale type on web
survey responses,” in Proceedings of the ESOMAR Congress—
Marketing Transformation (ESOMAR ’01), Rome, Italy, 2001.
[21] D. Heerwegh and G. Loosveldt, “An evaluation of the effect
of response formats on data quality in web surveys,” Social
Science Computer Review, vol. 20, no. 4, pp. 471–484, 2002.
[22] J.A.Bargas-Avila,O.Brenzikofer,A.N.Tuch,S.P.Roth,andK.
Opwis, “Working towards usable forms on the worldwide web:
optimizing multiple selection interface elements,” Advances in
Human-Computer Interaction, vol. 2011, Article ID 347171, 6
pages, 2011.
[23] J. Nielsen, “Error message guidelines,” 2001, http://www.useit
.com/alertbox/20010624.html.
[24] J. A. Bargas-Avila, G. Oberholzer, P. Schmutz, M. de Vito, and
K. Opwis, “Usable error message presentation in the world
wide web: do not show errors right away,” Interacting with
Computers, vol. 19, no. 3, pp. 330–341, 2007.
[25] J.A.Bargas-Avila,S.Orsini,H.Piosczyk,D.Urwyler,andK.
Opwis, “Enhancing online forms: use format specifications for
fields with format restrictions to help respondents,” Interacting
with Computers, vol. 23, no. 1, pp. 33–39, 2011.
Available via license: CC BY
Content may be subject to copyright.
Content uploaded by Klaus Opwis
Author content
All content in this area was uploaded by Klaus Opwis
Content may be subject to copyright.