Conference PaperPDF Available

A Visualization Tool for Eye Tracking Data Analysis in the Web

Abstract

Usability analysis plays a significant role in optimizing Web interaction by understanding the behavior of end users. To support such analysis, we present a tool to visualize gaze and mouse data of Web site interactions. The proposed tool provides not only the traditional visualizations with fixations, scanpath, and heatmap, but allows for more detailed analysis with data clustering, demographic correlation, and advanced visualization like attention flow and 3D-scanpath. To demonstrate the usefulness of the proposed tool, we conducted a remote qualitative study with six analysts, using a dataset of 20 users browsing eleven real-world Web sites.
A Visualization Tool for Eye Tracking Data Analysis in the Web
Raphael Menges
University of Koblenz, Germany
raphaelmenges@uni-koblenz.de
Sophia Kramer
University of Koblenz, Germany
skramer@uni-koblenz.de
Stefan Hill
University of Koblenz, Germany
shill@uni-koblenz.de
Marius Nisslmüller
University of Koblenz, Germany
mnisslmueller@uni-koblenz.de
Chandan Kumar
University of Koblenz, Germany
kumar@uni-koblenz.de
Steen Staab
University of Stuttgart, Germany
steen.staab@ipvs.uni-stuttgart.de
ABSTRACT
Usability analysis plays a signicant role in optimizing Web in-
teraction by understanding the behavior of end users. To support
such analysis, we present a tool to visualize gaze and mouse data
of Web site interactions. The proposed tool provides not only the
traditional visualizations with xations, scanpath, and heatmap,
but allows for more detailed analysis with data clustering, demo-
graphic correlation, and advanced visualization like attention ow
and 3D-scanpath. To demonstrate the usefulness of the proposed
tool, we conducted a remote qualitative study with six analysts,
using a dataset of 20 users browsing eleven real-world Web sites.
CCS CONCEPTS
Human-centered computing Visualization systems and
tools.
KEYWORDS
Web interaction, gaze visualization, scanpath, heatmap
ACM Reference Format:
Raphael Menges, Sophia Kramer, Stefan Hill, Marius Nisslmüller, Chandan
Kumar, and Steen Staab. 2020. A Visualization Tool for Eye Tracking
Data Analysis in the Web. In Symposium on Eye Tracking Research and
Applications (ETRA ’20 Short Papers), June 2–5, 2020, Stuttgart, Germany.
ACM, New York, NY, USA, 5 pages. https://doi.org/10.1145/3379156.3391831
1 INTRODUCTION
Eye tracking is used for analyzing attention in several domains,
such as medical, sports, commerce, and human-computer interac-
tion studies [Blascheck et al
.
2016; Duchowski 2002; Holmqvist
et al
.
2011; Nielsen and Pernice 2009; Poole and Ball 2006]. Gaze
data from eye trackers, i. e., attention of users on screen contents,
provides feedback on the implicit behavior of users, which is ar-
guably more natural and intuitive to interpret than the conventional
indicators such as self-reported feedback or plain clickstream anal-
ysis [Schiessl et al
.
2003]. As eye-tracking hardware is becoming
Also with University of Southampton, UK
Permission to make digital or hard copies of all or part of this work for personal or
classroom use is granted without fee provided that copies are not made or distributed
for prot or commercial advantage and that copies bear this notice and the full citation
on the rst page. Copyrights for components of this work owned by others than the
author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or
republish, to post on servers or to redistribute to lists, requires prior specic permission
and/or a fee. Request permissions from permissions@acm.org.
ETRA ’20 Short Papers, June 2–5, 2020, Stuttgart, Germany
©2020 Copyright held by the owner/author(s). Publication rights licensed to ACM.
ACM ISBN 978-1-4503-7134-6/20/06. . . $15.00
https://doi.org/10.1145/3379156.3391831
cheaper and widely available, the relevance of gaze data and its
role in human-computer interaction studies is increasing signi-
cantly. One of the most common interfaces for human-computer
interaction are Web pages. Users interact with Web pages through
a Web browser that presents the contents of a Web page through a
viewport. In this setting, gaze-based usability analysis could help
us to understand how users interact with a Web page so that we
can address shortcomings and design issues. Recent works even
describe crowd sourcing of eye tracking on Web sites [Eraslan
et al
.
2018], i. e., collecting data from a large number of users. For
usability studies on the Web, usually one image per Web page is
composed, which oers a common space for interaction data of
all users who have visited that Web page. The images can be en-
riched with collected interaction data, i. e., gaze and mouse data,
such that analysts may correlate the contents of the Web page with
the interaction behavior of users. Analyzing the sheer amount of
gaze data on Web page representative images for several users is
a challenging task, and there is a need for eective visualization
tools to support ecient analysis.
We developed a Web-based tool for the visualization of gaze data.
It allows for data import, oers various visualizations, includes
ltering and correlation of user data, and enables automatic cluster-
ing of gaze data to tackle the challenge of cluttered visualizations
with the increasing amount of data. Our dataset and the tool can be
accessed online at https://eyevis.west.uni-koblenz.de and is open
to use unlike the existing commercial platforms. We have evalu-
ated the tool and visualizations with six analysts from the elds of
information systems, psychology, science, and usability analysis.
2 RELATED WORK
There are various commercial tools that allow visualization of gaze
data on an image of the stimulus. For example, eye-tracking devices
are sold in combination with desktop-based software like Tobii
Studio [Tobii AB 2016] and SMI beGaze [SensoMotoric Instruments
2020]. Recently, Web-based products have emerged, oering eye-
tracking usability studies of Web sites either with dedicated hard-
ware [CoolTool 2020; EYEVIDO GmbH 2020] or a Web cam [Eyezag
GbR 2018; RealEye sp. z o. o. 2020; Tobii AB 2020]. However, there
are not many publicly available tools to provide eective visualiza-
tion of gaze data or to enrich the features of the commercial prod-
ucts. Burch and Kumar [Burch et al
.
2019a] proposed a Web-based
visual analytic tool where users can upload, share, and explore data
with others. However, the tool is designed to detect strategic eye
movement patterns when the areas of interest (AOIs) are predened
on a static stimulus. For Web usability studies, the AOIs need not
ETRA ’20 Short Papers, June 2–5, 2020, Stugart, Germany Menges, et al.
to be predened, and it is important to show gaze data overlaid
on the composed images of the pages of a Web site. The challenge
in visualizing gaze data in relation to the Web page stimulus is its
spatial and temporal property, which must be mapped to the two-
dimensional space of a Web page. Existing tools for visualization
mostly implement scanpath and heatmap visualizations to map
gaze data onto the image of a Web page [Eraslan et al. 2019].
A scanpath [Goldberg and Kotval 1999; Holmqvist et al
.
2011]
represents each xation as a circle. The transition between two
xations is plotted as a connecting line, which signies a saccade.
Scanpath visualizations maintain the temporal property of gaze
data. However, it does not aggregate the gaze data of multiple users.
The scanpath visualization even tends to be cluttered, if scanpaths
of multiple users are overlaid.
A heatmap [Bojko 2009] shows the distribution of gaze data
across the stimulus. It aggregates the xations of multiple users
and shows hotspots, aka heat, of attention. The level of attention
is color coded. There is no recognized standard of constructing a
heatmap, but rather specic types of heatmaps are designed for
specic purposes. Heatmaps allow for a fast impression about the
overall attention on a Web page. However, any temporal infor-
mation of attention is lost. In a heatmap visualization there is no
dierence whether a user rst looked at the product picture and
then on the buy-button or vice versa.
Researchers have proposed more elaborate visualizations that
we take as inspiration [Blascheck et al
.
2017; Burch et al
.
2019a,b].
However, in these approaches, usability analysts must dene an AOI
on a Web page. The tools can compute the attention and interaction
with each AOI. Besides displaying statistics, one can display the
transitions in attention from one AOI to another AOI across the
users with arrows [Holmqvist et al. 2003].
3 VISUALIZATIONS
Instead of relying solely on the traditional scanpath and heatmap
visualizations, we propose to combine them for more interactive
analysis, and introduce new approaches, like attention ow with
an automatic AOI computation, and a 3D-scanpath that maps the
temporal information onto the depth axis.
Combination of heatmap and scanpath. A heatmap provides a
good overview about general attention on the entire Web page. A
scanpath preserves the temporal property of the attention and is
well suited to understand individual interactions, especially before
clicking a link or button. In our tool, we combine both visualizations
to provide an overview and simultaneously drill down to individ-
ual interaction behaviour. Initially, we visualize a heatmap that
shows the overall attention of users on the Web page. In addition,
we display indicators of mouse clicks on top of the heatmap (the
color of the indicator represents individual users). A click on an
indicator hides the heatmap and displays the scanpath before the
mouse click has happened. The time span of the xation history
shown by the scanpath can be easily adjusted. We have introduced
further functions for convenience, e. g., one can draw a rectangle to
select multiple indicators at once and have the according scanpaths
displayed. A click on the background resets the selection and we
display the heatmap again. See Figure 1 for an example on data we
recorded on the homepage of Tesla (tesla.com, November 2018).
Select Click
Figure 1: Combination of heatmap and scanpath
Weighted heatmap. In case of a heatmap, xations of all users are
aggregated and represented as a scalar intensity value of attention
per pixel. That attention might be either a long xation by only one
user, many short xations of many users, or something in-between.
Thus, one can decide to consider xation duration or xation den-
sity as intensity value. There exist various problems in psychology
or media science where either xation length or the number of
xations is part of the research question. Hence, we propose to let
the analyst decide how to aggregate the xations. For each xation,
the neighborhood density is calculated and can be weighted with
the length of the xation. We use the F-measure [Sasaki 2007] to
let an analyst change the importance of either density or length of
xation. See Figure 2 for the dierence between both intensity com-
putations. Normalization of the number of neighbors is achieved by
dividing through the number of neighbors of the xation with most
neighbors of the screenshot. For the xation length, we normalize
by using the maximum xation length of the screenshot.
Attention ow. The order of attention on contents is of high
interest for analysts. For example, users might look at the price,
photos, and ratings of a product depending on the placement of
those information. Inspired from visualizations of AOI transitions,
we provide a visualization of attention ow. However, instead of
having the analyst dening an AOI, we provide automatic clustering
of xations to retrieve hot spots of attention. We display arrows
for transitions between clusters, where color and size signify the
number of transitions, i. e., dark and red means more switches, thin
and green means fewer switches.
First, we cluster the xations with K-Means [Lloyd 1982]. K-
Means has the number of clusters as a parameter and uses random
initialization of cluster centroids. We also oer the OPTICS (Order-
ing Points to Identify the Clustering Structure) algorithm [Ankerst
Number of
Fixations
Length of
Fixations
Figure 2: Weighted heatmap visualization
A Visualization Tool for Eye Tracking Data Analysis in the Web ETRA ’20 Short Papers, June 2–5, 2020, Stugart, Germany
Figure 3: Attention ow visualization
et al
.
1999], which has the two parameters of neighborhood radius
and minimal number of points to form a cluster. Neighborhood
radius denes how far a point can be from a found cluster such
that it is put into that cluster. Thus, a higher value results in larger
clusters. The minimal number of points to form a cluster acts as
threshold for a cluster to be considered valid. In contrast to K-
Means, the OPTICS algorithm leads for unchanged input data and
same parameters to the same clusters at every execution, why it is
preferred by the analysts in our study. We then display the convex
hull of each cluster with a black line that is outlined in white color.
The xations in each cluster are displayed with white circles so an
analyst can see an estimate of the number of xations that lie in
each cluster. After dening the clusters, we go over each cluster and
its contained xations. If the next xation lies within a dierent
cluster, a transition is recorded. See Figure 3 for an example.
3D-Scanpath. Traditional scanpaths are useful, however, they
can become dense and hard to interpret while analyzing data of
multiple users or a high number of xations. Therefore, we inte-
grate a three-dimensional scanpath visulization, where the spatial
coordinates remain on the x and y-dimension, the screenshot of the
Web page in the background, and the temporal property is encoded
in the z-dimension (to deal with dense information). We render the
screenshot of the Web page on a plane geometry in the background.
In the foreground, we display the xations with numbers indicating
their index. The size of the number represents the length of the
respective xation. An analyst can manipulate the camera angle,
position, and zoom level. In addition, we add dotted lines from the
xation toward the background plane. This supports an analyst in
mapping xations onto their position on the page, see Figure 4.
4 TOOL
The tool is implemented as a modern single-page application with
dierent views, using JavaScipt, Canvas, and the ThreeJS library.
The single-page architecture allows us to stick with a global script-
ing context and store all data consistently across the tool. Never-
theless, a visitor (usability analyst) would experience the available
views as dierent Web pages. The analyst can navigate between
the views with the global navigation on top of the Web page.
Figure 4: 3D-scanpath visualization
Figure 5: Screenshot of the visualization view in our tool
The initial home view contains a brief explanation of what the
tool is about. It shows how to use the tool and redirects the user
to the other views, e. g., to the help view when the analyst requires
support in using the tool. The data import view allows to upload
images as image les and gaze and mouse data as
.csv
les. Promi-
nently placed in the top, the button “Try the demo data!” imports
the data and the images from our dataset, that can be used to ex-
plore our tool without requiring a dataset of your own. We describe
the dataset in the next section. The visualization view allows the
analyst to select the studies that will be displayed, the visualizations
to use, and parameters to adjust the visualization settings. It is also
possible to lter users, and select or deselect the users, based on
attributes available in the dataset, e. g., age and gender. In addition,
a time window of interest can be chosen. See Figure 5, which is a
screenshot of the visualization view, displaying the scanpaths of
multiple users on the Tesla homepage from our dataset.
5 DATASET
To be able to demonstrate the tool and its evaluation, we have
recorded a dataset of participants from our university browsing
various Web sites. The dataset contains gender, age, eye correction,
Web experience, and English skills of each participant, alongside
the Web site addresses, screenshots, transitions, gaze and mouse
data on the Web pages. The setup consisted of a laptop with a
15.6 inches screen that rendered with 1600
×
900 pixels, a wired
mouse, and a Tobii 4C eye tracking device with 90 Hz sampling
frequency. We used the EYEVIDO recording software [EYEVIDO
GmbH 2020] to store all data and export the dataset. The software
employs a stitching approach to create images that represent the
Web pages. It separates xed elements like menu bars and renders
them separately onto the top or bottom the image [Menges et al
.
2018]. We used popular Web sites (tesla.com, microsoft.com, apple.
com, harvard.edu) and less known Web sites (roverp6cars.com,
suzannecollinsbooks.com). The tasks for the participants ranged
from reading a specic text, to nding a link or button on the
page. For this, the participants were rst presented with a message
ETRA ’20 Short Papers, June 2–5, 2020, Stugart, Germany Menges, et al.
describing the task for each Web page and then asked to navigate
the Web page like they would under everyday circumstances. In
total, we recorded data from 20 participants, which is enough for eye
tracking studies [Eraslan et al
.
2016]. There were 25,000 xations,
20,000 mouse movements, and 650 mouse clicks on 164 Web pages
from eleven dierent domains.
6 EVALUATION
We evaluated the visualizations and the tool in a qualitative study.
Similar to recent work in gaze data visualizations [Kurzhals et al
.
2016], we recruited participants from the target audience, i. e., us-
ability analysts. The study was performed remotely, using the ben-
et of the Web-based implementation of our tool. We asked the
participants to open our survey form in one Web browser tab and
the tool in a second tab.
Participants. We recruited six participants (four female, two male).
Each participant had performed at least one eye-tracking usabil-
ity study in the past. One participant even had analyzed over 20
eye-tracking usability studies. The age ranged from 25 to 59 years,
with an average of 36 years. Five out of six participants provided
their profession. One participant works in the eld of information
systems, two in psychology, one as UX-expert, and one as a scien-
tic employee. We asked whether they are aware of the concepts
of “heatmap” (6 have checked with “yes”), “xation” (6), “area-of-
interest” (6), “attention ow” (4), and “scanpath” (3). Furthermore,
we asked for feedback on which visualization tools they have used
so far. Answers included Tobii Studio [Tobii AB 2016], Blickshift
Analytics [Blickshift GmbH 2020], and EYEVIDO Lab [EYEVIDO
GmbH 2020].
Procedure. We rst asked participants for demographic details,
profession, and experience, as described above. Then, the partici-
pants were directed to use-case scenarios in our tool. The use-case
scenarios instructed them for using the visualizations to analyze
user data and answer specic questions, i. e., “how many users
clicked a particular link?,” “the order of attention shift for a group
of users,” etc. For example, the participants used the combination
of heatmap and scanpath to observe the users who clicks on “Learn
More” on the Apple Web site. They used the attention ow to de-
termine on which elements the users shifted their attention after
looking at the portrait of Suzanne Collins on her homepage. They
used the 3D-scanpath to nd out which users have seen the “Further
performances” link on the Web site of the Vienna opera. The partic-
ipants judged the usefulness of every visualization from 1 to 5, with
5 being very useful. Finally, the participants were asked to perform
free browsing in the tool for ve minutes. During that time, most
participants also discovered the weighted heatmap visualization.
The participants concluded the study with a SUS rating.
Results. Participants answered the use-case scenario questions
mostly correct which indicates that the visualization could suce
them to understand gaze data and user behavior. With regard to par-
ticipants feedback on visualizations, the combination of heatmap
and scanpath were well received with the score of 4 out of 5 for
usefulness. However, when selecting too many mouse click indi-
cators, the scanpath can overwhelm an analyst as one participant
commented with “I had diculty understanding which scan paths
Figure 6: Usefulness of visualizations and the tool (1 lowest
to 5 highest) as rated in the remote study by experts.
belonged to which participant.” The attention ow received a supe-
rior score of 4.3 out of 5 on average. However, comments like “I did
not understand the dierent colors of the attention ows.” indicate
that the color coding of the transition arrows might be improved. In
contrast, the 3D-scanpath visualization was not well received and
scored only with 2.8 out of 5 on usefulness. The feedback ranges
from minor comments like “For the 3d visualization it would be
helpful if you could click on the number and the corresponding line
would be highlighted,” over “3D-Visualization as heat map would
also be nice.” toward “It looks very messy.” See a box plot in Figure 6
about the answer distribution.
We also asked for general feedback after the free browsing and
the participants were positive in general. Some example comments
are: “Weighted heatmap was the best feature, along with the clusters
and attention ows! I found the visualizations very helpful!,” or “The
possibility to show xations before a click is very nice.” There were
some negative comments, specially about the 3D-scanpath, e. g.,
the question on “which visualization would be useful for your eld”
was once answered with “All except the 3d visualization”. However,
another participant mentioned as most useful visualization “3D-
scanpath.” It is to be noted, that the 3D-scanpath was novel for
all participants, whereas two participants were familiar with the
concept of weighted heatmap and attention ow. Although not
specically evaluated, the weighted heatmap was popular among
the participants in the free browsing task.
The functionality of the overall tool was appreciated by the
participants and they rated the tool’s usefulness with 4.2 out of
5. On average, SUS was scored with 70.4, which indicates a good
usability of our tool [Brooke 2013]. There was explicit feedback
such as “I like the lter function on the user data!”, “There should
be a small button "hide" to QUICKLY hide/show all visualizations.
The participants also suggested some new visualizations, like “Can
I combine mouse data and e. g., heatmaps?”
7 CONCLUSION
In this paper, we present a Web-based visualization tool to support
eye-tracking usability analysis of Web sites. The tool oers both tra-
ditional and new visualizations to display and analyze the attention
and interaction of users. We recorded a dataset of users interacting
with Web sites and used it in a remote study with experts to eval-
uate the visualizations and the tool. The evaluation feedback was
positive and encouraged us to develop the tool further to provide
the community with a modern and easy-to-use platform to visualize
gaze data. In the future, we aim to improve the visualizations as per
usability analysts’ feedback, and enhance the tool with further data
processing steps, AOI editing tools, and data mining algorithms to
suce more in-depth analysis.
A Visualization Tool for Eye Tracking Data Analysis in the Web ETRA ’20 Short Papers, June 2–5, 2020, Stugart, Germany
ACKNOWLEDGMENTS
The authors acknowledge the nancial support by the Federal Min-
istry of Education and Research of Germany under the project
number 01IS17095B. We would also like to acknowledge students
that were part of the research lab in which the tool has been devel-
oped: Christian Brozmann, Chuyi Sun, Daniel Vossen, Nadja Jelicic,
Niklas Ecker, Oleksandr Kovtunov, and Orkut Karacalik.
REFERENCES
Mihael Ankerst, Markus M. Breunig, Hans-Peter Kriegel, and Jörg Sander. 1999. OP-
TICS: Ordering Points to Identify the Clustering Structure. SIGMOD Rec. 28, 2 (June
1999), 49–60. https://doi.org/10.1145/304181.304187
Tanja Blascheck, Markus John, Kuno Kurzhals, Steen Koch, and Thomas Ertl. 2016.
VA2: A Visual Analytics Approach for Evaluating Visual Analytics Applications.
IEEE Transactions on Visualization and Computer Graphics 22, 1 (Jan 2016), 61–70.
https://doi.org/10.1109/TVCG.2015.2467871
Tanja Blascheck, Kuno Kurzhals, Michael Raschke, Michael Burch, Daniel Weiskopf,
and Thomes Ertl. 2017. Visualization of Eye Tracking Data: A Taxonomy and Survey.
Computer Graphics Forum 36, 8 (2017), 260–284. https://doi.org/10.1111/cgf.13079
Blickshift GmbH. 2020. Blickshift. https://www.blickshift.com accessed on 28th March
2020.
Agnieszka Bojko. 2009. Informative or Misleading? Heatmaps Deconstructed. In
Proceedings of the 13th International Conference on Human-Computer Interaction.
Part I: New Trends. Springer-Verlag, Berlin, Heidelberg, 30–39. https://doi.org/10.
1007/978-3- 642-02574-7_4
John Brooke. 2013. SUS: A Retrospective. Journal of Usability Studies 8, 2 (Feb. 2013),
29–40. http://dl.acm.org/citation.cfm?id=2817912.2817913
Michael Burch, Ayush Kumar, and Neil Timmermans. 2019a. An Interactive Web-
Based Visual Analytics Tool for Detecting Strategic Eye Movement Patterns. In
Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications
(ETRA ’19). Association for Computing Machinery, New York, NY, USA, Article
Article 93, 5 pages. https://doi.org/10.1145/3317960.3321615
Michael Burch, Alberto Veneri, and Bangjie Sun. 2019b. EyeClouds: A Visualization
and Analysis Tool for Exploring Eye Movement Data. In Proceedings of the 12th
International Symposium on Visual Information Communication and Interaction
(VINCI’2019). Association for Computing Machinery, New York, NY, USA, Article
Article 8, 8 pages. https://doi.org/10.1145/3356422.3356423
CoolTool. 2020. CoolTool. https://cooltool.com accessed on 28th March 2020.
Andrew T. Duchowski. 2002. A breadth-rst survey of eye-tracking applications.
Behavior Research Methods, Instruments, & Computers 34, 4 (1 Nov. 2002), 455–470.
https://doi.org/10.3758/BF03195475
Şükrü Eraslan, Serkan Karabulut, Mehmet Can Atalay, and Yeliz Yeşilada. 2019. Evalu-
ation of Visualisation of Scanpath Trend Analysis (ViSTA) Tool. Balkan Journal of
Electrical and Computer Engineering 7, 4 (2019), 373–383.
Şükrü Eraslan, Yeliz Yeşilada, and Simon Harper. 2016. Eye Tracking Scanpath Analysis
on Web Pages: How Many Users?. In Proceedings of the Ninth Biennial ACM Sympo-
sium on Eye Tracking Research & Applications (ETRA ’16). Association for Computing
Machinery, New York, NY, USA, 103–110. https://doi.org/10.1145/2857491.2857519
Şükrü Eraslan, Yeliz Yeşilada, and Simon Harper. 2018. Crowdsourcing a Corpus
of Eye Tracking Data on Web Pages: A Methodology. In Proceedings of the 11th
International Conference on Methods and Techniques in Behavioral Research, Spink
A.J. et al. (Ed.).
EYEVIDO GmbH. 2020. EYEVIDO Portal. https://eyevido.de accessed on 28th March
2020.
Eyezag GbR. 2018. Eyezag. https://eyezag.de accessed on 28th March 2020.
Joseph Goldberg and Xerxes Kotval. 1999. Computer interface evaluation using eye
movements: Methods and constructs. International Journal of Industrial Ergonomics
24 (10 1999), 631–645. https://doi.org/10.1016/S0169-8141(98)00068- 7
Kenneth Holmqvist, Jana Holšánová, Maria Barthelson, and Daniel Lundqvist. 2003.
Reading or Scanning? A Study of Newspaper and Net Paper Reading. 2 (12 2003).
https://doi.org/10.1016/B978-044451020- 4/50035-9
Kenneth Holmqvist, Marcus Nyström, Richard Andersson, Richard Dewhurst, Jarodzka
Halszka, and Joost van de Weijer. 2011. Eye Tracking : A Comprehensive Guide to
Methods and Measures. Oxford University Press, United Kingdom.
Kuno Kurzhals, Marcel Hlawatsch, Florian Heimerl, Michael Burch, Thomas Ertl, and
Daniel Weiskopf. 2016. Gaze Stripes: Image-Based Visualization of Eye Tracking
Data. IEEE Transactions on Visualization and Computer Graphics 22, 1 (2016),
1005–1014. https://doi.org/10.1109/TVCG.2015.2468091
Stuart P. Lloyd. 1982. Least Squares Quantization in PCM. IEEE Trans. Inf. Theor. 28, 2
(March 1982), 129–137. https://doi.org/10.1109/TIT.1982.1056489
Raphael Menges, Hanadi Tamimi, Chandan Kumar, Tina Walber, Christoph Schaefer,
and Steen Staab. 2018. Enhanced Representation of Web Pages for Usability
Analysis with Eye Tracking. In Proceedings of the 2018 ACM Symposium on Eye
Tracking Research & Applications (ETRA ’18). ACM, New York, NY, USA, Article 18,
9 pages. https://doi.org/10.1145/3204493.3204535
Jakob Nielsen and Kara Pernice. 2009. Eyetracking Web Usability (1st ed.). New Riders
Publishing, Thousand Oaks, CA, USA.
Alex Poole and Linden J. Ball. 2006. Eye tracking in human-computer interaction and
usability research: Current status and future prospects. 211–219.
RealEye sp. z o. o. 2020. RealEye. https://www.realeye.io accessed on 28th March
2020.
Yutaka Sasaki. 2007. The truth of the F-measure. Teach Tutor Mater (1 2007).
Michael Schiessl, Sabrina Duda, Andreas Thölke, and Rico Fischer. 2003. Eye tracking
and its application in usability and media research. MMI interaktiv 6, 6 (1 2003).
SensoMotoric Instruments. 2020. beGaze. https://gazeintelligence.com/smi-product-
manual accessed on 28th March 2020.
Tobii AB. 2016. Tobii Studio User’s Manual. Version 3.4.5.
Tobii AB. 2020. Sticky. https://www.sticky.ai accessed on 28th March 2020.
... In [8] card sorting method is applied to construct the shape of the site focused on user shared cognition. In [9] the use of the eye tracking tool is exposed, as an input for the analysis of user behavior when interacting with the web. Artificial intelligence has also been applied to enhance data collection and perform sustained analysis in machine learning algorithms; Thus, Zainab [10] applies clustering techniques as an input to categorize data obtained from the web, which are then analyzed from a standard system usability scale. ...
... As indicated by research [7,8] the user must be able to retrieve information easily, therefore, it is important to provide adequate search systems, because the websites of educational institutions allow the presentation of academic offerings, news and achievements in teaching, research and networking. In this area, the importance of the understanding of a website is highlighted [9,10], as shown in this research, most Ecuadorian institute sites present information, but care must be taken to ensure that this is done through appropriate web interfaces that speed up understanding. ...
Conference Paper
Techniques for evaluating usability continue to be innovated. This document shares the application of a heuristic-based framework for measuring web usability - SIRIUS, complemented by two machine learning techniques for clustering: a) Hierarchical, with the Ward.2 method and Euclidean; and b) Kmeans clustering. For data processing, CRISP-DM has been proposed as a general method. Since our objective is to evaluate the usability characteristics of the websites of the Technical and Technological Institutes of Ecuador, data has been obtained from the web portals of 83 Institutes (34 public and 49 private). As a result, three clusters have been obtained, which encompass the 10 aspects of the framework, and which allow us to identify the levels of usability of technological institutes. As a result, 18 institutes have been categorized into the group of websites with above-average usability (cluster1), 32 institutes with below-average usability (cluster2), and the remainder with an acceptable degree of usability. The method used and proposed has made it possible to have a general usability map of the web portals of the technical and technological institutes of a country, as input for decision-making.
... Para el estudio de las respuestas implícitas provocadas (respuestas visuales y respuestas emociones faciales automáticas) los sujetos realizaron la visualización de los estímulos con el programa "Sticky de Tobii Pro" (https://www.tobiipro.com/es/products/sticky-by-tobii-pro/). Se trata de una herramienta desarrollada por Tobii, empresa consolidada como referentes en el desarrollo de software y hardware para investigaciones de neurocomunicación (Menges et al., 2020), que permite registrar el seguimiento ocular de los individuos, así como cuantificar sus emociones por medio de micro expresiones faciales de forma online a través de la cámara web del ordenador mediante un link que se les facilita a los participantes. El programa permite detectar la cara de los sujetos, las pupilas y predecir el punto a dónde se mira con una frecuencia de 30 Hz, con una precisión de 100px y con un error promedio en el ángulo visual de 4,17 grados mediante la ejecución de una red neuronal profunda (AI). ...
... Para el estudio de las respuestas implícitas provocadas (respuestas visuales y respuestas emociones faciales automáticas) los sujetos realizaron la visualización de los estímulos con el programa "Sticky de Tobii Pro" (https://www.tobiipro.com/es/products/sticky-by-tobii-pro/). Se trata de una herramienta desarrollada por Tobii, empresa consolidada como referentes en el desarrollo de software y hardware para investigaciones de neurocomunicación (Menges et al., 2020), que permite registrar el seguimiento ocular de los individuos, así como cuantificar sus emociones por medio de micro expresiones faciales de forma online a través de la cámara web del ordenador mediante un link que se les facilita a los participantes. El programa permite detectar la cara de los sujetos, las pupilas y predecir el punto a dónde se mira con una frecuencia de 30 Hz, con una precisión de 100px y con un error promedio en el ángulo visual de 4,17 grados mediante la ejecución de una red neuronal profunda (AI). ...
Chapter
Full-text available
En el contexto educativo actual de confinamiento, debido a la emergencia sanitaria mundial, las redes sociales como el whatsapp, han permitido desarrollar práctica pedagógica con los niños y niñas, educadores-familia y comunidad. El whatsapp como aplicación de mensajería instantánea, se convirtió en una herramienta para que las estudiantes de práctica profesional de la carrera de Educación Parvularia, pudieran entregar contenidos a través de cápsulas digitales. La presente investigación es desarrollada metodológicamente como un estudio de caso, donde se da a conocer una propuesta educativa exitosa; que diseñó la practicante para el rescate del patrimonio natural de la Localidad El Asiento. Por medio de las cápsulas digitales, la estudiante en práctica; generó contenidos educativos para que los niños y niñas aprendieran a utilizar juegos, los que le permitieron progresivamente rescatar su entorno natural.
... Except for the techniques that are mainly dedicated to the visualization of the visual scanning behavior of each individual, techniques that are able to support the visualization of aggregated gaze data and therefore to indicate the overall visual behavior considering inputs by multiple observers, could have a significant influence in both the interpretation and modeling of visual perception and attention processes. The significance of visualization methods in eye-tracking analyses and usability studies can also be demonstrated by the fact that many software, including standalone packages (e.g., OGAMA [18]) or toolboxes (e.g., EyeMMV toolbox [19]) working in local or in web environment (see e.g., the tools described in previous studies [20,21]) implement several methods that are utilized in conjunction with the typical metric-based analyses. ...
Article
Full-text available
Gaze data visualization constitutes one of the most critical processes during eye-tracking analysis. Considering that modern devices are able to collect gaze data in extremely high frequencies, the visualization of the collected aggregated gaze data is quite challenging. In the present study, contiguous irregular cartograms are used as a method to visualize eye-tracking data captured by several observers during the observation of a visual stimulus. The followed approach utilizes a statistical grayscale heatmap as the main input and, hence, it is independent of the total number of the recorded raw gaze data. Indicative examples, based on different parameters/conditions and heatmap grid sizes, are provided in order to highlight their influence on the final image of the produced visualization. Moreover, two analysis metrics, referred to as center displacement (CD) and area change (AC), are proposed and implemented in order to quantify the geometric changes (in both position and area) that accompany the topological transformation of the initial heatmap grids, as well as to deliver specific guidelines for the execution of the used algorithm. The provided visualizations are generated using open-source software in a geographic information system.
... The VERP Explorer [Demiralp et al. 2015] combines views based on standard visualization techniques with a specific focus on recurrence plots in a very specific domain, hence, it does not allow to explore eye movement data in a more conventional way. Another specific visualization tool focuses on the combination of standard eye movement data visualizations [Menges et al. 2020], enriched by more advanced ones such as attention flows or 3D scanpath representations. Many more tools can be found that focus on very specific application domains such as fixation distances [Burch et al. 2019a], parallel scan paths [Raschke et al. 2014], or video visual analytics [Kurzhals et al. 2014[Kurzhals et al. , 2016 but they typically do not provide the standard easy-to-understand visualizations, nor do they allow to easily exchange the views, and to share the found insights with others. ...
... In [8] card sorting method is applied to construct the shape of the site focused on user shared cognition. In [9] the use of the eye tracking tool is exposed, as an input for the analysis of user behavior when interacting with the web. Artificial intelligence has also been applied to enhance data collection and perform sustained analysis in machine learning algorithms; Thus, Zainab [10] applies clustering techniques as an input to categorize data obtained from the web, which are then analyzed from a standard system usability scale. ...
Chapter
Techniques for evaluating usability continue to be innovated. This document shares the application of a heuristic-based framework for measuring web usability - SIRIUS, complemented by two machine learning techniques for clustering: a) Hierarchical, with the Ward.2 method and Euclidean; and b) K-means clustering. For data processing, CRISP-DM has been proposed as a general method. Since our objective is to evaluate the usability characteristics of the websites of the Technical and Technological Institutes of Ecuador, data has been obtained from the web portals of 83 Institutes (34 public and 49 private). As a result, three clusters have been obtained, which encompass the 10 aspects of the framework, and which allow us to identify the levels of usability of technological institutes. As a result, 18 institutes have been categorized into the group of websites with above-average usability (cluster1), 32 institutes with below-average usability (cluster2), and the remainder with an acceptable degree of usability. The method used and proposed has made it possible to have a general usability map of the web portals of the technical and technological institutes of a country, as input for decision-making.
Article
Introduction There have been few published applications of the second phase of Cognitive Work Analysis (CWA), Control Task Analysis, particularly the Contextual Activity Template (CAT). The current study aimed to share lessons learnt from utilizing an online survey as a novel approach to development of a CAT. The application domain was sport, specifically football goalkeeping. A secondary aim was to apply the CAT to the goalkeeping role and gain the perceptions of both goalkeeping coaches and players on the functions and situations specific to a goalkeeper during match-play. Methods Ten SMEs with high-level expertize in goalkeeping coaching and/or playing participated in an online survey including a series of demographic, Likert scale and open-ended questions regarding goalkeeper specific functions and match-play situations. Eight goalkeeper match-play situations and 18 specific functions were included. Results A CAT model was created demonstrating the match-play situations where specific goalkeeper functions occur. Three function groupings were identified: broad (six or more functions), moderate (between three and five functions), and specific (below three functions). Discussion Utilizing online surveys to develop a CAT model is a novel approach within the Human Factors and Ergonomics (HFE) literature. Further, the CAT represents a first of its kind analysis in the football performance literature. Strengths and limitations of using online surveys for the development of a CAT are discussed. In conclusion, the work suggests flexible approaches can be used to develop HFE models.
Conference Paper
Full-text available
Eye tracking as a tool to quantify user attention plays a major role in research and application design. For Web page usability, it has become a prominent measure to assess which sections of a Web page are read, glanced or skipped. Such assessments primarily depend on the mapping of gaze data to a Web page representation. However, current representation methods, a virtual screenshot of the Web page or a video recording of the complete interaction session, suffer either from accuracy or scalability issues. We present a method that identifies fixed elements on Web pages and combines user viewport screenshots in relation to fixed elements for an enhanced representation of the page. We conducted an experiment with 10 participants and the results signify that analysis with our method is more efficient than a video recording, which is an essential criterion for large scale Web studies.
Conference Paper
Full-text available
Eye tracking studies are increasingly used for understanding how people interact with web pages. However, these studies are typically costly and time-consuming. Some of the main reasons are the difficulty in finding participants, the cost of equipment and the need for a separate session for each participant. Due to these reasons, eye tracking studies are typically conducted with a small number of users. Therefore, the statistical analysis of the collected data tends to be underpowered which possibly causes a failure to detect a significant difference (Type II Error). One of the solutions for this problem is to allow researchers to crowdsource eye tracking data from other researchers and combine them in an appropriate way to have a large amount of data for their studies, if possible. In this paper, we propose a methodology for creating a corpus of eye tracking data on web pages. This corpus will facilitate crowdsourcing eye tracking data collected on web pages.
Article
Full-text available
Conference Paper
Eye tracking as a tool to quantify user attention plays a major role in research and application design. For Web page usability, it has become a prominent measure to assess which sections of a Web page are read, glanced or skipped. Such assessments primarily depend on the mapping of gaze data to a Web page representation. However, current representation methods, a virtual screenshot of the Web page or a video recording of the complete interaction session, suffer either from accuracy or scalability issues. We present a method that identifies fixed elements on Web pages and combines user viewport screenshots in relation to fixed elements for an enhanced representation of the page. We conducted an experiment with 10 participants and the results signify that analysis with our method is more efficient than a video recording, which is an essential criterion for large scale Web studies.
Article
This survey provides an introduction into eye tracking visualization with an overview of existing techniques. Eye tracking is important for evaluating user behaviour. Analysing eye tracking data is typically done quantitatively, applying statistical methods. However, in recent years, researchers have been increasingly using qualitative and exploratory analysis methods based on visualization techniques. For this state-of-the-art report, we investigated about 110 research papers presenting visualization techniques for eye tracking data. We classified these visualization techniques and identified two main categories: point-based methods and methods based on areas of interest. Additionally, we conducted an expert review asking leading eye tracking experts how they apply visualization techniques in their analysis of eye tracking data. Based on the experts' feedback, we identified challenges that have to be tackled in the future so that visualizations will become even more widely applied in eye tracking research.
Conference Paper
The number of users required for usability studies has been a controversial issue over 30 years. Some researchers suggest a certain number of users to be included in these studies. However, they do not focus on eye tracking studies for analysing eye movement sequences of users (i.e., scanpaths) on web pages. We investigate the effects of the number of users on scanpath analysis with our algorithm that was designed for identifying the most commonly followed path by multiple users. Our experimental results suggest that it is possible to approximate the same results with a smaller number of users. The results also suggest that more users are required when they serendipitously browse on web pages in comparison with when they search for specific information or items. We observed that we could achieve 75% similarity to the results of 65 users with 27 users for searching tasks and 34 users for browsing tasks. This study guides researchers to determine the ideal number of users for analysing scanpaths on web pages based on their budget and time.
Article
Eye tracking is a technique whereby an individual’s eye movements are measured so that the researcher knows both where a person is looking at any given time and the sequence in which the person’s eyes are shifting from one location to another. Tracking people’s eye movements can help HCI researchers to understand visual and display-based information processing and the factors that may impact the usability of system interfaces. In this way, eye-movement recordings can provide an objective source of interface-evaluation data that can inform the design of improved interfaces. Eye movements also can be captured and used as control signals to enable people to interact with interfaces directly without the need for mouse or keyboard input, which can be a major advantage for certain populations of users, such as disabled individuals. We begin this article with an overview of eye-tracking technology and progress toward a detailed discussion of the use of eye tracking in HCI and usability research. A key element of this discussion is to provide a practical guide to inform researchers of the various eye-movement measures that can be taken and the way in which these metrics can address questions about system usability. We conclude by considering the future prospects for eye-tracking research in HCI and usability testing. Purchase this chapter to continue reading all 9 pages >
Article
We present a new visualization approach for displaying eye tracking data from multiple participants. We aim to show the spatio-temporal data of the gaze points in the context of the underlying image or video stimulus without occlusion. Our technique, denoted as gaze stripes, does not require the explicit definition of areas of interest but directly uses the image data around the gaze points, similar to thumbnails for images. A gaze stripe consists of a sequence of such gaze point images, oriented along a horizontal timeline. By displaying multiple aligned gaze stripes, it is possible to analyze and compare the viewing behavior of the participants over time. Since the analysis is carried out directly on the image data, expensive post-processing or manual annotation are not required. Therefore, not only patterns and outliers in the participants' scanpaths can be detected, but the context of the stimulus is available as well. Furthermore, our approach is especially well suited for dynamic stimuli due to the non-aggregated temporal mapping. Complementary views, i.e., markers, notes, screenshots, histograms, and results from automatic clustering, can be added to the visualization to display analysis results. We illustrate the usefulness of our technique on static and dynamic stimuli. Furthermore, we discuss the limitations and scalability of our approach in comparison to established visualization techniques.
Article
Evaluation has become a fundamental part of visualization research and researchers have employed many approaches from the field of human-computer interaction like measures of task performance, thinking aloud protocols, and analysis of interaction logs. Recently, eye tracking has also become popular to analyze visual strategies of users in this context. This has added another modality and more data, which requires special visualization techniques to analyze this data. However, only few approaches exist that aim at an integrated analysis of multiple concurrent evaluation procedures. The variety, complexity, and sheer amount of such coupled multi-source data streams require a visual analytics approach. Our approach provides a highly interactive visualization environment to display and analyze thinking aloud, interaction, and eye movement data in close relation. Automatic pattern finding algorithms allow an efficient exploratory search and support the reasoning process to derive common eye-interaction-thinking patterns between participants. In addition, our tool equips researchers with mechanisms for searching and verifying expected usage patterns. We apply our approach to a user study involving a visual analytics application and we discuss insights gained from this joint analysis. We anticipate our approach to be applicable to other combinations of evaluation techniques and a broad class of visualization applications.