Content uploaded by Florian Lachner
Author content
All content in this area was uploaded by Florian Lachner on Jul 19, 2017
Content may be subject to copyright.
Triangulation in UX Studies: Learning
from Experience
Ingrid Pettersson
Volvo Car Group
Sweden
ingrid.pettersson@
volvocars.com
Andreas Riener
University of Applied Sciences
Ingolstadt (THI),
Germany
andreas.riener@thi.de
Anna-Katharina Frison
University of Applied Sciences
Ingolstadt (THI),
Germany
anna-katharina.frison@thi.de
Jesper Nolhage
Volvo Car Group
Sweden
jesper.nolhage@
volvocars.com
Florian Lachner
University of Munich (LMU)
Germany
florian.lachner@ifi.lmu.de
Permission to make digital or hard copies of part or all of this work for personal or
classroom use is granted without fee provided that copies are not made or distributed
for profit or commercial advantage and that copies bear this notice and the full citation
on the first page. Copyrights for third-party components of this work must be honored.
For all other uses, contact the Owner/Author.
Copyright is held by the owner/author(s).
DIS’17 Companion, June 10-14, 2017, Edinburgh, United Kingdom
ACM 978-1-4503-4991-8/17/06.
http://dx.doi.org/10.1145/3064857.3064858
Abstract
While the consideration of User Experience (UX) has be-
come embedded in research and design processes, UX
evaluation remains a challenging and strongly discussed
area for both researchers in academia and practitioners
in industry. A variety of different evaluation methods have
been developed or adapted from related fields, building on
identified methodology gaps. Although the importance of
mixed methods and data-driven approaches to get well-
founded study results of interactive systems has been em-
phasized numerous times, there is a lack of evolved un-
derstandings and recommendations of when and in which
ways to combine different methods, theories, and data re-
lated to the UX of interactive systems. The workshop aims
to gather experiences of user studies from UX profession-
als and academics to contribute to the knowledge of mixed
methods, theories, and data in UX evaluation. We will dis-
cuss individual experiences, best practices, risks and gaps,
and reveal correlations among triangulation strategies.
Author Keywords
User Experience; Evaluation; Mixed Methods; Triangulation
ACM Classification Keywords
H.5.2 [Information interfaces and presentation (e.g., HCI)]:
Evaluation/methodology
Workshop Summaries
DIS 2017, June 10–14, 2017, Edinburgh, UK
341
Introduction
As an academic discipline, the field of User Experience
(UX) research has a multi-disciplinary heritage, involving
a variety of different perspectives that focused on studying
human experiences with products, systems, and services.
This led to a wide spectrum of methods that are used for
studying users’ experiences. Traditional Human-computer
interaction (HCI) theory has passed on methodological ap-
proaches akin to those used in usability evaluation studies.
Other disciplines that have significantly influenced UX re-
search are those of social sciences, ethnography, and
philosophy.
There have been great efforts in academia to create new
methods for effectively evaluating UX, aimed at both aca-
demic and industrial application [1]. Our proposition in this
workshop is, however, that we often do not need to develop
new methods but rather use existing tools and approaches
from the wide flora of UX evaluation more efficiently. UX
evaluation is no longer an unknown territory and we want to
encourage reflection on established approaches as well as
lessons learned along the way. We want to explore the ex-
isting know-how of UX professionals, from academia and in-
dustry, in combining different UX evaluation methods (e.g.,
qualitative and quantitative methods) within so called mixed
methods approaches and triangulation strategies.
Background & Motivation
Past workshops in the ACM community have already ex-
plored UX methods from different perspectives [4, 6, 3,
5]. However, a focus on triangulation, also called mixed
methods, or multi-method approaches, is still missing. To
combine different ways of research to get a more holistic
view on UX is nowadays one of the key areas for further UX
research [1, 4, 8]. Within a SIG session Roto et al. 2009
[4] analyzed UX evaluation methods in the industrial and
Figure 1: How can holistic User Experience (UX) evaluation be
optimized by triangulation?
academic context. They revealed that rich data can be col-
lected by applying mixed methods e.g., through the com-
bination of system logging with subjective user statements
from questionnaires and interviews. The authors conclude
that mixing methods allows to understand the reasoning be-
hind the concept of UX. Van Turnhout et al. [7] investigated
common mixed research approaches of the NordiCHI pro-
ceedings 2012 to lay a foundation for further research and
a more thoughtful application of multi-methods. However,
best practices for using such multi-method perspectives in-
spired by the needs of academia and industry are not yet
explored in depth.
Employing a mix of methods and theories to study a sub-
ject has been claimed to contribute to more reliable, holistic
and well-motivated understandings of a phenomenon [2].
Furthermore, a mixed methods approach can uncover un-
expected results, generate important and unforeseen re-
search questions while at the same time providing answers
to those new questions. This is particularly important for
complex topics, such as the concept of UX. We argue that
investigating UX design and evaluation from different angles
will lead to a well-founded understanding of UX.
Workshop Summaries
DIS 2017, June 10–14, 2017, Edinburgh, UK
342
Workshop Theme & Goal
Researchers and practitioners have developed their own
best practices over decades based on experiences, reflec-
tion, theoretic background, or intuition. We want to bring
this wide-spread knowledge together and learn from each
other by uncovering basic challenges, aims, and strategies
related to UX work.
It will be an opportunity to share experiences with different
UX evaluation methods, collect empirical data of practices,
and a way to jointly suggest ways of improving the learn-
ing process from user studies. Finally, we want to support
a more holistic understanding of the quality of a certain ex-
perience, which should be applicable for research projects
in academia and industry. Specifically, we want to answer
following questions:
• What are the motivations and the outcomes of differ-
ent UX research and evaluation methods?
• How do we best draw conclusions from multiple and
different sources, such as qualitative and quantitative
or attitudinal and behavioral data?
• Can combinations of contrasting theories that exist in
UX be better exploited, and if so how?
• How can we define best practices and where are
gaps or development needs in mixed method ap-
proaches in the field of UX?
Duration
The presented theme and questions shall be discussed and
edited in one full-day workshop.
Intended Outcome & Future Work
Our ambition is that the workshop will evolve and spread
knowledge as well as awareness of how to get more out of
UX studies. Consequently, participants will be able to apply
particular methods more efficiently and effectively. A coop-
eratively developed mixed method map will summarize the
outcomes. In combination with an already ongoing litera-
ture review on documented UX studies, the outcomes of
the workshop will unfold the state of the art of using mixed
method approaches in UX research. Further future work
can be identified during the day and within the networking
session.
REFERENCES
1. Javier A Bargas-Avila and Kasper Hornbæk. 2011. Old
wine in new bottles or novel challenges: a critical
analysis of empirical studies of user experience. In
Proceedings of the SIGCHI Conference on Human
Factors in Computing Systems. ACM, 2689–2698.
2. R Burke Johnson, Anthony J Onwuegbuzie, and Lisa A
Turner. 2007. Toward a definition of mixed methods
research. Journal of mixed methods research 1, 2
(2007), 112–133.
3. Marianna Obrist, Virpi Roto, and Kaisa
Väänänen-Vainio-Mattila. 2009. User experience
evaluation: do you know which method to use?. In
CHI’09 Extended Abstracts on Human Factors in
Computing Systems. ACM, 2763–2766.
4. Virpi Roto, Marianna Obrist, and Kaisa
Väänänen-Vainio-Mattila. 2009a. User experience
evaluation methods in academic and industrial
contexts. In Proceedings of the Workshop UXEM,
Vol. 9. Citeseer.
5. Virpi Roto, Kaisa Väänänen-Vainio-Mattila, Effie Law,
and Arnold Vermeeren. 2009b. User experience
Workshop Summaries
DIS 2017, June 10–14, 2017, Edinburgh, UK
343
evaluation methods in product development
(UXEM’09). In IFIP Conference on Human-Computer
Interaction. Springer, 981–982.
6. Kaisa Väänänen-Vainio-Mattila, Virpi Roto, and Marc
Hassenzahl. 2008. Now let’s do it in practice: user
experience evaluation methods in product
development. In CHI’08 extended abstracts on Human
factors in computing systems. ACM, 3961–3964.
7. Koen van Turnhout, Arthur Bennis, Sabine Craenmehr,
Robert Holwerda, Marjolein Jacobs, Ralph Niels,
Lambert Zaad, Stijn Hoppenbrouwers, Dick Lenior, and
René Bakker. 2014. Design patterns for mixed-method
research in HCI. In Proceedings of the 8th Nordic
Conference on Human-Computer Interaction: Fun,
Fast, Foundational. ACM, 361–370.
8. Arnold POS Vermeeren, Effie Lai-Chong Law, Virpi
Roto, Marianna Obrist, Jettie Hoonhout, and Kaisa
Väänänen-Vainio-Mattila. 2010. User experience
evaluation methods: current state and development
needs. In Proceedings of the 6th Nordic Conference on
Human-Computer Interaction: Extending Boundaries.
ACM, 521–530.
Workshop Summaries
DIS 2017, June 10–14, 2017, Edinburgh, UK
344