Figure - uploaded by Andreas M. Klein
Content may be subject to copyright.
Source publication
The UEQ+ is a modular framework for the construction of UX questionnaires. The researcher can pick those scales that fit his or her research question from a list of 16 available UX scales. Currently, no UEQ+ scales are available to allow measuring the quality of voice interactions. Given that this type of interaction is increasingly essential for t...
Contexts in source publication
Context 1
... the scales followed in the order Response behaviour, Response quality and Comprehensibility with the corresponding candidate items in the order shown in Tables 1 to 3. The study was done in the German language. ...Context 2
... show in the following the items used and the loadings on the corresponding factor, which is the basis for the selection of the four items that represent the scale. Table 1 shows the items and loadings for the factor corresponding to the scale Response behaviour. Items 2, 6, 7, and 9 were selected with an introducing sentence as follows: Table 2 shows all items and loadings for the scale Response quality. ...Similar publications
User engagement has become a much-cited construct in human-computer interaction (HCI) design and evaluation research and practice. Constructed as a positive and desirable outcome of users' interactions, more frequent and longer interactions are considered evidence of engagement. Disengagement, when discussed, is considered a best avoided outcome of...
Citations
... For post-evaluation, we propose the UEQ+ framework [69], as it can be adapted to further dimensions depending on the context, goal, or purpose. The UEQ+ framework [69], based on the User Experience Questionnaire (UEQ) [41], provides a flexible UX assessment approach in different languages, e.g., Spanish [66].From the VUI perspective, the UEQ+ provides three specific VUI voice quality scales designed to measure the following UX aspects: response behavior (i.e., the VUI reacts like a human conversationalist), response quality (i.e., the user's intention is fulfilled), and comprehensibility (i.e., the user's intention is recognized without any special formulation) [32,33]. The UEQ+ framework currently consists of 20 scales (e.g., efficiency, dependability, trust, and trustworthiness of content), which can be combined arbitrarily to create a product-related questionnaire that measures a variety of UX aspects [68]. ...
... Mixed-method approaches in VUI research are common [7,38,27]. An example, if one wants to assess a VUI customer service response regarding train schedules (the giving context), then the selector's recommendation could be a mixed-method approach to the UEQ+ with the six appropriate scales (e.g., efficiency, dependability, perspicuity, response behavior, response quality, and comprehensibility [32]) in combination with another tool (e.g., interviews). Interviews provide deeper insights into users' experiences with products [27]. ...
Voice user interfaces (VUI) come in various forms of software or hardware, are controlled by voice, and can help the user in their daily life. Despite VUIs being readily available on smartphones, they have a low adoption rate. This can be attributed to challenges such as the misunderstanding of voice commands as well as privacy and data security concerns. Still, there are intensive VUI users, but they also raise concerns that may be independent of culture. Hence, we will discuss in our paper the various areas that should be considered when developing VUIs to increase user acceptance and foster a positive user experience (UX). We propose exploring the context of use and UX aspects to understand users' needs while using VUIs. Additionally, we suggest using the UEQ+ scales for VUI assessment to compare different design decisions. All of our suggestions can help VUI developers to design better VUIs.
... Recent research includes several attempts to define important UX aspects of VUI using an expertdriven process (Hone and Graham, 2000;Kocaballi et al., 2019;Klein et al., 2020a). To the best of our knowledge, however, there is no user-driven identification of relevant UX aspects for VUIs that is based on up-to-date user data. ...
... Examples of other UEQ+ scales are Attractiveness, Novelty, and Efficiency. The voice quality scales are constructed with consideration of human-computer interaction (HCI) and the VUI design process (Klein et al., 2020a). User, system, and context all influence HCI significantly (Hassenzahl and Tractinsky, 2006). ...
Voice User Interfaces (VUIs) are becoming increasingly available while users raise, e.g., concerns about privacy issues. User Experience (UX) helps in the design and evaluation of VUIs with focus on the user. Knowledge of the relevant UX aspects for VUIs is needed to understand the user’s point of view when developing such systems. Known UX aspects are derived, e.g., from graphical user interfaces or expert-driven research. The user’s opinion on UX aspects for VUIs, however, has thus far been missing. Hence, we conducted a qualitative and quantitative user study to determine which aspects users take into account when evaluating VUIs. We generated a list of 32 UX aspects that intensive users consider for VUIs. These overlap with, but are not limited to, aspects from established literature. For example, while Efficiency and Effectivity are already well known, Simplicity and Politeness are inherent to known VUI UX aspects but are not necessarily focused. Furthermore, Independency and Context-sensitivity are some new UX aspects for VUIs.
... and a high affinity for technology interaction (M =4.304, SD=0.86). After interacting with MentalBuddy, the participants completed the User Experience Questionnaire (UEQ) [24] and UEQ+ scales for voice interaction [10], and participated in a brief semi-structured interview. ...
Voice interactions with conversational agents are becoming increasingly ubiquitous. At the same time, stigmas around mental health are beginning to break down, but there remain significant barriers to treatment. Mental health conditions are highly prevalent and people fail to receive help due to lack of access, information, or structures. We aim to address these problems by investigating the applicability of voice-based conversational agents for mental health. In this paper, we introduce our first prototype, MentalBuddy, present initial user feedback, and discuss the potential ethical implications of using conversational agents in mental health applications. With proper considerations, conversational interfaces have the potential to create scalable access to mental health prevention, diagnosis, and therapy.
... Walaupun metode UEQ telah banyak digunakan untuk mengukur pengalaman pengguna berbagai aplikasi termasuk pembelajaran elektronik [6]- [11], sistem pengelolaan administrasi akademik [12], [13], e-commerce [14]- [16], aplikasi kesehatan [17], [18], dan e-government [19], hingga saat ini masih sedikit sekali penelitian yang memanfaatkan UEQ+ karena framework ini masih tergolong baru. Beberapa penelitian yang telah menggunakan UEQ+ di antaranya adalah penelitian yang mengukur pengalaman pengguna dari voice assistant [20], [21], aplikasi ujian mobile [22], dan website universitas [23]. ...
This study aims to measure and evaluate the user experience of the Microsoft Teams application as a learning and video conferencing platform using the UEQ+ framework, which is the development of the UEQ method. Through the UEQ+ framework, questionnaires can be designed by customizing user experience variables according to the application to be measured, thus the research results are expected to be more accurate and relevant. The scale of user experience measured in this questionnaire includes: efficiency, perspicuity, dependability, trust, usefulness, intuitive use, trustworthiness of content, quality of content, and clarity. After the questionnaires were distributed, 149 data were obtained which could be processed using data processing tools which being provided by UEQ+ called Data Analysis Tools. In conclusion, respondents have positive impression of Microsoft Teams as a video conferencing application and learning platform. The most important scale that represents the quality of Microsoft Teams is usefulness, clarity, trustworthiness of content, and quality of content.
... This questionnaire, implemented through the conversational agent, was used by a group of 40 participants to evaluate two products. Later, these same participants evaluated the agent using the UEQ + [15] questionnaire, constructed from the scales for interfaces of voice proposals in [16]. ...
In recent years there has been an increase of voice interfaces, driven by developments in Artificial Intelligence and the expansion of commercial devices that use them, such as smart assistants present on phones or smart speakers. One field that could take advantage of the potential of voice interaction is in the self-administered surveys data collection, such as standardized UX evaluation questionnaires. This work proposes a set of conversational design patterns for standardized UX evaluation questionnaires that use semantic difference scales as a means of collecting quantitative information on user experience, as is the case of AttrakDiff and UEQ (User Experience Questionnaire). The presented design patterns seek to establish a natural conversation created in accordance with the user, the conservation of context between subsequent questions, the minimization of statements and with statement repair mechanisms not completely understood by the user or voice agent, as eliciting explanation of a concept or repetition.
... Existing questionnaires should be extended or a new questionnaire should be created to evaluate VAs, which should lead to improvements in VAs. For example, a new and flexible method is the modular framework UEQ+ based on various scales to construct a product-specific questionnaire for which three VUI scales have been developed but not yet validated [16]. Others, however, focus on exploring current users, use cases, and systems to understand VA interaction, as well as finding design patterns [2,6,28]. ...
... Another category could be response quality, which combines other answer options such as ability to answer quicker, sound more natural, ability to distinguish between users, and ability to recognize feelings. These categories merge appropriate response options as, e.g., the voice quality scales of the UEQ+ framework, which contain four bipolar item-pairs with 7-point Likert-type scales [16,25]. These scales contain very similar product characteristics as those that are assigned to UX aspects. ...
... Therefore, we intend to ask them about how they use VAs in order to identify research gaps regarding assessment methods and the VA context of use [14]. Our study results can help extend measurement methods, e.g., scale construction for the UEQ+ framework regarding VUI assessment [16]. Hence, we must define relevant UX criteria depending on the VA use case and apply the factorial analysis to identify single factors [17]. ...
Currently, voice assistants (VAs) are trendy and highly available. The VA adoption rate of internet users differs among European countries and also in the global view. Due to speech intelligibility and privacy concerns, using VAs is challenging. Additionally, user experience (UX) assessment methods and VA improvement possibilities are still missing, but are urgently needed to overcome users’ concerns and increase the adoption rate. Therefore, we conducted an intercultural study of technology-based users from Germany and Spain, expecting that higher improvement potential would outweigh concerns about VAs. We investigated VA use in terms of availability versus actual use, usage patterns, concerns, and improvement proposals. Comparing Germany and Spain, our findings show that nearly the same amount of intensive VA use is found in both technology-based user groups. Despite cultural differences, further results show very similar tendencies, e.g., frequency of use, privacy concerns, and demand for VA improvements.
... We must keep in mind that users have dierent needs (e.g., children vs. parents vs. people with disabilities) that could result in dierent design solutions. Research has already started to explore with questionnaires [9,10] and interviews the risks and opportunities of VAs (e.g., cultures [6,22], use cases [1,3], existing users, or potential users [1,3,22]). ...
... An example could be using ambient light to encode whether or not the microphone is listening [11]. In addition, we can design prototypes with dierent users, design concepts, and use cases and employ qualitative (e.g., interviews) and quantitative evaluation (e.g., questionnaires for VUIs [9]) to determine which designs are accepted by users or raise concerns. I also developed systems for user groups that need more protection, such as children [16,19,20]. ...
... The advantage of the context analysis is the structured way in which assumptions are backed up or put into context by users' statements. Additionally, we should consider using the existing UEQplus Framework with newly developed VUI scales to measure VAs [9,21]. ...
Voice Assistants (VAs), Speech-based Technology (SBT), and Voice User Interfaces (VUI) have gained popularity in recent years; however, privacy concerns stop people from using them. Although the terms are used for slightly di?erent purposes, we mainly use "VAs" for simplicity. To increase the adoption rate and address users’ needs, we propose reducing barriers with Acceptance by Design. By that, we mean using, e.g., VA designs that implicitly show if they collect data or how much data is collected. We can do that by ?rst exploring the user requirements and concerns with methods from the Human-Centered Design Process [4] to provide guidelines for the future. This is a complex challenge that requires a deep understanding from di?erent disciplines to design VAs for users skeptical about them. The HCD o?ers a holistic approach for capturing user requirements or discovering new dimensions of the UX that in?uence the user’s perception or awareness of the interactive system. We must keep in mind that users have di?erent needs (e.g., children vs. parents vs. people with disabilities) that could result in di?erent design solutions. Research has already started to explore with questionnaires [9, 10] and interviews the risks and opportunities of VAs (e.g., cultures [6, 22], use cases [1, 3], existing users, or potential users [1, 3, 22]). Furthermore, these guidelines might help others to design future interactive systems for di?erent use cases, such as augmented reality glasses for university students in labs.
... In a second step, we updated the list of UX factors Hinderks et al. (2020b) by including the UX factors from the UX questionnaire 'User Experience Questionnaire Plus (UEQ+)' Schrepp and Thomaschewski (2019b). Additionally, we included the UX factors from Klein et al. (2020) and Otten et al. (2020). These UX factors are specifically for voice assistants, such as Alexa, Siri, or Google Home. ...
... 4 Overview of all UX Factors fromHinderks et al. (2020b),Schrepp and Thomaschewski (2019b),Klein et al. (2020), andOtten et al. (2020) Dependability: The product always responds to user interaction in a predictable and consistent way.UX factorHinderks Schrepp Klein Otten Ease of Use: The product is easy to operate. X Efficiency: The user can reach their goals with minimum time required and minimum physical effort. ...
Context. Agile methods are increasingly being used by companies, to develop digital products and services faster and more effectively. Today's users not only demand products that are easy to use, but also products with a high User Experience (UX). Agile methods themselves do not directly support the development of products with a good user experience. In combination with UX activities, it is potentially possible to develop a good UX.
Objective. The objective of this PhD thesis is to develop a UX Lifecycle, to manage the user experience in the context of Agile methods. With this UX Lifecycle, Agile teams can manage the UX of their product, in a targeted way.
Method. We developed the UX Lifecycle step by step, according to the Design Science Research Methodology. First, we conducted a Structured Literature Review (SLR) to determine the state of the art of UX management. The result of the SLR concludes in a GAP analysis. On this basis, we derived requirements for UX management. These requirements were then implemented in the UX Lifecycle. In developing the UX Lifecycle, we developed additional methods (UX Poker, UEQ KPI, and IPA), to be used when deploying the UX Lifecycle. Each of these methods has been validated in studies, with a total of 497 respondents from three countries (Germany, England, and Spain). Finally, we validated the UX Lifecycle, as a whole, with a Delphi study, with a total of 24 international experts from four countries (Germany, Argentina, Spain, and Poland).
Results. The iterative UX Lifecycle (Figure 1) consists of five steps: Initial Step 0 ‘Preparation’, Step 1 ‘UX Poker’ (before development/Estimated UX), Step 2 ‘Evaluate Prototype’ (during development/Probable UX), Step 3 ‘Evaluate Product Increment’ (after development/Implemented UX), and a subsequent Step 4 ‘UX Retrospective’.
With its five steps, the UX Lifecycle provides the structure for continuously measuring and evaluating the UX, in the various phases. This makes it possible to develop the UX in a targeted manner, and to check it permanently. In addition, we have developed the UX Poker method. With this method, the User Experience can be determined by the Agile team, in the early phases of development. The evaluation study of UX Poker has indicated that UX Poker can be used to estimate the UX for user stories. In addition, UX Poker inspires a discussion about UX, that results in a common understanding of the UX of the product. To interpret the results from the evaluation of a prototype and product increment, we developed or derived the User Experience Questionnaire KPI and Importance-Performance Analysis. In a first study, we were able to successfully apply the two methods and, in combination with established UEQ methods, derive recommendations for action, regarding the improvement of the UX. This would not have been possible without their use. The results of the Delphi study, to validate the UX Lifecycle, reached consensus after two rounds. The results of the evaluation and the comments lead to the conclusion, that the UX Lifecycle has a sufficiently positive effect on UX management.
Conclusion. The goal-oriented focus on UX factors and their improvement, as propagated in the UX Lifecycle, are a good way of implementing UX management in a goal-oriented manner. By comparing the results from UX Poker, the evaluation of the prototype, and product increment, the Agile team can learn more about developing a better UX, within a UX retrospective. The UX Lifecycle will have a positive effect on UX management. The use of individual components of the UX Lifecycle, such as UX Poker or Importance-Performance Analysis, already helps an Agile team to improve the user experience. But only in combination with the UX Lifecycle and the individual methods and approaches presented in this PhD thesis, is a management of the user experience in a targeted manner possible, in our view. This was the initial idea of this PhD thesis, which we are convinced we could implement.
... The UEQ+ contains 20 scales to measure specific UX aspects, which can be combined into a product-related questionnaire. The newly constructed scales for voice quality consider the UX aspects of VUIs and fill the voice interaction gap within UEQ+ [13]. ...
Voice user interfaces (VUI) are currently a trending topic but the ability to measure and improve the user experience (UX) is still missing. We aim to develop a tool selector as a web application that can provide a suitable tool to measure UX quality of VUIs. The UX tool selector for VUIs will include a UX measurement toolbox containing several existing and new VUI assessment methods. The UX tool selector will provide context-dependent measurement recommendations without prior extensive research to evaluate and improve VUIs.
... Stimulation). o Useful tool for evaluating [7] o Existing tools measure usability [8] o • Research gap o UEQ+ lacks scales for VUI. ...
Präsentation zum World Usability Day in Osnabrück am 12.11.2020