Chapter

Evaluating Usability Evaluation Methods: Criteria, Method and a Case Study

DOI: 10.1007/978-3-540-73105-4_63 In book: Human-Computer Interaction, Part I, HCII 2007, LNCS 4550, Publisher: Springer, Editors: J. Jacko, pp.569-578

ABSTRACT The paper proposes an approach to comparative usability evaluation that incorporates important relevant criteria identified
in previous work. It applies the proposed approach to a case study of a comparative evaluation of an academic website employing
four widely-used usability evaluation methods (UEMs): heuristic evaluation, cognitive walkthroughs, think-aloud protocol and
co-discovery learning.

3 Bookmarks
 · 
200 Views
  • [Show abstract] [Hide abstract]
    ABSTRACT: The unquestionable relevance of the web in our society has led to an enormous growth of websites offering all kinds of services to users. In this context, while usability is crucial in the development of successful websites, many barely consider the recommendations of experts in order to build usable designs. Including the measurement of usability as part of the development process stands out among these recommendations. One of the most accepted methods for usability evaluation by experts is heuristic evaluation. There is abundant literature on this method. However, there is a lack of clear and specific guidelines to be used in the development and evaluation process. This is probably an important factor contributing to the aforementioned generalized deficiency in web usability.We miss an evaluation method based on heuristics whose measure is adapted to the type of evaluated website. In this paper we define Sirius, an evaluation framework based on heuristics to perform expert evaluations that takes into account different types of websites. We also provide a specific set of evaluation criteria, and a usability metric that quantifies the usability level achieved by a website depending on its type.
    Journal of Systems and Software 01/2012; · 1.14 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: To avoid use errors when handling medical equipment, it is important to develop products with a high degree of usability. This can be achieved by performing usability evaluations in the product development process to detect and mitigate potential usability problems. A commonly used method is cognitive walkthrough (CW), but this method shows three weaknesses: poor high-level perspective, insufficient categorisation of detected usability problems, and difficulties in overviewing the analytical results. This paper presents a further development of CW with the aim of overcoming its weaknesses. The new method is called enhanced cognitive walkthrough (ECW). ECW is a proactive analytical method for analysis of potential usability problems. The ECW method has been employed to evaluate user interface designs of medical equipment such as home-care ventilators, infusion pumps, dialysis machines, and insulin pumps. The method has proved capable of identifying several potential use problems in designs.
    Advances in Human-Computer Interaction 10/2013; 2013.
  • [Show abstract] [Hide abstract]
    ABSTRACT: The physical interaction aspects of embedded systems have been neglected in comparison to internal software issues in the software engineering field. Physical interfaces suffer from interaction complexities leading to usage difficulty and poor acceptance by the end-users. Meanwhile, the usability techniques focus on the overall usability issues while overlooking the in-depth physicality aspects of the interface and interaction. This study proposes a physicality-focused quantitative evaluation method to assist the embedded system developers in managing the interaction complexities of their products. The acceptance of the embedded system developers towards the proposed method was assessed by means of a user study. The results suggested their strong acceptance. The aim of this study is to re-emphasise the natural physical aspects of embedded system interfaces leading to intuitive interaction.
    Proceedings of the 25th Australian Computer-Human Interaction Conference: Augmentation, Application, Innovation, Collaboration; 11/2013

Full-text (2 Sources)

View
428 Downloads
Available from
May 31, 2014