Conference Paper

Studying applications for touch-enabled mobile phone keypads

DOI: 10.1145/1347390.1347396 Conference: Proceedings of the 2nd International Conference on Tangible and Embedded Interaction 2008, Bonn, Germany, February 18-20, 2008
Source: DBLP


We present a platform to evaluate mobile phone applications that make use of an additional dimension for key presses. Using capacitive sensors on each key, merely touching buttons as well as the force of the press can be measured. A set of applications well known from current mobile phones has been extended with functionality exploiting those new possibilities. The results of a study undertaken with this prototype are presented and conclusions are drawn for the design and implementation of such applications.

Download full-text


Available from: Paul Holleis,
  • Source
    • "We used this work as inspiration for our own, but simplified ours to use a standard QWERTY layout and only two levels of pressure. Holleis et al. [4] added touch sensors to a mobile phone pad to allow users to preview content by touching a key (in effect giving an extra pressure level). Their qualitative study showed people generally liked the touch feature. "
    [Show abstract] [Hide abstract]
    ABSTRACT: This paper describes the design and evaluation of a touchscreen- based pressure keyboard to investigate the possibilities of pres- sure as a new method of input for mobile devices. A soft press on the touchscreen generated a lowercase letter, a hard press an up- percase one. The aim was to improve input performance when entering mixed-case text, or shifted characters often used for emoticons etc. An experiment compared two different forms of pressure input (Dwell and Quick Release) against a standard shift key keyboard, with users sitting and walking. Results showed that Quick Release was the fastest for input of mixed case text with Dwell being the most accurate, even when users were mobile. The results demonstrate that pressure input can outperform a standard shift-key keyboard design for mobile text entry.
    Proceedings of the 11th Conference on Human-Computer Interaction with Mobile Devices and Services, Mobile HCI 2009, Bonn, Germany, September 15-18, 2009; 01/2009
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Besides key presses and text input, modern mobile devices support advanced interactions like taking pictures, gesturing, reading NFC-tags, as well as supporting physiological and environmental sensors. Implementing applications that benefit of this variety of interactions is still difficult. Support for developers and interaction designers remains basic and tools and frameworks are rare. This paper presents a prototyping environment that allows quickly and easily creating fully functional, high-fidelity prototypes deployable on the actual devices. With this work, we target the gap between paper prototyping and integrated development environments. Additionally, new interaction techniques can be significantly faster or slower to use than conventional mobile user interfaces. Hence it is essential to assess the impact of interface design decisions on interaction time. Additionally, the presented tool supports implicit and explicit user performance evaluations during all phases of prototyping. This approach builds on the original as well as extensions of the Keystroke-Level Model (KLM) which allows estimating interaction times in early phases of the development with a simulated prototype. An underlying state graph structure enables automatic checks of the application logic. This tool helps user interface designers and developers to create efficient and consistent novel applications.
    Pervasive Computing, 6th International Conference, Pervasive 2008, Sydney, Australia, May 19-22, 2008, Proceedings; 01/2008
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: In the context of tangibility, mobile phones are rapidly becoming sensor-rich handheld computers with the potential to take better advantage of our physical capabilities and our lifetime of experiences interacting both in and with the world around us. In this paper, we analyse four different ways in which mobiles can be used to represent and control digital information, showing that each resulting interaction style is characterized by a unique coordination of the user's attention and two hands in relation to the mobile device. We present our analysis in terms of a framework that can be used to critically examine future schemes of bimanual interaction with mobile phones. Author Keywords
    Proceedings of the 3rd International Conference on Tangible and Embedded Interaction 2009, Cambridge, UK, February 16-18, 2009; 01/2009
Show more