Conference Paper

Exploring back-of-device interaction

Authors:
To read the full-text of this research, you can request a copy directly from the author.

Abstract

Back of device interaction is gaining popularity as an alternative input modality in mobile devices, however it is still unclear how the back of device is related to other interactions. My research explores the relationship between hand grip from the back of the device and other interactions. In order to investigate this relationship, I will use touch target application to study hand grip patterns, then analyse the correlation that exists between touch target and hand grip. Finally I will explore the possibilities offered when the relationship between the touch target and hand grip is established in a quantifiable way.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... It also offers significant benefits to users by freeing the other hand, which requires physical and psychological attention when participating in mobile activities (Karlson et al., 2006). Physical and attitudinal studies on mobile device users were first conducted on businessmen to provide design implications for the touch screen interface to allow one-handed interaction (Baudisch and Chu, 2009;Faizuddin et al., 2014;Faizuddin and Noor, 2013;Ng et al., 2014). However, since the generalization of these findings to normal users was limited, several approaches for single-handed mobile interaction have been proposed in the fields of computer engineering and ergonomics (Gustafsson et al., 2010;Karlson et al., 2005;Park and Han, 2010). ...
Article
Purpose The purpose of this paper is to examine the effects of interaction techniques (e.g. swiping and tapping) and the range of thumb movement on interactivity, engagement, attitude, and behavioral intention in single-handed interaction with smartphones. Design/methodology/approach A 2×2 between-participant experiment (technological features: swiping and tapping×range of thumb movement: wide and narrow) was conducted to study the effects of interaction techniques and thumb movement ranges. Findings The results showed that the range of thumb movement had significant effects on perceived interactivity, engagement, attitude, and behavioral intention, whereas no effects were observed for interaction techniques. A narrow range of thumb movement had more influence on the interactivity outcomes in comparison to a wide range of thumb movement. Practical implications While the subject of actual and perceived interactivity has been discussed, the issue has not been applied to smartphone. Based on the research results, the mobile industry may come up with a design strategy that balances feature- and perception-based interactivity. Originality/value This study adopted the perspective of the hybrid definition of interactivity, which includes both actual and perceived interactivity. Interactivity effect outcomes mediated by perceived interactivity.
... • Applying additional hardware [12,33] • Using the existing rear-facing camera [32,1] • Using the existing internal sensors [19] Up until now, all research on possible back-of-device interaction for mobile devices has focused on designer-defined gesture sets; there have been no explorations of user-defined gesture sets. This is critical because research has shown that user-defined gestures are easier to learn and recall [30], easier to perform [15], and more appropriate than designer-defined gestures [15]. ...
Conference Paper
Many studies have highlighted the advantages of expanding the input space of mobile devices by utilizing the back of the device. We extend this work by performing an elicitation study to explore users' mapping of gestures to smartphone commands and identify their criteria for using back-of-device gestures. Using the data collected from our study, we present elicited gestures and highlight common user motivations, both of which inform the design of back-of-device gestures for mobile interaction.
... Reference [13] conducted the performance analysis of the rear touch and found that the index finger performed well on the small-sized display device. Recently, [14], [15] explored the relationship between hand grip from the back of the device and user interactions. With regard to single-handed rear touch panel operation, different ways to hold hand-held devices have been investigated, and five spontaneous ways to hold tablets have been described [16] and analyzed using a kinematic chain model [17]. ...
Article
Full-text available
To improve single-handed operation of mobile de-vices, the use of rear touch panel has potential for user interac-tions. In this paper, a basic study of operational control simply achieved through drag and tap of the index finger on a rear touch panel is conducted. Since a user has to hold the handheld device firmly with the thumb and fingers, a movable range of the tip of an index finger is limited. This restriction requires a user to perform several times of dragging actions to reach a cursor to the long distance target. Considering such kinematic restriction, a technique optimized for rear operation is proposed, wherein not only the position but also the velocity of fingertip movement is regarded. Movement time, the number of dragging operation, and the throughputs of the proposed technique have been evaluated in comparison with the generic technique using Fitts’s law. Experiments have been conducted to perform the target selection in the form of reciprocal 1D pointing tasks with ten participants. The combinations of two ways of holding the device (landscape and portrait) and two directions of dragging (horizontal and vertical) are considered. As a result, the proposed technique achieved the improvements of from 5 to 13% shorter movement time, from 20 to 40% higher throughputs and no deterioration of the number of dragging even for the longer distance targets. In addition, the further analysis addressed that there exists the advantageous combinations of the way of holding and the direction of dragging, which would be beneficial for better design of single-handed user interactions using rear touch.
Conference Paper
We present βTap, a Back-of-device (BoD) tap detection software for mobile devices that uses commodity sensors, without the need to instrument the device. Although just basic interactions are supported (namely, single and double taps), βTap is highly accurate and performance-friendly, since it uses a low-cost yet highly discriminative set of features. Our software is publicly available at the Google Play Store, so that others can build upon our work.
Conference Paper
Back-of-device (BoD) interaction using current smartphone sensors (e.g. accelerometer, microphone, or gyroscope) has recently emerged as a promising novel input modality. Researchers have used a different number of features derived from these commodity sensors, however it is unclear what sensors and which features would allow for practical use, since not all sensor measurements have an equal value for detecting BoD interactions reliably and efficiently. In this paper, we primarily focus on constructing and selecting a subset of features that is a good predictor of BoD tap-based input while ensuring low energy consumption. As a result, we build several classifiers for a variety of use cases (e.g. single or double taps with the dominant or non-dominant hand). We show that a subset of just 5 features provides high discrimination power and results in high recognition accuracy. We also make our software publicly available, so that others can build upon our work.
Conference Paper
Full-text available
This paper presents BoD Shapes, a novel authentication method for smartphones that uses the back of the device for input. We argue that this increases the resistance to shoulder surfing while remaining reasonably fast and easy-to-use. We performed a user study (n = 24) comparing BoD Shapes to PIN authentication, Android grid unlock, and a front version of our system. Testing a front version allowed us to directly compare performance and security measures between front and back authentication. Our results show that BoD Shapes is significantly more secure than the three other approaches. While performance declined, our results show that BoD Shapes can be very fast (up to 1.5 seconds in the user study) and that learning effects have an influence on its performance. This indicates that speed improvements can be expected in long-term use.
Conference Paper
Full-text available
This paper presents a novel user interface for handheld mobile devices by recognizing hand grip patterns. Par- ticularly, we consider the scenario where the device is provided with an array of capacitive touch sensors un- derneath the exterior cover. In order to provide the users with intuitive and natural manipulation experience, we use pattern recognition techniques for identifying the users' hand grips from the touch sensors. Preliminary user studies suggest that filtering out unintended user hand grip is one of the most important issues to be re- solved. We discuss the details of the prototype imple- mentation, as well as engineering challenges for practi- cal deployment.
Conference Paper
Full-text available
In this paper, we explore how to add pointing input capabilities to very small screen devices. On first sight, touchscreens seem to allow for particular compactness, because they integrate input and screen into the same physical space. The opposite is true, however, because the user's fingers occlude contents and prevent precision. We argue that the key to touch-enabling very small devices is to use touch on the device backside. In order to study this, we have created a 2.4" prototype device; we simulate screens smaller than that by masking the screen. We present a user study in which participants completed a pointing task successfully across display sizes when using a back-of device interface. The touchscreen-based control condition (enhanced with the shift technique), in contrast, failed for screen diagonals below 1 inch. We present four form factor concepts based on back-of-device interaction and provide design guidelines extracted from a second user study.
Conference Paper
Full-text available
As mobile and tangible devices are getting smaller and smal- ler it is desirable to extend the interaction area to their whole surface area. The HandSense prototype employs capacitive sensors for detecting when it is touched or held against a body part. HandSense is also able to detect in which hand the device is held, and how. The general properties of our approach were confirmed by a user study. HandSense was able to correctly classify over 80 percent of all touches, dis- criminating six different ways of touching the device (hold left/right, pick up left/right, pick up at top/bottom). This in- formation can be used to implement or enhance implicit and explicit interaction with mobile phones and other tangible user interfaces. For example, graphical user interfaces can be adjusted to the user's handedness.
Conference Paper
Full-text available
Touch is a compelling input modality for interactive devices; however, touch input on the small screen of a mobile device is problematic because a user's fingers occlude the graphical elements he wishes to work with. In this paper, we present LucidTouch, a mobile device that addresses this limitation by allowing the user to control the application by touching the back of the device. The key to making this usable is what we call pseudo-transparency : by overlaying an image of the user's hands onto the screen, we create the illusion of the mobile device itself being semi- transparent. This pseudo-transparency allows users to accurately acquire targets while not occluding the screen with their fingers and hand. LucidTouch also supports multi-touch input, allowing users to operate the device simultaneously with all 10 fingers. We present initial study results that indicate that many users found touching on the back to be preferable to touching on the front, due to reduced occlusion, higher precision, and the ability to make multi-finger input.
Conference Paper
Automatic screen rotation improves viewing experience and usability of mobile devices, but current gravity-based approaches do not support postures such as lying on one side, and manual rotation switches require explicit user input. iRotate Grasp automatically rotates screens of mobile devices to match users' viewing orientations based on how users are grasping the devices. Our insight is that users' grasps are consistent for each orientation, but significantly differ between different orientations. Our prototype embeds a total of 32 light sensors along the four sides and the back of an iPod Touch, and uses support vector machine (SVM) to recognize grasps at 25Hz. We collected 6-users' usage under 54 different conditions: 1) grasping the device using left, right, and both hands, 2) scrolling, zooming and typing, 3) in portrait, landscape-left, and landscape-right orientations, and while 4) sitting and lying down on one side. Results show that our grasp-based approach is promising, and our iRotate Grasp prototype could correctly rotate the screen 90.5% of the time when training and testing on different users.
Conference Paper
Multitouch tablets, such as iPad and Android tablets, support virtual keyboards for text entry. Our 64-user study shows that 98% of the users preferred different keyboard layouts and positions depending on how they were holding these devices. However, current tablets either do not allow keyboard adjustment or require users to manually adjust the keyboards. We present iGrasp, which automatically adapts the layout and position of virtual keyboards based on how and where users are grasping the devices without requiring explicit user input. Our prototype uses 46 capacitive sensors positioned along the sides of an iPad to sense users' grasps, and supports two types of grasp-based automatic adaptation: layout switching and continuous positioning. Our two 18-user studies show that participants were able to begin typing 42% earlier using iGrasp's adaptive keyboard compared to the manually adjustable keyboard.
Conference Paper
A novel and intuitive way of accessing applications of mobile devices is presented. The key idea is to use grip-pattern, which is naturally produced when a user tries to use the mobile device, as a clue to determine an application to be launched. To this end, a capacitive touch sensor system is carefully designed and installed underneath the housing of the mobile device to capture the information of the user's grip-pattern. The captured data is then recognized by a minimum distance classifier and a naive Bayes classifier. The recognition test is performed to validate the feasibility of the proposed user interface system
Conference Paper
We propose a vision of a grasp-based interaction system where users' intentions are inferred by the way they hold and interact with a device. In this paper we specifically discuss the Bar of Soap, a multi-function handheld prototype that uses grasp-based interactions to switch between modes. This prototype relies on the hypothesis that users share a set of stereotyped grasps associated with common multi-function modes. We show that using common machine learning techniques our device can reliably distinguish five separate classes based on the users' grasps. While this interaction is currently implemented in a multi-function handheld, we anticipate the existence of many scenarios where grasp recognition could provide a more intuitive or useful interface.
Conference Paper
We present a new mobile interaction model, called double-side multi-touch, based on a mobile device that receives simultaneous multi-touch input from both the front and the back of the device. This new double-sided multi-touch mobile interaction model enables intuitive finger gestures for manipulating 3D objects and user interfaces on a 2D screen.
Double-side multi-touch input for mobile devices, CHI '09 Extended Abstracts on Human Factors in Computing Systems
  • Li Early Erh
  • Sung-Sheng Daniel Shen
  • Hao-Hua Tsai
  • Yung-Jen Jane Chu
  • Chi-Wen Euro Hsu
  • Chen
The bar of soap: a grasp recognition system implemented in a multi-functional handheld device, CHI '08 Extended Abstracts on Human Factors in Computing Systems
  • T Brandon
  • Taylor