Conference Paper

Investigating How Smartphone Movement is Affected by Body Posture

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

We present an investigation into how hand usage is affected by different body postures (Sitting at a table, Lying down and Standing) when interacting with smartphones. We theorize a list of factors (smartphone support, body support and muscle usage) and explore their influence the tilt and rotation of the smartphone. From this we draw a list of hypotheses that we investigate in a quantitative study. We varied the body postures and grips (Symmetric bimanual, Asymmetric bimanual finger, Asymmetric bimanual thumb and Single-handed) studying the effects through a dual pointing task. Our results showed that the body posture Lying down had the most movement, followed by Sitting at a table and finally Standing. We additionally generate reports of motions performed using different grips. Our work extends previous research conducted with multiple grips in a sitting position by including other body postures, it is anticipated that UI designers will use our results to inform the development of mobile user interfaces.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... Also, thanks to the advancements made in hardware technologies and their continually shrinking size following Moore's law [86], mobile devices form factors are becoming smaller. They allow diverse interactions using both hands [17,39,91] or a single handed grip [23,39]. Consequently, it makes them sufficiently portative to be used in different contexts and situations (e.g., checking emails while walking, or even using GPS while driving). ...
... Also, thanks to the advancements made in hardware technologies and their continually shrinking size following Moore's law [86], mobile devices form factors are becoming smaller. They allow diverse interactions using both hands [17,39,91] or a single handed grip [23,39]. Consequently, it makes them sufficiently portative to be used in different contexts and situations (e.g., checking emails while walking, or even using GPS while driving). ...
... However, single-handed touch interaction has been widely documented and reviewed in the literature. It's a common observed hand grip for situations like standing [39], walking [17,91], holding items or being encumbered [23]. As well as situations where users are under some distractions [88] or when the physical and visual attention is hosted by the other hand [20]. ...
Thesis
Full-text available
The growth of mobile applications used in all aspects of life and the easiness of interaction they provide through tactile input had allowed users to interact with touchscreen in many different ways and in different contexts, including situations where the interaction complements or challenges another, a primary task, that either needs full attention or is cognitively demanding. Meanwhile, in the last few decades, touch interaction has been enhanced with a tactile feedback that provides stimulation when touching the surface. Especially, we are interested here by ultrasonic technologies that permit users to perceive different textures when moving their finger on the surface. In this context, we seek, here, to understand how to use tactile textures to successfully complete the secondary tasks without disturbing the primary one. First, we started by determining the effectiveness of tactile textures in a realworld scenario. We conducted three studies. From the first study, we found that users are perfectly able to use and recognize tactile textures when performing a physical activity like walking, or taking a public transport. In the next studies, we considered an attention saturating and a cognitively demanding primary tasks to challenge the secondary one. Our finding also proved that users were perfectly able to recognize textures in such contexts. Through an in-depth analysis, we reveal new findings about how people perceive tactile textures when performing another primary task. An important result of our work was that users were able for some situations to perceive tactile textures eyes-free when performing another primary task. In the same time, as tactile feedback is only felt when users’ finger is moving over the surface following rectilinear gestures trajectories, we run a forth study, seeking to better understand the characteristics of stroke gestures and howthese latter are be produced eyes-free. Our results indicate that gestures made eyes-free were geometrically different from gestures produced in the presence or absence of a visual feedback, and with a lower recognition rate of (95.53%), which it self was quite good. Finally, we introduce a new interaction technique that features the use of tactile textures to deliver information eyes-free, relying so on the sens of touch to provide information to fingertips depending on user’s finger speed and the perceived texture. When evaluating this latter technique, our findings indicated that users can learn and be accommodate to the dynamics of eyes-free tactile channel selection, and can reliable discriminate between different tactile patterns during multi-channel selection with an accuracy up to 90%when using two finger speed levels.
... The children bent their backs while using smartphones in the range of 8.50-8.54 degrees and reclined backward by [12][13][14][15][16][17][18][19][20][21][22][23][24] degrees while viewing the screen [9]. Both of these ranges are dangerous concerning back and neck injuries. ...
... Hansraj KK. found that flexing the neck by 60 degrees while using the phone is the same as applying a 60-pound weight to the spine [10]. Although several studies have focused on the use of smartphones in sitting positions for adolescents and adults [5,7,8,11], a dearth of factual investigation exists on the use of smartphones among school-aged children, especially in lying down positions [12]. Lying down posture had the maximum physical movement while using a smartphone, followed by the sitting posture, which created a lot of movement, intensifying the ergonomic risk [12]. ...
... Although several studies have focused on the use of smartphones in sitting positions for adolescents and adults [5,7,8,11], a dearth of factual investigation exists on the use of smartphones among school-aged children, especially in lying down positions [12]. Lying down posture had the maximum physical movement while using a smartphone, followed by the sitting posture, which created a lot of movement, intensifying the ergonomic risk [12]. A lack of information exists concerning risk factors for the development of musculoskeletal pain among primary school students. ...
Article
Full-text available
School-age children increasingly use smartphones to conduct their learning activities; increasing reports of disorders related to smartphone use exist, including visual-related symptoms, stress, and musculoskeletal pain. This study aimed to examine risk factors for musculoskeletal pain among primary school students using smartphones. A cross-sectional study was conducted with 233 school-aged children in Nakhon Si Thammarat, Thailand. Data collection used a questionnaire for musculoskeletal symptoms using the Nordic Musculoskeletal Questionnaire with ISO 11,226:2000. Through Chi-square, t-test, and logistic regression analysis, factors independently associated with musculoskeletal pain were determined. An important factor in the development of musculoskeletal pain was the prolonged use of smartphones for longer than 60 min, particularly among children aged 6–9 years old. In regards to musculoskeletal pain, almost 53% of the students used their smartphones while lying down. Posing in a prone position while using a smartphone was 7.37 times more dangerous than sitting. The laying position tilts numerous organs at varying angles, especially the upper arm. The risk of musculoskeletal complaints must be reduced by educating parents, children, and the relevant government organizations about safe smartphone usage. The mentioned factors may be used to anticipate the onset of musculoskeletal pain caused by smartphone use in young children.
... Indeed, BoD fingers perform supportive micro-movements while the thumb moves on the front side. These are necessary to maintain a stable grip [14,15], increase the thumb's range [9,14], and due to the limited independence of the finger movements [18]. ...
... An important basis to minimize unintended inputs is an understanding of supportive micro-movements which occur while holding and interacting with the device. Previous work analyzed supportive micro-movements by quantifying the amount of grip shifts on different smartphones through video observations [13,14] and built-in IMUs [9,14,15]. However, we have no understanding of how fingers move and unintended inputs that they would generate on the rear. ...
... Previous work found that users tilt the device towards their thumb to reach farther distanced targets (e.g., at the top left corner) and away from their thumb to reach targets at the bottom right corner [9,14]. Eardley et al. [13][14][15] referred to all movements which increase the reachability as "grip shifts" and explored them for different device sizes and tasks. Based on video recordings with manually identified key points and accelerometer values, they quantified the number of grip shifts during common smartphone tasks. ...
Conference Paper
Full-text available
Additional input controls such as fingerprint scanners, physical buttons, and Back-of-Device (BoD) touch panels improve the input capabilities on smartphones. While previous work showed the benefits of input beyond the touchscreen, unfavorably designed input controls force detrimental grip changes and increase the likelihood of unintended inputs. Researchers investigated all fingers' comfortable areas to avoid grip changes. However, there is no understanding of unintended BoD inputs which frustrate users and lead to embarrassing mistakes. In this paper, we study the BoD areas in which unintended inputs occur during interaction with the touchscreen. Participants performed common tasks on four smartphones which they held in the prevalent single-handed grip while sitting and walking. We recorded finger movements with a motion capture system and analyzed the unintended inputs. We identified comfortable areas on the back in which no unintended inputs occur and found that the least unintended inputs occurred on 5" devices. We derive three design implications for BoD input to help designers considering reachability and unintended input.
... Most critically, re-grasping destabilizes the device grip and causes increased device motion. This makes users feel insecure in holding their device [13,15] and can lead to accidental drops, breaking the screen or other components. This out-of-reach area grows with screen size. ...
... Dependent Variables were trial completion Time [ms], users' Success [0,1], i.e., whether they selected the correct target or not, and the Gesture Footprint caused by the touches on the screen to capture up to where users had to move their thumb. To quantify device motion and grip stability as in [13,15], Rotation captured device rotation [ • ] around x-, y-, and z-axis at 60 Hz. After each Techniqe, users were asked how much they agreed to (i) that they had to regularly change their device grip before acquiring a target, (ii) that they maintained a stable grip while selecting a target, and (iii) that the Techniqe was easy to apply on a 7-point Likert scale (7 = totally agree). ...
... For the dichotomous Success data, we ran McNemar and Cochran's Q tests. For the analysis of Rotation data we followed [13,15] by summing up the absolute angles of device motion change around each axis and ran repeated-measures ANOVAs on the log-transformed data. Likert scale data was compared using Friedman tests. ...
Conference Paper
Smartphones are used predominantly one-handed, using the thumb for input. Many smartphones, however, have grown beyond 5". Users cannot tap everywhere on these screens without destabilizing their grip. ForceRay (FR) lets users aim at an out-of-reach target by applying a force touch at a comfortable thumb location, casting a virtual ray towards the target. Varying pressure moves a cursor along the ray. When reaching the target, quickly lifting the thumb selects it. In a first study, FR was 195 ms slower and had a 3% higher selection error than the best existing technique, BezelCursor (BC), but FR caused significantly less device movement than all other techniques, letting users maintain a steady grip and removing their concerns about device drops. A second study showed that an hour of training speeds up both BC and FR, and that both are equally fast for targets at the screen border.
... Each had the same structure but investigated different factors: the first asked participants to generate UIs for different body postures given a specific grip, while in the second we fixed the body postures and let participants generate UIs for various hand grips. In both workshops, participants were given empirical data on hand grip, body posture and the corresponding tilting range of the smartphone as extracted from empirical data in [9] [10]. ...
... However, other research looked into hand movements when using a single-handed grip by tilting the device to allow the thumb to reach more items [22,20,4], This research established a precedent of using smartphone sensors in order to further interaction experience within the physical realms. Research by Eardley et al [9,10], was the first to provide a detailed table of empirically-derived metrics of factors (hand grips, body posture, phone form factors, target position) that affect movements. ...
... [4] investigated how smartphone movement varied in various physical positions during different activities (sitting, standing, walking and sitting on a moving bus). Finally [10] looked systematically at the hand-smartphone interaction and compared body postures with multiple grips. ...
Conference Paper
In this paper we explore how screen-based smartphone interaction can be enriched when designers focus on the physical interaction issues surrounding the device. These consist of the hand grips used (Symmetric bimanual, Asymmetric bimanual with thumb, Single handed, Asymmetric bimanual with finger), body postures (Sitting at a table, Standing, Lying down) and the tilting of the smartphone itself. These physical interactions are well described in the literature and several research papers provide empirical metrics describing them. In this paper, we go one step further by using this data to generate new screenbased interactions. We achieved this by conducting two workshops to investigate how smartphone interaction design can be informed by the physicality of smartphone interaction. By analysing the outcomes, we provide 14 new screen interaction examples with additional insights comparing outcomes for various body postures and grips.
... With desktop computers, the usage is consistent: one is facing the monitor and typically using mouse and keyboard. In contrast, mobile devices can be used in a variety of ways: either hand held, placed on a table, or even body-worn as with smartwatches; while sitting, standing, or lying down; and with one or two hands [8,30,31]. In consequence, the viewing angle, orientation, and distance to the display can differ, which likely affects readability of the visualized content. ...
... The way in which we hold a device also determines which parts of the interface or visualization are easy to reach. For example, when holding and using the device with just one hand, content in the opposite display corner of the hand is typically harder to access [31]. In general, the usage type is closely coupled to device factors such as size, weight, and interaction modalities [32,102]. ...
... As the majority of fingers are placed on the back when holding the device in common grips, we can assume that they are not used for input. Thus, reliably differentiating between combinations of input fingers (predominantly left/right thumbs and index fingers [15,16,40,43,55]) already extends the input vocabulary with useful features such as shortcuts and secondary actions. While finger identification is not a new challenge, there is no data set available which includes capacitive images of touches by each finger on a capacitive touchscreen. ...
... This extends the touch input vocabulary as the second thumb can be used for secondary actions, similar to the right mouse button. Moreover, previous work showed that using both thumbs is already a common posture for most users [15,16,43,55]. In addition to an offline validation, we demonstrate the usefulness of our thumb l/r model, suitable use cases, and the model's accuracy during real use cases on a commodity smartphone in the following. ...
Conference Paper
Full-text available
Touchscreens enable intuitive mobile interaction. However, touch input is limited to 2D touch locations which makes it challenging to provide shortcuts and secondary actions similar to hardware keyboards and mice. Previous work presented a wide range of approaches to provide secondary actions by identifying which finger touched the display. While these approaches are based on external sensors which are inconvenient, we use capacitive images from mobile touchscreens to investigate the feasibility of finger identification. We collected a dataset of low-resolution fingerprints and trained convolutional neural networks that classify touches from eight combinations of fingers. We focused on combinations that involve the thumb and index finger as these are mainly used for interaction. As a result, we achieved an accuracy of over 92% for a position-invariant differentiation between left and right thumbs. We evaluated the model and two use cases that users find useful and intuitive. We publicly share our data set (CapFingerld) comprising 455,709 capacitive images of touches from each finger on a representative mutual capacitive touchscreen and our models to enable future work using and improving them.
... Additionally, the study by Eardley et al., (2018) found that sitting and lying down produced the most movement, respectively. With symptoms in the upper arm (10%) and lower arm (5%), this raised the ergonomic risk. ...
Article
The aim of this paper is to analyse trends by reviewing the growing body of research on smartphone overuse in humans. This literature review focuses exclusively on original research articles. It begins by explaining key concepts, such as the levels and symptoms of smartphone usage excessively. There is a term “Nomophobia” which is frequently linked to poor mobile technology use. It, often known as "no-mobile-phone phobia," is the anxiety of not being able to use or reachable via one's cell phone. Additionally, it highlights the fundamental challenges and methodological issues discussed in the existing studies. The paper explores the relationship between student smartphone overuse and academic performance. While some researchers have identified gender differences in smartphone usage, others have found minimal correlation between gender and smartphone use. Several studies also link smartphone usage pattern to musculoskeletal problems. Moreover, research indicates that mobile phone use while driving is a major contributor to road accidents today. Also, it is crucial to understanding the impact of negative side of mobile phone usage on mental health, academic performance, and social interactions. Researches on this topic helps in developing effective interventions and policies to promote healthier usage habits. Therefore, raising awareness about smartphone overuse and its consequences is crucial. Parents should actively monitor their children’s schedules to help prevent them from developing a dependency on mobile devices.
... We calibrated the duration of this sequence based on prior work on kinesthetic feedback (five minutes for the red hand game played with Pose-IO [43]) and the constraint to keep the total duration of the experiment reasonable for participants (sixty minutes in total, as in Muscle-IO [22]). The sequence was replayed under different conditions: the hand resting on a table (Figure 4a), standing up with the hand alongside the body (4b), holding the smartphone (4c), holding a large object with both hands (4d), resting the head in the hand wearing the device (4e), and shaking hands with the experimenter (4f), representative of a variety of everyday interactions involving the hand [23,42,45,53,65,84]. The order of these conditions was randomized per participant. ...
... They are not only related to usability but also health and performance [6,21,46,60]. Several studies have observed how interfaces as well as devices affect users' posture [16,23,65] and investigated adaptive interfaces for better comfort and performance [35,63,67]. ...
... Negulescu and McGrenere [11] identified that, in different postures, users adjust the grip at runtime in such a way that they can point at the location accurately. Eardley et al., in their recent works [2][3][4], focused on understanding the influence of body posture and device size on mobile phone interactions. However, they did not particularly look upon parameters responsible for one-handed thumb-based interaction. ...
... This means action items (in this case the annotation interface) should be within the functional area of the thumb. Since distinctive body postures use distinctive sets of muscles [3], we ensured that at least for standing interaction, asymmetric bimanual input with thumb is comfortable [29]. ...
Conference Paper
Full-text available
Collecting accurate and precise emotion ground truth labels for mobile video watching is essential for ensuring meaningful predictions. However, video-based emotion annotation techniques either rely on post-stimulus discrete self-reports, or allow real-time, continuous emotion annotations (RCEA) only for desktop settings. Following a user-centric approach, we designed an RCEA technique for mobile video watching, and validated its usability and reliability in a controlled, indoor (N=12) and later outdoor (N=20) study. Drawing on physiological measures, interaction logs, and subjective workload reports, we show that (1) RCEA is perceived to be usable for annotating emotions while mobile video watching , without increasing users' mental workload (2) the resulting time-variant annotations are comparable with intended emotion attributes of the video stimuli (classification error for valence: 8.3%; arousal: 25%). We contribute a validated annotation technique and associated annotation fusion method, that is suitable for collecting fine-grained emotion annotations while users watch mobile videos.
... The neck pain and disability scores were related to the DCDS while these parameters were not associated with the DSDS on smartphone. It was concluded that the DCDS might be more indicative for neck disability than DSDS because while lying down the individuals may assume different body postures that do not include static neck flexion and because the daily total screen time involved both static and non-static postural conditions [44]. ...
Article
Background: The time spent on mobile phone during daytime is increasing with the rapid life-style in young population for different purposes such as texting, calling etc. and the younger population is more dependent on networking with them. Objective: The aim of the present study was to compare cervical-postural characteristics between groups with regard to their daily calling duration on smartphone and to determine the relationship between daily calling duration and potential cervical pain and disability. Methods: Sixty-three university students were included in the study. Participants were divided concerning to their durations of daily calling durations on smartphones. The joint-repositioning-error sense, craniovertebral angle, cervicothoracic muscle strength, and endurance of neck flexors were measured and potential pain and disability levels were assessed. Results: University students who spending twenty minutes or more for calling on smartphone daily had significantly higher joint repositioning error sense. Additionally, it was determined that there was fair relationship between the daily calling time on smartphone and potential neck pain and disability. Conclusions: The prolonged calling duration on smartphone could affect cervical joint repositioning error sense in university students. This might be related to potential discomfort on cervical region in the further period.
... For instance, the Skin-On Smartphone supports back-of-device interaction [47], which let users interact with the device without occluding the screen. It can also sense how users are grabbing the device to enable additional applications or context detection [18,19] as the skin covers both the back and the side of the smartphone. For instance, Figure 13-c shows an adaptive Pie menu whose location depends on the handedness of the phone grasp. ...
Conference Paper
Full-text available
We propose a paradigm called Skin-On interfaces, in which interactive devices have their own (artificial) skin, thus enabling new forms of input gestures for end-users (e.g. twist, scratch). Our work explores the design space of Skin-On interfaces by following a bio-driven approach: (1) From a sensory point of view, we study how to reproduce the look and feel of the human skin through three user studies;(2) From a gestural point of view, we explore how gestures naturally performed on skin can be transposed to Skin-On interfaces; (3) From a technical point of view, we explore and discuss different ways of fabricating interfaces that mimic human skin sensitivity and can recognize the gestures observed in the previous study; (4) We assemble the insights of our three exploratory facets to implement a series of Skin-On interfaces and we also contribute by providing a toolkit that enables easy reproduction and fabrication.
... The influence of a user's grip of a phone on touch performance is well known. Recent examples are work by Eardley et al. [6,7] as well as by Lehmann and Kipp [18]. However, these papers only compared macro-changes in grip and not smaller grip adjustments such as the grips we observed. ...
Conference Paper
Touch accuracy is not just dependent on the performance of the touch sensor itself. Instead, aspects like phone grip or occlusion of the screen have been shown to also have an influence on accuracy. Yet, these are all dependent on one underlying factor: the size and proportions of the user's hand. To better understand touch input, we investigate how 11 hand features influence accuracy. We find that thumb length in particular correlates significantly with touch accuracy and accounts for about 12% of touch error variance. Furthermore, we show that measures of some higher level interactions also correlate with hand size.
Article
This paper investigates the relationship between menu design and hand positions in relation to the assessment of end users with main focus on usability, user preference, and potential adaptions to different hand positions. Sixteen (N=16) participants first participated in a co-design workshop, in which they proposed menu designs for different hand grips. Based on the design proposals, a selection of menu designs were derived and implemented in a mobile app prototype, on which a menu selection study was conducted to investigate performance and perceived usability of the menus in one-handed and two-handed interaction. The results include user ratings and performance, which highlight the need for mobile menus to be adapted for different hand positions. Based on that, we derive design recommendations for more adaptive, user-centric and ergonomic mobile menu designs to match the natural interactions of users.
Chapter
Using smartphones while moving is challenging and can be dangerous. Eyes-free input gestures can provide a means to use smartphones without the need for visual attention from users. In this study, we investigated the effect of different moving speeds (standing, walking, or jogging) and different locations (phone held freely in the hand, or phone placed inside a shoulder bag) on eyes-free input gestures with smartphone. Our results from 12 male participants showed gesture’s entering duration is not affected by moving speed or phone location, however, other features of gesture, such as length, height, width, area, and phone orientation, are mostly affected by moving speed or phone location. So, eyes-free gestures’ features vary significantly as the user’s environmental factors, such as moving speed or phone location, change and should be considered by designers. KeywordsEyes-free gesturesuser’s moving speedphone’s locationgesture featuresphone movementsmobile device
Article
We introduce a novel one-handed input technique for mobile devices that is not based on pointing, but on motion matching -where users select a target by mimicking its unique animation. Our work is motivated by the findings of a survey (N=201) on current mobile use, from which we identify lingering opportunities for one-handed input techniques. We follow by expanding on current motion matching implementations - previously developed in the context of gaze or mid-air input - so these take advantage of the affordances of touch-input devices. We validate the technique by characterizing user performance via a standard selection task (N=24) where we report success rates (>95%), selection times (~1.6 s), input footprint, grip stability, usability, and subjective workload - in both phone and tablet conditions. Finally, we present a design space that illustrates six ways in which motion matching can be embedded into mobile interfaces via a camera prototype application.
Chapter
With digital learning making waves, students and teachers have moved to the virtual platform to impart knowledge and disseminate information. Digital learning overlaps with a wide range of education systems, including previous traditional formal and informal educational systems. It has the potential to change the entire education system in this COVID-19 pandemic situation around the world, which is why it has become one of the prime research topics. As online learning grows, it is important to explore the student’s overall experiences in online learning environments. Often it is difficult for an online student to find the right time to study on their busy schedule. Maybe they've been studying anywhere, anytime—at the dinner table, in front of the TV, or their bed. If they do this, it will cause a whole lot of aches and pains by the use of computers, mobiles, or any other online study equipment without a good ergonomic setup. Ergonomics is the process of making the work environment more efficient by reducing human fatigue and restlessness and maximizing safety. Sitting static while doing study tasks or typing tasks on personal computers is frequently found to be awkward. People, especially students, are unaware of the importance of having the right attitude when doing their work on computers and mobile. For this reason, this study is conducted to help students to identify the risk level of sitting with awkward posture during their online study. This is done using the RULA method, and the scores reveal that some steps are needed to further improve it. Some ergonomic tips are provided to help online students to avoid bad postural habits and fatigue while working on phones, tablets, laptops, and desktops.
Article
Full-text available
Recently, there has been a phenomenon wherein smartphones have begun to serve as substitutes for televisions in domestic spaces. However, there have been few studies of investigating the relationship between smartphone use and domestic spaces. To obtain a deeper understanding, this study intends to investigate how smartphone use has influenced the behaviors of residents within their domestic spaces. In particular, this study first aims to examine the interrelation between smartphone use and the domestic spaces of single-person student households. To this end, this study conducted on-line surveys and semi-structured interviews based on non-probability purposive sampling. The research participants were asked about their media technology use and the behavioral characteristics of their single-person households, the differences in their preferences between watching television and using their smartphones, and relationships between smartphone use and the physical environment of single-person households. The findings of this study suggest that participants often lose interest in the vertical walls of their domestic spaces due to the postures they maintain while using smartphones. Instead of paying attention to vertical walls, smartphone screens have become versatile places wherein people can communicate with others and enjoy leisure activities. Based on these findings, this study proposes three design suggestions.
Article
Previous research demonstrated the ability for users to accurately recognize tactile textures on mobile surface. However, the experiments were only run in a lab setting and the ability for users to recognize tactile texture in a real-world environment remains unclear. In this paper, we investigate the effects of physical challenging activities on tactile textures recognition. We consider five conditions: (1) seated on an office, (2) standing in an office, (3) seated in the tramway, (4) standing in the tramway and (5) walking in the street. Our findings indicate that when walking, performances deteriorated compared to the remainder conditions. However, despite this deterioration, the recognition rate stay higher than 82% suggesting that tactile texture could be effectively recognized and used by users in different physical challenges activities including walking.
Conference Paper
In this paper, we investigated how "lying down'' body postures affected the use of the smartphone user interface (UI) design. Extending previous research that studied body postures, handgrips, and the movement of the smartphone. We have done this in three steps; (1) An online survey that examined what type of lying down postures, participants, utilized when operating a smartphone; (2) We broke down these lying down postures in terms of body angle (i.e., users facing down, facing up, and on their side) and body support; (3) We conducted an experiment questioning the effects that these body angles and body supports had on the participants' handgrips. What we found was that the smartphone moves the most (is the most unstable) in the "facing up (with support)'' condition. Additionally, we discovered that the participants preferred body posture was those that produced the least amount of motion (more stability) with their smartphones.
Thesis
Full-text available
Billions of mobile devices are used worldwide for a significant number of important tasks in our personal and professional lives. Unfortunately, mobile devices are prone to interaction challenges as a result of the changing contexts of use, resulting in the user experiencing a situational impairment. For example, when typing in a vehicle being driven over an uneven road, it is difficult to avoid incorrect key presses. Situational visual impairments (SVIs) are one type of usability and accessibility challenge mobile device user's face (e.g., not being able to read and reply to an important email when outside under bright sunlight), which suggests that current mobile industry practices are insufficient for supporting designers when addressing SVIs. However, there is little HCI research that provides a comprehensive understanding of SVIs through qualitative research. Considering that we primarily interact with mobile devices through the screen, it is arguably important to further research this area. Understanding the true context of SVIs will help to identify adequate solutions. To address this, I recruited 174 participants for an online survey and 24 participants across Australia and Scotland for a two-week ecological momentary assessment to establish what factors contribute to SVIs experienced when using a mobile device. My findings revealed that SVIs are a complex phenomenon with several interacting factors. I introduce a mobile device SVI Context Model to conceptualise the problem. I identified that mobile content design was the most practical first step towards addressing SVIs. Following this, I surveyed 43 mobile content designers and ran four follow-on interviews to identify how often SVIs were considered and how I could provide effective support. I found key similarities and differences between accessibility and designing to reduce SVIs. The participants requested guidelines, education, and digital design tools for improved SVI design support. I focused on identifying the necessary features and implementation for an SVI design tool that would support designers because this would have an immediate and positive influence on addressing SVIs. Next, I surveyed 50 mobile app designers using an online survey to understand how mobile app interfaces are designed. I identified a wide variety of tools and practices used, and the participants also raised challenges for designing mobile app interfaces that had implications for users experiencing SVIs. Using my new understanding of SVIs and the challenges mobile designers face, I ran two design workshops. The purpose of the first workshop was to generate ideas for SVI design tools that would fit within a typical designer's workflow. I then created high-fidelity prototypes to elicit more informed feedback in the second workshop. To address the problem of insufficient support for designers, I present a set of recommendations for developing SVI design tools to support designers in creating mobile content that reduces SVIs in different contexts. The recommendations provide guidance on how to incorporate SVI design support into existing design software (e.g., Sketch) and future design software. Design software companies following my recommendations will lead to an improved set of tools for designers to expand mobile content designs to different contexts. The development and inclusion of these designs within mobile apps (e.g., allowing alternative modes such as for day or night) will provide users with more control in addressing SVIs through enhanced content design.
Conference Paper
Full-text available
Smartphones are currently the most successful mobile devices. Through their touchscreens, they combine input and output in a single interface. A body of work investigated interaction beyond direct touch. In particular, previous work proposed using the device's rear as an interaction surface and the grip of the hands that hold the device as a means of input. While previous work provides a categorization of grip styles, a detailed understanding of the preferred fingers' position during different tasks is missing. This understanding is needed to develop ergonomic grasp-based and Back-of-Device interaction techniques. We report from a study to understand users' finger position during three representative tasks. We highlight the areas that are already covered by the users' hands while using the on-screen keyboard, reading a text, and watching a video. Furthermore, we present the position of each of the user's fingers during these tasks. From the results, we derive interaction possibilities from an ergonomic perspective.
Conference Paper
Full-text available
This paper evaluates how the "index finger zone" on the back of a smartphone could compensate for the limitations of the "thumb space." We conducted two experiments to investigate how to reach to a distant point comfortably with one hand. First, we gave the participants four typical tasks while using mobile phones (tapping, texting, calling, and scrolling). In these situations, we measured the position of the index finger and the thumb with the natural hand posture. Consequently, the index finger was primarily positioned in the upper left side. Second, the main experiment was to determine how a touchable area could be extended using the index finger zone on the back side. 1) Randomly selected tiles were touched 84 times with a thumb and 42 times with an index finger. 2) Each tile's preference was evaluated with a 5-point Likert scale. As a result, we found that the "comfort zone" could be expanded by 15% by using index finger zone.
Conference Paper
Full-text available
Although different types of touch surfaces have gained extensive attention in HCI, this is the first work to directly compare them for two critical factors: performance and ergonomics. Our data come from a pointing task (N=40) carried out on five common touch surface types: public display (large, vertical, standing), tabletop (large, horizontal, seated), laptop (medium, adjustably tilted, seated), tablet (seated, in hand), and smartphone (single- and two-handed input). Ergonomics indices were calculated from biomechanical simulations of motion capture data combined with recordings of external forces. We provide an extensive dataset for researchers and report the first analyses of similarities and differences that are attributable to the different postures and movement ranges.
Article
Full-text available
In this paper, we investigate the effects of encumbrance (carrying typical objects such as shopping bags during interaction) and walking on target acquisition on a touchscreen mobile phone. Users often hold objects and use mobile devices at the same time and we examined the impact encumbrance has on one- and two- handed interactions. Three common input postures were evaluated: two-handed index finger, one-handed preferred thumb and two-handed both thumbs, to assess the effects on performance of carrying a bag in each hand while walking. The results showed a significant decrease in targeting performance when users were encumbered. For example, input accuracy dropped to 48.1% for targeting with the index finger when encumbered, while targeting error using the preferred thumb to input was 4.2mm, an increase of 40% compared to unencumbered input. We also introduce a new method to evaluate the user's preferred walking speed when interacting - PWS&I, and suggest future studies should use this to get a more accurate measure of the user's input performance.
Conference Paper
Full-text available
Text entry on smartphones is far slower and more error-prone than on traditional desktop keyboards, despite sophisticated detection and auto-correct algorithms. To strengthen the empirical and modeling foundation of smartphone text input improvements, we explore touch behavior on soft QWERTY keyboards when used with two thumbs, an index finger, and one thumb. We collected text entry data from 32 participants in a lab study and describe touch accuracy and precision for different keys. We found that distinct patterns exist for input among the three hand postures, suggesting that keyboards should adapt to different postures. We also discovered that participants' touch precision was relatively high given typical key dimensions, but there were pronounced and consistent touch offsets that can be leveraged by keyboard algorithms to correct errors. We identify patterns in our empirical findings and discuss implications for design and improvements of soft keyboards.
Conference Paper
Full-text available
In this paper, we investigate how smartphone applications, in particular web browsers, are used on mobile phones. Using a publicly available widget for smart phones, we recorded app usage and the phones' acceleration and orientation from 1,330 devices. Combining app usage and sensor data we derive the device's typical posture while different apps are used. Analyzing motion data shows that devices are moved more while messaging and navigation apps are used as opposed to browser and other common applications. The time distribution between landscape and portrait depicts that most of the landscape mode time is used for burst interaction (e.g., text entry), except for Media apps, which are mostly used in landscape mode. Additionally, we found that over 31% of our users use more than one web browser. Our analysis reveals that the duration of mobile browser sessions is longer by a factor of 1.5 when browsers are explicitly started through the system's launcher in comparison to being launched from within another app. Further, users switch back and forth between apps and web browsers, which suggest that a tight and smooth integration of web browsers with native apps can improve the overall usability. From our findings we derive design guidelines for app developers.
Article
Full-text available
The lack of tactile feedback on touch screens makes typing difficult, a challenge exacerbated when situational impairments like walking vibration and divided attention arise in mobile settings. We introduce WalkType, an adaptive text entry system that leverages the mobile device's built-in tri-axis accelerometer to compensate for extraneous movement while walking. WalkType's classification model uses the displacement and acceleration of the device, and inference about the user's footsteps. Additionally, WalkType models finger-touch location and finger distance traveled on the screen, features that increase overall accuracy regardless of movement. The final model was built on typing data collected from 16 participants. In a study comparing WalkType to a control condition, WalkType reduced uncorrected errors by 45.2% and increased typing speed by 12.9% for walking participants.
Article
Full-text available
The purpose of this research is to develop a simulator which can evaluate stability and ease of human's grasping handheld information appliancessuch as digital cameras without real subjects and physical mockups. In the simulator, we integrate 3D digital hand models with the 3D CAD models of the appliance to realize the virtual grasping assessment. The features of the simulator are that 1) geometrically accurate 3D digital hand models with rich Japanese size varieties are used for the assessment, 2) a semi-automatic grasp planning function is installed to efficiently find appropriate grasp posture for the exterior housings geometries of appliances, 3) stability of grasping can be quantitatively evaluated based on the "force-closure" and the "grasp quality" indices, and 4) ease of grasping can be qualitatively evaluated based on "comfort database" constructed from the PCA for measurements of finger joint angles of real subject's.
Conference Paper
Full-text available
This paper presents a novel user interface for handheld mobile devices by recognizing hand grip patterns. Par- ticularly, we consider the scenario where the device is provided with an array of capacitive touch sensors un- derneath the exterior cover. In order to provide the users with intuitive and natural manipulation experience, we use pattern recognition techniques for identifying the users' hand grips from the touch sensors. Preliminary user studies suggest that filtering out unintended user hand grip is one of the most important issues to be re- solved. We discuss the details of the prototype imple- mentation, as well as engineering challenges for practi- cal deployment.
Conference Paper
Full-text available
As mobile and tangible devices are getting smaller and smal- ler it is desirable to extend the interaction area to their whole surface area. The HandSense prototype employs capacitive sensors for detecting when it is touched or held against a body part. HandSense is also able to detect in which hand the device is held, and how. The general properties of our approach were confirmed by a user study. HandSense was able to correctly classify over 80 percent of all touches, dis- criminating six different ways of touching the device (hold left/right, pick up left/right, pick up at top/bottom). This in- formation can be used to implement or enhance implicit and explicit interaction with mobile phones and other tangible user interfaces. For example, graphical user interfaces can be adjusted to the user's handedness.
Conference Paper
Full-text available
It is generally assumed that touch input cannot be accurate because of the fat finger problem, i.e., the softness of the fingertip combined with the occlusion of the target by the finger. In this paper, we show that this is not the case. We base our argument on a new model of touch inaccuracy. Our model is not based on the fat finger problem, but on the perceived input point model. In its published form, this model states that touch screens report touch location at an offset from the intended target. We generalize this model so that it represents offsets for individual finger postures and users. We thereby switch from the traditional 2D model of touch to a model that considers touch a phenomenon in 3-space. We report a user study, in which the generalized model explained 67% of the touch inaccuracy that was previously attributed to the fat finger problem. In the second half of this paper, we present two devices that exploit the new model in order to improve touch accuracy. Both model touch on per-posture and per-user basis in order to increase accuracy by applying respective offsets. Our RidgePad prototype extracts posture and user ID from the user’s fingerprint during each touch interaction. In a user study, it achieved 1.8 times higher accuracy than a simulated capacitive baseline condition. A prototype based on optical tracking achieved even 3.3 times higher accuracy. The increase in accuracy can be used to make touch interfaces more reliable, to pack up to 3.3^2 > 10 times more controls into the same surface, or to bring touch input to very small mobile devices.
Conference Paper
When people hold their smartphone in landscape orientation, they use their thumbs for input on the frontal touchscreen, while their remaining fingers rest on the back of the device (BoD) to stabilize the grip. We present BackXPress, a new interaction technique that lets users create BoD pressure input with these remaining fingers to augment their interaction with the touchscreen on the front: Users can apply various pressure levels with each of these fingers to enter different temporary "quasi-modes" that are only active as long as that pressure is applied. Both thumbs can then interact with the frontal screen in that mode. We illustrate the practicality of BackXPress with several sample applications, and report our results from three user studies: Study 1 investigated which fingers can be used to exert BoD pressure and found index, middle, and ring finger from both hands to be practical. Study 2 revealed how pressure touches from these six fingers are distributed across the BoD. Study 3 examined user performance for applying BoD pressure (a) during single touches at the front and (b) for 20 seconds while touching multiple consecutive frontal targets. Participants achieved up to 92% pressure accuracy for three separate pressure levels above normal resting pressure, with the middle fingers providing the highest accuracy. BoD pressure did not affect frontal touch accuracy. We conclude with design guidelines for BoD pressure input.
Conference Paper
In this paper we present an investigation into how hand usage is affected by different mobile phone form factors. Our initial (qualitative) study explored how users interact with various mobile phone types (touchscreen, physical keyboard and stylus). The analysis of the videos revealed that each type of mobile phone affords specific handgrips and that the user shifts these grips and consequently the tilt and rotation of the phone depending on the context of interaction. In order to further investigate the tilt and rotation effects we conducted a controlled quantitative study in which we varied the size of the phone and the type of grips (Symmetric bimanual, Asymmetric bimanual with finger, Asymmetric bimanual with thumb and Single handed) to better understand how they affect the tilt and rotation during a dual pointing task. The results showed that the size of the phone does have a consequence and that the distance needed to reach action items affects the phones' tilt and rotation. Additionally, we found that the amount of tilt, rotation and reach required corresponded with the participant's grip preference. We finish the paper by discussing the design lessons for mobile UI and proposing design guidelines and applications for these insights.
Conference Paper
In this paper we investigate the physical interaction between the hand and three types of mobile device interaction: touchscreen, physical keyboard and stylus. Through a controlled study using video observational analysis, we observed firstly, how the participants gripped the three devices and how these grips were device dependent. Secondly we looked closely at these grips to uncover how participants performed what we call micro-movements to facilitate a greater range of interaction, e.g. reaching across the keyboard. The results extend current knowledge by comparing three handheld device input methods and observing the movements, which the hand makes in five grips. The paper concludes by describing the development of a conceptual design, proposed as a provocation for the opening of dialogue on how we conceive hand usage and how it might be optimized when designed for mobile devices.
Conference Paper
Touchscreens continue to advance including progress towards sensing fingers proximal to the display. We explore this emerging pre-touch modality via a self-capacitance touchscreen that can sense multiple fingers above a mobile device, as well as grip around the screen's edges. This capability opens up many possibilities for mobile interaction. For example, using pre-touch in an anticipatory role affords an "ad-lib interface" that fades in a different UI--appropriate to the context--as the user approaches one-handed with a thumb, two-handed with an index finger, or even with a pinch or two thumbs. Or we can interpret pre-touch in a retroactive manner that leverages the approach trajectory to discern whether the user made contact with a ballistic vs. a finely-targeted motion. Pre-touch also enables hybrid touch + hover gestures, such as selecting an icon with the thumb while bringing a second finger into range to invoke a context menu at a convenient location. Collectively these techniques illustrate how pre-touch sensing offers an intriguing new back-channel for mobile interaction.
Conference Paper
In order to reach targets with one hand on common large mobile touch displays, users tilt and shift the device in their hand. In this work, we use this grip change as a continuous information stream for detecting where the user will touch while their finger is still en-route. We refer to this as in the air prediction. We show that grip change detected using standard mobile motion sensors produces similar in the air touch point predictions to techniques that use auxiliary sensor arrays, even in varying physical scenarios such as interacting in a moving vehicle. Finally, our model that combines grip change and the resulting touch point predicted where users intended to land, lowering error rates by 41%.
Conference Paper
As large-screen smartphones are trending, they bring a new set of challenges such as acquiring unreachable screen targets using one hand. To understand users' touch behavior on large mobile touchscreens, we conducted an empirical experiment to discover their usage patterns of tilting devices toward their thumbs to touch screen regions. Exploiting this natural tilting behavior, we designed three novel mobile interaction techniques: TiltSlide, TiltReduction, and TiltCursor. We conducted a controlled experiment to compare our methods with other existing methods, and then evaluated them in real mobile phone scenarios such as sending an e-mail and web surfing. We constructed a design space for one-hand targeting interactions and proposed design considerations for one-hand targeting in real mobile phone circumstances.
Article
The purpose of this study was to determine how smartphone use posture affects biomechanical variables and muscle activities. Eleven university students(age: 22.2?2.6 yrs, height: 176.6?4.7 cm, weight: 69.5?7.5 kg) who have no musculoskeletal disorder were recruited as the subject according to having experience in using the smartphone for more than one year. Angular velocity, muscle activity, and thumb finger pressure were determined for each trial. For each dependent variable, a one-way analysis of variance (ANOVA) with repeated measures was performed to test if significant difference existed among different three conditions (p
Article
We present BackPat: A technique for supporting one-handed smartphone operation. Using pats of either the index finger or middle finger or thumb on the back or side of the device, the user can extend one-handed use in a variety of difficult tasks. We explain the principle behind the technique and make a first attempt at examining its usability and versatility by implementing it into four applications, covering text selection, reaching distant targets, multiple file selection, and map and image zoom. An initial user study has shown a high grade of acceptance, verified the interaction logic and highlighted improvements in task-completion time over non-enhanced interaction. This way we hope to encourage discussion about its usefulness and potential.
Article
We demonstrate that front-of-screen targeting on mobile phones can be predicted from back-of-device grip manipulations. Using simple, low-resolution capacitive touch sensors placed around a standard phone, we outline a machine learning approach to modelling the grip modulation and inferring front-of-screen touch targets. We experimentally demonstrate that grip is a remarkably good predictor of touch, and we can predict touch position 200ms before contact with an accuracy of 18mm.
Article
We present a predictive model for the functional area of the thumb on a touchscreen surface: the area of the interface reachable by the thumb of the hand that is holding the device. We derive a quadratic formula by analyzing the kinematics of the gripping hand. Model fit is high for the thumb-motion trajectories of 20 participants. The model predicts the functional area for a given 1) surface size, 2) hand size, and 3) position of the index finger on the back of the device. Designers can use this model to ensure that a user interface is suitable for interaction with the thumb. The model can also be used inversely - that is, to infer the grips assumed by a given user interface layout.
Article
We present iRotate, an approach to automatically rotate screens on mobile devices to match users' face orientation. Current approaches to automatic screen rotation are based on gravity and device orientation. Our survey of 513 users shows that 42% currently experience auto-rotation that leads to incorrect viewing orientation several times a week or more, and 24% find the problem to be very serious to extremely serious. iRotate augments gravity-based approach, and uses front cameras on mobile devices to detect users' faces and rotates screens accordingly. It requires no explicit user input and supports different user postures and device orientations. We have implemented a iRotate that works in real-time on iPhone and iPad, and we assess the accuracy and limitations of iRotate through a 20- participant feasibility study.
Conference Paper
In spite of the increasing popularity of handheld touchscreen devices, little research has been conducted on how to evaluate and design one handed thumb tapping interactions. In this paper, we present a study that researched three issues related to these interactions: 1) whether it is necessary to evaluate these interactions with the preferred and the non-preferred hand; 2) whether participants evaluating these interactions should be asked to stand and walk during evaluations; 3) whether targets on the edge of the screen enable participants to be more accurate in selection than targets not on the edge. Half of the forty participants in the study used their non-preferred hand and half used their preferred hand. Each participant conducted half of the tasks while walking and half while standing. We used 25 different target positions (16 on the edge of the screen) and five different target sizes. The participants who used their preferred hand completed tasks more quickly and accurately than the participants who used their non-preferred hand, with the differences being large enough to suggest it is necessary to evaluate this type of interactions with both hands. We did not find differences in the performance of participants when they walked versus when they stood, suggesting it is not necessary to include this as a variable in evaluations. In terms of target location, participants rated targets near the center of the screen as easier and more comfortable to tap, but the highest accuracy rates were for targets on the edge of the screen.
Article
Three studies of different mobile-device hand postures are presented. The first study measures the performance of postures in Fitts' law tasks using one and two hands, thumbs and index fingers, horizontal and vertical movements, and front- and back-of-device interaction. Results indicate that the index finger performs well on both the front and the back of the device, and that thumb performance on the front of the device is generally worse. Fitts' law models are created and serve as a basis for comparisons. The second study examines the orientation of shapes on the front and back of a mobile device. It shows that participants' expectations of visual feedback for finger movements on the back of a device reverse the direction of their finger movements to favor a ''transparent device'' orientation. The third study examines letter-like gestures made on the front and back of a device. It confirms the performance of the index finger on the front of the device, while showing limitations in the ability for the index finger on the back to perform complex gestures. Taken together, these results provide an empirical foundation upon which new mobile interaction designs can be based. A set of design implications and recommendations are given based directly on the findings presented. r 2008 Elsevier Ltd. All rights reserved.
Article
Mobile device text messaging and other typing is rapidly increasing worldwide. A checklist was utilized to characterize joint postures and typing styles in individuals appearing to be of college age (n = 859) while typing on their mobile devices in public. Gender differences were also ascertained. Almost universally, observed subjects had a flexed neck (91.0%, n = 782), and a non-neutral typing-side wrist (90.3%, n = 776). A greater proportion of males had protracted shoulders (p < 0.01, χ(2) test), while a greater proportion of females had a typing-side inner elbow angle of <90°, particularly while standing (p = 0.03, χ(2) test). 46.1% of subjects typed with both thumbs (two hands holding the mobile device). Just over one-third typed with their right thumb (right hand holding the mobile device). No difference in typing styles between genders was found. Future research should determine whether the non-neutral postures identified may be associated with musculoskeletal disorders.
Article
To work successfully, designers must understand the various body shapes and physical abilities of the population for which they design. The Measure of Man and Woman is an updated and expanded version of the landmark human factors book first published in 1959. It brings together a wealth of crucial information to help designers create products and environments that better accommodate human needs.
The Measure of Man: Human Factors in Design , Whitney Library of Design , 1967 Dreyfuss, H. The Measure of Man: Human Factors in Design
  • H Dreyfuss
  • Dreyfuss H.
Biomechanical analysis of a smartphone task with different postures
  • D.-S Kim
  • Kim D.-S