Martin Weigel’s research while affiliated with Saarland University and other places

What is this page?


This page lists works of an author who doesn't have a ResearchGate profile or hasn't added the works to their profile yet. It is automatically generated from public (personal) data to further our legitimate goal of comprehensive and accurate scientific recordkeeping. If you are this author and want this page removed, please let us know.

Publications (9)


DeformWear: Deformation Input on Tiny Wearable Devices
  • Article

June 2017

·

136 Reads

·

46 Citations

Proceedings of the ACM on Interactive Mobile Wearable and Ubiquitous Technologies

Martin Weigel

·

Jürgen Steimle

Due to their small surfaces, wearable devices make existing techniques for touch input very challenging. This paper proposes deformation input on a tiny and soft surface as an input modality for wearable computing devices. We introduce DeformWear, tiny wearable devices that leverage single-point deformation input on various body locations. Despite the small input surface, DeformWear enables expressive and precise input using high-resolution pressure, shear, and pinch deformations. We present a first set of interaction techniques for tiny deformation-sensitive wearable devices. They enable fluid interaction in a large input space by combining multiple dimensions of deformation. We demonstrate their use in seven application examples, showing DeformWear as a standalone input device and as a companion device for smartwatches, head-mounted displays, or headphones. Results from a user study demonstrate that these tiny devices allow for precise and expressive interactions on many body locations, in standing and walking conditions.


SkinMarks: Enabling Interactions on Body Landmarks Using Conformal Skin Electronics

May 2017

·

524 Reads

·

157 Citations

The body provides many recognizable landmarks due to the underlying skeletal structure and variations in skin texture, elasticity, and color. The visual and spatial cues of such body landmarks can help in localizing on-body interfaces, guide input on the body, and allow for easy recall of mappings. Our main contribution are SkinMarks, novel skin-worn I/O devices for precisely localized input and output on fine body landmarks. SkinMarks comprise skin electronics on temporary rub-on tattoos. They conform to fine wrinkles and are compatible with strongly curved and elastic body locations. We identify five types of body landmarks and demonstrate novel interaction techniques that leverage SkinMarks' unique touch, squeeze and bend sensing with integrated visual output. Finally, we detail on the conformality and evaluate sub-millimeter electrodes for touch sensing. Taken together, SkinMarks expands the on-body interaction space to more detailed, highly curved and challenging areas on the body.


On-Skin Interaction Using Body Landmarks

January 2017

·

61 Reads

·

30 Citations

Computer

Jurgen Steimle

·

Joanna Bergstrom-Lehtovirta

·

Martin Weigel

·

[...]

·

Kasper Hornbak

The human skin is a promising surface for input to computing devices but differs fundamentally from existing touch-sensitive devices. The authors propose the use of skin landmarks, which offer unique tactile and visual cues, to enhance body-based user interfaces.



iSkin: Stretchable On-Body Touch Sensors for Mobile Computing
  • Chapter
  • Full-text available

August 2015

·

145 Reads

Download

iSkin: Flexible, Stretchable and Visually Customizable On-Body Touch Sensors

April 2015

·

508 Reads

·

333 Citations

We propose iSkin, a novel class of skin-worn sensors for touch input on the body. iSkin is a very thin sensor overlay, made of biocompatible materials, and is flexible and stretchable. It can be produced in different shapes and sizes to suit various locations of the body such as the finger, forearm, or ear. Integrating capacitive and resistive touch sensing, the sensor is capable of detecting touch input with two levels of pressure, even when stretched by 30% or when bent with a radius of 0.5cm. Furthermore, iSkin supports single or multiple touch areas of custom shape and arrangement, as well as more complex widgets, such as sliders and click wheels. Recognizing the social importance of skin, we show visual design patterns to customize functional touch sensors and allow for a visually aesthetic appearance. Taken together, these contributions enable new types of on-body devices. This includes finger-worn devices, extensions to conventional wearable devices, and touch input stickers, all fostering direct, quick, and discreet input for mobile computing.


Figure 3: Overview of user-defined gestures.  
Figure 4: Input Modalities: (a) modalities used in the user-defined gestures , (b) aggregated means and 95% confidence intervals of perceived ease and comfort.  
Figure 5. User-defined set of skin-specific gestures.  
More Than Touch: Understanding How People Use Skin as an Input Surface for Mobile Computing

April 2014

·

533 Reads

·

119 Citations

This paper contributes results from an empirical study of on-skin input, an emerging technique for controlling mobile devices. Skin is fundamentally different from off-body touch surfaces, opening up a new and largely unexplored interaction space. We investigate characteristics of the various skin-specific input modalities, analyze what kinds of gestures are performed on skin, and study what are preferred input locations. Our main findings show that (1) users intuitively leverage the properties of skin for a wide range of more expressive commands than on conventional touch surfaces; (2) established multi-touch gestures can be transferred to on-skin input; (3) physically uncomfortable modalities are deliberately used for irreversible commands and expressing negative emotions; and (4) the forearm and the hand are the most preferred locations on the upper limb for on-skin input. We detail on users' mental models and contribute a first consolidated set of on-skin gestures. Our findings provide guidance for developers of future sensors as well as for designers of future applications of on-skin input.


ProjectorKit: Easing Rapid Prototyping of Interactive Applications for Mobile Projectors

August 2013

·

29 Reads

·

8 Citations

Researchers have developed interaction concepts based on mobile projectors. Yet pursuing work in this area - particularly in building projector-based interactions techniques within an application - is cumbersome and time-consuming. To mitigate this problem, we contribute ProjectorKit, a flexible open-source toolkit that eases rapid prototyping mobile projector interaction techniques.


Citations (7)


... Moreover, specific body parts can be directly mapped to screen layouts on conventional digital devices, such as the palm [14,19,68] or forearm [6,48], and offer intuitive support for interactions in virtual reality environments [33]. Additionally, the spatial location of the body part receiving input can serve as an extra modality [24,45], while anatomical landmarks on the body, such as finger joints and nails, as well as personal landmarks, such as scars and tattoos, can be leveraged to further enhance the input efficiency of on-body input [58] and increase recall rates for virtual content mapped to the body [5]. We refer to Bergström and Hornbaek [4] and Villarreal-Narvaez et al. [64] for comprehensive overviews of on-skin, on-body, and whole-body user interfaces. ...

Reference:

Intermanual Deictics: Uncovering Users' Gesture Preferences for Opposite-Arm Referential Input, from Fingers to Shoulder
On-Skin Interaction Using Body Landmarks
  • Citing Article
  • January 2017

Computer

... The FoamSense sensor employed conductive ink and a porous soft object to measure the user's compression, bending, twisting, and shearing operations [29]. Weigel et al. reported a wearable mini-device, DeformWear, for single-point deformation interaction (e.g., squeezing, compression, and shearing) on a limited surface [30]. The above studies have demonstrated that flexible inputs can provide richer and more expressive user interactions than traditional touch interfaces. ...

DeformWear: Deformation Input on Tiny Wearable Devices
  • Citing Article
  • June 2017

Proceedings of the ACM on Interactive Mobile Wearable and Ubiquitous Technologies

... However, they relied on rigid enclosures and extruded profiles, which were not body-conformable. Researchers have since explored alternative wearable interfaces made from softer and more flexible materials, such as silicone [52,53,55,71,78,79,110,111], ink [16,91], flexible electronics [54], and textiles [132,133]. Textiles provide one of the most fundamental interfaces for wearable computing due to their ubiquity on and around the human body [99]. ...

SkinMarks: Enabling Interactions on Body Landmarks Using Conformal Skin Electronics
  • Citing Conference Paper
  • May 2017

... In this paper, we approach the design of more effective techniques by looking at the strengths and weaknesses of pico projections and search for new methods for providing and interacting with contextual information. Doing so, we particularly look into enhancing established flashlight metaphors like [1] [2] [3] [4] to explore information spaces, as these metaphors are well suited for pico projectors. In our approaches we combine zooming principles from proxemic interaction [5] with focus plus context visualization principles to extend the interaction possibilities in reflection to projector limitations. ...

From Focus to Context and Back: Combining Mobile Projectors and Stationary Displays
  • Citing Conference Paper
  • January 2013

... However, they relied on rigid enclosures and extruded profiles, which were not body-conformable. Researchers have since explored alternative wearable interfaces made from softer and more flexible materials, such as silicone [52,53,55,71,78,79,110,111], ink [16,91], flexible electronics [54], and textiles [132,133]. Textiles provide one of the most fundamental interfaces for wearable computing due to their ubiquity on and around the human body [99]. ...

iSkin: Flexible, Stretchable and Visually Customizable On-Body Touch Sensors
  • Citing Conference Paper
  • April 2015

... This design possibility involves using input modalities in ways that are inconsistent with their typical usage in the physical world. For example, gestures with low ergonomics, performed in relation to body parts that are difficult to reach, can prevent accidental input [90]. Eye gaze input using repetitive blinking, uncommon in everyday gaze, enable effective control of remote devices for users with upper-body motor impairments [20]. ...

More Than Touch: Understanding How People Use Skin as an Input Surface for Mobile Computing

... Motion-Beam [39] is a mobile projector that couples the movement of the projection to the imagery. ProjectorKit provides technical support for rapid prototyping of mobile projector interaction techniques [37]. ...

ProjectorKit: Easing Rapid Prototyping of Interactive Applications for Mobile Projectors
  • Citing Conference Paper
  • August 2013