Conference PaperPDF Available

HapticDrone: An Encountered-Type Kinesthetic Haptic Interface with Controllable Force Feedback: Initial Example for 1D Haptic Feedback

Authors:

Abstract

We present HapticDrone, a concept to generate controllable and comparable force feedback for direct haptic interaction with a drone. As a proof-of-concept study this paper focuses on creating haptic feedback only in 1D direction. To this end, an encountered-type, safe and un-tethered haptic display is implemented. An overview of the system and details on how to control the force output of drones is provided. Our current prototype generates forces up to 1.53 N upwards and 2.97 N downwards. This concept serves as a first step towards introducing drones as mainstream haptic devices.
A preview of the PDF is not available
... Efforts addressing this challenge come mostly from ungrounded unmanned air vehicles (UAVs). UAV-based ETHDs have the advantage of being the actual shape displays and not having a constrained workspace [5], [15], [24]. However, shape rendering is mostly limited to the shape of the cage surrounding the UAV. ...
... This definition was also adapted for comprehending wearable ETHDs, that unlike common wearable haptic displays in which the surface display, nor any other part of the device, is in constant contact with the user. This definition also adapts to more recent approaches such as mobile platform ETHDs [28]- [32] and UAV-based ETHDs [9], [10], [24], [33]. ...
... Ungrounded ETHDs are not constrained to an anchor position to render haptic feedback. Ungrounded ETHDs comprise UAVs (such as drones) [5], [9], [10], [15], [24], [33], mobile platform [28]- [32] and wearable ETHDs [22], [23], [57]. Hereby we present a de- [21]. ...
Thesis
Encountered-Type Haptic Displays (ETHDs) are robotic devices that follow the users' hand and locate themselves in an encountered position when users want to touch objects in immersive virtual reality (VR). Despite these advantages, several challenges are yet to be solved in matters of usability and haptic feedback. This thesis presents a series of contributions to leverage ETHDs through research axes for both usability and haptic feedback.The first contribution in the usability axis studied the design of safety techniques for ETHDs based on visual feedback. Then, a series of interaction techniques for surface exploration with ETHDs is presented. These techniques explored several combinations of factors related to ETHD control to give users the sensation of touching a large surface in VR.Concerning the haptic feedback axis, we introduce an approach for large, multi-textured surface rendering. This approach is based on a rotating, multi-textured, cylindrical prop attached to an ETHD's end-effector. Finally, the thesis presents a contribution to object manipulation in VR using a detachable tangible object and an ETHD. This contribution permits creating, destroying and reconfiguring tangible objects in immersive virtual environments.
... While audio and haptic feedback depict the center of interest in HCI research for BVI [29], the feld has spread into utilizing innovative concepts like augmented reality (AR) or miniature drones to present further haptic and visual enhancements for impaired people [1,4,19,26]. Haptic and tactile interfaces augment the users explore by touch approach [16,17,29] by generating new haptic cue sets to encode further localization information [30,35]. ...
... A recent research interest in HCI has been tangible guidance interfaces implemented by using quadcopter drones [6,19,26]. Already widely used in virtual reality (VR) research for perceivable force feedback objects and control inputs (e.g., Flyables, Thor's Hammer, Valkyrie or VRHapticDrones) [1,19,32], new, more advanced application cases for haptic human-drone interaction (HDI) started to emerge. ...
... With the exploration of human-drone interaction (HDI) for HCI, researchers adapted quadcopter for BVI aid research [4,6]. The beneft of using drones is the dynamic haptic force feedback and possibility of both input and output modality in various HCI applications [1,19]. Freundschuh et al. [15] categorized the general human interaction tasks, which were extended by Nicholson et al. [34], into the three spatial interaction spaces: (I) locomotor, (II) search and (III) haptic. ...
... These allow for augmented reality scenarios without the need for head-mounted displays. HapticDrone (Abdullah et al. 2017) uses a quadcopter to create force feedback either up-or downwards with about 1.5 and 3 N. TactileDrones (Knierim et al. 2017) provides tactile feedback through small drones. By hitting the user with differently shaped tips, this approach can convey the impact of arrows or the sting of a bumblebee. ...
Article
Full-text available
Immersive analytics often takes place in virtual environments which promise the users immersion. To fulfill this promise, sensory feedback, such as haptics, is an important component, which is however not well supported yet. Existing haptic devices are often expensive, stationary, or occupy the user’s hand, preventing them from grasping objects or using a controller. We propose PropellerHand, an ungrounded hand-mounted haptic device with two rotatable propellers, that allows exerting forces on the hand without obstructing hand use. PropellerHand is able to simulate feedback such as weight and torque by generating thrust up to 11 N in 2-DOF and a torque of 1.87 Nm in 2-DOF. Its design builds on our experience from quantitative and qualitative experiments with different form factors and parts. We evaluated our prototype through a qualitative user study in various VR scenarios that required participants to manipulate virtual objects in different ways, while changing between torques and directional forces. Results show that PropellerHand improves users’ immersion in virtual reality. Additionally, we conducted a second user study in the field of immersive visualization to investigate the potential benefits of PropellerHand there. Graphical abstract
... This can be typically realized by wearable haptic devices, or through direct stimulation to the user's neurosensory mechanism [48]. Recently, there has been a new class of methods developed, which use drones to flexibly generate tactile feedbacks [49], [50]. ...
Preprint
The next revolution of industry will turn the industries as well as the entire society into a human-centric shape. The presence and impacts of human beings in industrial systems and processes will be magnified more than ever before. To cope with the emerging challenges raised by this revolution, 6G will massively deploy digital twins to merge the cyber-physicalhuman worlds; novel solutions for multi-sensory human-machine interfaces will play a key role in this strategy.
Conference Paper
Assistive rehabilitation devices have been developed to help post-stroke patients to recover and live independently for a number of years. As a way to communicate with the physical world, force sensation is extremely helpful to rebuild neuroplasticity [1] during the rehabilitation process. This paper presents a model and design of asymmetric vibrations to provide bidirectional force sensation, which can be beneficial to design a portable rehabilitation haptic device. Users will feel a directional cue generated by asymmetric vibrations by holding the device. This directional cue can navigate users around in a rehabilitation training along with visual guidance and provide physical force sensations. The system consists of a current-drive single solenoid rigidly attached to a base. The system's model is verified through experiment at three different frequencies. Our analysis shows that by varying the signal's duty ratio, the direction of peak accelerations change from positive to negative. In addition, two other waveforms (saw-tooth and step-ramp) at several frequencies and different spring's stiffness are also discussed to determine the ideal characteristics of the input signal for rehabilitation applications.
Conference Paper
Full-text available
Head-mounted displays for virtual reality (VR) provide high-fidelity visual and auditory experiences. Other modalities are currently less supported. Current commercial devices typically deliver tactile feedback through controllers the user holds in the hands. Since both hands get occupied and tactile feedback can only be provided at a single position, research and industry proposed a range of approaches to provide richer tactile feedback. Approaches, such as tactile vests or electrical muscle stimulation, were proposed, but require additional body-worn devices. This limits comfort and restricts provided feedback to specific body parts. With this Interactivity installation, we propose quadcopters to provide tactile stimulation in VR. While the user is visually and acoustically immersed in VR, small quadcopters simulate bumblebees, arrows, and other objects hitting the user. The user wears a VR headset, mini-quadcopters, controlled by an optical marker tracking system, are used to provide tactile feedback.
Conference Paper
Full-text available
Encountered-type haptic displays recreate realistic haptic sensations by producing physical surfaces on demand for a user to explore directly with his or her bare hands. However, conventional encountered-type devices are fixated in the environment thus the working volume is limited. To address the limitation, we investigate the potential of an unmanned aerial vehicle (drone) as a flying motion base for a non-grounded encountered-type haptic device. As a lightweight end-effector, we use a piece of paper hung from the drone to represent the reaction force. Though the paper is limp, the shape of paper is held stable by the strong airflow induced by the drone itself. We conduct two experiments to evaluate the prototype system. First experiment evaluates the reaction force presentation by measuring the contact pressure between the user and the end-effector. Second experiment evaluates the usefulness of the system through a user study in which participants were asked to draw a straight line on a virtual wall represented by the device.
Conference Paper
Full-text available
The Rutgers Master II-ND glove is a follow up on the earlier Rutgers Master II haptic interface. The redesigned glove has all the sensing placed on palm support, avoiding routing wires to the fingertips. It uses custom pneumatic actuators arranged in a direct-drive configuration between the palm and the thumb, index middle and ring fingers. The supporting glove used in the RMII design is eliminated, thus the RMII-ND can better accommodate varying hand sizes. The glove is connected to a haptic control interface that reads its sensors and servos its actuators. The interface pneumatic pulse-width modulated servo-valves have higher bandwidth than those used in the earlier RMII, resulting in better force control. A comparison with the CyberGrasp commercial haptic glove is provided
Conference Paper
We present BitDrones, a toolbox for building interactive real reality 3D displays that use nano-quadcopters as self-levitating tangible building blocks. Our prototype is a first step towards interactive self-levitating programmable matter, in which the user interface is represented using Catomic structures. We discuss three types of BitDrones: PixelDrones, equipped with an RGB LED and a small OLED display; ShapeDrones, augmented with an acrylic mesh spun over a 3D printed frame in a larger geometric shape; and DisplayDrones, fitted with a thin-film 720p touchscreen. We present a number of unimanual and bimanual input techniques, including touch, drag, throw and resize of individual drones and compound models, as well as user interface elements such as self-levitating cone trees, 3D canvases and alert boxes. We describe application scenarios and depict future directions towards creating high-resolution self-levitating programmable matter.
Article
AIREAL is a novel haptic technology that delivers effective and expressive tactile sensations in free air, without requiring the user to wear a physical device. Combined with interactive computers graphics, AIREAL enables users to feel virtual 3D objects, experience free air textures and receive haptic feedback on gestures performed in free space. AIREAL relies on air vortex generation directed by an actuated flexible nozzle to provide effective tactile feedback with a 75 degrees field of view, and within an 8.5cm resolution at 1 meter. AIREAL is a scalable, inexpensive and practical free air haptic technology that can be used in a broad range of applications, including gaming, mobile applications, and gesture interaction among many others. This paper reports the details of the AIREAL design and control, experimental evaluations of the device's performance, as well as an exploration of the application space of free air haptic displays. Although we used vortices, we believe that the results reported are generalizable and will inform the design of haptic displays based on alternative principles of free air tactile actuation.
Conference Paper
We introduce UltraHaptics, a system designed to provide multi-point haptic feedback above an interactive surface. UltraHaptics employs focused ultrasound to project discrete points of haptic feedback through the display and directly on to users' unadorned hands. We investigate the desirable properties of an acoustically transparent display and demonstrate that the system is capable of creating multiple localised points of feedback in mid-air. Through psychophysical experiments we show that feedback points with different tactile properties can be identified at smaller separations. We also show that users are able to distinguish between different vibration frequencies of non-contact points with training. Finally, we explore a number of exciting new interaction possibilities that UltraHaptics provides.
Conference Paper
This paper describes the PHANTOM haptic interface - a device which measures a user's finger tip position and exerts a precisely controlled force vector on the finger tip. The device has enabled users to interact with and feel a wide variety of virtual objects and will be used for control of remote manipulators. This paper discusses the design rationale, novel kinematics and mechanics of the PHANTOM. A brief description of the programming of basic shape elements and contact interactions is also given.
The Geomagic Touch (formerly Sensable Phantom Omni)
  • Geomagic
Geomagic. 2017. The Geomagic Touch (formerly Sensable Phantom Omni). (2017). http://www.geomagic. com/en/products/phantom-omni/overview.html.