Conference PaperPDF Available

A Non-grounded and Encountered-type Haptic Display Using a Drone

Authors:

Abstract and Figures

Encountered-type haptic displays recreate realistic haptic sensations by producing physical surfaces on demand for a user to explore directly with his or her bare hands. However, conventional encountered-type devices are fixated in the environment thus the working volume is limited. To address the limitation, we investigate the potential of an unmanned aerial vehicle (drone) as a flying motion base for a non-grounded encountered-type haptic device. As a lightweight end-effector, we use a piece of paper hung from the drone to represent the reaction force. Though the paper is limp, the shape of paper is held stable by the strong airflow induced by the drone itself. We conduct two experiments to evaluate the prototype system. First experiment evaluates the reaction force presentation by measuring the contact pressure between the user and the end-effector. Second experiment evaluates the usefulness of the system through a user study in which participants were asked to draw a straight line on a virtual wall represented by the device.
Content may be subject to copyright.
A Non-grounded and Encountered-type Haptic Display
Using a Drone
Kotaro Yamaguchi
Graduate School of
Information Science and
Technology, Osaka University,
1-5 Yamadaoka, Suita, Osaka
565-0871, Japan
yamaguchi.koutarou
@lab.ime.cmc.osaka-
u.ac.jp
Ginga Kato
Graduate School of
Information Science and
Technology, Osaka University,
1-5 Yamadaoka, Suita, Osaka
565-0871, Japan
katou.ginga
@lab.ime.cmc.osaka-
u.ac.jp
Yoshihiro Kuroda
Graduate School of
Engineering Science, Osaka
University, 1-3
Machikaneyama, Toyonaka,
560-8531, Japana
ykuroda@bpe.es.osaka-
u.ac.jp
Kiyoshi Kiyokawa
Cybermedia Center, Osaka
University, 1-32
Machikaneyama, Toyonaka,
Osaka 560-0043, Japan
kiyo@ime.cmc.osaka-
u.ac.jp
Haruo Takemura
Cybermedia Center, Osaka
University, 1-32
Machikaneyama, Toyonaka,
Osaka 560-0043, Japan
takemura@ime.cmc.osaka-
u.ac.jp
ABSTRACT
Encountered-type haptic displays recreate realistic haptic
sensations by producing physical surfaces on demand for a
user to explore directly with his or her bare hands. However,
conventional encountered-type devices are fixated in the en-
vironment thus the working volume is limited. To address
the limitation, we investigate the potential of an unmanned
aerial vehicle (drone) as a flying motion base for a non-
grounded encountered-type haptic device. As a lightweight
end-effector, we use a piece of paper hung from the drone to
represent the reaction force. Though the paper is limp, the
shape of paper is held stable by the strong airflow induced
by the drone itself. We conduct two experiments to evaluate
the prototype system. First experiment evaluates the reac-
tion force presentation by measuring the contact pressure
between the user and the end-effector. Second experiment
evaluates the usefulness of the system through a user study
in which participants were asked to draw a straight line on
a virtual wall represented by the device.
Keywords
virtual reality; haptic display; encountered-type device; non-
grounded-type; unmanned aerial vehicle
Permission to make digital or hard copies of all or part of this work for personal or
classroom use is granted without fee provided that copies are not made or distributed
for profit or commercial advantage and that copies bear this notice and the full cita-
tion on the first page. Copyrights for components of this work owned by others than
ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or re-
publish, to post on servers or to redistribute to lists, requires prior specific permission
and/or a fee. Request permissions from permissions@acm.org.
SUI ’16, October 15-16, 2016, Tokyo, Japan
c
2016 ACM. ISBN 978-1-4503-4068-7/16/10. . . $15.00
DOI: http://dx.doi.org/10.1145/2983310.2985746
feedback
drone
virtual
creature
(a)
(b)
Figure 1: An example of haptic feedback by our
device in (a) the real and (b) virtual environments.
1. INTRODUCTION
Haptic feedback systems can be classified into three types
according to their mechanical grounding configuration; the
wearing type, the grip type and the target type. The wear-
ing type haptic devices represent reaction force by attaching
the device to the body of the user. The grip type haptic
devices perform as a haptic display by getting the user to
43
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted
without fee provided that copies are not made or distributed for profit or commercial advantage and that copies
bear this notice and the full citation on the first page. Copyrights for components of this work owned by others
than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on
servers, or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from
permissions@acm.org or Publications Dept., ACM, Inc., fax +1 (212) 869-0481.
grip the tool-shaped devices such as a pen. Target type
haptic devices represent the target surface by changing the
shape, material properties, or/and location of the display
itself. The wearing type and grip type haptic devices may
cause the sense of incongruity in presence and the burden
because these devices are always in contact with the body
of the user. On the other hand, with the target type haptic
devices, the user rarely feels a burden. However, it is dif-
ficult to present a virtual object in the real world with an
arbitrary shape, material and position.
One of the approaches to realize a target type haptic de-
vice is called encountered-type. Encountered-type haptic
displays present haptic feedback by moving the end-effector
to the surface of the virtual object which the user is going to
touch. Encountered-type haptic devices have an advantage
that the user doesn’t feel the contact force unless the user
touch a virtual object because the devices themselves move.
However, there is a disadvantage that the work space is lim-
ited because conventional encountered-type haptic devices
need to be grounded.
We intend to develop a novel encountered-type haptic
device that does not limit a work space. In this study,
we propose an attachment of a light-weight flat object to
an unmanned aerial vehicle (drone) and a force generation
mechanism using its own airflow that would realize the non-
grounded encountered-type haptic display. Figure 1 shows
an example of haptic feedback by our haptic device in the
real environment (RE, (a)) and a virtual environment (VE,
(b)).
2. RELATED WORK
In this section, we will introduce the conventional grounded
type haptic displays and some important studies. In addi-
tion, we will also introduce a work support device using a
drone. When haptic devices apply forces to a user to repre-
sent the sensation of external force, the reaction force is also
generated. Most conventional haptic devices are grounded
to the environment to resist the reaction force from the user.
However, the grounded haptic displays have a disadvantage
of a limited work space due to the fixed position of the de-
vice. Therefore, in recent years, numerous non-grounded
haptic devices have been developed[1][2]. Rekimoto et al.
developed Traxion, which represents a virtual force by con-
trolling an asymmetric waveform of vibration with an elec-
tromagnetic coil supported by leaf springs[3]. The size of
Traxion is 7.5 mm ×35.0 mm ×5.0mm and the weight is
5.2 g. Yoshie et al. developed a device performing non-
grounded torque display using a gyro effect[4]. When the
force to change the rotation axis is applied to a disk that
rotates at a high speed, the force in the direction perpen-
dicular to the applied force is generated in the rotation axis.
These devices can display a virtual force or an actual torque
with a large work space. However, they cannot display ac-
tual translation forces.
Even in the case of no contact with a virtual object, most
haptic devices excluding encountered-type devices cannot
avoid displaying forces, because a part of the haptic dis-
play, e.g. an end-effector, is grasped or worn by the user.
In contrast, an encountered-type haptic device realizes an
ideal non-contact state, because no force arises unless a user
touches a virtual object[5][6]. When the user is going to
touch the object, the device moves toward the position of
the virtual object surface in a forestalling manner so that a
force is displayed to the user upon actual touch. Yokokoji
et al. developed the trajectory planning system with the
encountered-type haptic device to represent multiple vir-
tual objects in the three-dimensional space[7][8].Yafune et
al. proposed displaying a couple of virtual objects at the
same time using an encountered-type haptic device[9]. The
above-mentioned conventional encountered-type haptic de-
vices are grounded in space. Therefore, the work space of
those devices is limited.
A drone is an unmanned aerial vehicle that can fly and
move by a remote control. It is attractive not only for aerial
photography and transport, but also for an interaction me-
dia with a human. Some devices using a drone have been
developed to support a user’s work. Agrawal et al. devel-
oped a drone to assist a daily life of a person in a room[10].
A table attached to the top surface of the drone supports
writing text, and the drone’s bottom surface controls illu-
mination in a room. Weise et al. used a commercial head
mounted display (HMD), Oculus Rift, to enable the user to
control the drone remotely from the first person view [11].
The user can watch the remote scene captured by the camera
attached to the drone through the HMD, and can operate a
drone using a controller. These studies focus on positioning
and the operability of the drone.
We focus on developing an encountered-type haptic dis-
play using a drone. The non-grounded mechanism may dis-
play a limited force in magnitude due to a lack of fixation.
However, we address this issue by taking advantage of the
drone’s own strong airflow to represent the reaction force
from a virtual object such as a virtual creature illustrated
in Fig. 1.
3. DRONE-BASED NON-GROUNDED HAP-
TIC DISPLAY
A light-weight flat object is attached to a drone as an
end-effector, as shown in Fig. 2(a). The reaction force is
displayed parallel to the ground when a user touches the
end-effector with a grasping object, as shown in Fig. 2(b).
The strong airflow induced by the drone itself enables to
display force even with a light-weight and flexible object,
e.g. a sheet of paper. Then, the user can feel the contact
from the virtual object.
When the user is going to touch the virtual wall, the pro-
posed haptic display moves towards the corresponding posi-
tion in a forestalling manner and displays the reaction force
to the user upon actual touch. In our system, a user touches
a virtual object with a grasping object (see Fig. 1(a) and
Fig. 3) for safety. With a safer drone design, it is possible to
directly touch the end-effector. The system requires the po-
sition and orientation information of the haptic display and
that of the grasping object. In the current implementation,
a motion capture system is used to acquire these informa-
tion, and to synchronize their positions in VE. The process
flow is as follows:
1. Acquire the positions of the drone and the grasping
object from a motion capture system.
2. Update their positions in VE.
3. Estimate the contact position on the virtual wall where
the virtual grasping object might touch.
4. Move the drone to the corresponding estimated posi-
tion in RE. The user receives the reaction force when
44
drone
end-effector
(a)
airflow
drone
end-effector of
haptic display
grasping
object
haptic
feedback
(b)
Figure 2: Design and a principle of our haptic dis-
play: (a) an end-effector attached to a drone, (b)
force display using drone’s own airflow.
the collision between the grasping object and the end-
effector of the drone occurs.
AR Drone 2.0 by Parrot is used for the drone in the pro-
posed device. Unity 4 by Unity Technologies is used to con-
struct the VE and control the drone. The stereo images
of VE are sent to the HMD device, Oculus Rift DK1 by
Oculus VR, through Oculus Unity Integration. The posi-
tions of the drone and the grasping object are measured by
the motion capture system, Motive and OptiTrack by Nat-
uralPoint. The measured positions are broadcasted to local
hosts, and are transmitted to Unity. The system controls
the drone position using a Unity project, RiftDrone [11].
4. MEASUREMENT
We investigated whether or not the proposed haptic de-
vice can produce a perceptible force to a user. Specifically,
the effect of the drone’s airflow is examined because the
proposed mechanism uses the airflow to display the reaction
force.
Triaxial force sensor MFS20-025 by Nippon Liniax was
used to measure the force. Its maximum measurable force
is 50 N. The obtained signal was amplified 20 times with
an inversion amplification circuit because the range of the
measured force is less than 1.0 N. The amplified signal was
converted to digital signal and recorded in PC. The force
sensor was attached to the grasping object. The drone is
controlled at a fixed position and the experimenter (one of
the authors) pushes the end-effector of the proposed haptic
device with the grasping object. The pushing distance was
about 100 mm in depth from the initial surface position of
the end-effector. The force perpendicular to the surface of
the end-effector was measured at 100 Hz for a short period.
The force was measured with and without the presence of
the airflow. In the case of latter, the drone was not flying
Table 1: Average and standard deviation of the mea-
sured forces with and without airflow
airflow average (N) standard deviation
ON 1.18 ×1013.6×102
OFF 3.6×1021.3×102
drone
end-effector of
haptic display
grasping
object
HMD
Figure 3: Experimental setup.
and placed on a tripod. We performed the pushing task 10
times in each condition.
For calibration, the average of 100 consecutive samples
was set as a baseline before each trial. A moving average
filter with a span of 5 samples was applied to the measured
data. Table 1 shows the average forces of the 10 trials. The
result shows that the measured reaction force by the airflow
induced by the proposed haptic device is 0.118 ±0.036 N,
which is weak but well perceptible. In contrast, the mea-
sured force without the airflow is 0.036 ±0.012 N which
is too weak to perceive. A one-tailed t-test found that the
force with the presence of airflow was significantly greater
than the force without the airflow (t(11)=6.80, p0.001).
5. SUBJECTIVE STUDY
5.1 Experimental conditions
We investigated whether or not a user can stably draw
a straight line in midair using the proposed haptic display.
Four participants were recruited (all right-handed male, M=
24.0, SD=2.16) to take part in the experiment. They wore
the HMD and held the grasping object. They were requested
to move the grasping object following a marker on a hori-
zontal straight line of 1 m in length drawn on a virtual wall
which was presented about 1 m in front of the user. The
marker was moving at a constant speed (10 cm/sec) from
left to right, or from right to left. In a session they repeated
this line drawing task 10 times for each direction in a ran-
domized order, and they repeated the session twice, with and
without the proposed haptic display in a randomized order.
The position and orientation of the grasping object are re-
flected at 60 frames per second in VE, and its tip position
is used to draw a line on the virtual wall. These positions
were recorded at 10 times a second for analysis.
45
Figure 4: Average and standard error of deviation of
the tip of the grasping object from the surface of the
virtual wall; rightward lines with (a) and without
(b) the haptic display, leftward lines with (c) and
without (d) the haptic display. (**: p<0.01, ***:
p<0.001)
5.2 Results and Discussions
Figure 4 shows the average and standard error of the de-
viation of the tip of the grasping object from the surface
of the virtual wall for each of four combinations of condi-
tions. A two-way ANOVA found the main effects of both
the haptic display condition (F(1,15996)=215.18, p0.001)
and the line direction (F(1,15996)=143.58, p0.001). A
post-hoc analysis using Bonferroni correction revealed that
the haptic display was helpful in drawing a straight line in
midair regardless of the direction (rightward, p0.001, left-
ward, p<0.01). Also found was that rightward lines were
far more accurate than leftward (p0.001) when the hap-
tic display was used. This is presumably because drawing a
rightward line is more natural considering our writing skills
and the fact that all the participants were right-handed, thus
the physical support of the tool tip on the virtual wall was
more helpful. In the case of drawing a leftward line, it was
probably difficult even with the physical support. It is also
worth noting that the precision of positioning the drone was
insufficient which may also have contributed to the amount
of deviation when the haptic display was used.
6. CONCLUSIONS
In this study, we investigated the potential of an un-
manned aerial vehicle (drone) as a flying motion base for a
non-grounded encountered-type haptic device. We proposed
a force generation mechanism by using a sheet of paper and
the airflow of the drone itself as a lightweight end-effector.
The results of force measurement showed that perceptible
force was successfully presented with the drone’s airflow by
the proposed method. In addition, the force was increased
by the drone’s airflow statistically. We confirmed the effec-
tiveness of the our proposed device and the force generation
mechanism for an encountered-type haptic display. The re-
sults of the subjective study of line drawing suggested that
the haptic display was effective to support drawing a line
in midair. Future work includes improving the drone’s po-
sition control for more stable and accurate haptic feedback,
and conducting further user studies involving more compli-
cated drawing tasks. Self localization of the drone by using
the visual simultaneous localization and mapping (SLAM)
methods will also be investigated which will eliminate the
necessity of a dedicated tracking system.
Acknowledgments
This work was supported by JSPS KAKENHI Grant Num-
bers JP15K12082 and JP15K12083.
7. REFERENCES
[1] S. Kamuro, K. Minamizawa, N. Kawakami, and S. Tachi. An
Ungrounded Pen-Shaped Kinesthetic Display: Device
construction and Applications. In Proceedings of the IEEE
World Haptics, pages 557–562, 2011.
[2] M. Hirose, K. Hirota, T. Ogi, N. Kakehi, M. Saito, and
M. Nakashige. Hapticgear: the Development of a Wearable
Force Display System for Immersive Projection Displays. In
Proceedings of the IEEE Virtual Reality Conference 2001,
pages 123–129, 2001.
[3] J. Rekimoto. Traxion: a Tactile Interaction Device with
Virtual Force Sensation. In Proceedings of the ACM
SIGGRAPH 2014 Emerging Technologies, page 25, 2014.
[4] H.Yano, M.Yoshie, and H.Iwata. Development of a
Non-Grounded Haptic Interface Using the Gyro Effect. In
Proceedings of the 11th Symposium on Haptic Interfaces for
Virtual Environment and Teleoperator Systems (HAPTICS
’03), pages 32–39, 2003.
[5] T. Furukawa, K. Inoue, T. Takubo, and T. Arai.
Encountered-type Visual Haptic Display Using Flexible Sheet.
In Proceedings of the IEEE International Conference on
Robotics and Automation 2007, pages 479–484, 2007.
[6] S. Nakagawara, H. Kajimoto, N. Kawakami, S. Tachi, and
I. Kawabuchi. An Encounter-type Multi-Fingered Master Hand
Using Circuitous Joints. In Proceedings of the IEEE
International Conference on Robotics and Automation 2005,
pages 2667–2672, 2005.
[7] Y. Yokokohji, N. Muramori, Y. Sato, and T. Yoshikawa.
Designing an Encountered-type Haptic Display for Multiple
Fingertip Contacts based on the Observation of Human
Grasping Behaviors. In Proceedings of the 12th International
Symposium on Haptic Interfaces for Virtual Environment
and Teleoperator Systems 2004 (HAPTICS ’04), pages 66–73,
March 2004.
[8] Y. Yokokohi, J. Kinoshita, and T. Yoshikawa. Path Planning
for Encountered-type Haptic Devices That Render Multiple
Objects in 3D Space. In Proceedings of the IEEE Virtual
Reality 2001, pages 271–278, 2001.
[9] M. Yafune and Y. Yokokohji. Haptically Rendering Different
Switches Arranged on a Virtual Control Panel by Using an
Encountered-type Haptic Device. In Proceedings of the IEEE
World Haptics Conference (WHC) 2011, pages 551–556, June
2011.
[10] H. Agrawal, S. Leigh, and P. Maes. L’evolved: Autonomous
and Ubiquitous Utilities as Smart Agents. In Proceedings of
the ACM UbiComp 2015, pages 487–491, 2015.
[11] M. Weise. RiftDrone.
https://github.com/scopus777/RiftDrone. accessed 1 July
2016.
46
... In the case of industrial applications, these devices are considered for virtual prototyping that requires to have haptic feedback in several locations in order to recreate workspaces or objects to be manipulated. In the case of entertainment, ETHDs are used to recreate elements that can come in contact with the users when interacting with a ludic VE [10]. In the case of medicine, ETHDs are often used for remote body-palpation and surgery practice [6]. ...
... This definition was also adapted for comprehending wearable ETHDs, that unlike common wearable haptic displays in which the surface display, nor any other part of the device, is in constant contact with the user. This definition also adapts to more recent approaches such as mobile platform ETHDs [28]- [32] and UAV-based ETHDs [9], [10], [24], [33]. ...
... In addition, progress in actuator technologies yielded the use of a larger assort of devices used as ETHDs for more complex tasks such as juggling [40], medical palpation [6], [41] and surgical simulations [42]. UAV technologies brought new possibilities, leading research towards to developing ungrounded ETHDs starting with the work of Yamaguchi et al. [10]. Multi-sided end-effectors were introduced in the work of Araujo et al. [20]. ...
Thesis
Encountered-Type Haptic Displays (ETHDs) are robotic devices that follow the users' hand and locate themselves in an encountered position when users want to touch objects in immersive virtual reality (VR). Despite these advantages, several challenges are yet to be solved in matters of usability and haptic feedback. This thesis presents a series of contributions to leverage ETHDs through research axes for both usability and haptic feedback.The first contribution in the usability axis studied the design of safety techniques for ETHDs based on visual feedback. Then, a series of interaction techniques for surface exploration with ETHDs is presented. These techniques explored several combinations of factors related to ETHD control to give users the sensation of touching a large surface in VR.Concerning the haptic feedback axis, we introduce an approach for large, multi-textured surface rendering. This approach is based on a rotating, multi-textured, cylindrical prop attached to an ETHD's end-effector. Finally, the thesis presents a contribution to object manipulation in VR using a detachable tangible object and an ETHD. This contribution permits creating, destroying and reconfiguring tangible objects in immersive virtual environments.
... Ever since the WYSIWYF, various terminologies emerged: "shape display" [59], "dynamic shape display" [47], "encounteredtype haptic display" [20,53], "encountered-type of haptic device" [9], and depend on the community designing them. Yet, all of these terms actually depict the same concept, from a different perspective. ...
... Yet, all of these terms actually depict the same concept, from a different perspective. In these regards, the haptics and VR community provide a focus on the provided haptic feedback, with the generation of "physical characteristics, such as shape and rigidity, of three-dimensional (3D) virtual objects" [53] "directly explored with his or her bare-hands" [20,59]. The HCI community focuses on enabling the "sensation of voluntarily eliciting haptic feedback with the environment at a proper time and location" [41]. ...
... A recent taxonomy provides an overview of the available ETHD technologies [41]; which can be used by novice designers to identify the most adequate solutions. Authors distinguish grounded solutions (Robotic arm [33] or Fixed platform [28]), or ungrounded solutions (drone [59], mobile platform [60], on-demand handheld [15,25]). ...
Conference Paper
Full-text available
Encountered-type of Haptic devices (ETHD) are robotic interfaces physically overlaying virtual counterparts prior to a user interaction in Virtual Reality. They theoretically reliably provide haptics in Virtual environments, yet they raise several intrinsic design challenges to properly display rich haptic feedback and interactions in VR applications. In this paper, we use a Failure Mode and Effects Analysis (FMEA) approach to identify, organise and analyse the failure modes and their causes in the different stages of an ETHD scenario and highlight appropriate solutions from the literature to mitigate them. We help justify these interfaces' lack of deployment, to ultimately identify guidelines for future ETHD designers.
... Three examples of ungrounded ETHF. Left: A proxy drone (extracted from[Yamaguchi et al., 2016]; © 2016 ACM). Center: HapticDrone (extracted from [Abdullah etal., 2017]; © 2017 Abdullah and co-authors). ...
Thesis
This thesis advances haptic feedback for Virtual Reality (VR). Our work is guided by Sutherland's 1965 vision of the ultimate display, which calls for VR systems to control the existence of matter. To push towards this vision, we build upon proxy-based haptic feedback, a technique characterized by the use of passive tangible props. The goal of this thesis is to tackle the central drawback of this approach, namely, its inflexibility, which yet hinders it to fulfill the vision of the ultimate display. Guided by four research questions, we first showcase the applicability of proxy-based VR haptics by employing the technique for data exploration. We then extend the VR system's control over users' haptic impressions in three steps. First, we contribute the class of Dynamic Passive Haptic Feedback (DPHF) alongside two novel concepts for conveying kinesthetic properties, like virtual weight and shape, through weight-shifting and drag-changing proxies. Conceptually orthogonal to this, we study how visual-haptic illusions can be leveraged to unnoticeably redirect the user's hand when reaching towards props. Here, we contribute a novel perception-inspired algorithm for Body Warping-based Hand Redirection (HR), an open-source framework for HR, and psychophysical insights. The thesis concludes by proving that the combination of DPHF and HR can outperform the individual techniques in terms of the achievable flexibility of the proxy-based haptic feedback.
... More complex systems apply additional haptic proxies attached to the drone for kinesthetic feedback, such as lightweight haptic extentions for passive and active feedback scenarios developed by Hoppe et al. [10]. Many applications for haptic drones were suggested by Yamaguchi et al. [11] in Virtual Reality (VR) scenario of interaction with a virtual sword and Abtahi et al. [12] in VR scenario of a virtual wardrobe. All the mentioned above scenarios allowed users to experience tactile interactions over a large area freely. ...
Preprint
Full-text available
To achieve high fidelity haptic rendering of soft objects in a high mobility virtual environment, we propose a novel haptic display DandelionTouch. The tactile actuators are delivered to the fingertips of the user by a swarm of drones. Users of DandelionTouch are capable of experiencing tactile feedback in a large space that is not limited by the device's working area. Importantly, they will not experience muscle fatigue during long interactions with virtual objects. Hand tracking and swarm control algorithm allow guiding the swarm with hand motions and avoid collisions inside the formation. Several topologies of impedance connection between swarm units were investigated in this research. The experiment, in which drones performed a point following task on a square trajectory in real-time, revealed that drones connected in a Star topology performed the trajectory with low mean positional error (RMSE decreased by 20.6\% in comparison with other impedance topologies and by 40.9\% in comparison with potential field-based swarm control). The achieved velocities of the drones in all formations with impedance behavior were 28\% higher than for the swarm controlled with the potential field algorithm. Additionally, the perception of several vibrotactile patterns was evaluated in a user study with 7 participants. The study has shown that the proposed combination of temporal delay and frequency modulation allows users to successfully recognize the surface property and motion direction in VR simultaneously (mean recognition rate of 70\%, maximum of 93\%). DandelionTouch suggests a new type of haptic feedback in VR systems where no hand-held or wearable interface is required.
... In the near future, it is expected that this application of the Tombo propeller will become more obvious when UAV delivery services take place in a more complex environments (buildings, residential areas, and so on) that require a high level of safety. In addition, drones with Tombo propellers can be widely used in the entertainment field, such as in drone shows, where drones fly close one to another that may increase the likelihood of mutual collision, or in the dronehuman interaction scenario [49], [50]. Moreover, we would also like to adopt this biomimetic approach for applications in other sectors, such as small-scaled wind power generation propellers (reducing bird-strike risk), agricultural machine cutting blade (limiting damage of collision with rocks or broken branches), or ship propellers (reducing entanglement with marine litter, fish-strike risk, or even dangerous accidents to divers) toward sustainable solution for the nature. ...
Article
Full-text available
There is a growing need for vertical takeoff and landing vehicles, including drones, which are safe to use and can adapt to collisions. The risks of damage by collision, to humans, obstacles in the environment, and drones themselves, are significant. This has prompted a search into nature for a highly resilient structure that can inform a design of propellers to reduce those risks and enhance safety. Inspired by the flexibility and resilience of dragonfly wings, we propose a novel design for a biomimetic drone propeller called Tombo propeller. Here, we report on the design and fabrication process of this biomimetic propeller that can accommodate collisions and recover quickly, while maintaining sufficient thrust force to hover and fly. We describe the development of an aerodynamic model and experiments conducted to investigate performance characteristics for various configurations of the propeller morphology and related properties, such as generated thrust force, thrust force deviation, collision force, recovery time, lift-to-drag ratio, and noise. Finally, we design and showcase a control strategy for a drone equipped with Tombo propellers that collides in midair with an obstacle and recovers from collision continuing flying. The results show that the maximum collision force generated by the proposed Tombo propeller is less than two-thirds that of a traditional rigid propeller, which suggests the concrete possibility to employ deformable propellers for drones flying in a cluttered environment. This research can contribute to the morphological design of flying vehicles for agile and resilient performance.
... Some related work uses drones to provide haptic feedback for objects that can move around the user. Yamaguchi et al. (2016) propose an approach where a piece of paper, fixed to a drone and stabilized by its airflow, serves as an interaction surface. BitDrones (Gomes et al. 2016) uses drones that are equipped with either an RGB LED, a shape formed by an acrylic mesh and a frame, or a small touch screen. ...
Article
Full-text available
Immersive analytics often takes place in virtual environments which promise the users immersion. To fulfill this promise, sensory feedback, such as haptics, is an important component, which is however not well supported yet. Existing haptic devices are often expensive, stationary, or occupy the user’s hand, preventing them from grasping objects or using a controller. We propose PropellerHand, an ungrounded hand-mounted haptic device with two rotatable propellers, that allows exerting forces on the hand without obstructing hand use. PropellerHand is able to simulate feedback such as weight and torque by generating thrust up to 11 N in 2-DOF and a torque of 1.87 Nm in 2-DOF. Its design builds on our experience from quantitative and qualitative experiments with different form factors and parts. We evaluated our prototype through a qualitative user study in various VR scenarios that required participants to manipulate virtual objects in different ways, while changing between torques and directional forces. Results show that PropellerHand improves users’ immersion in virtual reality. Additionally, we conducted a second user study in the field of immersive visualization to investigate the potential benefits of PropellerHand there. Graphical abstract
Article
Modern aerial robots, in particular the drones, are developing at a rapid pace. Drones appear to be a promising area in robotics performing dangerous tasks during search and rescue operations, as well as in practical applications such as photography and cinematography. An urgent task is to ensure the drone safety against their mechanical damage when interacting with the external environment, as well as the safety of people in case of contact with the drones. To solve this problem, it is advisable to use tensegrity drones with the deformable structure and the ability to adapt to the changing environment parameters taking into account the obstacles encountered in the flight. These drones are able to ensure the controlled deformation of their fuselage in flight making them more mobile in difficult environments. A method was previously proposed to plan such trajectories based on solving the optimization problem with the linear matrix inequalities. However, numerical properties of the method remained unexplored. The problem of planning the tensegrity drone flight was considered. Numerical experiments were carried out. It was established that the surrounding space geometry had insignificant effect on the task implementation, but very significantly affected computational complexity and elapsed processor time.
Article
Immersive environments offer new possibilities for exploring three-dimensional volumetric or abstract data. However, typical mid-air interaction offers little guidance to the user in interacting with the resulting visuals. Previous work has explored the use of haptic controls to give users tangible affordances for interacting with the data, but these controls have either: been limited in their range and resolution; were spatially fixed; or required users to manually align them with the data space. We explore the use of a robot arm with hand tracking to align tangible controls under the user's fingers as they reach out to interact with data affordances. We begin with a study evaluating the effectiveness of a robot-extended slider control compared to a large fixed physical slider and a purely virtual mid-air slider. We find that the robot slider has similar accuracy to the physical slider but is significantly more accurate than mid-air interaction. Further, the robot slider can be arbitrarily reoriented, opening up many new possibilities for tangible haptic interaction with immersive visualisations. We demonstrate these possibilities through three use-cases: selection in a time-series chart; interactive slicing of CT scans; and finally exploration of a scatter plot depicting time-varying socio-economic data.
Conference Paper
Ubiquitous computing has been focusing on creating smart agents that are submerged into everyday environments, however, recent development on physical computing is demanding a shift from calm computing to a physically engaging form. Computing is no more limited to increasing our comfort through passive and pervasive deployment, they can now be created as being more actively and physically intermeshed into our tasks. We present L'evolved, autonomous ubiquitous utilities that assist in user tasks through active physical participation. They not only dynamically adapt to individual user needs and actions, but also work in close tandem with the users. Among explorations on potential applications, we harness drone technology to realize the design and implementation of example utilities that afford free motions and computational controls. Through various use scenarios of those exemplary utilities, we show how this new form of smart agents promises new ways of interacting with our physical environments. We also discuss design implications and technical details of our implementations.
Conference Paper
This paper introduces a new mechanism to induce a virtual force based on human illusory sensations. An asymmetric signal is applied to a tactile actuator consisting of an electromagnetic coil, a metal weight, and a spring, such that the user feels that the device is being pulled (or pushed) in a particular direction, although it is not supported by any mechanical connection to other objects or the ground. The proposed tactile device is smaller (35.0 mm x 5.0 mm x 7.5 mm) and lighter (5.2 g) than any previous force-feedback devices, which have to be connected to the ground with mechanical links. This small form factor allows the device to be implemented in several novel interactive applications, such as a pedestrian navigation system that includes a finger-mounted tactile device or an (untethered) input device that features virtual force. Our experimental results indicate that this illusory sensation actually exists and the proposed device can switch the virtual force direction within a short period. We combined this new technology with visible light transmission via a digital micromirror device (DMD) projector and developed a position guiding input device with force perception.
Conference Paper
This paper introduces a new mechanism to induce a virtual force based on human illusory sensations. An asymmetric signal is applied to a tactile actuator consisting of an electromagnetic coil, a metal weight, and a spring, such that the user feels that the device is being pulled (or pushed) in a particular direction, although it is not supported by any mechanical connection to other objects or the ground. The proposed tactile device is smaller (35.0 mm x 5.0 mm x 7.5 mm) and lighter (5.2 g) than any previous force-feedback devices, which have to be connected to the ground with mechanical links. This small form factor allows the device to be implemented in several novel interactive applications, such as a pedestrian navigation system that includes a finger-mounted tactile device or an (untethered) input device that features virtual force. Our experimental results indicate that this illusory sensation actually exists and the proposed device can switch the virtual force direction within a short period. We combined this new technology with visible light transmission via a digital micromirror device (DMD) projector and developed a position guiding input device with force perception.
Conference Paper
In this paper, a virtual prototype system for control panels is presented by using an encountered-type haptic device. The system can render different switches arranged on a virtual control panel by a single haptic device. A motion planning algorithm for the encountered-type haptic device is proposed so that the device is properly encountered by the user at the switch locations while it avoids unintended collisions with the user elsewhere. To reduce the duty of the device motion, the weighting function in the algorithm is improved. Control panel surface coordinates are newly introduced in the motion planning so that unintended collisions of the device with the user are completely avoided in free space. The proposed motion planning algorithm with some improvements is verified by numerical simulations.
Conference Paper
In this paper, we implement an ungrounded pen-shaped kinesthetic display and construct a three-dimensional (3D) haptic interaction system. The ungrounded pen-shaped kinesthetic display provides kinesthetic sensations to a user's fingers without the use of mechanical linkages. Therefore, the user can move his/her hand freely in air and interact with virtual environments with sensation of touching. We verified the ability of the device to provide forces and to represent a virtual surface for 3D input. Then we constructed a 3D haptic interaction system in that the user can touch virtual objects displayed as 3D images directly. We also applied the device to an interface for 3D modeling and constructed a 3D haptic modeling system in that the user can create touchable 3D models by sketching two-dimensional (2D) figures in air.
Conference Paper
We have developed a new type of master hand to control a dexterous slave robot hand for telexistence. Our master hand has two features. One is a compact exoskeleton mechanism called “circuitous joint,” which covers wide workspace of an operator’s finger. Another is the encounter-type force feedback that enables unconstrained motion of the operator’s finger and natural contact sensation. In this paper, the mechanism and control method of our master hand are introduced and experimental master-slave finger control is conducted.
Conference Paper
An encountered-type visual haptic display using flexible sheet is proposed; it allows users to feel like seeing and pushing virtual soft objects directly. Both edges of a translucent flexible sheet such as rubber is attached to two manipulators. They apply bias tension to the sheet by pulling it from both sides, thus varying the sheet compliance. A user can feel the compliance of a virtual soft object by pushing the sheet with his finger directly. The motion of the finger tip is measured by stereo cameras. The manipulators also change the pose of the sheet along the object's surface together with the finger tip motion. The user can touch different points of the object and feel like stroking it with his finger. This sheet is also used as a rear projection screen. From the measured finger tip position, the deformation of the object is calculated by FEM. An LCD projector projects the CG image of the deformed object stereoscopically on the sheet from its back. The user can see the 3D image through stereoscopic glasses and touch the image directly. A method of correcting the CG image distortion caused by the movement of the sheet and the depression of the pushed sheet is proposed. A virtual soft cylinder is expressed by a prototype display: the stroking of the cylinder and the correction of its 3D image are evaluated.
Article
Unlike conventional haptic devices, an encountered-type device is not held by a user all the time. Instead, the device remains at the location of a virtual object and waits for the user to encounter it. In this paper, we extend this concept to fingertip contacts and design an encountered-type haptic display for multiple fingertip contacts to simulate tasks of grasping an object with any shape and size. Before designing the device, we intensively observed human grasping behaviors. This observation was very helpful to determine the mechanism of the device. An encountered-type device for three-fingered grasping was actually prototyped based on our design.
Conference Paper
Unlike conventional haptic devices, an encountered-type device is not held by a user all the time. Instead, the device stays at the location of a virtual object and waits for the user to encounter it. In this paper, we extend this concept to fingertip contacts and design an encountered-type haptic display for multiple fingertip contacts to simulate tasks of grasping an object with any shape and size. Before designing the device, we intensively observed human grasping behaviors. This observation was very helpful to determine the mechanism of the device. An encountered-type device for three-fingered grasping was actually prototyped based on our design.
Conference Paper
In this paper, we propose a technique capable of generating haptic sensation in a human-scale virtual reality environment or in an augmented reality environment. To realize such a large working volume and the ability to move easily within it, we have developed a non-grounded haptic interface using the gyro effect. The interface consists of a flywheel equipped with motor controlled gimbals that tilt the flywheel. We also propose three haptic rendering methods that can provide information regarding territories, target direction and trajectory. The effectiveness of our haptic interface was tested by conducting a series of experiments.