This project is about the design, development and control of micro-surgical instruments. These tools are useful for performing minimal invasive surgery via tele-operation. At present, the tools developed in this work are aimed at vocal cord surgery.
The main obstacle to a wide-spread adoption of advanced manipulation systems in industry is their complexity, fragility, lack of strength, and difficulty of use. This project deals with a path of disruptive innovation for the development of simple, compliant, yet strong, robust, and easy-to-program manipulation systems. The idea is: Soft Manipulation SOMA. SOMA explores a new avenue of robotic manipulation with the environment, as opposed to manipulation of or in the environment. In our approach, the physical constraints imposed by objects in the environment and the manipulandum itself are not regarded as obstacles, but rather as opportunities to guide functional hand pre-shaping, adaptive grasping, and affordance-guided manipulation of objects. The exploitation of these opportunities, which we refer to as environmental constraints (EC), enables robust grasping and manipulation in dynamic, open, and highly variable environments. The key ingredient for the exploitation of EC is softness of hands, i.e. their embodied ability to comply and adapt to features of the environment. The traditional paradigm for robotic manipulation is in complete disarray in front of this shift of focus: state-of-the-art grasp planners are targeted towards rigid hands and objects, and attempt to find algorithmic solutions to inherently complex, often ill-posed problems. Further complicating matters, the requirement of planning for soft, uncertain interactions between hand and environment is entirely beyond the state of the art. However, this is how humans most often use their hands, and how we plan to change robotic manipulation. Scientific objective: Our consortium will develop Soft Manipulation into a proven approach to robotic grasping and manipulation. We will design capable soft hands for the versatile and competent exploitation of EC. We will develop control methodology tailored to these hands. And we will devise approaches for perception and planning Soft Manipulation in dynamic, real-world environments. Taken together, these components deliver versatile, robust, cost-effective, and safe robotic grasping and manipulation capabilities. Industrial objective: We will apply the developed Soft Manipulation technology to an open manipulation problem in the food and agriculture industry: the handling of irregularly shaped, flexible, and easily damage- able goods, such as fruit and vegetables. This Soft Manipulation system prototype will be demonstrated in an operational industrial environment (Technology Readiness Level TRL 7).
Research Item (380)
- Jan 2019
- ROMANSY 22 – Robot Design, Dynamics and Control
In this study, we present an efficient mathematical representation of soft robotic fingers based on screw theory. Then, we show how the model and its main properties can be exploited in the design phase and we present an application for an under-actuated tendon driven gripper with a modular structure, in which the joint stiffness values are defined to obtain the desired equivalent stiffness and kinematics manipulability at the fingertips. A distribution of stiffness through the flexible parts of the gripper is fundamental to characterize its overall behavior: the introduced mathematical model enables the gripper designer to analyse how a specific property e.g., a desired trajectory of the fingertips, a desired overall stiffness, a distribution of contact force, etc., is influenced by the stiffness of its passive joints, and vice-versa evaluating the joint stiffness values allowing to get a desired properties. Joints with different stiffness values can be obtained by regulating 3D-printing parameters and material properties in the manufacturing process. It is possible to design modular grippers with the same mechanical structure, but different behaviours, i.e. different fingertip trajectories, equivalent fingertip stiffness ellipsoids etc.. Gripper modules can be easily assembled and disassembled maintaining the same base, to adapt them to different tasks. The presented model and design guidelines are a first step in the direction of soft-grippers that can be optimized for a specific problem.
This paper presents a comparison between two different approaches to control human walking cadence, with the further aim to assess if the users can synchronize to the suggested rhythm with low efforts while performing other tasks. Elastic haptic bands are used to suggest walking-pace during an exercise aimed at reproducing real industrial or human-robot cooperation task. The proposed system consists of two wearable interfaces for providing timing information to the users, and a pressure sensor to estimate the real gait pattern, thus resulting in a combination of walking-state monitoring and vibro-tactile stimuli to regulate the walking pace. Vibrational stimuli with a constant presentation interval are alternately and repeatedly given to the right and left side of the human body, in accordance with the desired walking cadence. We tested two different interface placements: wrists and ankles. The guidance system has been evaluated under mental and manual workload using an additional task: balancing a small sphere in the center of a flat surface. Experimental results revealed that subjects prefer the ankle position for what concerns wearability, comfort and easiness in task execution. Examples of the proposed approach in daily use are training and coaching in sports, rehabilitation, and human-robot cooperation and interaction.
- Jun 2018
- Haptics: Science, Technology, and Applications
This paper presents a pneumatic-based force sensor, used to measure the force generated at the tip of a surgical instrument during robot-assisted minimally invasive surgery (RMIS). Despite the achievements of the robotic surgery, the lack of haptic feedback to the surgeon is still a great limitation, since through palpation the physician can distinguish consistency of tissues and determine the occurrence of an abnormal mass. Although a great effort has been made by researchers to develop novel haptic interfaces able to provide force feedback to the operator, far fewer works exist regarding the design of sensing systems for robotic surgery. In this respect, we propose a new force measurement method based on the relation between the air pressure variation inside a pneumatic balloon and the interaction force due to the contact between the balloon and an object. A performance comparison with a very-fine resolution commercial force sensor proves the feasibility and effectiveness of the proposed approach.
- Apr 2018
- Extended Abstracts of the 2018 CHI Conference
This workshop aims to generate an interdisciplinary research agenda for digital touch communication that effectively integrates technological progress with robust investigations of the social nature and significance of digital touch. State-of-the-art touch-based technologies have the potential to supplement, extend or reconfigure how people communicate through reshaping existing touch practices and generating new capacities. Their possible impact on interpersonal intimacy, wellbeing, cultural norms, ways of knowing and power relations is far-reaching and under-researched. Few emerging devices and applications are embedded into everyday touch practices, limiting empirical exploration of the implications of digital touch technologies in everyday communication. There is, thus, a real need for methodological innovation and interdisciplinary collaboration to critically examine digital touch communication across social contexts and technological domains, to better understand the social consequences of how touch is digitally remediated. This agenda-setting workshop will bring together HCI researchers and designers with colleagues from sociology, media & communications, arts & design to address key research challenges and build the foundations for future collaborations.
- Mar 2018
- Wearable Exoskeleton Systems: Design, control and applications
The intrinsic soft nature of compliant supernumerary limbs and exosuits makes them appealing candidates for assisting human movements, with potential applications in healthcare, human augmentation and logistics. In the following chapter, we describe the technology used in exosuits and supernumerary limbs for assistance of activities of daily living, with emphasis on aiding grasping and flexion/ extension of the elbow joint. We discuss the mechanical design principles of such devices, detail the control paradigms that can be used for intention-detection and present the design and evaluation of cutaneous interfaces used for force feedback rendering. Tests on healthy and impaired subjects highlight that exosuits and supernumerary limbs are potential cost-effective and intrinsically safe solutions for increasing the capabilities of healthy subjects and improving the quality of life of subjects suffering from motor disorders.
Haptic interfaces are mechatronic devices designed to render tactile sensations; although they are typically based on robotic manipulators external to the human body, recently, interesting wearable solutions have been presented. Towards a more realistic feeling of virtual and remote environment interactions, we propose a novel wearable skin stretch device for the upper limb called "hBracelet." It consists of two main parts coupled with a linear actuator. Each part contains two servo actuators that move a belt. The device is capable of providing distributed mechanotactile stimulation on the arm by controlling the tension and the distance of the two belts in contact with the skin. When the motors spin in opposite directions, the belt presses into the user's arm, while when they spin in the same direction, the belt applies a shear force to the skin. Moreover, the linear actuator exerts longitudinal cues on the arm by moving the two parts of the device. In this work we illustrate the mechanical structure, working principle, and control strategies of the proposed wearable haptic display. We also present a qualitative experiment in a teleoperation scenario as a case study to demonstrate the effectiveness of the proposed haptic interface and to show how a human can take advantage of multiple haptic stimuli provided at the same time and on the same body area. The results show that the device is capable of successfully providing information about forces acting at the remote site, thus improving telepresence.
Underactuated and compliant hands, frequently referred to as soft hands, have been recently proposed to overcome common issues of multi-fingered robotic hands. Although several prototypes have been developed, there is still a lack of systematic ways to model and control these devices to get grasps exploiting their intrinsic features. Classical tools can hardly be applied when contact surfaces are deformable and hand kinematics is not uniquely defined due to underactuation. In this work, we propose a method to model underactuated compliant hands. The model captures how the hand closes analyzing the motion of suitable reference points defined on the hand. In particular, we present a procedure to compute the preferred grasping direction of a given hand, namely the closure signature (CS), and then we use this information to plan power grasps. The feasibility of the proposed method has been validated by performing experiments with a soft hand fixed to a robotic arm. The use of the closure signature proved to increase the performance of the planner. The proposed method can easily be extended to other underactuated compliant hands.
Among the most promising field of applications of wearable robotics there are the rehabilitation and the support in activities of daily living (ADL) of impaired people. In this paper, we propose two possible designs of a robotic extra-finger, the Robotic Sixth Finger, for grasping compensation in patients with reduced hand mobility, such as post-stroke patients. The idea is to let the patients be able to grasp an object by taking advantage of the wearable device worn on the paretic limb by means of an elastic band. The Robotic Sixth Finger and the paretic hand work jointly to hold an object. Adding a robotic opposing finger is a promising approach that can significantly improve the grasping functional compensation in different typologies of patients during everyday life activities.
This work presents a cutaneous haptic device able to provide navigation cues to the forearm through lateral skin stretch haptic feedback. Four cylindrical rotating end effectors, placed on the forearm of the human user, can generate independent skin stretches at the palmar, dorsal, ulnar, and radial sides of the arm. When all the end effectors rotate in the same direction, the cutaneous device is able to provide cutaneous cues about a desired pronation/supination of the forearm. When two opposite end effectors rotate in different directions, the cutaneous device is able to provide cutaneous cues about a desired translation of the forearm. To evaluate its effectiveness in providing navigation information, we carried out two experiments of haptic navigation. In the first experiment, subjects were asked to translate and rotate the forearm toward a target position and orientation, respectively. In the second experiment, subjects were asked to control the motion and orientation of a 6-DoF robotic manipulator to grasp and lift a target object. Our haptic device improved the performance of both tasks with respect to providing no haptic feedback. Moreover, it shows similar performance with respect to sensory substitution via visual feedback, without overloading the visual channel.
We present a novel 3RRS wearable fingertip device for the rendering of stiffness. It is composed of a static upper body and a mobile end-effector. The upper body is located on the nail side of the finger, supporting three servo motors, and the mobile end-effector is in contact with the finger pulp. The two parts are connected by three articulated legs, actuated by the motors. The end-effector can move toward the user's fingertip and rotate it to simulate contacts with arbitrarily-oriented surfaces. Moreover, a vibrotactile motor placed below the end-effector conveys vibrations to the fingertip. The proposed device weights 25 g for 35×50×48 mm dimensions. To test the effectiveness of our wearable haptic device and its level of wearability, we carried out two experiments. The first experiment tested the capability of our device in differentiating stiffness information, while the second one focused on evaluating its applicability in an immersive virtual reality scenario. Results showed the effectiveness of the proposed wearable solution, with a JND for stiffness of 208.5 $\pm$ 17.2 N/m. Moreover, all subjects preferred the virtual interaction experience when provided with wearable cutaneous feedback, even if results also showed that subjects found our device still a bit difficult to use.
- Jul 2017
- International AsiaHaptics conference
We present a novel 3RRS wearable fingertip device. It is composed of a static upper body and a mobile end-effector. The upper body is located on the nail side of the finger, supporting three small servo motors, and the mobile end-effector is in contact with the finger pulp. The two parts are connected by three articulated legs, actuated by the motors. The end-effector can move toward the users fingertip and rotate it to simulate contacts with arbitrarily-oriented surfaces. Moreover, a vibrotactile motor placed below the end-effector conveys vibrations to the fingertip. The axes of the two revolute joints of each leg are parallel, so that it constitutes a 2-DoF planar articulated mechanism, constraining the motion of the center of each spherical joint on a plane fixed with respect to the upper body. The mobile end-effector has therefore 3-DoF with respect to the body. The proposed device weights 25 g for 355048 mm dimensions.
This paper present a wearable haptic ring called “hRing” to control the soft extra robotic finger and to render haptic informations to provide cutaneous stimuli. The “hRing” is a novel wearable cutaneous device for the proximal finger phalanx and the extra robotic finger is a device to be used by chronic stroke patients to compensate for the missing hand function of their paretic limb. The hRing consists of two servo motors, a vibro motor, two pairs of push button and a belt. The servo motors move the belt placed in contact with the user’s finger skin. When the motors spin in opposite directions, the belt presses into the user’s finger, while when the motors spin in the same direction, the belt applies a shear force to the skin. The soft finger is an underactuated modular robotic finger worn on the paretic forearm by means of an elastic band. The device and the paretic hand/arm act like the two parts of a gripper working together to hold an object. It has been designed to be wearable, Two chronic stroke patients took part to our experimental evaluation on how the proposed integrated robotic system can be used for hand grasping compensation. The hRing enabled the patients to easily control the motion of the robotic finger while being provided the haptic feedback about the status of the grasping action. The patients found the system useful for ADL tasks, the hRing easy to use, and the haptic feedback very informative.
- Jul 2017
- 2017 IEEE International Conference on Advanced Intelligent Mechatronics (AIM)
This paper presents a master-slave configuration robotic microsurgical forceps, which is capable of performing micro tissue manipulation. The master, i.e., 7 degree of freedom (DOF) device (Sigma.7), tele-operates the slave device which is a combination of a 6-DOF serial robotic arm and a 1-DOF (open/close) microsurgical forceps device. The serial robotic arm is used for positioning and orienting the slave device, which is integrated with a force/torque sensor for tissue grip-force measurement. This integrated system is analyzed for its (i) functional, (ii) usability, and (iii) haptic performance through user trials. The proposed system offers improved tool placement, enhanced tissue perception, safety, and accuracy with respect to the state of the art. This study the feasibility of replacing traditional manual forceps with easy-to-use and ergonomic robot-assisted devices.
Many common activities of daily living like open a door or fill a glass of water, which most of us take for granted, could be an insuperable problem for people who have limited mobility or impairments. For years the unique alternative to overcame this limitation was asking for human help. Nowadays thanks to recent studies and technology developments, having an assistive devices to compensate the loss of mobility is becoming a real opportunity. Off-the-shelf assistive robotic manipulators have the capability to improve the life of people with motor impairments. Robotic lightweight arms represent one of the most spread solution, in particular some of them are designed specifically to be mounted on wheelchairs to assist users in performing manipulation tasks. On the other hand, usually their control interface relies on joystick and buttons, making the use very challenging for people affected by impaired motor abilities. In this paper, we present a novel wearable control interface for users with limb mobility impairments. We make use of muscles residual motion capabilities, captured through a Body-Machine Interface based on a combination of head tilt estimation and electromyography signals. The proposed BMI is completely wearable, wireless and does not require frequently long calibrations. Preliminary experiments showed the effectiveness of the proposed system for subjects with motor impairments, allowing them to easily control a robotic arm for activities of daily living.
The Supernumerary robotic limbs are a recently introduced class of wearable robots that, differently from traditional prostheses and exoskeletons, aim at adding extra effectors (i.e., arms, legs, or fingers) to the human user, rather than substituting or enhancing the natural ones. However, it is still undefined whether the use of supernumerary robotic limbs could specifically lead to neural modifications in brain dynamics. The illusion of owning the part of body has been already proven in many experimental observations, such as those relying on multisensory integration (e.g., rubber hand illusion), prosthesis and even on virtual reality. In this paper we present a description of a novel magnetic compatible supernumerary robotic finger together with preliminary observations from two functional magnetic resonance imaging (fMRI) experiments, in which brain activity was measured before and after a period of training with the robotic device, and during the use of the novel MRI-compatible version of the supernumerary robotic finger. Results showed that the usage of the MR-compatible robotic finger is safe and does not produce artifacts on MRI images. Moreover, the training with the supernumerary robotic finger recruits a network of motor-related cortical regions (i.e. primary and supplementary motor areas), hence the same motor network of a fully physiological voluntary motor gestures.
In this work, we propose a method to design tendon-driven underactuated hands whose fingertips can track a predefined trajectory, when actuated. We focus on passively compliant hands composed of deformable joints and rigid links. We firstly introduce a procedure to determine suitable joints stiffness and tendon routing, then a possible realization of a robotic underactuated finger is shown. The kinematic and kinetostatic analysis of a tendon-driven robotic finger is necessary to define the overall stiffness values of the finger joints. A structural analysis of the element constituting each passive joint allowed to define a relation between the stiffness and joint's main dimensional and material properties. We validated the proposed framework both in simulation and with experiments using the robotic Soft-SixthFinger as a case study. The Soft-SixthFinger is a wearable robot for grasping compensation in patients with a paretic hand. We demonstrated that different fingertip trajectories can be achieved when joint stiffness and tendon routing are properly designed. Moreover we demonstrated that the device is able to grasp a wider set of objects when a specific finger flexion trajectory is designed. The proposed framework is general and can be applied to robotic hands with an arbitrary number of fingers and joints per finger. The modular approach furthermore allows the user to easily customize the hand according to specific tasks or trajectories.
The human hand represents a complex fascinating system with highly sensitive sensory capabilities and dexterous grasping and manipulation functionalities. As a consequence, estimating the hand pose and at the same time having the capability to provide haptic feedback in a wearable way may benefit areas such as rehabilitation, human-robot interaction, gaming, and many more. Existing solutions allow to accurately measure the hand configuration and provide effective force feedback to the user. However, they have limited wearability/portability. In this paper, we present the wearable sensing/actuation system GESTO (Glove for Enhanced Sensing and TOuching). It is based on inertial and magnetic sensors for hand tracking, coupled with cutaneous devices for the force feedback rendering. Unlike vision-based tracking systems, the sensing glove does not suffer from occlusion problems and lighting conditions. We properly designed the cutaneous devices in order to reduce possible interferences with the magnetic sensors and we performed an experimental validation on ten healthy subjects. In order to measure the estimation accuracy of GESTO, we used a high precision optical tracker. A comparison between using the glove with and without the haptic devices shows that the presence of them does not induce a statistically significant increase in the estimation error. Experimental results revealed the effectiveness of the proposed approach. The accuracy of our system, 3.32 degrees mean estimation error in the worst case, is comparable with the human ability of discriminating finger joint angle.
Assistive and rehabilitation devices are a promising and challenging field of recent robotics research. Motivated by societal needs such as aging populations, such devices can support motor functionality and subject training. The design, control, sensing, and assessment of the devices become more sophisticated due to a human in the loop. This paper gives a human–robot interaction perspective on current issues and opportunities in the field. On the topic of control and machine learning, approaches that support but do not distract subjects are reviewed. Options to provide sensory user feedback that are currently missing from robotic devices are outlined. Parallels between device acceptance and affective computing are made. Furthermore, requirements for functional assessment protocols that relate to real-world tasks are discussed. In all topic areas, the design of human-oriented frameworks and methods is dominated by challenges related to the close interaction between the human and robotic device. This paper discusses the aforementioned aspects in order to open up new perspectives for future robotic solutions.
— In this paper we lay the foundation of the first heterogeneous multi-robot system of the Multiple AerialGround Manipulator System (MAGMaS) type. A MAGMaS consists of a ground manipulator and a team of aerial robots equipped with a simple gripper manipulator the same object. The idea is to benefit from the advantages of both kinds of platforms, i.e., physical strength versus large workspace. The dynamic model of such robotic systems is derived, and its characteristic structure exhibited. Based on the dynamical structure of the system a nonlinear control scheme, augmented with a disturbance observer is proposed to perform trajectory tracking tasks in presence of model inaccuracies and external disturbances. The system redundancy is exploited by solving an optimal force/torque allocation problem that takes into account the heterogeneous system constraints and maximizes the force manipulability ellipsoid. Simulation results validated the proposed control scheme for this novel heterogeneous robotic system. We finally present a prototypical mechanical design and preliminary experimental evaluation of a MAGMaS composed by a kuka LWR4 and quadrotor based aerial robot.
In this work, we propose a method to compute the stiffness of flexible joints and its realization in order to let the fingers track a certain predefined trajectory. We refer to tendon-driven, underactuated and passively compliant hands composed of deformable joints and rigid links. Specific stiffness and pre-form shapes can be assigned to the finger joints can be given s such that a single-cable actuation can be used. We firstly define a procedure to determine suitable joints stiffness and then we propose a possible realization of soft joints using rapid prototyping techniques. The stiffness computation is obtained leveraging on the the mechanics of tendon-driven hands and on compliant systems, while for its implementation beam theory has been exploited. We validate the proposed framework both in simulation and with experiments using the robotic Soft-SixthFinger, a wearable robot for grasping compensation in patients with a paretic hand, as a case study. The proposed framework can be used to design the stiffness of the passive joints in several model of underactuated tendon-driven soft hands so to improve their grasping capabilities.
In the last decade, we have witnessed a drastic change in the form factor of audio and vision technologies, from heavy and grounded machines to lightweight devices that naturally fit our bodies. However, only recently, haptic systems have started to be designed with wearability in mind. The wearability of haptic systems enables novel forms of communication, cooperation, and integration between humans and machines. Wearable haptic interfaces are capable of communicating with the human wearers during their interaction with the environment they share, in a natural and yet private way. This paper presents a taxonomy and review of wearable haptic systems for the fingertip and the hand, focusing on those systems directly addressing wearability challenges. The paper also discusses the main technological and design challenges for the development of wearable haptic interfaces, and it reports on the future perspectives of the field. Finally, the paper includes two tables summarizing the characteristics and features of the most representative wearable haptic systems for the fingertip and the hand.
This paper presents the design, analysis, fabrication, experimental characterization, and evaluation of two prototypes of robotic extra fingers that can be used as grasp compensatory devices for a hemiparetic upper limb. The devices are the results of experimental sessions with chronic stroke patients and consultations with clinical experts. Both devices share a common principle of work, which consists in opposing the device to the paretic hand or wrist so to restrain the motion of an object. They can be used by chronic stroke patients to compensate for grasping in several activities of daily living (ADLs) with a particular focus on bimanual tasks. The robotic extra fingers are designed to be extremely portable and wearable. They can be wrapped as bracelets when not being used, to further reduce the encumbrance. Both devices are intrinsically compliant and driven by a single actuator through a tendon system. The motion of the robotic devices can be controlled using an electromyography-based interface embedded in a cap. The interface allows the user to control the device motion by contracting the frontalis muscle. The performance characteristics of the devices have been measured experimentally and the shape adaptability has been confirmed by grasping various objects with different shapes. We tested the devices through qualitative experiments based on ADLs involving five chronic stroke patients. The prototypes successfully enabled the patients to complete various bimanual tasks. Results show that the proposed robotic devices improve the autonomy of patients in ADLs and allow them to complete tasks that were previously impossible to perform.
Although Augmented Reality (AR) has been around for almost five decades, only recently we have witnessed AR systems and applications entering in our everyday life. Representative examples of this technological revolution are the smartphone games " PokémonPok´Pokémon GO " and " Ingress " or the Google Translate real-time sign interpretation app. Even if AR applications are already quite compelling and widespread, users are still not able to physically interact with the computer-generated reality. In this respect, wearable haptics can provide the compelling illusion of touching the superimposed virtual objects without constraining the motion or the workspace of the user. In this paper, we present the experimental evaluation of two wearable haptic interfaces for the fingers in three AR scenarios, enrolling 38 participants. In the first experiment, subjects were requested to write on a virtual board using a real chalk. The haptic devices provided the interaction forces between the chalk and the board. In the second experiment, subjects were asked to pick and place virtual and real objects. The haptic devices provided the interaction forces due to the weight of the virtual objects. In the third experiment, subjects were asked to balance a virtual sphere on a real cardboard. The haptic devices provided the interaction forces due to the weight of the virtual sphere rolling on the cardboard. Providing haptic feedback through the considered wearable device significantly improved the performance of all the considered tasks. Moreover, subjects significantly preferred conditions providing wearable haptic feedback.
In this paper, we present the combination of our soft supernumerary robotic finger i.e. Soft-SixthFinger with a commercially available zero grav- ity arm support, the SaeboMAS. The overall proposed system can pro- vide the needed assistance during paretic upper limb rehabilitation involving both grasping and arm mobility to solve task-oriented activities. The Soft- SixthFinger is a wearable robotic supernumerary finger designed to be used as an active assistive device by post stroke patients to compensate the paretic hand grasp. The device works jointly with the paretic hand/arm to grasp an object similarly to the two parts of a robotic gripper. The SaeboMAS is a commercially available mobile arm support to neutralize gravity effects on the paretic arm specifically designed to facilitate and challenge the weak- ened shoulder muscles during functional tasks. The proposed system has been designed to be used during the rehabilitation phase when the arm is potentially able to recover its functionality, but the hand is still not able to perform a grasp due to the lack of an efficient thumb opposition. The overall system also act as a motivation tool for the patients to perform task-oriented rehabilitation activities With the aid of proposed system, the patient can closely simulate the desired motion with the non-functional arm for rehabilitation purposes, while performing a grasp with the help of the Soft-SixthFinger. As a pilot study we tested the proposed system with a chronic stroke patient to evaluate how the mobile arm support in conjunction with a robotic supernumerary finger can help in performing the tasks requiring the manipulation of grasped object through the paretic arm. In particular, we performed the Frenchay Arm Test (FAT) and Box and Block Test (BBT). The proposed system successfully enabled the patient to complete tasks which were previously impossible to perform.
- Mar 2017
Background: While the role of beta (∼20Hz), theta (∼5Hz) and alpha (∼10Hz) oscillations in the motor areas have been repeatedly associated with defined properties of motor performance, the investigation of gamma (∼40-90Hz) oscillatory activity is a more recent and still not fully understood component of motor control physiology, despite its potential clinical relevance for motor disorders. Objective/hypothesis: we have implemented an online neuromodulation paradigm based on transcranial alternating current stimulation (tACS) of the dominant motor cortex during a visuo-motor coordination task. This approach would allow a better understanding of the role of gamma activity, as well as that of other oscillatory bands, and their chronometry throughout the task. Methods: we tested the effects of 5Hz, 20Hz, 60Hz (mid-gamma) 80Hz (high-gamma) and sham tACS on the performance of a sample of right-handed healthy volunteers, during a custom-made unimanual tracking task addressing several randomly occurring components of visuo-motor coordination (i.e., constant velocity or acceleration pursuits, turns, loops). Results: data showed a significant enhancement of motor performance during high-gamma stimulation -as well as a trending effect for mid-gamma- with the effect being prominent between 200 and 500ms after rapid changes in tracking trajectory. No other effects during acceleration or steady pursuit were found. Conclusions: our findings posit a role for high-frequency motor cortex gamma oscillations during complex visuo-motor tasks involving the sudden rearrangement of motor plan/execution. Such a "prokinetic" effect of high-gamma stimulation might be worth to be tested in motor disorders, like Parkinson's disease, where the switching between different motor programs is impaired.
- Feb 2017
Background: Haptic feedback has been proven to play a key role in enhancing the performance of teleoperated medical procedures. However, due to safety issues, commercially-available medical robots do not currently provide the clinician with haptic feedback. Methods: This work presents the experimental evaluation of a teleoperation system for robot-assisted medical procedures able to provide magnified haptic feedback to the clinician. Forces registered at the operating table are magnified and provided to the clinician through a 7-DoF haptic interface. The same interface is also used to control the motion of a 6-DoF slave robotic manipulator. The safety of the system is guaranteed by a time-domain passivity-based control algorithm. Results: Two experiments were carried out on stiffness discrimination (during palpation and needle insertion) and one experiment on needle guidance. Conclusions: Our haptic-enabled teleoperation system improved the performance with respect to direct hand interaction of 80%, 306%, and 27% in stiffness discrimination through palpation, stiffness discrimination during needle insertion, and guidance, respectively.
Untethered miniature robotics have recently shown promising results in several scenarios at the microscale, such as targeted drug delivery, microassembly, and biopsy procedures. However, the vast majority of these small-scale robots have very limited manipulation capabilities, and none of the steering systems currently available enables humans to intuitively and effectively control dexterous miniaturized robots in a remote environment. In this paper, we present an innovative microteleoperation system with haptic assistance for the intuitive steering and control of miniaturized self-folding soft magnetic grippers in 2-D space. The soft grippers can be wirelessly positioned using weak magnetic fields and opened/closed by changing their temperature. An image-guided algorithm tracks the position of the controlled miniaturized gripper in the remote environment. A haptic interface provides the human operator with compelling haptic sensations about the interaction between the gripper and the environment as well as enables the operator to intuitively control the target position and grasping configuration of the gripper. Finally, magnetic and thermal control systems regulate the position and grasping configuration of the gripper. The viability of the proposed approach is demonstrated through two experiments involving 26 human subjects. Providing haptic stimuli elicited statistically significant improvements in the performance of the considered navigation and micromanipulation tasks. Note to Practitioners —The ability to accurately and intuitively control the motion of miniaturized grippers in remote environments can open new exciting possibilities in the fields of minimally invasive surgery, micromanipulation, biopsy, and drug delivery. This paper presents a microteleoperation system with haptic assistance through which a clinician can easily control the motion and open/close capability of miniaturized wireless soft grippers. It introduces the underlying autonomous magnetic and thermal control systems, their interconnection with the master haptic interface, and an extensive evaluation in two real-world scenarios: 1) following of a predetermined trajectory and 2) pick-and-place task of a microscopic object.
Human motion models are finding an increasing number of novel applications in many different fields, such as building design, computer graphics and robot motion planning. The Social Force Model is one of the most popular alternatives to describe the motion of pedestrians. By resorting to a physical analogy, individuals are assimilated to point-wise particles subject to social forces which drive their dynamics. Such a model implicitly assumes that humans move isotropically. On the contrary, empirical evidence shows that people do have a preferred direction of motion, walking forward most of the time. Lateral motions are observed only in specific circumstances, such as when navigating in overcrowded environments or avoiding unexpected obstacles. In this paper, the Headed Social Force Model is introduced in order to improve the realism of the trajectories generated by the classical Social Force Model. The key feature of the proposed approach is the inclusion of the pedestrians’ heading into the dynamic model used to describe the motion of each individual. The force and torque representing the model inputs are computed as suitable functions of the force terms resulting from the traditional Social Force Model. Moreover, a new force contribution is introduced in order to model the behavior of people walking together as a single group. The proposed model features high versatility, being able to reproduce both the unicycle-like trajectories typical of people moving in open spaces and the point-wise motion patterns occurring in high density scenarios. Extensive numerical simulations show an increased regularity of the resulting trajectories and confirm a general improvement of the model realism.
Human guidance in situations where the users cannot rely on their main sensory modalities, such as assistive or search-and-rescue scenarios, is a challenging task. In this paper, we address the problem of guiding users along collision-free paths in dynamic environments, assuming that they cannot rely on their main sensory modalities. In order to safely guide the subjects along collision-free paths, we adapt the Optimal Reciprocal Collision Avoidance to our specific problem. The proposed algorithm takes into account the stimuli which can be displayed to the users and the motion uncertainty of the users when reacting to the provided stimuli. The proposed algorithm was evaluated in three different dynamic scenarios. A total of 18 blindfolded human subjects were asked to follow haptic cues in order to reach a target area while avoiding real static obstacles and moving users. Three metrics such as time to reach the goal, length of the trajectories, and minimal distance from the obstacles are considered to compare results obtained using this approach and experiments performed without visual impairments. Experimental results reveal that blindfolded subjects are successfully able to avoid collisions and safely reach the target in all the performed trials. Although in this work we display directional cues via haptic stimuli, we believe that the proposed approach can be general and tuned to work with different haptic interfaces and/or feedback modalities.
In this paper, we present an electromyographic (EMG) control interface for a supernumerary robotic finger. This novel wearable robot can be used to compensate the missing grasping abilities in chronic stroke patients or to augment human healthy hand so to enhance its grasping capabilities and workspace. The proposed EMG interface controls the motion of the robotic extra finger and its joint compliance. In particular, we use a commercial EMG armband for gesture recognition to be associated with the motion control of the robotic device and surface one channel EMG electrodes interface to regulate the compliance of the robotic device. We also present an updated version of a robotic extra finger where the adduction/abduction motion is realized through ball bearing and spur gears mechanism. We validated the proposed interface with two sets of experiments related to compensation and augmentation. In the first set of experiments, different bi-manual tasks have been performed with the help of the robotic device and simulating a paretic hand. In the second set, the robotic extra finger is used to enlarge the workspace and manipulation capability of healthy hands. In both the sets, the same EMG control interface has been used. The obtained results demonstrate that the proposed control interface is intuitive and can successfully be used for both compensation and augmentation purposes. The proposed approach can be exploited also for the control of different wearable devices that has to actively cooperate with the human limbs.
Novel wearable tactile interfaces offer the possibility to simulate tactile interactions with virtual environments directly on our skin. But, unlike kinesthetic interfaces, for which haptic rendering is a well explored problem, they pose new questions about the formulation of the rendering problem. In this work, we propose a formulation of tactile rendering as an optimization problem, which is general for a large family of tactile interfaces. Based on an accurate simulation of contact between a finger model and the virtual environment, we pose tactile rendering as the optimization of the device configuration, such that the contact surface between the device and the actual finger matches as close as possible the contact surface in the virtual environment. We describe the optimization formulation in general terms, and we also demonstrate its implementation on a thimble-like wearable device. We validate the tactile rendering formulation by analyzing its force error, and we show that it outperforms other approaches.
In this paper, we present the combination of the Soft-SixthFinger, a wearable robotic extra-finger designed to be used by chronic stroke patients to compensate for the missing hand function, with a robotic arm that is used as an assistive device to support the patient arm. The extra-finger is a tendon-driven modular structure worn at the paretic forearm. The robotic extra-finger is used jointly with the paretic hand/arm to grasp an object similarly to the two parts of a robotic gripper. The flexion/extension of the robotic finger is controlled by the patient using an Electromyography (EMG) interface embedded in a cap. The robotic arm is controlled to partially compensate for the weight of the paretic arm, while not interfering with the user arm motion. The system has been designed as a tool that can be used by chronic stroke patients to compensate for grasping in many Activities of Daily Living (ADL). We performed a pilot test to demonstrate that the proposed system can significantly improve the performance and the autonomy in ADL.
- Oct 2016
- IROS 2016
In this paper, we propose a bilateral tele-operation scheme for cooperative aerial manipulation in which a human operator drives a team of Vertical TakeOff and Landing (VTOL) aerial vehicles, that grasped an object beforehand, and receives a force feedback depending on the states of the system. For application scenarios in which dexterous manipulation by each robot is not necessary, we propose using a rigid tool attached to the vehicle through a passive spherical joint, equipped with a simple adhesive mechanism at the tool-tip that can stick to the grasped object. Having more than two robots, we use the extra degrees of freedom to find the optimal force allocation in term of minimum power and forces smoothness. The human operator commands a desired trajectory for the robot team through a haptic interface to a pose controller, and the output of the pose controller along with system constraints, e.g., VTOL limited forces and contact maintenance, defines the feasible set of forces. Then, an on-line optimization allocates forces by minimizing a cost function of forces and their variation. Finally, propeller thrusts are computed by a dedicated attitude and thrust controller in a decentralized fashion. Human/Hardware in the loop simulation study shows efficiency of the proposed scheme, and the importance of haptic feedback to achieve a better performance.
In this paper, we present a novel cooperative navigation control for human–robot teams. Assuming that a human wants to reach a final location in a large environment with the help of a mobile robot, the robot must steer the human from the initial to the target position. The challenges posed by cooperative human–robot navigation are typically addressed by using haptic feedback via physical interaction. In contrast with that, in this paper, we describe a different approach, in which the human–robot interaction is achieved via wearable vibrotactile armbands. In the proposed work, the subject is free to decide her/his own pace. A warning vibrational signal is generated by the haptic armbands when a large deviation with respect to the desired pose is detected by the robot. The proposed method has been evaluated in a large indoor environment, where 15 blindfolded human subjects were asked to follow the haptic cues provided by the robot. The participants had to reach a target area, while avoiding static and dynamic obstacles. Experimental results revealed that the blindfolded subjects were able to avoid the obstacles and safely reach the target in all of the performed trials. A comparison is provided between the results obtained with blindfolded users and experiments performed with sighted people.
- Oct 2016
Bilateral telemanipulation refers to frameworks in which a human operator manipulates a master robotic interface, and a slave robotic device emulates the behavior of the master, while haptic feedback is provided to the operator. For multi-contact bilateral teleoperation we intend master and slave systems that can establish multiple contact points with the user and with the environment. A paradigmatic example can be a multi-fingered robotic hand teleoperated by the human hand. Two of the most critical issues in this context are: (i) how to provide haptic feedback on multiple points of the human hand; (ii) how to solve the correspondence problem between the human hand and the robotic slave device. In this work, we propose finger-worn devices able to apply a three dimensional vector of force at a specific contact point to solve the multi-contact feedback problem. For the correspondence problem, we propose an object-based mapping procedure. The approach is based on two virtual objects, defined both at the master and slave sides, to capture the human hand motion and to compute the related force feedback. The proposed approach has been tested in a telemanipulation framework where the master side was composed of a Leap Motion sensor used to track the hand plus three wearable haptic devices, while a robotic hand/arm system performed a manipulation task as slave.
- Oct 2016
- 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)
Underactuated and synergy-driven hands are gaining attention in the grasping community mainly due to their simple kinematics, intrinsic compliance and versatility for grasping objects even in non structured scenarios. The evaluation of the grasping capabilities of such hands is a challenging task. This paper revisits some traditional quality measures developed for multi-fingered, fully actuated hands, and applies them to the case of underactuated hands. The extension of quality metrics for synergy-driven hands for the case of underactuated grasping is also presented. The performance of both types of measures is evaluated with simulated examples, concluding with a comparative discussion of their main features.
The availability of grasp quality measures is fundamental for grasp planning and control, and also to drive designers in the definition and optimization of robotic hands. This work investigates on grasp robustness and quality indexes that can be applied to power grasps with underactuated and compliant hands. When dealing with such types of hands, there is the need of an evaluation method that takes into account which forces can be actually controlled by the hand, depending on its actuation system. In this paper we study the Potential Contact Robustness and the Potential Grasp Robustness (PCR, PGR) indexes. They both consider main grasp properties: contact points, friction coefficient, etc., but also hand degrees of freedom and consequently the directions of controllable contact forces. The PCR comes directly from classical grasp theory and can be easily evaluated, but often leads to too conservative solutions, particularly when the grasp has many contacts. The PGR is more complex and computationally heavier, but gives a more realistic, even if still conservative, estimation of the overall grasp robustness, also in power grasps. We evaluated the indexes for various simulated grasps, performed with underactuated and compliant hands, and we analyzed their variations with respect to the main grasp parameters.
- Sep 2016
- Smart Cities Conference (ISC2), 2016 IEEE International
We compare two different solutions to guide an older adult along a safe path using a robotic walking assistant (the c-Walker). The two solutions are based on tactile or acoustic stimuli, respectively, and suggest a direction of motion that the user is supposed to take on her own will. We describe the technological basis for the hardware components, and show specialised path following algorithms for each of the two solutions. The paper reports an extensive user validation activity, with a quantitative and qualitative analysis.
We propose a novel bilateral telemanipulation framework to tame master and slave devices having different structures. This condition applies to multi-contact teleoperation scenarios where the number of contact points on the slave side and the number of interaction points on the master side are different. An example is a master device interacting with the thumb and the index fingertips of the human operator and, as slave device, a robotic arm with a multi-fingered robotic hand. In case of a manipulation task, it is not straightforward to transmit motion commands and reflect forces from the interaction with the environment. A general telemanipulation framework, that does not consider the specific kinematics of the devices involved, is needed. The main idea of this work is to take advantage of a virtual object as a mediator between the master and slave side. The arising forward and backward mapping algorithms are able to relate the motions and the exerted forces of very dissimilar systems. The approach has been evaluated in a case study consisting of two haptic interfaces used both to track the index and thumb motions and to render forces on the master side and a robotic arm with a multi-fingered hand as end-effector on the slave side. The results presented in this paper can be extended to cooperative grasping scenarios where multiple robots tele-manipulate the same object.
- Aug 2016
- RO-MAN 2016
Human-Robot teams can efficiently operate in several scenarios including Urban Search and Rescue (USAR). Robots can access areas too small or deep for a person, can begin surveying larger areas that people are not permitted to enter and can carry sensors and instruments. One important aspect in this cooperative framework is the way robots and humans can communicate during rescue operation. Vision and audio modalities may result not efficient in case of reduced visibility or high noise. A promising way to guarantee effective communications between robot and human in a team is the exploitation of haptic signals. In this work, we present a possible solution to let a robot guide the position of a human operator's hand by using vibrations. We demonstrate that an armband embedding four vibrating motors is enough to guide the wrist of an operator along a predefined path or in a target location. The results proposed can be exploited in human-robot teams. For instance, when the robot detects the position of a sensible target, it can guide the wrist of the operator in such position following an optimal path.
- Aug 2016
- Engineering in Medicine and Biology Society
In this paper, a novel, motorized, multi-degrees-of-freedom (DoF), microsurgical forceps tool is presented, which is based on a master-slave teleoperation architecture. The slave device is a 7-DoF manipulator with: (i) 6-DoF positioning and orientation, (ii) 1 open/close gripper DoF; and (iii) an integrated force/torque sensor for tissue grip-force measurement. The master device is a 7-DoF haptic interface which teleoperates the slave device, and provides haptic feedback in its gripper interface. The combination of the device and the surgeon interface replaces the manual, hand-held device providing easy-to-use and ergonomic tissue control, simplifying the surgical tasks. This makes the system suitable to real surgical scenarios in the operating room (OR). The performance of the system was analysed through the evaluation of teleoperation control and characterization of gripping force. The new system offers an overall positioning error of less than 400 μm demonstrating its safety and accuracy. Improved system precision, usability, and ergonomics point to the potential suitability of the device for the OR and its ability to advance haptic-feedback-enhanced transoral laser microsurgeries.
Microrobotics systems are showing promising results in several applications and scenarios, such as targeted drug delivery and screening, biopsy, environmental control, surgery, and assembly. While most of the systems presented in the literature consider autonomous techniques, there is a growing interest in human-in-the-loop approaches. For reasons of responsibility, safety, and public acceptance, it is in fact beneficial to provide a human with intuitive and effective means for directly controlling these microrobotic systems. In this respect, haptic feedback is widely believed to be a valuable tool in human-in-the-loop teleoperation systems. This article presents a review of the literature on haptic feedback systems for microrobotics, categorizing it according to the type of haptic technology employed. In particular, we considered both tethered and untethered systems, including applications of micropositioning, microassembly, minimally invasive surgery, delivery of objects, micromanipulation, and injection of cells. One of the main challenges for an effective implementation is stability control. In fact, the high scaling factors introduced to match variables in the macro and the micro worlds may introduce instabilities. Another challenge lies in the measurement of position and force signals in the remote environment. The integration of microsized sensors may significantly increase the complexity and cost of tools fabrication. To overcome the lack of force-sensing, vision seems a promising solution. Finally, although the literature on haptic feedback for untethered microrobotics is still quite small, we foreseen a great development of this field of research, thanks to its flexible applications in biomedical engineering scenarios.
Current wireless, small-scale robots have restricted manipulation capabilities, and limited intuitive tools to control their motion. This paper presents a novel teleoperation system with haptic feedback for the control of untethered soft grippers. The system is able to move and open/close the grippers by regulating the magnetic field and temperature in the workspace. Users can intuitively control the grippers using a grounded haptic interface, that is also capable of providing compelling force feedback information as the gripper interacts with the environment. The magnetic closed-loop control algorithm is designed starting from a Finite Element Model analysis. The electromagnetic model used is validated by a measurement of the magnetic field with a resolution of 0.1 mT and sampling rate of 6.8×106 samples/m2. The system shows an accuracy in positioning the gripper of 0.08 mm at a velocity of 0.81 mm/s. The robustness of the control and tracking algorithms are tested by spraying the workspace with water drops that cause glares and related disturbances of up to 0.41 mm.
The lack of haptic feedback during laser surgery procedures prevents surgeons from accurately discerning the depth of the incisions they perform. In this paper we introduce a novel teleoperated surgical platform, which employs a commercial haptic device to convey information about the laser incision depth to the surgeon. The incision depth is estimated by a feed forward model that maps the laser parameters selected by the surgeon and the total time of laser exposure to the resulting ablation depth. An experiment was conducted to evaluate the effectiveness of the proposed system in enabling precise laser ablation. The experiment involved ten human subjects who were asked to complete a single-point laser ablation task. Results show that haptic feedback can significantly improve the level of surgical precision of laser interventions.
- Apr 2016
- Haptic Symposium 2016
Fingertip contact forces are of utmost importance in evaluating the quality of the human grasp. However, measuring such forces during object manipulation is not a trivial task. In this paper, we propose a novel method to estimate the fingertip contact forces in grasping deformable objects with known shape and stiffness matrix. The proposed approach uses a sensing glove instrumented with inertial and magnetic sensors. Data obtained from the accelerometers and gyroscopes placed on the distal phalanges are used to determine when the fingers establish contacts with the object. The sensing glove is used to estimate the configuration of the hand and the deformation of the object at each contact with the fingertips of the human hand. The force exerted by each fingertip is obtained by multiplying the stiffness matrix of the object and the vector of object’s local deformation in the contact point. Extensive simulations have been performed in order to evaluate the robustness of the proposed approach to noisy measurements, and uncertainties in human hand model. In order to validate the proposed approach, experimental validations with a virtual object have been performed. A haptic device was used to generate the contact forces with the virtual object and accurately measure the forces exerted by the users during the interaction.
Data compression techniques enable the transmission of highly informative data using little bandwidth. Examples of popular compression formats are Google's VP9 and MPEG's MP3 for video and audio data, respectively. Recently, researchers focused on the applicability of compression techniques for haptic data too. One of these approaches, called the deadband compression approach, transmits new haptic stimuli to the receiver side only when the user is able to actually perceive the change of the stimulus with respect to the previously transmitted one. However, no deadband compression approach has been presented and evaluated for cutaneous stimuli. In this work we extend the deadband approach to cutaneous haptic data. A force-controlled cutaneous device provides the human operator with cutaneous feedback from a virtual environment. A new cutaneous stimulus is applied at the master side only if the human operator is able to sense the change with respect to the previous one. This perceptual threshold is the just notifiable difference (JND). Results show an average bit rate reduction of 61.7% with no performance degradation.
The wearable electronics business has powered over $14 billion in 2014 and it is estimated to power over $70 billion by 2024. However, commercially-available wearable devices still provide very limited haptic feedback, mainly focusing on vibrotactile sensations. Towards a more realistic feeling of interacting with virtual and remote objects, we propose a novel wearable cutaneous device for the proximal finger phalanx, called "hRing". It consists of two servo motors that move a belt placed in contact with the user's finger skin. When the motors spin in opposite directions, the belt presses into the user's finger, while when the motors spin in the same direction, the belt applies a shear force to the skin. Its positioning on the proximal finger phalanx improves the capability of this device to be used together with unobtrusive hand tracking systems, such as the LeapMotion controller and the Kinect sensor. The viability of the proposed approach is demonstrated through a pick-and-place experiment involving seven human subjects. Providing cutaneous feedback through the proposed device improved the performance and perceived effectiveness of the considered task of 20% and 47% with respect to not providing any force feedback, respectively. All subjects found no difference in the quality of the tracking when carrying out the task wearing the device versus barehanded.
This paper presents a novel cutaneous device capable of providing independent skin stretches at the palmar, dorsal, ulnar, and radial sides of the arm. It consists of a lightweight bracelet with four servo motors. Each motor actuates a cylindrical shaped end-effector that is able to rotate, generating skin stretch stimuli. To understand how to control and wear the device on the forearm to evoke the most effective cutaneous sensations, we carried out perceptual experiments evaluating its absolute and differential thresholds. Finally, we carried out an experiment of haptic navigation to assess the effectiveness of our device as a navigation feedback system to guide a desired rotation and translation of the forearm. Results demonstrate an average rotation and translation error of 1.87° and 2.84 mm, respectively. Moreover, all the subjects found our device easy to wear and comfortable. Nine out of ten found it effective in transmitting navigation information to the forearm.
- Apr 2016
- 2016 IEEE Haptics Symposium (HAPTICS)
Recent advances in tactile rendering span, among others, wearable cutaneous interfaces, tactile rendering algorithms, or nonlinear soft skin models. However, the adoption of these advances for multi-finger tactile rendering of dexterous grasping and manipulation is hampered by the computational cost incurred with nonlinear skin models when applied to the full hand. We have observed that classic constrained dynamics solvers, typically designed for contact mechanics, fail to perform efficiently on deformation constraints of nonlinear skin models. In this paper, we propose a novel constrained dynamics solver designed to perform well with highly nonlinear deformation constraints. In practice, we achieve more than 10x speed-up over previous approaches, and as a result we enable multi-finger tactile rendering of manipulation actions that capture the nonlinearity of skin.
In this demo, we present two possible control interfaces for a robotic extra-finger called the Robotic Sixth Finger. One interface is an instrumented glove able to measure the human hand posture. The aim is the integration of the motion of robotic finger with that of the human hand so to achieve complex manipulation skills. The second interface is a ring with a push button embedded so to implement a simple and intuitive control. The presence of an extra robotic finger in human hand enlarges the workspace, increases the grasping capabilities and the manipulation dexterity. We will propose a series of grasping and manipulation tasks to be performed with the help of the robotic sixth finger and the relative interface so to prove their effectiveness in augmenting the human hand capabilities.
- Feb 2016
- Human and Robot Hands
Throughout this book, we have described how neuroscientific findings on synergistic organization of human hand can be used to devise guidelines for the design and control of robotic and prosthetic hands as well as for sensing devices (see Chaps. 8, 10, 11 and 15). However, the development of novel robotic devices open issues on how to generalize the outcomes to different architectures. In this chapter, we describe a mapping strategy to transfer human hand synergies onto robotic hands with dissimilar kinematics. The algorithm is based on the definition of two virtual objects that are used to abstract from the specific structures of the hands. The proposed mapping strategy allows to overcame the problems in defining synergies for robotic hands computing PCA analysis over a grasp dataset obtained empirically closing the robot hand upon different objects. The developed mapping framework has been implemented using the SynGrasp Matlab toolbox. This tool includes functions for the definition of hand kinematic structure and of the contact points with a grasped object, the coupling between joints induced by a synergistic control, compliance at the contact, joint and actuator levels. Its analysis functions can be used to investigate the main grasp properties: controllable forces and object displacements, manipulability analysis, grasp quality measures. Furthermore, functions for the graphical representation of the hand, the object and the main analysis results are provided.
This paper presents the Soft-SixthFinger, a wearable robotic extra-finger designed to be used by chronic stroke patients to compensate for the missing hand function of their paretic limb. The extra-finger is an underactuated modular structure worn on the paretic forearm by means of an elastic band. The device and the paretic hand/arm act like the two parts of a gripper working together to hold an object. The patient can control the flexion/extension of the robotic finger through the eCap, an Electromyography-based (EMG) interface embedded in a cap. The user can control the device by contracting the frontalis muscle. Such contraction can be achieved simply moving his or her eyebrows upwards. The Soft-SixthFinger has been designed as tool that can be used by chronic stroke patients to compensate for grasping in many Activities of Daily Living (ADL). It can be wrapped around the wrist and worn as a bracelet when not used. The light weight and the complete wireless connection with the EMG interface guarantee a high portability and wearability. We tested the device with qualitative experiments involving six chronic stroke patients. Results show that the proposed system significantly improves the performances of the patients in the proposed tests and, more in general, their autonomy in ADL.
A novel solution to compensate hand grasping abilities is proposed for chronic stroke patients. The goal is to provide the patients with a wearable robotic extra-finger that can be worn on the paretic forearm by means of an elastic band. The proposed prototype, the Robotic Sixth Finger, is a modular articulated device that can adapt its structure to the grasped object shape. The extra-finger and the paretic hand act like the two parts of a gripper cooperatively holding an object. We evaluated the feasibility of the approach with four chronic stroke patients performing a qualitative test, the Frenchay Arm Test. In this proof of concept study, the use of the Robotic Sixth Finger has increased the total score of the patients of 2 points in a 5 points scale. The subjects were able to perform the two grasping tasks included in the test that were not possible without the robotic extra-finger. Adding a robotic opposing finger is a very promising approach that can significantly improve the functional compensation of the chronic stroke patient during everyday life activities.
The term ‘synergy’ – from the Greek synergia – means ‘working together’. The concept of multiple elements working together towards a common goal has been extensively used in neuroscience to develop theoretical frameworks, experimental approaches, and analytical techniques to understand neural control of movement, and for applications for neuro-rehabilitation. In the past decade, roboticists have successfully applied the framework of synergies to create novel design and control concepts for artificial hands, i.e., robotic hands and prostheses. At the same time, robotic research on the sensorimotor integration underlying the control and sensing of artificial hands has inspired new research approaches in neuroscience, and has provided useful instruments for novel experiments.
Blindness dramatically limits quality of life of individuals and has profound implications for the person affected and the society as a whole. Physical mobility and exercises are strongly spurred within people, as ways to maintain health and well-being. Such activities can be really important for people with disability as well, and their increase is paramount in the well-being and assistive care system. In this work, we aim at improving the communication between the instructor and a visual impaired subject during skiing. Up to now, only the auditory channel is used to communicate basic commands to the skier. We introduce a novel use of haptic feedback in this context. In particular, the skier can receive directional information through two vibrating bracelets worn on the forearms. Haptic interaction has been proven to be processed faster by the brain demanding a less cognitive effort with respect to the auditory modality. The connection between the instructor and the skier is done by Bluetooth protocol. We tested different guiding modalities including only audio commands, audio and haptic commands and only haptic commands. Preliminary results on the use of the system reveled the haptic channel to be a promising way for guidance of blind people in winter sports.
In this paper, we propose several solutions to guide an older adult along a safe path using a robotic walking assistant (the c-Walker). We consider four different possibilities to execute the task. One of them is mechanical, with the c-Walker playing an active role in setting the course. The other ones are based on tactile or acoustic stimuli, and suggest a direction of motion that the user is supposed to take on her own will. We describe the technological basis for the hardware components implementing the different solutions, and show specialized path following algorithms for each of them. The paper reports an extensive user validation activity with a quantitative and qualitative analysis of the different solutions. In this work, we test our system just with young participants to establish a safer methodology that will be used in future studies with older adults.
This chapter introduces fundamental models of grasp analysis. The overall model is a coupling of models that define contact behavior with widely used models of rigid-body kinematics and dynamics. The contact model essentially boils down to the selection of components of contact force and moment that are transmitted through each contact. Mathematical properties of the complete model naturally give rise to five primary grasp types whose physical interpretations provide insight for grasp and manipulation planning. After introducing the basic models and types of grasps, this chapter focuses on the most important grasp characteristic: complete restraint. A grasp with complete restraint prevents loss of contact and thus is very secure. Two primary restraint properties are form closure and force closure. A form closure grasp guarantees maintenance of contact as long as the links of the hand and the object are well-approximated as rigid and as long as the joint actuators are sufficiently strong. As will be seen, the primary difference between form closure and force closure grasps is the latter’s reliance on contact friction. This translates into requiring fewer contacts to achieve force closure than form closure. The goal of this chapter is to give a thorough understanding of the all-important grasp properties of form and force closure. This will be done through detailed derivations of grasp models and discussions of illustrative examples. For an in-depth historical perspective and a treasure-trove bibliography of papers addressing a wide range of topics in grasping, the reader is referred to [38.1].
Haptics provides sensory stimuli that represent the interaction with a virtual or tele-manipulated object, and it is considered a valuable navigation and manipulation tool during tele-operated surgical procedures. Haptic feedback can be provided to the user via cutaneous information and kinesthetic feedback. Sensory subtraction removes the kinesthetic component of the haptic feedback, having only the cutaneous component provided to the user. Such a technique guarantees a stable haptic feedback loop, while it keeps the transparency of the tele-operation system high, which means that the system faithfully replicates and render back the user’s directives. This work focuses on checking whether the interaction forces during a bench model neurosurgery operation can lie in the solely cutaneous perception of the human finger pads. If this assumption is found true, it would be possible to exploit sensory subtraction techniques for providing surgeons with feedback from neurosurgery. We measured the forces exerted to surgical tools by three neurosurgeons performing typical actions on a brain phantom, using contact force sensors, while the forces exerted by the tools to the phantom tissue were recorded using a load cell placed under the brain phantom box. The measured surgeon–tool contact forces were 0.01–3.49 N for the thumb and 0.01–6.6 N for index and middle finger, whereas the measured tool–tissue interaction forces were from six to 11 times smaller than the contact forces, i.e., 0.01–0.59 N. The measurements for the contact forces fit the range of the cutaneous sensitivity for the human finger pad; thus, we can say that, in a tele-operated robotic neurosurgery scenario, it would possible to render forces at the fingertip level by conveying haptic cues solely through the cutaneous channel of the surgeon’s finger pads. This approach would allow high transparency and high stability of the haptic feedback loop in a tele-operation system.
Cutaneous haptic feedback can be used to enhance the performance of robotic teleoperation systems while guaranteeing their safety. Delivering ungrounded cutaneous cues to the human operator conveys in fact information about the forces exerted at the slave side and does not affect the stability of the control loop. In this work we analyze the feasibility, effectiveness, and implications of providing solely cutaneous feedback in robotic teleoperation. We carried out two peg-in-hole experiments, both in a virtual environment and in a real (teleoperated) environment. Two novel 3-degree-of-freedom fingertip cutaneous displays deliver a suitable amount of cutaneous feedback at the thumb and index fingers. Results assessed the feasibility and effectiveness of the proposed approach. Cutaneous feedback was outperformed by full haptic feedback provided by grounded haptic interfaces, but it outperformed conditions providing no force feedback at all. Moreover, cutaneous feedback always kept the system stable, even in the presence of destabilizing factors such as communication delays and hard contacts.
Self-propelled microrobots have recently shown promising results in several scenarios at the microscale, such as targeted drug delivery and micromanipulation of cells. However, none of the steering systems available in the literature enable humans to intuitively and effectively control these microrobots in the remote environment, which is a desirable feature. In this paper we present an innovative teleoperation system with force reflection that enables a human operator to intuitively control the positioning of a self-propelled microjet. A particle-filter-based visual tracking algorithm tracks at runtime the position of the microjet in the remote environment. A 6-degrees-of-freedom haptic interface then provides the human operator with compelling haptic feedback about the interaction between the controlled microjet and the environment, as well as enabling the operator to intuitively control the target position of the microjet. Finally, a wireless magnetic control system regulates the orientation of the microjet to reach the target point. The viability of the proposed approach is demonstrated through two experiments enrolling twenty-eight subjects. In both experiments providing haptic feedback significantly improved the performance and the perceived realism of the considered tasks.
In this paper, we present a haptic guidance policy to steer the user along predefined paths, and we evaluate a predictive approach to compensate actuation delays that humans have when they are guided along a given trajectory via sensory stimuli. The proposed navigation policy exploits the nonholonomic nature of human locomotion in goal directed paths, which leads to a very simple guidance mechanism. The proposed method has been evaluated in a real scenario where seven human subjects were asked to walk along a set of predefined paths, and were guided via vibrotactile cues. Their poses as well as the related distances from the path have been recorded using an accurate optical tracking system. Results revealed that an average error of 0.24 m is achieved by using the proposed haptic policy, and that the predictive approach does not bring significant improvements to the path following problem for what concerns the distance error. On the contrary, the predictive approach achieved a definitely lower activation time of the haptic interfaces.
SynGrasp is a MATLAB toolbox for grasp analysis of fully or underactuated robotic hands with compliance. Compliance can be modeled at contact points, in the joints or in the actuation system including transmission. It is possible to use a Graphical User Interface or directly assemble and modify the available functions to exploit all the toolbox features. Grasps can be described either using the provided grasp planner or directly defining contact points on the hand with the respective contact normal directions. Several analysis functions have been developed to investigate the main grasp properties: control-lable forces and object displacement, manipulability analysis, grasp stiffness and grasp quality measures. Functions for the graphical representation of the hand, the object and the main analysis results are provided. The toolbox is freely available at http://syngrasp.dii.unisi.it.
Wearable robotics is strongly oriented to humans. New applications for wearable robots are encouraged by the lightness and portability of new devices and the progress in human-robot cooperation strategies. In this paper, we propose the different design guidelines to realize a robotic extra-finger for human grasping enhancement. Such guidelines were followed for the realization of three prototypes obtained using rapid prototyping techniques, i.e., a 3D printer and an open hardware development platform. Both fully actuated and under-actuated solutions have been explored. In the proposed wearable design, the robotic extra-finger can be worn as a bracelet in its rest position. The availability of a supplementary finger in the human hand allows to enlarge its workspace, improving grasping and manipulation capabilities. This preliminary work is a first step towards the development of robotic extra-limbs able to increase human workspace and dexterity.
- Aug 2015
- Applications in Electronics Pervading Industry, Environment and Society
The c-Walker is a smart rollator that provides physical sustain to people with mobility difficulties together with a cognitive support to overcome disabilities related to the decrement of sensorial abilities. The proposed system is made of a conventional walker equipped with a variety of sensors, actuators, user interfaces, and computing units. Various algorithms monitor environmental data. The system processes them to define the safest path for the user, and transmits useful information for navigation via multiple interfaces to the assisted person. Moreover, the system can take control of the direction to avoid hazards. To design and develop the c-Walker, we adopt state-of-the-art design methodologies that assist the designers in the integration phase. In this work we describe the technology of hardware and software components included in the prototype device.
This paper presents a wearable robotic extra finger used by chronic stroke patients to compensate for the missing hand functions of the paretic limb. The extra finger is worn on the paretic forearm by means of an elastic band, and it is coupled with a vibrotactile ring interface worn on the healthy hand. The robotic finger and the paretic hand act like the two parts of a gripper working together to hold an object. The human user is able to control the flexion/extension of the robotic finger through a switch placed on the ring, while being provided with vibrotactile feedback about the forces exerted by the robotic finger on the environment. To understand how to control the vibrotactile interface to evoke the most effective cutaneous sensations, we carried out perceptual experiments to evaluate its absolute and differential thresholds. Finally, we performed a qualitative experiment, the Franchay Arm Test, with a chronic post-stroke patient presenting a partial loss of sensitivity on the paretic limb. Results show that the proposed system significantly improves the performance of the considered test.
The complexity of the world around us is creating a demand for novel interfaces that will simplify and enhance the way we interact with the environment. The recently unveiled Android Wear operating system addresses this demand by providing a modern system for all those companies that are now developing wearable devices, also known as "wearables". Wearability of robotic devices will enable novel forms of human intention recognition through haptic signals and novel forms of communication between humans and robots. Specifically, wearable haptics will enable devices to communicate with humans during their interaction with the environment they share. Wearable haptic technology have been introduced in our everyday life by Sony. In 1997 its DualShock controller for PlayStation revolutionized the gaming industry by introducing a simple but effective vibrotactile feedback. More recently, Apple unveiled the Apple Watch, which embeds a linear actuator that can make the watch vibrate. It is used whenever the wearer receives an alert or notification, or to communicate with other Apple Watch owners.
We introduce a novel method to improve the performance of passive teleoperation systems with force reflection. It consists of integrating kinesthetic haptic feedback provided by common grounded haptic interfaces with cutaneous haptic feedback. The proposed approach can be used on top of any time-domain control technique that ensures a stable interaction by scaling down kinesthetic feedback when this is required to satisfy stability conditions (e.g., passivity) at the expense of transparency. Performance is recovered by providing a suitable amount of cutaneous force through custom wearable cutaneous devices. The viability of the proposed approach is demonstrated through an experiment of perceived stiffness and an experiment of teleoperated needle insertion in soft tissue.
A teleoperation system for bevel-tipped flexible needle steering has been evaluated. Robotic systems have been exploited as the main tool to achieve high accuracy and reliability. However, for reasons of safety and acceptance by the surgical community, keeping the physician tightly in the loop is preferable. The system uses ultrasound imaging, path planning, and control to compute the desired needle orientation during the insertion and intuitively passes this information to the operator, who teleoperates the motion of the needle's tip. Navigation cues about the computed orientation are provided through haptic and visual feedback to the operator to steer the needle. The targeting accuracy of several co-manipulation strategies were studied in four sets of experiments involving human subjects with clinical backgrounds. Experimental results show that receiving feedback regarding the desired needle orientation improves the targeting accuracy by a factor of 9 with respect to manual insertions. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.
Despite its expected clinical benefits, current teleoperated surgical robots do not provide the surgeon with haptic feedback largely because grounded forces can destabilize the system's closed-loop controller. This article presents an alternative approach that enables the surgeon to feel fingertip contact deformations and vibrations while guaranteeing the teleoperator's stability. We implemented our cutaneous feedback solution on an Intuitive Surgical da Vinci Standard robot by mounting a SynTouch BioTac tactile sensor to the distal end of a surgical instrument and a custom cutaneous display to the corresponding master controller. As the user probes the remote environment, the contact deformations, DC pressure, and AC pressure (vibrations) sensed by the BioTac are directly mapped to input commands for the cutaneous device's motors using a model-free algorithm based on look-up tables. The cutaneous display continually moves, tilts, and vibrates a flat plate at the operator's fingertip to optimally reproduce the tactile sensations experienced by the BioTac. We tested the proposed approach by having eighteen subjects use the augmented da Vinci robot to palpate a heart model with no haptic feedback, only deformation feedback, and deformation plus vibration feedback. Fingertip deformation feedback significantly improved palpation performance by reducing the task completion time, the pressure exerted on the heart model, and the subject's absolute error in detecting the orientation of the embedded plastic stick. Vibration feedback significantly improved palpation performance only for the seven subjects who dragged the BioTac across the model, rather than pressing straight into it.
- Jul 2015
- Ambient Assisted Living (Italian Forum 2014)
Large and crowded public places can easily disorientate elderly people. The EU FP7 project Devices for Assisted Living (DALi) aims at developing a robotic wheeled walker able to assist people with moderate cognitive problems to navigate in complex indoor environments where other people, obstacles and multiple points of interest may confuse or intimidate the users. The walking assistant, called c-Walker, is designed to monitor the space around the user, to detect possible hazards and to plan the best route towards a given point of interest. In this chapter, an overview of the system and some of its most important functions are described.
Wearable technologies are gaining great popularity in the recent years. The demand for devices that are lightweight and compact challenges researchers to pursue innovative solutions to make existing technologies more portable and wearable. In this paper we present a novel wearable cutaneous fingertip device with 3 degrees of freedom. It is composed of two parallel platforms: the upper body is fixed on the back of the finger, housing three small servo motors, and the mobile end-effector is in contact with the volar surface of the fingertip. The two platforms are connected by three articulated legs, actuated by the motors in order to move the mobile platform toward the user's fingertip and re-angle it to simulate contacts with arbitrarily oriented surfaces. Each leg is composed of two rigid links, connected to each other and then to the platforms, according to a RRS (Revolute-Revolute-Spherical) kinematic chain. With respect to other similar cable-driven devices presented in the literature, this device solves the indeterminacy due to the underactuation of the platform. This work presents the main design steps for the development of the wearable display, along with its kinematics, quasi-static modeling, and control. In particular, we analyzed the relationship between device performance and its main geometrical parameters. A perceptual experiment shows that the cutaneous device is able to effectively render different platform configurations.
Telerobotic systems enable humans to explore and manipulate remote environments for applications such as surgery and disaster response, but few such systems provide the operator with cutaneous feedback. This article presents a novel approach to remote cutaneous interaction; our method is compatible with any fingertip tactile sensor and any mechanical tactile display device, and it does not require a position/force or skin deformation model. Instead, it directly maps the sensed stimuli to the best possible input commands for the device's motors using a data set recorded with the tactile sensor inside the device. As a proof of concept, we considered a haptic system composed of a BioTac tactile sensor, in charge of measuring contact deformations, and a custom 3-DoF cutaneous device with a flat contact platform, in charge of applying deformations to the user's fingertip. To validate the proposed approach and discover its inherent tradeoffs, we carried out two remote tactile interaction experiments. The first one evaluated the error between the tactile sensations registered by the BioTac in a remote environment and the sensations created by the cutaneous device for six representative tactile interactions and 27 variations of the display algorithm. The normalized average errors in the best condition were 3.0% of the BioTac's full 12-bit scale. The second experiment evaluated human subjects' experiences for the same six remote interactions and eight algorithm variations. The average subjective rating for the best algorithm variation was 8.2 out of 10, where 10 is best.
Wearable robots have been mostly designed as exoskeletons, with segments and joints corresponding to those of the person they are coupled with. Exoskeletons are mainly employed to augment human body force and precision capabilities , or for rehabilitation purposes. More recently, new wearable robots resembling additional robotic limbs have been developed thanks to the progress in miniaturization and efficiency of mechanical and sensing components. However, wearable robotic extra limbs presented in the literature lack of effective haptic feedback systems. In this paper, we present a robotic extra finger coupled with a vibrotactile ring interface. The human user is able to control the motion of the robotic finger through a switch placed on the ring, while being provided with vibrotactile feedback about the forces exerted by the robotic finger on the environment. To understand how to control the vibrotactile interface to evoke the most effective cutaneous sensations, we executed perceptual experiments to evaluate its absolute and differential thresholds. We also carried out a pick-and-place experiment with ten subjects. Haptic feedback significantly improved the performance in task execution in terms of completion time, exerted force, and perceived effectiveness. All subjects preferred experimental conditions employing haptic feedback with respect to those not providing any force feedback.
In this paper we present a mathematical framework to describe the interaction between compliant hands and environmental constraints during grasping tasks. In the proposed model, we considered compliance at wrist, joint and contact level. We modeled the general case in which the hand is in contact with the object and the surrounding environment. All the other contact cases can be derived from the proposed system of equations. We performed several numerical simulation using the SynGrasp Matlab Toolbox to prove the consistency of the proposed model. We tested different combinations of compliance as well as different reference inputs for the hand/arm system considered. This work has to be intended as a tool for compliant hand designer since it allows to tune compliance at different levels before the real hand realization. Furthermore, the same framework can be used for compliant hand simulation in order to study the interaction with the environmental constrains and to plan complex manipulation tasks.
In this paper a novel teleoperation framework for aerial robots that physically interact with the environment is presented. This framework allows to teleoperate the robot both in contact-free flight and in physical contact with the environment in order, e.g., to apply desired forces on objects of the environment. The framework is build upon an impedance-like indirect interaction force controller that allows to use standard underactuated aerial robots as force effectors. Haptic feedback from the master side enables the user to feel the contact forces exerted by the robot. An automatic potential field-based slowing-down policy is used by the robot to ensure a smooth transition between the contact-free motion phase and the force interaction phase. The effectiveness of the approach has been shown in extensive human-in-the-loop simulations including remote pressing of buttons on a surface and pushing a cart until it touches a wall.
- May 2015
In this paper we present a study concerning the human hand during digital handwriting on a tablet. Two different cases are considered: writing with the finger, and writing with the stylus. We chose an approach based on the biomechanics of the human hand to compare the two different input methods. Performance is evaluated using metrics originally introduced and developed in robotics, such as the manipulability indexes. Analytical results assess that writing with the finger is more suitable for performing large, but not very accurate motions, while writing with the stylus leads to a higher precision and more isotropic motion performance. We then carried out two experiments of digital handwriting to support the approach and contextualize the results.
The Devices for Assisted Living (DALi) project is a research initiative sponsored by the European Commission under the FP7 programme aiming for the development of a robotic device to assist people with cognitive impairments in navigating complex environments. The project revisits the popular paradigm of the walker enriching it with sensing abilities (to perceive the environment), with cognitive abilities (to decide the best path across the space) and with mechanical, visual, acoustic and haptic guidance devices (to guide the person along the path). In this paper, we offer an overview of the developed system and describe in detail some of its most important technological aspects.
In this paper we present an off-line Kalman filter approach to remove transcranial magnetic stimulatio n (TMS)-induced artifacts from electroencephalographic (EEG) recordings. Two dynamic models describing EEG and TMS sig- nals generation are identified from data and the Kalman filter is applied to the linear system arising from their combination. The keystone of the approach is the use of time-varying covariance matrices suitably tuned on the physical parameters of the problem that allow us to model the non-stationary components of the EEG/TMS signal neglected by conventional stationary filters. The approach guarantees an efficient deletion of TMS - induced artifacts while preserving the integrity of EEG signals around TMS impulses. Experimental results show that the Kalman filter achieves a significant performance improvemen t over standard stationary filters.
This paper investigates the geometric and structural char-acteristics involved in the control of rather general manip-ulation systems, consisting of multiple cooperating link-ages, interacting with a reference member of the mecha-nism (the "object") by means of contacts on any available part of their links. Object grasp and manipulation by the human hand is taken as a paradigmatic example for this class of manipulators. This paper reports on some recent results on the analysis and control of these mechanisms, based on a geometric analysis of a local approximation of system dynamics. A special attention is devoted to the force control by an algebraic output feedback.
There are several scenarios where microrobots can be beneficial, especially in the field of medicine . The use of microdevices can in fact enable clinicians to perform less invasive diagnostic, therapeutic and surgical interventions, thanks to the fact that these robots can provide new ways of accessing areas of the patient's body that are hard to reach (e.g., deeply-located tumors). This paper presents an innovative teleoperation system with force reflection for steering self-propelled microjets in 2-dimensional space, shown in Fig. 1. The propulsion of these microjets is based on the catalytic decomposition of hydrogen peroxide by thin layers of platinum, which generates bubbles and leads to the fast forward jet motion of the microtube . The proposed teleoperation system enables the human operator to intuitively and accurately control the motion of a microjet in the remote environment while providing him/her with compelling haptic force feedback. A novel particle-filter-based visual tracking algorithm tracks at runtime the position of the microjet in the remote environment. A 6-degrees-of-freedom (6-DoF) haptic interface then provides the human operator with haptic feedback about the interaction between the controlled microjet and the environment, as well as enabling the operator to intuitively control the target position of the microjet. Finally, a wireless magnetic control system regulates the orientation of the microjet to reach the target point. Figure 2 shows how the tracking, haptic, and control systems are interconnected. 1) Tracking System: A high-resolution camera is placed above the Petri dish hosting the environment. The camera has an adjustable zoom with a maximum of 24X, and it is mounted on a linear stage to enable precise focusing. Each frame registered by the camera is first filtered by a Laplacian of Gaussian (LoG) filter , which is used to find areas of rapid change (edges) in the image. Subsequently, the tracker selects the target object based on shape, size, and temporal consistency, and it then estimates its position. Finally, to robustly track inconsistent shapes (e.g., bubble trails of the microjets) and to effectively reject the presence of other microjets that we do not want to control, we use a particle filter . The tracker uses the estimated position to C. Pacchierotti and D. Prattichizzo are affiliated with the Fig. 1. Teleoperation system. The tracker measures at runtime the position of the microjets in the remote environment. The human operator then set the microjet's reference point by controlling the position of the end-effector of a 6-degrees-of-freedom haptic interface. At the same time, he is also provided with kinesthetic and vibrotactile feedback coming from the remote environment. Finally, the magnetic control system regulates the orientation of the microjet toward the reference point. weight the particles of the particle filter. After the weighting, the particles are also used for position estimation in the next frame. To do this, the particles are resampled based on their weights and translated based on the measured object velocity. Experiments showed the tracker to be able to track microjets in 2-D with an average precision of 90.4 µm. 2) Haptic System: The haptic feedback system is composed of a 6-DoF Omega haptic interface (Force Dimension, Switzerland). We measured the position of the end-effector of the Omega, controlled by the human operator, to set the reference target position of the microjet. At the same time, through the same end-effector, we provided the operator with force feedback from the remote environment (see Sec. II). 3) Control System: Given the current position of the microjet, as estimated by the tracking algorithm, and the commanded reference position, as controlled by the operator through the haptic interface, the control system controls the
- Oct 2014
In this paper we present a new visuo-haptic interaction mechanism for human-robot formation control. The formation setup consists of a human leader and multiple follower robots. The mobile robots are equipped only with RGB-D cameras, and they should maintain a desired distance and orientation to the leader at all times. Mechanical limitations common to all the robots limit the possible trajectories that the human can take. In this regard, vibrotactile feedback provided by a haptic bracelet guides the human along trajectories that are feasible for the team by warning her/him when the formation constraints are being violated. Psychophysical tests on the bracelet together with real-world experiments conducted with a team of Pioneer robots show the effectiveness of the proposed visuo-haptic paradigm for the coordination of mixed human-robot teams.
- Sep 2014
- Proceeding of the 5th IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics (BioRob)
- IEEE International Conference on Biomedical Robotics and Biomechatronics
This paper introduces ModGrasp, an open-source virtual and physical rapid-prototyping framework that allows for the design, simulation and control of low-cost sensorised modular hands. By combining the rapid-prototyping approach with the modular concept, different manipulator configurations can be modelled. A real-time one-to-one correspondence between virtual and physical prototypes is established. Different control algorithms can be implemented for the models. By using a low-cost sensing approach, functions for torque sensing at the joint level, sensitive collision detection and joint compliant control are possible. A 3-D visualization environment provides the user with an intuitive visual feedback. As a case study, a three-fingered modular manipulator is presented. Related simulations are carried out to validate efficiency and flexibility of the proposed rapid-prototyping framework.
Needle insertion in soft-tissue is a minimally invasive surgical procedure that demands high accuracy. In this respect, robotic systems with autonomous control algorithms have been exploited as the main tool to achieve high accuracy and reliability. However, for reasons of safety and responsibility, autonomous robotic control is often not desirable. Therefore, it is necessary to focus also on techniques enabling clinicians to directly control the motion of the surgical tools. In this work we address that challenge and present a novel teleoperated robotic system able to steer flexible needles. The proposed system tracks the position of the needle using an ultrasound imaging system and computes needle's ideal position and orientation to reach a given target. The master haptic interface then provides the clinician with mixed kinesthetic-vibratory navigation cues to guide the needle toward the computed ideal position and orientation. Twenty participants carried out an experiment of teleoperated needle insertion into a soft-tissue phantom, considering four different experimental conditions. Participants were provided with either mixed kinesthetic-vibratory feedback or mixed kinestheticvisual feedback. Moreover, we considered two different ways of computing ideal position and orientation of the needle: with or without set-points. Vibratory feedback was found more effective than visual feedback in conveying navigation cues, with a mean targeting error of 0.72 mm when using set-points, and of 1.10 mm without set-points.