Conference PaperPDF Available

Abstract and Figures

In this paper we present a new Robot Hardware Abstraction Layer (R-HAL) that permits to seamlessly program and control any robotic platform powered by the XBot control software framework. The implementation details of the R-HAL are introduced. The R-HAL is extensively validated through simulation trials and experiments with a wide range of dissimilar robotic platforms, among them the COMAN and WALK-MAN humanoids, the KUKA LWR and the CENTAURO upper body. The results attained demonstrate in practise the gained benefits in terms of code compatibility, reuse and portability, and finally unified application programming even for robots with significantly diverse hardware. Furthermore, it is shown that the implementation and integration of the R-HAL within the XBot framework does not generate additional computational overheads for the robot computational units.
Content may be subject to copyright.
Towards A Robot Hardware Abstraction Layer (R-HAL) Leveraging
the XBot Software Framework
Giuseppe F. Rigano1, Luca Muratore1 2 , Arturo Laurenzi1, Enrico Mingo Hoffman1, and Nikos G. Tsagarakis1
Abstract In this paper we present a new Robot Hardware
Abstraction Layer (R-HAL) that permits to seamlessly program
and control any robotic platform powered by the XBot control
software framework. The implementation details of the R-HAL
are introduced. The R-HAL is extensively validated through
simulation trials and experiments with a wide range of dissim-
ilar robotic platforms, among them the COMAN and WALK-
MAN humanoids, the KUKA LWR and the CENTAURO upper
body. The results attained demonstrate in practise the gained
benefits in terms of code compatibility, reuse and portability,
and finally unified application programming even for robots
with significantly diverse hardware. Furthermore, it is shown
that the implementation and integration of the R-HAL within
the XBot framework does not generate additional computational
overheads for the robot computational units.
I. INTRODUCTION
The Hardware Abstraction Layer (HAL) is a software
component that can be incorporated to mask the physical
robot hardware differences and limitations (e.g. data link
layer, kinematic model, sensors, etc.) in order to provide a
relatively uniform abstraction layer able to assure portability
and code-reuse for any robot platform.
The RT XBotCore [1] platform was initially designed to
provide a bland level robot hardware abstraction just for
EtherCAT based robot, making really difficult interfacing
with any new robot hardware. Despite the EtherCAT abstrac-
tion, XBotCore was highly tied to the low level software. In
the work presented herein we will show the design choices
taken for the implementation of the robot hardware abstrac-
tion layer and the capabilities of this software component,
which enables us to efficiently port and run the same control
software modules on different robots, both on simulation and
on the real hardware platforms.
II. R-HAL
The lack of an HAL made XBotCore an embedded com-
ponent inside the low level software relying on the Template
Pattern to become part of the control loop. The purpose of
the proposed Robot HAL (R-HAL) design is to provide a
middleware with autonomous threading capabilities and with
no dependencies to the low level hardware/software. The R-
HAL achieves its goal by providing three abstract methods.
1Advanced Robotics Department (ADVR), Istituto Italiano di Tecnologia,
Genova, Italy
2School of Electrical and Electronic Engineering, The University of
Manchester, M13 9PL, UK
{giuseppe.rigano, luca.muratore, arturo.laurenzi,
enrico.mingo, nikolaos.tsagarakis}@iit.it
Figure 1. The proposed XBot R-HAL: it assures high
flexibility towards any type of robotic platform or simulation
environment.
In detail, adopting a master - slave approach, the R-HAL
lifecycle is described by the following methods:
init(), useful in the initialization phase (open connection,
initialization of data structures etc.);
recvFromSlave(), designed to communicate to the slaves
and fill the data structure;
sendToSlave(), deals with the communication to send
the reference signals to the slaves.
Figure 2. R-HAL components interaction.
In particular we provide the following implementa-
tions out-of-the-box: XBotEcat, XBotEthernet, XBotKuka,
XBotGazebo respectively for Ethercat, Ethernet, Kuka lwr
4 based robots and Gazebo based simulators. Each imple-
mentation has to provide the behaviour of the XBotJoint,
XBotFT, XBotIMU and XBotHand interfaces. However, the
described design does not limit the user to add new behaviour
because it is just required to implement other set of inter-
faces. All the R-HAL implementations are built as shared
libraries loaded at runtime according to what specified in a
configuration file. In particular the factory design pattern has
been adopted to load/unload several implementations where
the symbols are resolved immediately to avoid slowdowns
during the RT execution. In order to provide more flexibility
a Controller interface is used to represent a generic con-
troller behaviour. XBotCore class acts as a specific controller
implementation and it interacts with the R-HAL by calling
the suitable methods as shown in Fig 2. The described
abstraction layer allows to easily switch between simulation
and real robot as shown in Fig. 1.
A. Mixing RT-NRT controlled hardware
A software design that exploits the IPC from Xenomai1
has been provided to consider how to mix NRT and RT
controlled hardware. A NRT thread will manage the robotic
hardware exploiting the XDDP communication and running
the low level end-effector control modules. The interaction
between the RT and NRT domains takes place inside one
of the possible HAL implementation allowing us to load at
runtime the specific RT-based or NRT-based control imple-
mentation.
III. EXP ER IMENTS
To validate and evaluate the performance of the XBotCore
software platform, we performed a set of experiments on
different robotic platforms as described in the following
section.
A. Experimental Setup
The proposed software architecture has been validated on
the following robotic platforms (Fig. 1):
WALK-MAN robot, a full-size humanoid [2]
COMAN robot, a 95cm tall humanoid
CENTAURO upper body prototype [3]
KUKA lwr 4+, using the FRI (Fast Research Interface)
All the robotic platforms were tested both on the Gazebo
environment and on the real hardware by running a generic
XBot plugin that performs a circular trajectory. In particular,
the OpenSoT control framework[4] was used to solve whole-
body control and inverse kinematics. The software has been
tested on a Xenomai patched Linux OS in combination with
RTnet.
B. Results
In Fig. 3 we report the data of the experiments on the four
robotic platforms in terms of control period. It is important
to note that we requested a control frequency of 1 kHz (i.e. 1
ms control period), for all the platform RT controlled except
for the KUKA lwr 4+ that was controlled in N-RT mode at
5 ms control period. It is evident that no overhead has been
introduced with the R-HAL compared to the former XBot
software design.
1https://xenomai.org/
Figure 3. Control period comparison between the XBot
architecture with R-HAL and with No HAL.
IV. CONCLUSIONS
The extensive uprising in robotic hardware of different
form is imposing a significant challenge to the robot software
developers when they have to deal with robotics systems
with hardware incompatibility making code development,
porting and reuse highly inefficient. To address this barrier,
we presented the concept of R-HAL, a new Robot Hardware
Abstraction Layer leveraging on the power of the XBot
software framework. We developed and validated R-HAL on
a range of different robotic platforms showing that the pro-
posed software component does not affect the performance of
the XBot architecture. The autonomous threading capability
together with the R-HAL interface allows to easily adopt the
XBot framework as a middleware for several different robots
by exploiting the power of the software polymorphism.
Moreover an example of integrating RT and NRT domain to
support hybrid RT-NRT hardware has been shown leading
to a very high flexibility, towards any type of low-level
interface. The XBot software platform is released free and
open source2.
ACKNOWLEDGMENTS
The authors would like to thank Luka Peternel for the sup-
port provided during the experiments. The research leading to
these results has received funding from the European Unions
Horizon 2020 research and innovation programme under
grant agreement No 644839 (CENTAURO) and 644727
(CogIMon).
REFERENCES
[1] L. Muratore, et al., “XBotCore: A Real-Time Cross-Robot Software
Platform,” in IEEE International Conference on Robotic Computing,
2017.
[2] N. G. Tsagarakis et al., “WALK-MAN: A High Performance Humanoid
Platform for Realistic Environments,Journal of Field Robotics (JFR),
2016.
[3] L. Baccelliere et al., “Development of a high performance bi-manual
platform for realistic heavy manipulation tasks,” in IEER/RSJ IROS (to
appear), 2017.
[4] A. Rocchi et al., “Opensot: a whole-body control library for the com-
pliant humanoid robot coman,” in 2015 IEEE International Conference
on Robotics and Automation (ICRA), 2015, pp. 6248–6253.
2https://github.com/ADVRHumanoids/XBotCore
... Creation of this low-cost hardware abstraction layer [7], [8] needs to use off-the-shelf components, and readily available. The choice of the interface relies on the Raspberry Pi for its low cost and its compatibility among versions. ...
... A Robot Hardware Abstraction Layer (R-HAL), introduced in (Rigano et al., 2018) that permits to seamlessly program and control any robotic platform powered by XBotCore is also provided by the framework. Moreover, a simple and easy-to-use middleware API, for both RT and non-RT control frameworks is available. ...
Preprint
Full-text available
Solving mobile manipulation tasks in inaccessible and dangerous environments is an important application of robots to support humans. Example domains are construction and maintenance of manned and unmanned stations on the moon and other planets. Suitable platforms require flexible and robust hardware, a locomotion approach that allows for navigating a wide variety of terrains, dexterous manipulation capabilities, and respective user interfaces. We present the CENTAURO system which has been designed for these requirements and consists of the Centauro robot and a set of advanced operator interfaces with complementary strength enabling the system to solve a wide range of realistic mobile manipulation tasks. The robot possesses a centaur-like body plan and is driven by torque-controlled compliant actuators. Four articulated legs ending in steerable wheels allow for omnidirectional driving as well as for making steps. An anthropomorphic upper body with two arms ending in five-finger hands enables human-like manipulation. The robot perceives its environment through a suite of multimodal sensors. The resulting platform complexity goes beyond the complexity of most known systems which puts the focus on a suitable operator interface. An operator can control the robot through a telepresence suit, which allows for flexibly solving a large variety of mobile manipulation tasks. Locomotion and manipulation functionalities on different levels of autonomy support the operation. The proposed user interfaces enable solving a wide variety of tasks without previous task-specific training. The integrated system is evaluated in numerous teleoperated experiments that are described along with lessons learned.
... A Robot Hardware Abstraction Layer (R-HAL), introduced in (Rigano et al., 2018) that permits to seamlessly program and control any robotic platform powered by XBotCore is also provided by the framework. Moreover, a simple and easy-to-use middleware API, for both RT and non-RT control frameworks is available. ...
Article
Full-text available
Solving mobile manipulation tasks in inaccessible and dangerous environments is an important application of robots to support humans. Example domains are construction and maintenance of manned and unmanned stations on the moon and other planets. Suitable platforms require flexible and robust hardware, a locomotion approach that allows for navigating a wide variety of terrains, dexterous manipulation capabilities, and respective user interfaces. We present the CENTAURO system which has been designed for these requirements and consists of the Centauro robot and a set of advanced operator interfaces with complementary strength enabling the system to solve a wide range of realistic mobile manipulation tasks. The robot possesses a centaur‐like body plan and is driven by torque‐controlled compliant actuators. Four articulated legs ending in steerable wheels allow for omnidirectional driving as well as for making steps. An anthropomorphic upper body with two arms ending in five‐finger hands enables human‐like manipulation. The robot perceives its environment through a suite of multimodal sensors. The resulting platform complexity goes beyond the complexity of most known systems which puts the focus on a suitable operator interface. An operator can control the robot through a telepresence suit, which allows for flexibly solving a large variety of mobile manipulation tasks. Locomotion and manipulation functionalities on different levels of autonomy support the operation. The proposed user interfaces enable solving a wide variety of tasks without previous task‐specific training. The integrated system is evaluated in numerous teleoperated experiments that are described along with lessons learned.
... Moreover it provides a simple and easyto-use middleware Application Programming Interface (API) for both RT and non-RT external control applications. This API hides the specifics of the underlying hardware and allows the robot software developers to easily transfer/reuse their code on different robots and simulators thanks to the usage of the Robot Hardware Abstraction Layer (R-HAL) [26]. ...
Conference Paper
The understanding of human arm stiffness have brought several significant advances to robotics. For the most part, the end-point stiffness of human arm serves as an important role in guiding and shaping the Cartesian stiffness of robot arm in the execution of complicated interaction tasks because of the convenience of using the common space where both stiffnesses function. However, investigation of the joint stiffness of human arm, on the other hand, will provide a more comprehensive perspective on the human arm stiffness and enable other appealing robotic applications, for instance, whole-arm interaction with unstructured environment. As a fundamental research for these applications, the feasibility of an online joint stiffness transfer approach from human to anthropomorphic arms is discussed in this paper. This is realized by a proposed concept of physiological joint stiffness, which is shared by the human and anthropomorphic arms. The desired joint stiffness of robot arm is transformed from the estimated joint stiffness of human arm by requiring both arms to have the same apparent physiological joint stiffness. To make the calculated joint stiffness achievable in a robot controller, the stiffness matrix is subsequently optimized to be symmetric and positive definite. Proof-of-concept experiment is performed on a fully integrated robotic teleoperation setup to validate the efficacy of the proposed method.
... 1) XBotCore: XBotCore (Cross-Bot-Core) is a recently developed open-source 7 Real-Time (RT) and cross-robot software platform. It expects a fieldbus systems employed to permit communication among the robot hardware devices and it uses the Robotic Hardware Abstraction Layer (R-HAL [20]) component to seamlessly program and control any robotic platform providing a relatively uniform abstraction that hides the specifics of the underlying hardware. ...
Conference Paper
Full-text available
Limitations with the on-board computational resources installed on untethered robots such as humanoids and mobile robots in general affects significantly the performance and capabilities of these machines. An approach to address this issue is to make use of the cloud robotics concept and take advantage of the extensive computational resources of the cloud. XBotCloud is a recently developed component of the XBot framework. It tackles the above challenges by introducing the tools and mechanisms to enable users and robots to exploit the computational resources of the cloud allowing the execution of services with low, soft or hard Real-Time execu-tion/communication performance. The latter is ensured thanks to the functionality provided by the XBotCore Real-Time cross-robot software component of the XBot framework. XBotCloud addresses also one of the main challenges related with cloud robotics: security. To avoid remote attacks it takes advantage of the Amazon Web Services (AWS) Cloud Security and it uses an internal VPN Network to handle the connectivity between the robot and the cloud server. The full implementation of the framework is presented and its functionality is demonstrated in realistic tasks involving pipelines that mix the execution of cloud services with moderate execution time constraints and Real-Time modules running on the robot local control unit. XBotCloud performances and cross-robot flexibility are experimentally validated on two different robotic platforms, the WALK-MAN humanoid and the CENTAURO upper body / full-body.
... As shown in Figure 5, the main XBotCore components are the following: EtherCAT master, Plugin Handler, RT and non RT middleware API, Communication Handlers. Note that we used the built-in Robotics Hardware Abstraction Layer (R-HAL [15]) implementation for the EtherCAT fieldbus provided by the framework. We exploited the RT plugin architecture thanks to the Plugin Handler: it is an RT thread responsible for starting all the loaded plugins, executing them sequentially and closing them before unloading them. ...
Conference Paper
Full-text available
Robotics teleoperation has been extensively studied and considered in the past in several task scenarios where direct human intervention is not possible due to the hazardous environments. In such applications, both communication degradation and reduced perception of the remote environment are practical issues that can challenge the human operator while controlling the robot and attempting to physically interact within the remote workspace. %Avoid harsh contacts that can potentially damage the robot or the environment resulting to task execution failure, is not a trivial task for the robot pilot. To address this challenge, we introduce a novel shared-autonomy Tele-Interaction control approach that blends the motion commands from the pilot (master side) with locally (slave side) executed autonomous motion and impedance modulators. This enables a remote robot to handle and autonomously avoid physical obstacles during manoeuvring, reduce interaction forces during contacts, and finally accommodate different payload conditions while at the same time operating with a ''default" low impedance setting. We implemented and experimentally validated the proposed method both on simulation and on a real robot platform called CENTAURO. A series of tasks, such as maneuvering through the physical constraints of the remote environment in an autonomous manner, pushing and lifting heavy objects with autonomous impedance regulation and colliding with the rigid geometry of the remote environment were executed. The obtained results demonstrate the effectiveness of the shared-autonomy control principles that eventually aim to reduce the level of attention and stress of human pilot while manoeuvring the slave robot, and at the same time to enhance the robustness of the robot during physical interactions even if accidentally occurred. https://www.youtube.com/watch?v=IUGOV8cVMts
... This API is completely flexible with respect to the framework a user wants to utilize. It is also possible to reuse the code written using XBotCore API with different robots (crossrobot feature) thanks to the Robot Hardware Abstraction Layer (R-HAL) introduced in [22]. ...
Conference Paper
Full-text available
Complex interactions with unstructured environments require the application of appropriate restoring forces in response to the imposed displacements. Impedance control techniques provide effective solutions to achieve this, however, their quasi-static performance is highly dependent on the choice of parameters, i.e. stiffness and damping. In most cases, such parameters are previously selected by robot programmers to achieve a desired response, which limits the adaptation capability of robots to varying task conditions. To improve the generality of interaction planning through task-dependent regulation of the parameters, this paper introduces a novel self-regulating impedance controller. The regulation of the parameters is achieved based on the robot's local sensory data, and on an interaction expectancy value. This value combines the interaction values from the robot state machine and visual feedback, to authorize the autonomous tuning of the impedance parameters in selective Cartesian axes. The effectiveness of the proposed method is validated experimentally in a debris removal task.
Thesis
Full-text available
The widespread use of robotics in new application domains outside the industrial workplace settings requires robotic systems that demonstrate functionality far beyond that of industrial robotic machines. The implementation of these additional capabilities inevitably increases the complexity of the robotic platforms at the hardware, control and application software level. As a result the complexity of today's robots targeting new applications in partially unstructured environments has reached a noticeable extent, e.g. such robots typically consist of a large number of actuators, sensors, and processors communicating through several interfaces. These emerging applications involve complex tasks that also vary and have to be carried out within a partially unknown environment requiring advanced capabilities with respect to system autonomy and adaptability, which further increases the intricacy of the system software architecture creating an additional challenge on the development of robotic software. The selection among the available software middlewares is not an easy task for the research community as well as for companies interested in exploiting them. A software framework that targets to enable the above described applications, should provide Real-Time (RT) performance with minimum jitter at relatively high control frequency (e.g. 1 kHz) for the low-level, code reuse, hardware abstraction layer, interoperability with existing frameworks, robustness and small footprint. Moreover a robotic software middleware has to be able to assure cross-robot compatibility, in the sense that it should be possible to use it with any robot, without code modification, but only changing a set of configuration files. A software architecture with the above performance features would lead to the possibility of effectively running autonomous and semi-autonomous skills on-board the robot. This is essential for application domains such as logistics, robot companions in houses and offices and co-workers in industrial, manufacturing and construction spaces, which require Human-Robot Collaboration (HRC), either sharing the same workspace or in a remote teleoperation scenario. The architectures and limitations of the existing frameworks are first studied and discussed in this thesis. A novel software architecture for robotics called XBotCore is then introduced in details and its experimental validation on different robotic platforms, from industrial manipulators, to humanoids, to quadruped robots is demonstrated and discussed. Having developed the core of the software architecture, this thesis then introduces a state-of-the-art multimodal (verbal, motion, force) tele-interaction control framework. Motivated by biological findings, the framework is capable to deal with the uncertainty of the interaction forces by exploring an online autonomous impedance regulation principle to adapt to payload or interaction forces variations. In the experimental trials performed, the efficacy of the individual components of the framework was proofed, with focus on the autonomous impedance regulator for force interaction; moreover the overall multimodal tele-interaction framework was validated for a human-robot collaboration task scenario that involves carrying and transporting a heavy object with a humanoid robot. However, the limitation of the on-board computational power installed on the robotic systems represents a significant barrier for their deployment in emerging applications within human working and domestic spaces, which eventually compromises their application. An approach to address this issue is to exploit the cloud computing, cloud storage, and other Internet technologies to benefit from the powerful computational, storage, and communications resources of modern data centers. In this regard, the present thesis illustrates the XBotCloud framework, which extends the XBotCore by providing the tools and mechanisms to enable users and robots to securely exploit the resources of the Cloud allowing also the combination of local (RT) and cloud execution on the basis of the requirements and the demands concerning the service execution and communication latency. XBotCloud functionalities were rigorously demonstrated in realistic cloud, network communication and robotic task pipelines settings on different robotic platforms, that include also the execution of cloud services within moderate feedback control loops.
Conference Paper
Full-text available
In this work we introduce XBotCore (Cross-Bot-Core), a lightweight , Real-Time (RT) software platform for EtherCAT-based robots. XBotCore is open-source and is designed to be both an RT robot control framework and a software middleware. It satisfies hard RT requirements, while ensuring 1 kHz control loop even in complex Multi-Degree-Of-Freedom systems. It provides a simple and easy-to-use middleware Application Programming Interface (API), for both RT and non-RT control frameworks. This API is completely flexible with respect to the framework a user wants to utilize. Moreover it is possible to reuse the code written using XBotCore API with different robots (cross-robot feature). In this paper, the XBotCore design and architecture will be described and experimental results on the humanoid robot WALK-MAN [17], developed at the Istituto Italiano di Tecnologia (IIT), will be presented.
Article
Full-text available
In this work, we present WALK-MAN, a humanoid platform that has been developed to operate in realistic unstructured environment, and demonstrate new skills including powerful manipulation, robust balanced locomotion, high-strength capabilities, and physical sturdiness. To enable these capabilities, WALK-MAN design and actuation are based on the most recent advancements of series elastic actuator drives with unique performance features that differentiate the robot from previous state-of-the-art compliant actuated robots. Physical interaction performance is benefited by both active and passive adaptation, thanks to WALK-MAN actuation that combines customized high-performance modules with tuned torque/velocity curves and transmission elasticity for high-speed adaptation response and motion reactions to disturbances. WALK-MAN design also includes innovative design optimization features that consider the selection of kinematic structure and the placement of the actuators with the body structure to maximize the robot performance. Physical robustness is ensured with the integration of elastic transmission, proprioceptive sensing, and control. The WALK-MAN hardware was designed and built in 11 months, and the prototype of the robot was ready four months before DARPA Robotics Challenge (DRC) Finals. The motion generation of WALK-MAN is based on the unified motion-generation framework of whole-body locomotion and manipulation (termed loco-manipulation). WALK-MAN is able to execute simple loco-manipulation behaviors synthesized by combining different primitives defining the behavior of the center of gravity, the motion of the hands, legs, and head, the body attitude and posture, and the constrained body parts such as joint limits and contacts. The motion-generation framework including the specific motion modules and software architecture is discussed in detail. A rich perception system allows the robot to perceive and generate 3D representations of the environment as well as detect contacts and sense physical interaction force and moments. The operator station that pilots use to control the robot provides a rich pilot interface with different control modes and a number of teleoperated or semiautonomous command features. The capability of the robot and the performance of the individual motion control and perception modules were validated during the DRC in which the robot was able to demonstrate exceptional physical resilience and execute some of the tasks during the competition.
Article
A fundamental aspect of controlling humanoid robots lies in the capability to exploit the whole body to perform tasks. This work introduces a novel whole body control library called OpenSoT. OpenSoT is combined with joint impedance control to create a framework that can effectively generate complex whole body motion behaviors for humanoids according to the needs of the interaction level of the tasks. OpenSoT gives an easy way to implement tasks, constraints, bounds and solvers by providing common interfaces. We present the mathematical foundation of the library and validate it on the compliant humanoid robot COMAN to execute multiple motion tasks under a number of constraints. The framework is able to solve hierarchies of tasks of arbitrary complexity in a robust and reliable way.