ArticlePDF Available

Abstract and Figures

July 18, 2011 9:43 WSPC/INSTRUCTION FILE ijhrPaper
International Journal of Humanoid Robotics
World Scientific Publishing Company
, JOS´
Robotics Brain and Cognitive Sciences Department,
Istituto Italiano di Tecnologia
Via Morego 30, 16163 Genova, Italy,,,,,,
Advanced Robotics Department,
Istituto Italiano di Tecnologia
Via Morego 30, 16163 Genova, Italy
Institute of Systems and Robotics,
Instituto Superior T´ecnico,
Av. Rovisco Pais, 1049-001 Lisboa, Portugal
Telerobot OCEM s.r.l.,
Via Semini 28C 16163 Genova, Italy
§DIST, University of Genoa,
Viale Causa, 13 16145 Genova, Italy
Received 2nd August 2010
Revised 1st July 2011
Accepted Day Month Year
This article describes the hardware design of the iCub humanoid robot. The iCub is
an open-source humanoid robotic platform designed explicitly to support research in
embodied cognition. This paper covers the mechanical and electronic design of the first
release of the robot. A series upgrades developed for the second version of the robot
(iCub2), which are aimed at the improvement of the mechanical and sensing performance,
are also described.
Keywords: humanoid robotics; open source; cognitive system.
1. Introduction
In recent years there has been a growing worldwide attention to the development
of humanoid robots. Although these robots are intended for real world applications
July 18, 2011 9:43 WSPC/INSTRUCTION FILE ijhrPaper
2A. Parmiggiani et al.
most of them are at the moment at the status of research prototypes to address the
problems of mobility1,2, entertainment3,4,5and service robotics6,7,8to cite a few.
Humanoid robots are also often used as a model to study human behaviour9,10,11,12.
The iCub (shown in Fig.1) can be considered as a member of the latter category.
(a) (b)
Fig. 1. The iCub. The figure shows a photograph of the iCub robot (a). The overall robot dimen-
sions are shown in (b).
2. RobotCub: open source robotics
Open source robotics, especially in its recent evolution, can be given two different
flavors covering respectively the software components required to operate a robot
platform (or a set of robot platforms) or the mechanical hardware. Examples of the
first category are Orocos13, OpenRTM14 and the Robot Operating System (ROS)15
which is a recent attempt of standardizing middleware for mobile robotics. One
slightly older example of the latter is the Japanese open source robot Pino16. There
is also a notable activity in the creation of open source electronic design and this is
summarized for example in the activities of the OpenCoresa.
The iCubbis one of the results of the RobotCub project, a EU-funded endeavor
to create a common platform for researchers interested in embodied artificial cog-
nitive systems17. Here the RobotCub project took a strong stance towards open
July 18, 2011 9:43 WSPC/INSTRUCTION FILE ijhrPaper
The design of the iCub humanoid robot 3
source by releasing everything of the Consortium work as GPL, FDL or LGPL:
this includes the mechanical and electronics design together with the software in-
frastructure of the iCub. The software infrastructure is based on an open source
middleware called YARP which can compile cleanly on a number of operating sys-
tems (supported on Linux, Windows and MacOS) using a well-established set of
For the design of the electronics and mechanics, we were forced to uses propri-
etary CAD tools, due to the absence of open source professional counterparts. This
is an unfortunate situation, but there is no practical alternative at the moment.
Free of charge viewers are available for all file formats employed by the project.
This does not prevent however the copy or reproduction of the iCub components
since 2D drawings or Gerber files suffice in manufacturing parts and printed circuit
boards (PCBs).
In supporting our open source stance, considerable effort was devoted to creating
an appropriate documentation of the robot. The current iCub documentation covers
nearly all aspects of the robot design, from the mechanical hardware to the operating
software. For RobotCub, it was decided to release all the CAD files under the
GPLc. The associated documentation was also licensed under the GPLd. The YARP
middleware is licensed either as GPL or LGPL.
(a) (b)
Fig. 2. The iCub kinematic structure. The figure shows a CAD representation of the iCub (a) and
of its kinematic structure (b). For visual clarity the representation of the eyes and hand joints has
been omitted.
cThe CAD models of the robot are available at
dThe documentation of the robot can be consulted at
July 18, 2011 9:43 WSPC/INSTRUCTION FILE ijhrPaper
4A. Parmiggiani et al.
3. Mechanical design of the iCub
This section describes the details of the mechanical design of the iCub robot. In
its final release at the end of the RobotCub project, the iCub is approximately
1[m] tall (see Fig.1(b)), has 53 active degrees of freedom (DOF) and has a mass of
approximately 24[kg].
3.1. Design specifications
The initial specifications for the design of the robot aimed at replicating the size
of a three-year-old child19. In particular, it was required that the robot be capable
of crawling on all fours and possess fine manipulation abilities. For a motivation
of why these features are important, the interested reader is referred to Metta
et al.17. The initial dimensions, kinematic layout and ranges of movement were
drafted by considering biomechanical models and anthropometric tables 17,20. Rigid
body simulations allowed to determine which were the crucial kinematic features
of the human body to be replicated in order to perform the set of desired tasks
and motions17,21. These simulations also provided joint torques requirements: these
data were then used as a baseline for the selection of the robot’s actuators. The
final kinematic structure of the robot is shown in Fig.2(b). The iCub kinematic
structure has several peculiar features which are rarely found in humanoid robots.
The waist features a three DOF torso which considerably increases the robot’s
mobility. Moreover the three DOF shoulder joint is constructed such that its three
axes of rotation always intersect at a single point. The list of the main DOF of the
iCub robot is listed in Table 1; for more detailed information the reader shall refer
to the official iCub documentatione.
3.2. Actuators
To match the aforementioned torque requirements several actuator technologies
were considered22,23. Among the various alternatives rotary electric motors coupled
with speed reducers were preferred because of their higher robustness and reliability.
In total three modular motor groups with different characteristics were developed;
this allowed their reuse throughout the main joints of the robot. All of them comprise
a Kollmorgen-DanaherMotion RBE type brushless frameless motorfand a CSD
frameless Harmonic Drive flat speed reducerg(see Fig. 3). Brushless motors have a
very good power density and generally outperform conventional brushed DC motors.
Harmonic Drive speed reducers are very light, have practically no backlash, and
allow very high reduction ratios in small spaces. The use of frameless components
fKollmorgen DanaherMotion product website:
gHarmonicDrive product website:
July 18, 2011 9:43 WSPC/INSTRUCTION FILE ijhrPaper
The design of the iCub humanoid robot 5
degree of freedom range of motion [deg]
shoulder pitch -95 +10
roll 0 +160
yaw -37 +80
elbow flexion/extension +5 +105
pronation/supination -30 +30
wrist flexion/extension -90 +90
abduction/adduction -90 +90
waist roll -90 -90
pitch -10 +90
yaw -60 +60
hip flexion/extension -120 +45
abduction/adduction -30 +45
rotation -90 +30
knee flexion/extension 0 +130
ankle flexion/extension -60 +70
abduction/adduction -25 +25
neck pan -90 +90
tilt -80 +90
roll -45 +45
Table 1. The joints of the iCub. The table lists the main joints of the iCub robot and their
respective range of motion.
allows further optimization of space and to avoids the unnecessary weight of the
housings. The characteristics of the actuator modules are the following:
the high power motor group: capable of delivering 40Nm of torque, it is
based on the RBE 01211 motor and a CSD-17-100-2A Harmonic Drive,
and has, roughly, a diameter of 60mm and a length of 50mm.
the medium power motor group: capable of delivering 20Nm of torque, it
is based on the RBE 01210 motor and a CSD-14-100-2A Harmonic Drive,
and has, roughly, a diameter of 50mm and a length of 50mm.
the low-power motor: capable of delivering up to 11 Nm it is based on
the RBE 00513 motor and a CSD-14-100-2A Harmonic Drive and has,
approximately a diameter of 40mm and a length of 82mm.
3.3. Cable drives
In the design of the robot cable drive transmissions are widely employed. Cable
transmissions can be used to efficiently transmit power from an actuator to a driven
link whose range of rotation is limited. Cable drives allow the transmission of power
between bodies rotating along different axes with driven pulleys, stepped pulleys,
July 18, 2011 9:43 WSPC/INSTRUCTION FILE ijhrPaper
6A. Parmiggiani et al.
output shaft
wave generator
motor rotor
motor stator
circular spline
magnetic position
Fig. 3. Motor group cross section. The figure shows a cross section of a iCub motor group. The
Harmonic-Drive and Kollmorgen brushless motor are clearly visible.
pinions, and idle pulleys. Whenever space is limited they are a good alternative to
geared transmission. Despite this kind of transmission generally has a lower mechan-
ical stiffness than gears, if designed properly, it generally allows to obtain higher
efficiencies. Moreover cable drives can be used to construct epicyclic transmission
mechanisms similar to the one introduced by Salisbury et. al.24,25 and refined by
Townsend26. In normal “serial” manipulators all the motors and speed reduction
units are mounted directly on the joints, thus increasing the inertial loads on the
motors. Instead by using coupled cable transmission the joints can be driven re-
motely: motors can thus be mounted in the proximity of the joint rather than on
the joint itself. A mechanism of this kind has several advantages among which are
a more compact size and lower weights and inertias if compared to standard serial
designs. Another advantage is that this kind of transmission generally allows to
obtain larger workspaces. However it is generally affected by some drawbacks such
as higher mechanical complexity (therefore higher manufacturing costs and longer
assembly time) and less intrinsic robustness. Examples of the implementation of
cable drives can be found in the shoulder, elbow, torso, hip and ankle joints.
Let us consider, for illustrative purposes, the assembly of the two stage shoul-
der roll joint, represented in Fig.4 and Fig.5. The output shaft of the motor block
comprises a pulley that is connected to an idle pulley, that is coaxial with the main
motor group of the shoulder joint. This connection is obtained with two high resis-
tance steel cables (1.5[mm] cross sectional diameter), manufactured by Carlstahlh
and represented in pink and purple in Fig5(a). Because cables can only transmit
forces through tension, two of them are always necessary to obtain forward and
backward motion. Two other cables (represented in red and blue in Fig5(a)) con-
nect the idle pulley to the output assembly with pulleys that intersect at a 90[deg]
hManufacturer’s website:
July 18, 2011 9:43 WSPC/INSTRUCTION FILE ijhrPaper
The design of the iCub humanoid robot 7
angle, thus constituting the second stage of the transmission. Since the two cables
cannot be wound on the same cylindrical surface (as suggested by Townsend25), a
stepped pulley is employed to allow the correct cable routing.
3.4. Materials selection
The total weight design specification was particularly difficult: special care had to be
taken in the design of structural elements to avoid adding mass. For what concerns
the materials, the majority of the parts of the robot were fabricated with the Al6082
aluminum alloy. With its ultimate tensile strength (UTS) of 310[MPa] and roughly
the typical density of aluminum 2700[kg/m3], Al6082 is among the best materials in
the 6000 alloy seriesi. For these reasons it was widely employed for all the parts that
did not require particular resistance characteristics. Another material that has been
employed id the Al7075 aerospace aluminum alloy because of its excellent strength
to weight ratio. The use of zinc as the primary alloying element results in a strong
material, with good fatigue strength and average machinability. The density of
Al7075 has a density of 2810[kg/m3] which is slightly higher than normal aluminum;
its UTS of 524[MPa]jis comparable with that of medium quality steels and make
it one of the toughest types of aluminum alloys currently available. Components
with more demanding mechanical properties were therefore manufactured with this
material. Finally, highly stressed parts (such as joint shafts) were obtained from
the high resistance stainless steel 39NiCrMo3. This material, known in the AISI
standard as AISI9840, is a nickel-chromium-molybdenum steel, that exhibits a good
combination of strength, fatigue resistance, toughness and wear resistance. Its UTS
is high, around 1.2[GPa]k.
3.5. The arm and elbow assemblies
The iCub arm has two joints: a three DOF proximal “shoulder” joint and a rota-
tional distal “elbow” joint (see Fig.4). The shoulder movements are obtained by
means of a cable driven epicyclic transmission of the kind described in section 3.3
which is shown in Fig.5(a). The three motors driving the shoulder are housed in the
upper-torso aluminum frame. The first motor actuates directly the shoulder pitch
joint whereas the second and third motors actuate two pulleys that are coaxial with
the first motor. These pulleys have slightly different primitive diameters thus pro-
ducing a transmission reduction equal to the ratio of their diameters. The pulley
motion is then transmitted to the shoulder roll and yaw joints through a second
set of idle pulleys (see Fig.5(a) and Fig.5(b)). As a result the shoulder joint has its
iAl6082 Matweb datasheet: print.aspx?matguid=fad29be6e64d4e95a241690f1f6e1eb7.
jAl7075 Matweb datasheet: print.aspx?matguid=9852e9cdc3d4466ea9f111f3f0025c7d.
k39NiCrMo3 Matweb datasheet: print.aspx?matguid=697130f21da64542a68bf61911f2f495.
July 18, 2011 9:43 WSPC/INSTRUCTION FILE ijhrPaper
8A. Parmiggiani et al.
three axes of rotation intersecting at a single point (which is a typical characteristic
of robotic wrist mechanisms) thus allowing “quasi”-spherical movements.
yaw axis
elbow joint
roll axis
pitch axis
Fig. 4. The iCub arm. The figure represents a CAD view of the arm of the iCub and its three
DOF shoulder joint and one DOF elbow joint.
(a) (b)
Fig. 5. The shoulder joint. The figure shows a CAD view of the shoulder joint mechanism with a
highlight of the cable transmission system (a). The figure also shows a photo of a bottom view of
the shoulder joint (b).
This unconventional construction introduces kinematic couplings between the
different motions. Because of this coupling however the relation between the dis-
placements and the torques at motor and joint level is not straightforward. The
technique outlined by Tsai for robotic wrist mechanisms27 is particularly conve-
nient for the analysis of complex epicyclic transmissions and allows to derive these
July 18, 2011 9:43 WSPC/INSTRUCTION FILE ijhrPaper
The design of the iCub humanoid robot 9
relations for the iCub shoulder mechanism28.
The one DOF elbow joint is rather simple in its design. The output link is
driven through a pulley system which transmits the power from the motor group.
The motor is housed at the center of the assembly oriented 90[deg] with respect to
the axis of rotation of the elbow. A six-axis force-torque sensor is mounted at the
interface between the shoulder and elbow assembly.
3.6. The forearm and hand groups
The hand of the iCub has been designed to enable dexterous manipulation as this
capability is crucial for natural grasping behaviours (which are in turn fundamental
for our research in cognitive systems). The hand of the iCub has 19 joints but is
driven by only 9 motors: this implies that group of joints are under-actuated and
their movement is obtained with mechanical couplings. Similarly to the human body
most of the hand actuation is in the forearm subsection. In particular, seven out of
the nine motors driving the hand joints are placed in the forearm assembly. Given
the limited amount of space available 0.36 to 2.57[W] brushed DC electric motors
were employed. These electric motors are coupled to multistage planetary speed
reducers (whose reduction ratios vary from 159:1 to 256:1) to obtain the desired
torques. The output shaft of the motors is connected to capstans which wind and
unwind the steel cables that drive the phalanges movements.
The tendon arrangement is extremely critical; therefore the cable routing had to
be done with extreme care and neatly organized according to specified guidelines l
(see Fig.6(b)). Moreover each tendon has to be tensioned properly: this was achieved
by inserting double screwed tensioners along each cable.
The wrist is driven by a differential transmission mechanism of the type de-
scribed in section 3.3. On the other hand the flexing of the fingers is directly driven
by the motors while their extension relies on a spring return mechanism, thus re-
ducing the overall complexity of the device. The motion of the proximal phalanx
and medial and distal phalanges are independent for the thumb, the index and the
middle finger. The ring and small finger motions are coupled and driven by a single
motor. Finally two motors, placed directly inside the hand assembly, are used for
adduction/abduction movements of the thumb and of the index, ring and small fin-
gers. The position of each phalanx is sensed by 17 small custom magnetic position
The overall size of the hand is extremely compact with its 50[mm] in length,
60[mm] in width and 25[mm] in thickness, making it one of the smallest and most
dexterous of its kind. The design of the iCub hand has been addressed in greater
detail in a recent paper by Schmitz et al.29 to which the reader shall refer for
additional informations.
lThe documentation can be consulted at
July 18, 2011 9:43 WSPC/INSTRUCTION FILE ijhrPaper
10 A. Parmiggiani et al.
(a) (b)
Fig. 6. The right forearm and hand of the iCub. The figure shows a photo of the right hand of the
iCub (a) and of the right forearm (b). The motor-capstan arrangement and the cable tensioning
devices, described in the main text can clearly be seen in (b).
3.7. The lower body and the torso
The preliminary phases of the design process described in section 3.1 suggested that
for effective crawling a two DOF waist/torso mechanism is adequate. However, a
three DOF waist was preferred to increase the range and flexibility of motions of
the upper body. As a result the robot can lean, sideways, forwards and backwards,
and rotate its body along its sagittal axis.
The torso mechanism is also based on the differential epicyclic transmission
described in section 3.3. In this case however the two base motors drive a third
motor group, whose axis is orthogonal to the previous motors. The first two motors
actuate jointly the pitch and roll axes whereas the third motor drives the yaw joint.
The torso subassembly is shown in Fig.7.
Fig. 7. The torso of the iCub. The figure shows a photo of the 3DOF torso mechanism and in
particular the construction of the differential cable drive transmission.
Since for the legs space and size constraints were not particularly critical the
lower body was designed with a more standard “serial” configuration. The legs of
July 18, 2011 9:43 WSPC/INSTRUCTION FILE ijhrPaper
The design of the iCub humanoid robot 11
the iCub comprise a three DOF joint at the hip. In this joint the first DOF is driven
remotely by means of a cable drive actuated by a motor which is located in the lower
torso assembly (see Fig.8(a)). The leg includes a one DOF knee joint, actuated
by the knee flexion/extension motor, and a two DOF ankle (see Fig.8(b)). Each
ankle is actuated by a frameless brushless motor housed in the lower leg segments
which drives the flexion/extension movement and by a smaller motor group for the
abduction/adduction movement which is placed directly on the foot.
(a) (b)
Fig. 8. Details of the legs. The figure shows detail photographs of the legs of the iCub robot. The
fisrt two axes of the 3DOF hip are shown in (a). The 2DOF ankle is shown in (b).
3.8. The head
The primary function of the head assembly is to move cameras in order to quickly
observe the environment. Two small video cameras are therefore available on the
iCub eyes (contained entirely inside the eyeballs). These cameras are moved by a
three DOF eyes mechanism which allows both tracking and vergence behaviors. The
compact neck mechanism has three additional DOF arranged in a serial pitch, roll
and yaw configuration (see Fig.1(c)). The three neck joints are driven by brushed
DC motors coupled with low backlash Gysin speed reducers, to avoid problems
when performing visual tasks. The eyes movements are also achieved with three
DC brushed motors which drive the eyes with toothed belts (see Fig. 9). The belts
can be tensioned by means of apposite tensioning guides which are included in the
mechanism. Besides the video camera and the video processing boards the head
also contains the following elements:
an XSense MTx inertial sensor, which measures the three components of
linear accelerations and of angular velocities;
a PC104 which is used for high level motor control (see section 4.1);
two small omnidirectional microphones, for auditory input;
a MCP and two MC4 boards (see section 4.2) which are used to control the
neck and eyes motors;
July 18, 2011 9:43 WSPC/INSTRUCTION FILE ijhrPaper
12 A. Parmiggiani et al.
facial expression boards, which control a set of LED’s that represent the
facial expressions.
Fig. 9. Eyes mechanism. The figure shows a photo of the eyes mechanism; in particular the toothed
belt transmissions of the eyes tilt and pan axes can be seen.
4. Electronics and sensors
4.1. PC104
The PC104 card is used in general for the bidirectional communication of iCub with
the external control station. It is based on a Intel Core 2Duo 2.16[Mhz] Pentium
processor and has 1GB of RAM, and the sensors acquisition and control electronics.
The data to and from the different robot parts are transferred over several CAN bus
lines. As the gradual improvements and addition of sensors (see following sections)
required an increase of data throughput in its latest revision the board interfaces
with ten CAN bus ports.
4.2. Motor Control boards
The arms’ brushless motors are controlled with the BLL (BrushLess Logic) and
the BLP (BrushLess Power) electronic boards shown in Fig.10(a). The BLL board
processes the various signals provided by the sensors and generates the control
signals that govern the motion of the motors. These signals are then passed to the
BLP board which contains the actuator power drivers: the voltages applied to the
three phases are controlled by the amplifiers with pulse width modulation (PWM).
BLP boards can provide power up to 20[A] at 48[V]. Similar but smaller boards
have been developed to drive small, low power DC motors. The power board and
the controller board (which drives four motors independently) are conventionally
called MCP and MC4 respectively. A MCP and three MC4 boards (see Fig.10(b))
can be used to control up to 12 DC motors, delivering up to 1[A] at 12[V] to each
July 18, 2011 9:43 WSPC/INSTRUCTION FILE ijhrPaper
The design of the iCub humanoid robot 13
motor. The electronics are placed on-board near the motor joint assemblies. Data
to and form BLL and MC4 boards are exchanged through CAN bus interfaces.
(a) (b)
Fig. 10. Motor control boards. The figure shows the custom motor control boards developed for
the RobotCub project. The BLP and BLL boards for high power motor groups are shown in (a).
The low power MCP and MC4 boards are shown in(b).
4.3. Joint position sensors
For what concerns position sensing, each actuator unit contains three Hall effect
sensors integrated in the motor stator that can be used as an incremental rotary
position sensor. This provides a low resolution 48cpr (counts per revolution) rotor
position measurement which can be used for trapezoidal phase commutation. More-
over every joint angular position is sensed with an absolute 12bit angular encoder
(employing the AS5045 microchip from Austria Microsystems).
Fig. 11. AEA board. The figure shows a photograph of the 12 bit AEA encoder board.
In most cases there is no room to fit a position sensor in the frontal part of
the motor groups since all parts move with respect to the frame. For this reason it
was necessary to locate the position sensor in the rear of the motor. To do this the
movement of the output link is transmitted through the motor’s rotor hollow shaft
with a thin shaft that carries the magnet for the sensor: this particular arrangement
can be seen in Fig.3.
July 18, 2011 9:43 WSPC/INSTRUCTION FILE ijhrPaper
14 A. Parmiggiani et al.
(a) (b)
Fig. 12. Six-axis force-torque sensor. The figure shows photos of the sensors’ three-spoke structure
(a), and the integrated electronic board (b).
4.4. Six-axis force-torque sensor
The iCub arm also comprises a six-axis force-torque sensor 21. The sensor load cell
is based on a three spoke structure machined from stainless steel Fig.12(a). On each
side of each spoke a semi-conductor strain-gage is mounted: opposite strain gages
are connected in a half Wheatstone’s’ bridge configuration. The sensor integrates
an electronic board for the data acquisition and signal conditioning Fig.12(b). The
board samples six analog channels with an INA118 instrumentation amplifier: each
input is connected to one of the six aforementioned half Wheatstone’s’ bridges. The
analog to digital conversion is performed by an AD7685 converter on the multi-
plexed signals of the six channels. It is besides possible to add an offset by means
of a DAC. The board also allows the installation of thermal compensation resistors
that minimize the thermal drift effects of the semi-conductor strain gages. All the
operations are managed with a 16bit DSP from Microchip (dsPIC30F4013) which
also provides digital signal filtering and the linear transformation needed to project
the signals of the strain gages to the force/torque space. The data are finally broad-
cast through a CAN bus interface at a frequency of 1[kHz].
4.5. Pressure sensors for tactile feedback
Humanoid robots are required to sustain increasingly complex forms of interaction
(e.g. whole hand or whole arm grasping and manipulation30, etc.). In these cases the
location and the characteristics of the contact cannot be exhaustively predicted or
modeled in advance. Skin-like sensors and sensing methods are therefore required
for processing distributed tactile information. The problem is not new and some
pressure sensing technologies for humanoid robots were studied recently31,32.
We have equipped the hands of iCub with a distributed pressure sensing system
based on capacitive technology33,34. This technology is based on modules which
yield 12 independent measurements from 12 corresponding pressure sensing ele-
July 18, 2011 9:43 WSPC/INSTRUCTION FILE ijhrPaper
The design of the iCub humanoid robot 15
ments, called taxels in the following. The basis of each taxel is constituted by a
round metal pad which is obtained on a flexible PCB. Flexible PCBs can be bent
to cover generic curved surfaces and the shape can be engineered to optimize cov-
ering or curvature of the robot surface. The flexible PCBs are then covered with a
thin layer of soft silicone foam, which is roughly 2.5[mm] thick. This silicone layer
acts as the dielectric medium of a capacitor. The foam is covered by an outer layer,
which can be obtained either from conductive Lycra or from conductive silicone.
This layer is connected to ground and enables the sensor to respond to objects ir-
respective of their electrical properties (unlike consumer electronics products based
on the same technology). In addition, this layer reduces electric noise from the envi-
ronment. When pressure is applied to the sensor, this conductive layer gets closer to
the round pads on the PCB thereby changing their capacitance. We use this change
in capacitance as an estimation of the pressure applied to the sensor surface. The
taxels are connected to a capacitive to digital converter chip (AD7147 from Analog
Devices), which sends the measurements over an I2C serial bus. The data of up to
sixteen modules (for a total of 192 taxels) are collected by a small micro-controller
board, which can then relay them to the main CPU via CAN bus.
(a) (b)
Fig. 13. Tactile pressure sensors. The figure shows a photograph of the iCub palm with embedded
capacitive pressure sensors (a), and a detail of the pressure sensing fingertip (b).
In the first version of iCub these pressure sensors have been embedded in the
palm, the fingertips and the forearm (as shown in Fig.16(a)), to enhance the manip-
ulation capabilities of the robot. In particular, the skin of the palm incorporates four
triangular modules (see Fig.13(a)), each of the five fingertips comprises one mod-
ule (see Fig.13(b)), and the forearm covers contain 23 modules. This arrangement
results in a total of 384 independent sensitive elements per arm.
4.6. Communication bus
Control cards, skin sensors and force-torque sensors communicate on several
1Mbit/s CAN bus ports. The network is characterized by a star-like topology, with
all branches converging on the PC104 CPU. Ten CAN sub-networks (roughly one for
each body segment, and two dedicated for skin sensors) join at a central node which
July 18, 2011 9:43 WSPC/INSTRUCTION FILE ijhrPaper
16 A. Parmiggiani et al.
Fig. 14. iCub bus diagram. The figure shows a diagram of the current communication bus arrange-
ment of the iCub. The control boards are called DEV int the figure.
is constituted by the PC104 described in section 4.1. This network architecture is
represented in Fig.14.
5. iCub2
The first version of the iCub robot was developed and constructed four years ago.
Since then it has intensively been used by several partners and institutionsm: this
allowed to reveal several critical aspects in its initial design. We therefore began the
development of a new version of the iCub (tentatively called iCub2) which is now
almost complete. The most relevant improvements are described in the following
5.1. Joint torque sensing
The requirement for the robot to interact safely and robustly with humans and
its surrounding environment is particularly difficult to fulfill. To achieve this joint
torque feedback is essential. We therefore developed torque sensors for the main
joints of the iCub. All the sensors are based on the piezo-resistive effect of semi-
conductor strain gages (SSG). When loads are applied the sensing elements and
the SSG which are attached to them deform. This deformation is accompanied
by a change in resistance which is proportional to the applied torque. The signal
conditioning is preformed by microcontroller boards similar to the one described
in section 4.4. The sensors allow the measurement of joint torques with 16 bits
of resolution at a frequency of 1[kHz]. As the constraint was to maintain all the
functional dimensions of the iCub unchanged, the development of the sensor for the
mA comprehensive and up to date list can be found in the website
July 18, 2011 9:43 WSPC/INSTRUCTION FILE ijhrPaper
The design of the iCub humanoid robot 17
shoulder joint (which are shown in Fig.15(a)) was particularly complicated35. For
the lower body instead it was possible to develop a sensor with radial, controlled
deformation, spoke features, that can seamlessly be integrated in the motor groups
(see Fig.15(b)). Finite element structural simulations were employed to optimize the
final sensor geometries. Although the current loop frequency is limited by the CAN
bus network throughput, in the near future it will be possible to close torque feed-
back loops at 3[kHz] by relying on a new control card design (see section 5.6). The
(a) (b)
Fig. 15. Joint torque sensing. The figure shows a photograph of the iCub shoulder joint torque
sensors (a), and the “modular” joint torque sensor developed for the lower body (b).
addition of joint torque sensing also required significant upgrades to the firmware
and software currently used to control the robot36.
5.2. Extensive tactile feedback
Besides joint torque sensing the sense of touch is among the principal sensing modal-
ities required to work closely and interact safely with humans and more in general
with the environment. Touch can provide a reliable source of information to guide
exploratory behaviors as required for example in machine learning. For this rea-
son the exterior surfaces of iCub2 have been extensively covered with the pressure
sensitive elements described in section 4.5. As shown in Fig.16 the “skin” tactile
sensors will be embedded in the fingertips, the palms, the forearms, the upper arm
segments, the torso, the upper leg segments, the knees, the lower leg segments and
the feet, for a total of approximately 4200 taxels. This, to our knowledge, makes
iCub2 the humanoid robot with the highest number of pressure sensitive points.
The processing of the vast amount of data streaming from these sensors will be an
interesting technological challenge which is currently being addressed in the context
of the RoboSKINnFP7 European project.
July 18, 2011 9:43 WSPC/INSTRUCTION FILE ijhrPaper
18 A. Parmiggiani et al.
torso: 552tx
upper arm (top): 96tx
upper arm (bottom): 348tx
fingertip: 12tx each
knee: 72tx
lower leg: 360tx*
upper leg: 432tx*
palm: 56tx
foot sole: 150tx*
forearm (top): 84tx
forearm (bottom): 192tx
fingertip: 12tx each
palm: 56tx
forearm (top): 84tx
forearm (bottom): 192tx
(a) (b)
Fig. 16. Pressure sensing surfaces of the iCub. The figure shows the covers of the iCub covers with
embedded pressure sensors. The surfaces colored in blue (a) show the pressure sensing elements of
the first version of iCub. The surfaces colored in yellow (b) show the skin coverage of iCub2. The
figure also indicates the number of taxels for each body segment. The numbers with the asterisk
are to be intended as approximate as the design of those surfaces is currently being finalized.
5.3. Head and eyes redesign
It was noticed that in particular operating conditions the neck motor would overheat
quickly, thus indicating that they were probably under-dimensioned. A first revision
of the neck mechanism, still based on the serial joint configuration, was proposed
by Rodriguez37. The proposed solution was based on the use of Harmonic Drive
speed reducers and four-bar linkages as key elements of the transmission. However
the design was not entirely compatible with size and space constraints, and was
therefore modified. This variant is based on a “parallel” actuation scheme with
cable drives. The new solution is partly inspired by the design of the robot COG
by Brooks et al.38 which has also been employed in the construction of the MERTZ
robot head 39. As described in section 3.3, epicyclic transmissions are a very effective
way to reduce the driven masses and inertias. In the final design the new head
assembly weights approximately 1.05[kg] less than the first version of the head.
With its 8[Nm] peak torque on the pitch and roll axes, the new neck mechanism
also allows a three-folds increase of the delivered output torque.
The eyes mechanism has also been revised. The original design was found to be
critical in two senses:
rapid eye movements were obtained with brushed DC motors which em-
July 18, 2011 9:43 WSPC/INSTRUCTION FILE ijhrPaper
The design of the iCub humanoid robot 19
(a) (b) (c)
Fig. 17. Design revision of the iCub head. The figure shows two CAD views of the new version of
the iCub head and eyes (a) and (b), and a photo of the head without the electronics (c).
ployed low reduction ratio planetary speed reducers. This introduced sig-
nificant backlashes in the transmission, thus complicating the control, and
in general the achievement of visual tasks.
the tensioning of the toothed belt transmission had to be performed manu-
ally. This was problematic in terms of accuracy of the camera positioning.
The first issue is commonly solved by employing zero-backlash Harmonic Drive
speed reducers, as done also by Asfour et al.40 in the tilt joint of the ARMAR-III
humanoid head. Rodriguez suggested this solution37 as well; moreover he proposed
to solve the second issue by replacing the transmission belts with low play, rigid
four bar linkages. Since the elegant solution proposed by Rodriguez37 introduced
a mechanical coupling between the eyes pan and tilting motions we preferred to
maintain the current eyes mechanism configuration while improving the precision
by the addition of Harmonic Drive gears. A CAD view of the new iCub head is
shown in Fig.17. The new neck design also features a two piece eyeball whose outer
part can easily be removed to fine-tune the positioning of the cameras.
5.4. Optical encoders
Currently the configuration of the robot is measured by means of the magnetic
joint positions sensors described in section 4.3. Brushless motors are instead driven
with a “standard” trapezoidal PWM profile strategy based on the feedback of three
digital Hall effect sensors placed in the motor stator. These sensors provide a low
resolution 48cpr (counts per revolution) signal which causes slight vibrations when
driving the motor at low speeds. To solve this issue and to implement more advanced
field oriented control (FOC) strategies we developed and tested an extremely com-
pact custom optical encoder with 8192cpr resolution. We successfully completed the
July 18, 2011 9:43 WSPC/INSTRUCTION FILE ijhrPaper
20 A. Parmiggiani et al.
CAT5 cable
Fig. 18. Ethernet bus diagram. The figure shows a diagram of new Ethernet based network archi-
preliminary testing and are now integrating this subsystem in all the major joints
of the robot.
5.5. Control boards revision
The control boards were improved in several ways. More in detail the first release
of the BLL boards, described in section 4.2, had several issues for what concerned
the phases current measurement system. This system has been improved and is
now capable of providing a reliable, high bandwidth, 13bit current measurement.
Moreover as new sensors were added to the robot (e.g. see section 5.4) the board I/O
ports had to be revised, without however substantial changes. Finally the boards
firmware has been thoroughly optimized with respect to its first stable release.
5.6. Ethernet bus
Besides the aforementioned upgrades the whole sensory motor architecture is be-
ing deeply revised. Since the current network (based CAN buses) limits the data
throughput, a new Ethernet based network has been developed. The new architec-
ture will be configured hierarchically with “mid-level” control boards called DEV
in the following, supervising the operation of “low-level” boards (as represented in
Fig18). The higher layer of the architecture will communicate on an Ethernet bus,
whereas the lower level boards will employ an efficient and robust CAN transmission
protocol. The solution which we are currently investigating is based on a flexible
Ethernet design, where the DEVs, present two plugs for connection with a CAT5
cable. The DEVs can thus be connected either in daisy chain or in point-to-point
or even in a mixture of them.
July 18, 2011 9:43 WSPC/INSTRUCTION FILE ijhrPaper
The design of the iCub humanoid robot 21
6. Conclusions
The first part of this article presented the development of the iCub robot, which
is currently being used in several robotics laboratories worldwide for research in
embodied cognitiono. In the second part the most relevant upgrades which are being
integrated in the second version of the robot (namely iCub2) have been described.
The robot features a combination of various technologies which make it unique;
among these full joint torque feedback, extensive pressure sensing, open hardware
and software can be cited as the most important. We hope that these features will
make iCub2 the platform of choice for the emerging fields of artificial intelligence,
motor control and developmental cognition.
This work has been supported by the European Commision RobotCub IST-FP6-
004370, CHRIS IST-FP7-215805 and RoboSKIN ICT-FP7-231500 projects.
We would like to thank and to acknowledge the contributions to this project
of Mattia Salvi, Diego Torazza, Fabrizio Larosa, Marco Accame, Claudio Lorini,
Bruno Bonino, Andrea Menini, Davide Gandini, Emiliano Barbieri, Roberto Puddu,
Charlie Sanguineti, Marco Pinaffo and all the people who have contributed to the
construction, maintenance and design of the iCub, whose help has been essential to
the completion of this work.
1. Masato Hirose and Kenichi Ogawa. Honda humanoid robots development. Philosoph-
ical Transactions of the Royal Society, 365(1850):11–19, 2007.
2. I.W. Park, J.Y. Kim, J. Lee, and J.H. Oh. Mechanical design of the humanoid robot
platform HUBO. Advanced Robotics, 21(11):1305–1322, 2007.
3. Tatsuzo Ishida, Yoshihiro Kuroki, and Jinichi Yamaguchi. Mechanical system of a
small biped entertainment robot. In Proc. IEEE/RSJ Int. Conf. on Intelligent Robots
and Systems (IROS), pages 1129–1134, October 2003.
4. D. Gouaillier, V. Hugel, P. Blazevic, C. Kilner, J. Monceaux, P. Lafourcade,
B. Marnier, J. Serre, and B. Maisonnier. Mechatronic design of NAO humanoid. In
Proc. IEEE Int. Conf. on Robotics and Automation (ICRA), pages 2124–2129, 2009.
5. K. Kaneko, F. Kanehiro, M. Morisawa, K. Miura, S. Nakaoka, and S. Kajita. Cy-
bernetic human HRP-4C. In Proc. IEEE/RAS Int. Conf. on Humanoid Robots (HU-
MANOIDS), pages 7–14, 7-10 2009.
6. T. Asfour, K. Regenstein, P. Azad, J. Schroder, A. Bierbaum, N. Vahrenkamp, and
R. Dillmann. ARMAR-III: An integrated humanoid platform for sensory-motor con-
trol. In IEEE/RAS Int. Conf. on Humanoid Robots (HUMANOIDS), pages 169–175,
7. C. Ott, O. Eiberger, W. Friedl, B. Bauml, U. Hillenbrand, C. Borst, A. Albu-Schaffer,
B. Brunner, H. Hirschmuller, S. Kielhofer, et al. A humanoid two-arm system for dex-
oFor an up to date list of the demonstrations of the robot please consult the website:
July 18, 2011 9:43 WSPC/INSTRUCTION FILE ijhrPaper
22 A. Parmiggiani et al.
terous manipulation. In IEEE/RAS Int. Conf. on Humanoid Robots (HUMANOIDS),
pages 276–283, 2006.
8. Kenji Kaneko, Kensuke Harada, Fumio Kanehiro, Gou Miyamori, and Kazuhiko
Akachi. Humanoid robot HRP-3. In Proc. IEEE/RSJ Int. Conf. on Intelligent Robots
and Systems (IROS), pages 2471–2478, 2008.
9. Christopher G. Atkeson, Joshua G. Hale, Frank Pollick, Marcia Riley, Shinya Koto-
saka, Stefan Schaal, Tomohiro Shibata, Gaurav Tevatia, Ales Ude, Sethu Vijayaku-
mar, and Mitsuo Kawato. Using humanoid robots to study human behavior. IEEE
Intelligent Systems, 15(4):46–56, 2000.
10. Akihiko Nagakubo, Yasuo Kuniyoshi, and Gordon Cheng. The ETL-humanoid system
- a high-performance full-body humanoid system for versatile real-world interaction.
Advanced robotics, 17(2):149–164, 2003.
11. T. Minato, Y. Yoshikawa, T. Noda, S. Ikemoto, H. Ishiguro, and M. Asada. CB2: A
child robot with biomimetic body for cognitive developmental robotics. In IEEE/RAS
Int. Conf. on Humanoid Robots (HUMANOIDS), 2007.
12. G. Cheng, Hyon Sang-Ho, A. Ude, J. Morimoto, J.G. Hale, J. Hart, J. Nakan-
ishi, D. Bentivegna, J. Hodgins, C. Atkeson, M. Mistry, S. Schaal, and M. Kawato.
CB: Exploring neuroscience with a humanoid research platform. Advanced robotics,
21(10):1097–1114, 2007.
13. Herman Bruyninckx, Peter Soetens, and Bob Koninckx. The real-time motion control
core of the Orocos project. In Proc. IEEE Int. Conf. on Robotics and Automation
(ICRA), pages 2766–2771, 2003.
14. —. OpenRTM.
15. Morgan Quigley, Brian Gerkey, Ken Conley, Josh Faust, Tully Foote, Jeremy Leibs,
Eric Berger, Rob Wheeler, and Andrew Y. Ng. ROS: an open-source robot operating
system. In Open-Source Software workshop at the Int. Conf. on Robotics and Automa-
tion (ICRA), 2009.
16. F. Yamasaki, T. Matsui, T. Miyashita, and H. Kitano. Pino the humanoid: a basic
architecture. RoboCup 2000: Robot Soccer World Cup IV, pages 269–278, 2001.
17. Giorgio Metta, Giulio Sandini, David Vernon, Darwin Caldwell, Nikolaos Tsagarakis,
Ricardo Beira, Jose Santos Victor, Auke Ijspeert, Ludovic Righetti, Giovanni Cap-
piello, Giovanni Stellin, and Francesco Becchi. The robotcub project: an open frame-
work for research in embodied cognition. In Proc. IEEE/RAS Int. Conf. on Humanoid
Robots (HUMANOIDS), pages 13–32, 2004.
18. P. Fitzpatrick, G. Metta, and L. Natale. Towards long-lived robot genes. Robotics and
Autonomous Systems, 56:29–45, 2008.
19. Giorgio Metta, David Vernon,
and Giulio Sandini. Deliverable 8.1. Initial specification of the iCub open system,
20. Alvin R. Tilley. The measure of man & woman: human factors in design. Wiley In-
terscience, 2002.
21. Tsagarakis, N.G., Metta, G., Sandini, G., Vernon, D., Beira, R., Becchi, F., Righetti,
L., Santos-Victor, J., Ijspeert, A.J., Carrozza, M.C., Caldwell, and D.G. iCub: the
design and realization of an open humanoid platform for cognitive and neuroscience
research. Advanced Robotics, 21(10):1151–1175, 2007.
22. John M. Hollerbach, Ian W. Hunter, and John Ballantyne. A comparative analysis of
actuator technologies for robotics, pages 299–342. MIT Press, Cambridge, MA, USA,
23. Paolo Dario. Deliverable 7.2. Analysis and pre-selection of the sensor and actuator
July 18, 2011 9:43 WSPC/INSTRUCTION FILE ijhrPaper
The design of the iCub humanoid robot 23
technologies., 2004. http://www.robot- 7 2 pdf.
24. J. Kenneth Salisbury, William T. Townsend, David M. DiPietro, and Brian S. Eber-
man. Compact cable transmission with cable differential. Patent, Feb 1991. US 4 903
25. William T. Townsend. The effect of transmission design on force-controlled manipu-
lator performance. PhD thesis, Massachusetts Institue of Technology, 1988.
26. William T. Townsend and J. Kenneth Salisury. Mechanical design for whole-arm ma-
nipulation. In Paolo Dario, Giulio Sandini, and Patrick Aebischer, editors, Robots and
biological systems: towards a new bionics?, pages 153–164. Springer, 1993.
27. Lung-Wen Tsai. Robot analysis. Wiley interscience, 1999.
28. Alberto Parmiggiani. Torque control: a study on the iCub humanoid robot. PhD thesis,
Univversit´a degli Studi di Genova, 2010.
29. A. Schmitz, U. Pattacini, F. Nori, L. Natale, G. Metta, and G. Sandini. In IEEE/RAS
Int. Conf. on Humanoid Robots (HUMANOIDS), pages 186–191, 2010.
30. Y. Ohmura and Y. Kuniyoshi. Humanoid robot which can lift a 30kg box by whole
body contact and tactile feedback. In Proc. IEEE/RSJ Int. Conf. on Intelligent Robots
and Systems (IROS), 2007.
31. O. Kerpa, K. Weiss, and H. Worn. Development of a flexible tactile sensor system for
a humanoid robot. In Proc. IEEE/RSJ Int. Conf. on Intelligent Robots and Systems
(IROS), volume 1, 2003.
32. Y. Ohmura, Y. Kuniyoshi, and A. Nagakubo. Conformable and scalable tactile sen-
sor skin for curved surfaces. In Proc. IEEE Int. Conf. on Robotics and Automation
(ICRA), pages 1348–1353, 2006.
33. G. Cannata, M. Maggiali, G. Metta, and G. Sandini. An embedded artificial skin
for humanoid robots. In IEEE Int. Conf. on Multisensor Fusion and Integration for
Intelligent Systems, pages 434–438, 2008.
34. A. Schmitz, M. Maggiali, M. Randazzo, L. Natale, and G. Metta. A arototype fingertip
with high spatial resolution pressure sensing for the robot iCub. In IEEE/RAS Int.
Conf. on Humanoid Robots (HUMANOIDS), pages 423–428, 2008.
35. Alberto Parmiggiani, Marco Randazzo, Lorenzo Natale, Giorgio Metta, and Giulio
Sandini. Joint torque sensing for the upper-body of the icub humanoid robot. In Proc.
IEEE/RAS Int. Conf. on Humanoid Robots (HUMANOIDS), pages 15 –20, 7-10 2009.
36. G. Metta, P. Fitzpatrick, and L. Natale. Yarp: Yet another robot platform. Interna-
tional Journal on Advanced Robotics Systems, 3(1):43–48, 2006.
37. Nestor Eduardo Nava Rodrguez. Design issue of a new iCub head sub-system. Robotics
and Computer-Integrated Manufacturing, 26(2):119–129, 2010.
38. Rodney A. Brooks, Cinthia Breazeal, M. Marjanovic, Brian Scassellati, and Matthew
Williamson. The Cog project: Building a humanoid robot, page 5287. Springer, New
York, 1999. Lecture Notes in Artificial Intelligence 1562.
39. L. Aryananda and J. Weber. Mertz: A quest for a robust and scalable active vision hu-
manoid head robot. In IEEE/RAS Int. Conf. on Humanoid Robots (HUMANOIDS),
pages 513–532, 2004.
40. T. Asfour, K. Welke, P. Azad, A. Ude, and R. Dillmann. The Karlsruhe humanoid
head. In IEEE/RAS Int. Conf. on Humanoid Robots (HUMANOIDS), pages 447–453,
... While the unsophisticated robots can generate basic emotions like happiness, anger, sadness and surprise, the most advanced social robots can express a greater variety of emotions. For example, the iCub robot ( [25], [26]) is a versatile humanoid robot which was designed by the RobotCub Consortium of several European universities. It was built by the Italian Institute of Technology (IIT) as part of the EU project RobotCub and subsequently adopted by more than 20 laboratories worldwide. ...
Full-text available
The paper presents an initial step towards employing the advantages of educational theatre and implementing them into social robotics in order to enhance the emotional skills of a child and at the same time to augment robots with actors’ emotional talent. Emotional child-robot interaction helps to catch quickly a child’s attention and enhance information perception during learning and verbalization in children with communication disorders. An innovative approach for learning through art by transferring actors’ emotional and social talents to socially assistive robots is presented and the technical and artistic challenges of tracking and translating movements expressing emotions from an actor to a robot are considered. The goal is to augment the robot intervention in order to enhance a child’s learning skills by stimulating attention, improving timing of understanding emotions, establishing emotional contact and teamwork. The paper introduces a novel approach to capture movements and expressions of a human head, to process data from brain and inertial tracking devices and to transfer them into a socially assistive robot.
... Moreover, each experiment started with a calibration phase for the joystick. Also shown in Figure 1 is the humanoid robot "icub" (Metta et al., 2008(Metta et al., , 2010Parmiggiani et al., 2012;Fischer et al., 2018). Its design resembles human face attributes and it features speech and motor behaviors, which is why the icub robot has been employed in numerous HRI and social robotics studies (Anzalone et al., 2015;Li, 2015). ...
Full-text available
A key goal in human-robot interaction (HRI) is to design scenarios between humanoid robots and humans such that the interaction is perceived as collaborative and natural, yet safe and comfortable for the human. Human skills like verbal and non-verbal communication are essential elements as humans tend to attribute social behaviors to robots. However, aspects like the uncanny valley and different technical affinity levels can impede the success of HRI scenarios, which has consequences on the establishment of long-term interaction qualities like trust and rapport. In the present study, we investigate the impact of a humanoid robot on human emotional responses during the performance of a cognitively demanding task. We set up three different conditions for the robot with increasing levels of social cue expressions in a between-group study design. For the analysis of emotions, we consider the eye gaze behavior, arousal-valence for affective states, and the detection of action units. Our analysis reveals that the participants display a high tendency toward positive emotions in presence of a robot with clear social skills compared to other conditions, where we show how emotions occur only at task onset. Our study also shows how different expression levels influence the analysis of the robots' role in HRI. Finally, we critically discuss the current trend of automatized emotion or affective state recognition in HRI and demonstrate issues that have direct consequences on the interpretation and, therefore, claims about human emotions in HRI studies.
... A reduced subset of 32 (or sometimes 26) DoFs for the neck, torso, legs, and arms, which are most relevant for locomotion purposes, are considered in this thesis. Each of these DoFs is electrically actuated with Brushless DC (BLDC) motors and a harmonic drive transmission (Parmiggiani et al. (2012)). This subset of Figure 1.3: iCub v2.5, additionally equipped with a Realsense camera and a Vicon marker mount on its waist, making them base-collocated sensors. ...
Full-text available
The future where the industrial shop-floors witness humans and robots working in unison and the domestic households becoming a shared space for both these agents is not very far. The scientific community has been accelerating towards that future by extending their research efforts in human-robot interaction towards human-robot collaboration. It is possible that the anthropomorphic nature of the humanoid robots could deem the most suitable for such collaborations in semi-structured, human-centered environments. Wearable sensing technologies for human agents and efficient human-aware control strategies for the humanoid robot will be key in achieving a seamless human-humanoid collaboration. This is where reliable state estimation strategies become crucial in making sense of the information coming from multiple distributed sensors attached to the human and those on the robot to augment the feedback controllers designed for the humanoid robot to aid their human counterparts. In this context, this thesis investigates the theory of Lie groups for designing state estimation techniques aimed towards humanoid locomotion and human motion estimation. [continued]
... The behavior-based robotic (BBR) has been a common approach for emotion-aware robots, which can use emotions as internal variables, which drive their external actions, mostly by correcting their operations according to the signals gained from their sensors(Arkin 2005). BBR ideas stimulated the design of robots capable to express emotional cues, e.g.Kismet, Mexi, iCub, Emys (Breazeal 2004a;Parmiggiani et al. 2012;Esau et al. 2003; Kędzierski et al. 2013). However, the mechanical expression of physical cues is just a preliminary step for the successful modeling of emotions, thus emotionally capable cognitive architectures are necessary for enhancing the implementation of believable, autonomous, adaptive, and context-aware artificial agents(Hudlicka 2011). ...
The current state of the art in cognitive robotics, covering the challenges of building AI-powered intelligent robots inspired by natural cognitive systems. A novel approach to building AI-powered intelligent robots takes inspiration from the way natural cognitive systems—in humans, animals, and biological systems—develop intelligence by exploiting the full power of interactions between body and brain, the physical and social environment in which they live, and phylogenetic, developmental, and learning dynamics. This volume reports on the current state of the art in cognitive robotics, offering the first comprehensive coverage of building robots inspired by natural cognitive systems. Contributors first provide a systematic definition of cognitive robotics and a history of developments in the field. They describe in detail five main approaches: developmental, neuro, evolutionary, swarm, and soft robotics. They go on to consider methodologies and concepts, treating topics that include commonly used cognitive robotics platforms and robot simulators, biomimetic skin as an example of a hardware-based approach, machine-learning methods, and cognitive architecture. Finally, they cover the behavioral and cognitive capabilities of a variety of models, experiments, and applications, looking at issues that range from intrinsic motivation and perception to robot consciousness. Cognitive Robotics is aimed at an interdisciplinary audience, balancing technical details and examples for the computational reader with theoretical and experimental findings for the empirical scientist.
... An open source humanoid robotic platform called iCub, designed explicitly to support research in embodied cognition is shown in [150]. The robot has 3 DOFs eyeballs-the cameras are located in the eyeballs, and brushed Faulhaber DC motors via toothed belts actuate them, while the eyebrows and mouth are displayed using LEDs allowing basic facial expressions; in addition, the robot has vestibular, auditory, and haptic sensory capabilities. ...
Full-text available
This paper shows the structure of a mechanical system with 9 DOFs for driving robot eyes, as well as the system’s ability to produce facial expressions. It consists of three subsystems which enable the motion of the eyeballs, eyelids, and eyebrows independently to the rest of the face. Due to its structure, the mechanical system of the eyeballs is able to reproduce all of the motions human eyes are capable of, which is an important condition for the realization of binocular function of the artificial robot eyes, as well as stereovision. From a kinematic standpoint, the mechanical systems of the eyeballs, eyelids, and eyebrows are highly capable of generating the movements of the human eye. The structure of a control system is proposed with the goal of realizing the desired motion of the output links of the mechanical systems. The success of the mechanical system is also rated on how well it enables the robot to generate non-verbal emotional content, which is why an experiment was conducted. Due to this, the face of the human-like robot MARKO was used, covered with a face mask to aid in focusing the participants on the eye region. The participants evaluated the efficiency of the robot’s non-verbal communication, with certain emotions achieving a high rate of recognition.
... The iCub is an open-source humanoid robotic platform designed explicitly to support research in embodied cognition [83][84][85]. iCub has been used as a social robot in [86,87]. The robot contains motor amplifiers, a set of DSP controllers, a PC104-based PC, and analog to digital conversion cards. ...
Currently, most social robots interact with their surroundings and humans through sensors that are integral parts of the robots, which limits the usability of the sensors, human-robot interaction, and interchangeability. A wearable sensor garment that fits many robots is needed in many applications. This article presents an affordable wearable sensor vest, and an open-source software architecture with the Internet of Things (IoT) for social humanoid robots. The vest consists of touch, temperature, gesture, distance, vision sensors, and a wireless communication module. The IoT feature allows the robot to interact with humans locally and over the Internet. The designed architecture works for any social robot that has a general-purpose graphics processing unit (GPGPU), I2C/SPI buses, Internet connection, and the Robotics Operating System (ROS). The modular design of this architecture enables developers to easily add/remove/update complex behaviors. The proposed software architecture provides IoT technology, GPGPU nodes, I2C and SPI bus mangers, audio-visual interaction nodes (speech to text, text to speech, and image understanding), and isolation between behavior nodes and other nodes. The proposed IoT solution consists of related nodes in the robot, a RESTful web service, and user interfaces. We used the HTTP protocol as a means of two-way communication with the social robot over the Internet. Developers can easily edit or add nodes in C, C++, and Python programming languages. Our architecture can be used for designing more sophisticated behaviors for social humanoid robots.
... The iCub is an open-source humanoid robotic platform designed explicitly to support research in embodied cognition [38][39][40]. iCub has been used as social robot in [41,42]. The robot contains motor amplifiers, a set of DSP controllers, a PC104-based PC, and analog to digital conversion cards. ...
Social robots are essential for healthcare applications as assistive devices or behavior-based intervention systems. Social interactions, robotic hands and multimodal control are three core aspects of social robots, which are investigated in this dissertation. First, we present a wearable sensor vest and an open-source software architecture with the Internet of Things (IoT) for social robots. The IoT feature allows the robot to interact with local humans and other humans over the Internet. The designed architecture is demonstrated in a humanoid robot, and it works for any social robot that has general-purpose graphics processing unit (GPGPU), I2C/SPI buses, Internet connection, and Robot Operating System. The modular design of this architecture enables developers to easily add/remove/update complex behaviors. The proposed software architecture provides IoT technology, GPGPU nodes, I2C and SPI bus mangers, audio-visual interaction nodes, and isolation between behavior nodes and other nodes. Second, our humanoid robot uses novel actuators, called twisted and coiled polymer (TCP) actuators/artificial muscles, to move its fingers. Classical controllers and fuzzy-based controllers are examined for force control of these actuators. It was noted that disturbance and noise are major challenges in system identification and control of TCPs. In a short term, the muscles behave like a first-order, linear time-invariant system when the input is voltage square and the output is either force or displacement. However, the behaviors and parameters of the polymer muscles slowly change. An on-policy adaptive controller is designed for regulating force of the muscles that is optimized by stochastic hill-climbing and a novel associated search element. The third part is multimodal control of robotic hands. Recent advancements in GPGPUs enable intelligent devices to run deep neural networks in real-time. Thus, state-of-the-art intelligent systems have rapidly shifted from the paradigm of composite subsystems optimization to the paradigm of end-to-end optimization. By taking advantages of GPGPU, we showed how to control robotic hands with raw electromyography signals and speech 2D features using deep learning and convolutional neural networks. The proposed convolutional neural networks are lightweight, such that it runs in real-time and locally in an embedded GPGPU.
Socially Assistive Robots (SARs) and immersive Virtual Reality (iVR) are interactive platforms that promote user engagement, which can motivate users to adhere to therapeutic frameworks. SARs use social presence to create affective relationships with users, leveraging the human tendency to be driven by social interactions. iVR uses spatial presence to provide an intense multisensory experience that submerges users in a virtual world. We adapted two such platforms – a SAR and an iVR – to deliver cognitive training (CT), by integrating established cognitive tasks in gamified environments that convey a strong sense of presence. Sixty-four participants underwent CT with both platforms. We tested: (1) their perception of both platforms; (2) whether they preferred one over the other in the short term; (3) their projected preferences for long-term training; and (4) whether their preferences correlated with personal characteristics. They preferred the virtual experience in the short term across age and gender. For long-term CT, there was equal projected preference for both platforms. It may be that a combination of social and spatial presence might yield engagement in long-term training.
Effective control design of flying vehicles requires a reliable estimation of the propellers' thrust forces to secure a successful flight. Direct measurements of thrust forces, however, are seldom available in practice and on-line thrust estimation usually follows from the application of fusion algorithms that process on-board sensor data. This paper proposes a framework for the estimation of the thrust intensities on flying multibody systems that are not equipped with sensors for direct thrust measurement. The key ingredient of the proposed framework is the so-called centroidal momentum of a multibody system, which combined with the propeller model, enables the design of Extended Kalman Filters (EKF) for on-line thrust estimation. The presented approach tackles the additional complexity in thrust estimation due to uncertainties in the propeller model and the possibly large number of degrees of freedom of the system. For instance, a covariance scheduling approach based on the turbines RPM error is proposed to ensure a reliable estimation even in case of turbine failures. Simulations are presented to validate the proposed algorithm during robot flight. Moreover, an experimental setup is designed to evaluate the accuracy of the estimation algorithm using iRonCub, a jet-powered humanoid robot, while standing on the ground.
Full-text available
This paper describes the design and integration of three joint torque sensors on the arm of the iCub platform [1], [2]. The objective is to enhance the robot arm with joint torque control capability. This activity is part of a general upgrade of the humanoid robot to provide the entire 53 degree of freedom robot with low-level joint torque control. In particular, the shoulder is challenging because of the complex and compact mechanism of the shoulder. We first modeled the behaviour of the sensors with analytical equations and the sensor geometry were subsequently optimized using finite element structural simulations. The sensors were then constructed, and integrated in the arm assembly. Finally we present preliminary experiments to validate the design.
Conference Paper
Full-text available
Tactile feedback is of crucial importance for object manipulation in unknown environments. In this paper we de-scribe the design and realization of a fingertip which includes a capacitive pressure sensor with 12 sensitive zones. It is naturally shaped and its size is small enough so that it can be mounted on the fingers of the humanoid robot iCub. It also embeds the electronic device which performs A/D conversion: This is beneficial for the signal to noise ratio and reduces the number of wires required to connect the fingertip to the robot. The fingertip is made of silicone, which makes its surface and inner structure compliant and flexible. We present preliminary experiments performed with the first prototype.
Conference Paper
Full-text available
This paper presents a humanoid two-arm system developed as a research platform for studying dexterous two-handed manipulation. The system is based on the modular DLR-Lightweight-Robot-III and the DLR-Hand-II. Two arms and hands are combined with a three degrees-of-freedom movable torso and a visual system to form a complete humanoid upper body. In this paper we present the design considerations and give an overview of the different sub-systems. Then, we describe the requirements on the software architecture. Moreover, the applied control methods for two-armed manipulation and the vision algorithms used for scene analysis are discussed.
Conference Paper
Full-text available
We present the design and realization of a conformable tactile sensor skin (patent pending). The skin is organized as a network of self-contained modules consisting of tiny pressure-sensitive elements which communicate through a serial bus. By adding or removing modules it is possible to adjust the area covered by the skin as well as the number (and density) of tactile elements. The skin is therefore highly modular and thus intrinsically scalable. Moreover, because the substrate on which the modules are mounted is sufficiently pliable to be folded and stiff enough to be cut, it is possible to freely distribute the individual tactile elements. A tactile skin composed of multiple modules can also be installed on curved surfaces. Due to their easy configurability we call our sensors "cut-and-paste tactile sensors." We describe a prototype implementation of the skin on a humanoid robot
Conference Paper
Full-text available
This article presents the mechatronic design of the autonomous humanoid robot called NAO that is built by the French company Aldebaran-Robotics. With its height of 0.57 m and its weight about 4.5 kg, this innovative robot is lightweight and compact. It distinguishes itself from existing humanoids thanks to its pelvis kinematics design, its proprietary actuation system based on brush DC motors, its electronic, computer and distributed software architectures. This robot has been designed to be affordable without sacrificing quality and performance. It is an open and easy-to-handle platform. The comprehensive and functional design is one of the reasons that helped select NAO to replace the AIBO quadrupeds in the 2008 RoboCup standard league.
Conference Paper
Full-text available
This paper presents a new research platform, CB<sup>2</sup>, a child robot with biomimetic body for cognitive developmental robotics developed by the Socially-Synergistic Intelligence (Hereafter, Socio-SI) group of JST ERATO Asada Project. The Socio-SI group has focused on the design principles of communicative and intelligent machines and human social development through building a humanoid robot that has physical and perceptual structures close to us, that enables safe and close interactions with humans. For this purpose, CB<sup>2</sup> was designed, especially in order to establish and maintain a long-term social interaction between human and robot. The most significant features of CB<sup>2</sup> are a whole-body soft skin (silicon surface with many tactile sensors underneath) and flexible joints (51 pneumatic actuators). The fundamental capabilities and the preliminary experiments are shown, and the future work is discussed.
Conference Paper
Abstract— This paper gives an overview of ROS, an open- source robot operating,system. ROS is not an operating,system in the traditional sense of process management,and scheduling; rather, it provides a structured communications layer above the host operating,systems,of a heterogenous,compute,cluster. In this paper, we discuss how ROS relates to existing robot software frameworks, and briefly overview some of the available application software,which,uses ROS.
This paper describes the performance requirements and mechanical design of an arm designed and built at MIT for whole-arm manipulation. Whole-arm manipulation began as a research objective to explore the benefits of manipulating objects with all surfaces of a robotic manipulator — not just the fingertips of an attached robotic hand. The need for robust environment contact by all surfaces of the robotic hardware prompted a re-evaluation of traditional manipulator design requirements and spurred the invention of new transmission mechanisms for robots.
Conference Paper
This paper describes the current state, and the ongoing developments, of the hard real-time motion control core, of the OROCOS project (Open Robot Control Software). This core is a component-based, distributed and configurable software framework. The presentation discusses both the design and the current implementation. The design separates the structure of the control system from its functionality. Its hard real-time core provides a generic control structure, with plug-in facilities for customization.