Conference PaperPDF Available

Development of a nursing-care assistant robot RIBA that can lift a human in its arms


Abstract and Figures

In aging societies, there is a strong demand for robotics to tackle problems caused by the aging population. Patient transfer, such as lifting and moving a bedridden patient from a bed to a wheelchair and back, is one of the most physically challenging tasks in nursing care, the burden of which should be reduced by the introduction of robot technologies. We have developed a new prototype robot named RIBA with human-type arms that is designed to perform heavy physical tasks requiring human contact, and we succeeded in transferring a human from a bed to a wheelchair and back. To use RIBA in changeable and realistic environments, cooperation between the caregiver and the robot is required. The caregiver takes responsibility for monitoring the environment and determining suitable actions, while the robot undertakes hard physical tasks. The instructions can be intuitively given by the caregiver to RIBA through tactile sensors using a newly proposed method named tactile guidance. In the present paper, we describe RIBA's design concept, its basic specifications, and the tactile guidance method. Experiments including the transfer of humans are also reported.
Content may be subject to copyright.
Development of a Nursing-Care Assistant Robot RIBA
That Can Lift a Human in Its Arms
Toshiharu Mukai, Shinya Hirano, Hiromichi Nakashima, Yo Kato,
Yuki Sakaida, Shijie Guo and Shigeyuki Hosoe
Abstract In aging societies, there is a strong demand for
robotics to tackle problems caused by the aging population.
Patient transfer, such as lifting and moving a bedridden
patient from a bed to a wheelchair and back, is one of the
most physically challenging tasks in nursing care, the burden
of which should be reduced by the introduction of robot
technologies. We have developed a new prototype robot named
RIBA with human-type arms that is designed to perform heavy
physical tasks requiring human contact, and we succeeded in
transferring a human from a bed to a wheelchair and back.
To use RIBA in changeable and realistic environments, coop-
eration between the caregiver and the robot is required. The
caregiver takes responsibility for monitoring the environment
and determining suitable actions, while the robot undertakes
hard physical tasks. The instructions can be intuitively given
by the caregiver to RIBA through tactile sensors using a newly
proposed method named tactile guidance. In the present paper,
we describe RIBA’s design concept, its basic specifications, and
the tactile guidance method. Experiments including the transfer
of humans are also reported.
With the advent of an aging society, the demand for
human-interactive robots that can help on-site caregivers
by playing a part in nursing humans, particularly the el-
derly, is increasing. For this purpose, many robots have
been proposed, for example, robots for feeding people who
are paralyzed [1], mental commitment robots dedicated for
mental healing [2], and smart wheelchairs [3]. There are
also wearing-type robots [4] that can support a caregiver’s
or patient’s motion.
Tasks involving the transfer of patients, such as lifting
and moving a bedridden patient from a bed to a wheelchair
and back, are among the most physically challenging tasks
in nursing care. Although patient-lifting devices have been
developed and commercialized, they are not widely used
in nursing-care facilities in Japan. According to [5], the
proportion of caregivers in Japanese nursing-care facilities
always or sometimes using portable patient lifts is limited to
14.8%. The reasons for this include the long time required
for their use, the difficulty of attaching slings, the risk of
dropping a patient, and the mental and physical discomfort
of the patient. In addition, it was reported in [6] that the
physical burden of the caregiver is not reduced in many cases
by using patient lifts.
T. Mukai, S. Hirano, H. Nakashima, Y. Sakaida, and S. Hosoe are with
RIKEN RTC, 2271-130, Anagahora, Shimoshidami, Moriyama-ku, Nagoya
463-0003, Japan {tosh,hirano,nakas,sakaida,hosoe}
Y. Kato and S. Guo are with SR Laboratory, Tokai Rub-
ber Industries, 1, Higashi 3-chome, Komaki, Aichi 485-8550, Japan
Under this situation, robotics is required to help with
patient-transferring tasks. Daihen Corporation has developed
a patient-transfer apparatus named C-Pam [7] that can trans-
fer a patient between a bed and a stretcher. It consists of a
flat board covered with motorized endless belts, and gently
crawls under the patient who is lying on the bed. Panasonic
developed Transfer Assist Robot [8] that has flat board-type
arms with motorized endless belts, and can transfer a patient
from a bed to an almost flat wheelchair with a reclining
function. However, these robots cannot transfer a patient
from a bed to a wheelchair without a reclining function. The
long time taken to use these devices is another difficulty.
Another approach to assisting with transfer tasks by robotics
is the use of wearing-type robots [4]. A robot of this type is
worn by the caregiver, and assists with his or her motion. In
nursing-care facilities, however, caregivers have to perform
many other tasks in addition to patient transfer, and wearing
such a robot may interfere with these tasks.
We consider that robots for performing patient-transfer
tasks between a bed and a wheelchair are needed in nursing-
care facilities and hospitals, and we are developing prototype
robots for this purpose. In 2006, we presented a robot named
RI-MAN [9], [10], which succeeded in lifting a dummy
human. However RI-MAN had several strong limitations for
the use in realistic situations. First, its mechanical structure
was not satisfactory regarding payload, motion accuracy,
and ranges of joint movement. The weight of the lifted
dummy was no more than 18.5 kg, and because of the
limited ranges of joint movement, it could not put the dummy
down. Second, it could not deal with various and changeable
situations, and its working environment had to be carefully
controlled. For example, the dummy had to be set in the
predetermined position and posture before lifting. Third, it
did not have sufficient safety for handling a human.
To cope with these difficulties, we have developed a new
robot named RIBA (Robot for Interactive Body Assistance).
It has adopted a new human-robot interface, tactile guidance,
based on tactile sensors. It has satisfactory power, manipu-
lability, and safety to handle a human. RIBA succeeded in
transferring a human between a bed and a wheelchair, using
human-type arms. These arms give RIBA the ability to adapt
its lifting motion to different situations, which include lifting
to and from a wheelchair without a reclining function.
In the remainder of this paper, we first describe the design
concept of RIBA, then outline its basic specifications. Next,
we explain the concept of tactile guidance. Next, experiments
including the lifting and putting down of humans are de-
scribed, and finally we conclude this paper.
The 2010 IEEE/RSJ International Conference on
Intelligent Robots and Systems
October 18-22, 2010, Taipei, Taiwan
978-1-4244-6676-4/10/$25.00 ©2010 IEEE
A. Patient Transfer Using Human-Type Arms
To deal with a bedridden person using robots, several
methods have been proposed.
1) The use of human-type arms for transferring a person
(our method).
2) The use of board-type arms with endless conveyer belts
that enable frictionless insertion under a bedridden
person before transferring the person [8].
3) Part of the patient’s bed is detached and jointed to the
robot arms, and the patient is lifted with the detached
part [11].
4) The use of a simple and small bed fixed on the robot
arms. A bedridden person must first be transferred
from the normal bed to the small bed [12].
5) Part of the bed is transformed to a smart wheelchair
There are some advantages of our method of using human-
type arms. First, the robot can be applied to various types of
lifting and other tasks such as assisting with rehabilitation
training. RIBA succeeded in transferring a patient between
a bed and a normal wheelchair without a reclining function,
which cannot be achieved by the other methods. Second, if
a robot lifts a patient but does not put him or her down in a
short time, it is occupied by the patient for a long time and
cannot be used with other patients, which is the standard
usage of 4) and 5). In our method, the robot can be shared
among many patients. The third merit is that the human-
type arms can be inserted into small spaces under a patient
lying on a bed, which results in insertion taking less time
than that using endless belts in 2). If the caregiver makes
a small space under a patient lying on a bed, for example,
by bending the patient’s knees, the robot can insert its arm
under the knees. Similarly, the robot can insert its other arm
under the patient’s back after the caregiver has slightly raised
the upper body of the patient.
For the above reasons, we have adopted a method of
patient transfer using human-type arms. Its disadvantages
include the increased cost and probability of malfunction
caused by the complexity of the arm structure, and the danger
of dropping the lifted patient, which should be overcome by
farther research.
B. Trade-Off among Size, Speed, and Payload
To use robots for patient transfer in nursing-care facilities
and hospitals, they must be able to go through doors and
move into narrow spaces between beds. On the other hand,
they must be able to support the weight of a human, and
the lifting motion must be acceptably quick for caregivers
and patients. These conditions have a trade-off relation, but
all conditions must be satisfied to an acceptable level. In the
design of RIBA, we set the following priorities in decreasing
order of importance: i) a payload that enables the robot to
lift a human (over 60 kg), ii) a size that allows RIBA to
move into small spaces (width less than 80 cm), iii) a joint
speed as high as possible while satisfying conditions i) and
C. Whole Body Manipulation
We have adopted whole body manipulation [14], in which
not the end effectors but the entire body can be considered as
the contact area between the robot and the manipulated ob-
ject. When the object is a human, whole body manipulation
means that many areas of the robot body may come in contact
with the human, and thus the state of the robot surface
is important for the safety and comfort of the human. We
decided to embed all cables in the body, and we designed a
soft and smooth outer shell without projections. In particular,
the joints are covered and isolated to prevent fingers or hair
from being trapped in their gears.
D. Cooperation between the Caregiver and the Robot
Even using state-of-the-art technology, it is almost im-
possible to build fully autonomous robots that can perform
patient-transfer tasks in environments such as nursing-care
facilities and hospitals. It is very difficult for robots to detect
human positions and postures in various environments, to
plan a suitable lifting motion from the detected information,
and to understand the patient’s physical and mental condition
from the patient’s facial expression and body posture to
determine whether the patient is ready to be lifted. It is also
difficult to detect dangerous and unexpected situations from
sensor data. Furthermore, it is necessary to clarify where
the responsibility lies regarding the determination of robot
To realize patient-transfer robots under these conditions,
we have adopted a system based on the cooperation of the
caregiver and the robot, where the caregiver becomes the
operator and takes charge of recognizing the environment
and deciding the lifting procedure, while the robot undertakes
physically hard tasks. The robot operates autonomously
when safety is guaranteed, whereas at times when more
complex decision making is required, such as when a large
force is to be exerted on a human for lifting, it leaves the
final decision to the accompanying caregiver.
It is desirable that the interface for transmitting instruc-
tions from the caregiver to the robot should be simple and
usable without additional devices. In addition, its use should
allow easy and intuitive control of the many degrees of
freedom (d.o.f.) of the robot. To this end, we developed a
new method named ‘tactile guidance’ for controlling robots
by touching and leading the robot motion. Details will
be described in Section IV. Using tactile guidance, the
caregiver can control the robot by touching it with one hand,
while performing delicate jobs unsuitable for high-power
robots, such as lifting up the patient’s head, using the other
hand. The caregiver can remain close to the patient while
controlling the robot, which we believe is more relaxing for
the patient.
In nursing-care facilities, a bedridden patient is usually
transferred by two or more caregivers. If the role of the
caregiver with the greater physical load can be replaced by
a robot, it will save manpower and allow him or her to
concentrate on more mental jobs.
Fig. 1. RIBA (Robot for Interactive Body Assistance) and its joint
Dimensions Width 750 mm (when arms are folded)
Depth 840 mm
Height 1,400 mm
Weight inc. batteries 180 kg
D.O.F. Head 3 (only 1 in current use)
Arm 7 each
Waist 2
Cart 3 (with 4 motored wheels)
Base movement Omnidirectional with omnidirectional wheels
Actuator type DC motor
Payload 63 kg (tested value)
Operation time 1 hour in standard use
Power NiMH batteries
Sensors Vision 2 cameras
Audio 2 microphones
Tactile Upper arm (128 pts. each)
Forearm (94 pts. each)
Hand (4 pts. each)
Shoulder pad (8 pts. each)
The developed robot RIBA and its joint configuration are
shown in Fig. 1, and its basic specifications are given in
Table I. The link length, joint configuration, and movable
ranges of the joints were determined by performing a com-
puter simulation of lifting a human and from our experience
based on our previous robot RI-MAN. We adopted a coupled
drive [15] mechanism that uses a pair of motors for providing
2 d.o.f. in the joint pairs (0,1), (2,3), (4,5), (7,8), (9,10), and
(11,12). This mechanism allows the output of the two motors
to be concentrated at one joint if the other joint in the pair
is not required to move. This enables the robot to realize a
high payload with thin and light arms.
The width of 750 mm in Table I is that when the arms
are fully folded. When the arms are straight, as in Fig. 1,
the robot width is 1200 mm. Omni-directional wheels are
adopted so that the cart can freely move around in narrow
spaces such as the space between beds.
RIBA has speech recognition ability so that it can under-
stand voice commands. It also has face recognition and sound
source localization functions to find the operating caregiver.
As tactile sensors, we developed a flexible tactile sheet with
8×8 semiconductor pressure sensors and a readout circuit
embedded in an elastic material [16]. This type of tactile
(a) Without cover (b) With cover
Fig. 2. Tactile sensors on the upper arm
sensor is mounted on the upper arms (Fig. 2) and forearms.
The numbers of sensing points on the upper arm and the
forearm are 128 and 94, respectively. RIBA also has tactile
sensors in its hands and shoulder pads that are made of
pressure-sensitive rubber. These tactile sensors are used for
motion adjustment, tactile feedback, communication, and
ensuring safety.
Basic trajectories for motions are given to RIBA in ad-
vance. One of motions is selected by the operator using
voice commands such as ‘Lift up from the bed’. The selected
trajectory is modified using tactile guidance when necessary.
RIBA can operate as a stand-alone robot with all the
processors and batteries inside it. The main PC (CPU: Intel
CoreDuo, 2GHz) and more than 20 local processing boards
(CPU: Microchip dsPIC33F) for the sensors and motor
controllers constitute the distributed information-processing
network in RIBA. This distributed network contributes to
reducing the computational load of the main PC, decreasing
the number of cables in RIBA, and reducing the sensor noise
by shortening the analog transmission length. RIBA can be
accessed via wireless LAN when necessary.
To ensure safety in the case of unexpected contact, and
the stability and comfort of the lifted patient by increasing
the contact area, the entire body of RIBA including its
joints is covered with soft materials such as polyurethane
foam and a silicone elastomer. We adopted a clean and
friendly appearance for RIBA, similar to that of a giant white
teddy bear, because a mechanical appearance would not have
suited nursing-care situations and a humanoid appearance
may cause psychological discomfort to the patient.
RIBA has succeeded in lifting a human from a bed, placing
a human on a bed, lifting a human from a wheelchair, putting
a human down on a wheelchair, and moving with a human in
its arms. The current maximum weight of the lifted person
is 63 kg. Fig. 3 shows RIBA lifting a human in its arms.
Many methods have been proposed for instructing robots,
for example, through the use of a remote controller, voice
commands, motion capture, and force/torque sensors. How-
ever, these methods are unsatisfactory for controlling RIBA.
In human-robot cooperative patient transfer, we consider
that the caregiver should be able to operate the robot using
one hand, in order to use the other hand for adjusting the
patient’s posture and skin contact. The above methods do
not satisfy our requirements, because some of the methods
require additional devices, some have insufficient recognition
accuracy, some are unsuitable for instructing a robot to
assume a posture determined by multiple degrees of freedom,
Fig. 3. RIBA lifting a human in its arms
and some require the operator to remain in a specific region
to grasp the control devices.
For use during human-robot cooperative patient transfer,
we have developed a direct and intuitive ‘tactile guidance’
method, where the caregiver indicates the position, direction,
and/or speed of the desired motion by directly touching the
robot on the part that is concerned with the motion. This
method was inspired by the way a teacher instructs the
motion of a student through touch and directly guides the
student’s motion when teaching sport or dance.
RIBA has tactile sensors covering the entire area on
the arms except the joints. By touching these sensors, the
caregiver gives instructions to the robot. The wide area of
the control interface results in little constraint on the position
of the caregiver, and the caregiver can operate the robot
using one hand. The coincidence of the force-applied point
and the force detecting point is another advantage, which
enables detecting small force that is difficult for force/torque
sensors mounted on the base of a link to detect. The
tactile sensors provide two-dimensional pattern information,
to which pattern processing techniques can be applied. For
example, it is possible to detect sliding motion on the tactile
sensor, to cancel the output from the load of the lifted person
while detecting an instruction at another contact, and to count
the number of active elements to omit output from only one
element, with the aim of achieving robustness.
In the current RIBA, we have incorporated three tactile
guidance modes, a cart control mode, a posture-forming
mode, and a motion-adjusting mode. The switching between
these modes is initiated by a voice command. The pushing
force or sliding distance is used as the operation input.
Operation is activated only when the sensor is being touched
and stops when the touching finishes.
For cart control, we use three Regions A, B, and C
assigned on the outer side (the opposite side from lifted
patient) of the upper arms, as shown in Fig. 4. By touching
Region A and sliding the touching position forward or
backward, the cart moves forward or backward, respectively.
The sliding distance determines the speed of motion. We
assigned Region B for side translation and Region C for
gyration. The direction of side translation or gyration is
determined by whether the left or right side of the arm is
pushed and the speed is determined by the pushing force.
Region A
Region B
Regi on C
Region D
Region E
Region F
Fig. 4. Regions on an arm for tactile guidance
Joint angle Trajectory for a tall person
The trajectory appropriate to the lifted
person is selected by changing
Trajectory for a short person
),( pfq t=
),( 1tfq =
),( 0tfq =
Fig. 5. Concept of motion adjustment by tactile guidance
Both arms can be used for these operations.
In the posture-forming mode, all the 16 joints in both
arms and the waist are controlled by touching the arms.
This mode may be used for forming the motion of RIBA by
directly touching the robot. For example, elbow extension
or flexion is realized by pushing the inner or outer side
of the forearm, respectively, and elbow rotation is realized
by sliding the contact point along a circular direction. The
motion of other joints is also assigned to different positions
or touching conditions (pushing or sliding), so as to be
intuitively understandable to the operator.
The motion-adjusting mode is used during lifting the
patient up and down. The concept of adjusting the motion
is shown in Fig. 5. The trajectories for lifting a tall or a
short person with different distances between both arms are
given by the designer in advance, and the trajectory suitable
to the present patient is made by interpolating these two
trajectories. The interpolation is adjusted by the parameter
p(0p1), and the progress of the motion is controlled
by the time parameter t. Tactile guidance is used to change
the parameters pand t. The adjustment of tis assigned to
the outer side (the opposite side from the lifted patient) of
the forearms. Pushing Region E or D in Fig. 4 changes the
time tforward or backward, respectively, and the pushing
force determines the changing rate of t. In preliminary
experiments, it was found that the lifted person sometimes
caused a load on the side of the forearms (Region F); hence,
we decided to omit the side regions and use only the central
regions of the outer side of the forearms for instructions. The
adjustment of the parameter pis assigned to the grasping of
the left or right half of RIBAs hand.
-4 [deg]
42 [deg]
43 [deg]
Weight of 30 [kg]
for each arm
Fig. 6. Payload test posture
A. Handling Heavy Objects
The principal objective of RIBA is to transfer a human
weighing 60 kg or more from a bed to a wheelchair, and vice
versa. To confirm RIBA’s ability to handle heavy objects, we
tested the power of the elbow-bending joint (J11), shoulder-
rotating joint (J8), and the joint allowing the waist to bend
back and forth (J14) when the forearm was almost level as
shown in Fig. 6. This posture is similar to that when starting
to lift a human. We placed a weight of 30 kg on the joint
between the forearm and the hand of each arm so that RIBA
bore a total weight of 60 kg.
As the desired trajectory, a 0.2 Hz sinusoidal wave was
applied from the above posture to the elbow, shoulder, and
waist joints. The amplitude of the sinusoidal wave was de-
termined so as not to exceed the maximum angular velocity
of each joint, and was 10 deg for the elbow, 7.5 deg for
the shoulder, and 5 deg for the waist. The actual trajectories
with and without the weight were recorded. The results are
shown in Fig. 7.
In all cases, the actual trajectory was in satisfactory agree-
ment with the desired trajectory. During lifting in patient
transfer, the patient is usually mounted on the middle of the
forearms, which gives a payload margin. The main structure
of the arms was designed to support 120 kg in principle,
but for safety reasons, we have so far limited the weight to
63 kg.
B. Motion Control by Touching
To test the controllability of tactile guidance, we con-
ducted experiments in which the elbow-bending joint (J11)
or elbow-rotating joint (J12) was moved 30 deg using the
posture-forming mode of tactile guidance when RIBA was
in the posture shown in Fig. 6. The operator was instructed
to move the joint of the left arm so as to follow the
corresponding joint of the right arm that was used to show
the desired angles. In addition to the visual cue, sound was
also used for indicating that the actual joint angle was within
±1 deg, over 1 deg above, or over 1 deg below the desired
The results are shown in Fig. 8. In addition to the desired
and actual joint angles, the pushing force or sliding distance
that was used as the operation input for tactile guidance is
indicated. In the elbow-bending experiment, the operation
input was the pushing force calculated as the sum of the
outputs of all pressure-sensing elements on the tactile sensor.
0 5000 10000
Time [ms]
Angle [deg] a
0 [kg]
30 [kg]
0 5000 10000
Time [ms ]
Angle [deg] a
0 [kg]
30 [kg]
(a) Elbow (b) Shoulder
0 5000 10000
Time [ms ]
Angle [deg] a
0 [kg]
60 [kg]
(c) Waist
Fig. 7. Experimental results of payload test
The unit was approximately 0.1 N. In the case of elbow
rotation, the operation input was the sliding distance of the
contact point, where the unit was the pitch of the pressure-
sensing elements (21.5 mm). When the operation input was
the pushing force, fine adjustments were performed. For
example, motion was slowed down when the joint angle was
near the desired value. On the other hand, when the operation
input was the sliding distance, adjustment was not frequent
and the actual angle was sometimes moved past the desired
angle. From these experiments, we can conclude that force
is more suitable than sliding distance as the operation input
for motion that requires fine adjustment.
C. Patient Transfer
We evaluated the patient-transfer ability of RIBA using
10 adults (1 male and 9 females). The sequence of lifting
from the bed is shown in Fig. 9 and that of lifting from the
wheelchair is shown in Fig. 10. Putting a human down on
the bed or the wheelchair is also possible by applying the
reverse motion.
In both cases, the caregiver made fine adjustments of
RIBAs position and motion according to the patient’s po-
sition and posture by touching RIBAs arms. The caregiver
used one hand to raise the head of the patient (Fig. 9(c))
when lifting him or her from the bed and to raise the legs
(Figs. 10(d) and (e)) when the patient was lifted from the
wheelchair, while using the other hand to operate RIBA.
Each lifting took approximately 40 s. Lifting was stable and
no danger of dropping the patient was observed.
We have developed a prototype robot, RIBA, to assist with
patient transfer. RIBA succeeded in transferring a human
from a bed to a wheelchair and back using its human-type
arms. To the best of our knowledge, RIBA is the first robot
that can transfer a human between a bed and a wheelchair
without a reclining function using human-type arms. RIBA
0 5 10 15 20
Time [s]
Angle [deg]
Operation input (Pushing force)
Operation input
(a) Elbow bending
0 5 10 15 20
Time [s]
Angle [deg]
Operation input (Sliding distance)
Operation input
(b) Elbow rotation
Fig. 8. Results of tactile guidance experiments
(a) (b) (c)
(d) (e) (f)
Fig. 9. Lifting from bed
weighs 180 kg and can lift a patient with a weight of up to
63 kg; the payload to weight ratio is as high as 0.35.
We developed a tactile guidance method in which the care-
giver adjusts robot motion through tactile sensors, enabling
it to cope with changeable situations. This also allows the
caregiver to remain close to the patient, which we believe
is less stressful for the patient. The aims of our future work
include increasing the payload, ensuring the comfort of the
lifted person, and reinforcing safety.
[2] K. Wada, T. Shibata, T. Saito, and K. Tanie, “Psychological and
Social Effects in Long-Term Experiment of Robot Assisted Activity
to Elderly People at a Health Service Facility for the Aged,” in Proc.
(a) (b) (c)
(d) (e) (f)
(g) (h) (i)
Fig. 10. Lifting from wheelchair
IEEE/RSJ International Conference on Intelligent Robots and Systems
(IROS), pp.3068–3073, 2004.
[3] C. Mandel, T. L¨uth, T. Laue, T. R¨ofer, A. Gr¨aser, and B. Krieg-
Br¨uckner, “Navigating a Smart Wheelchair with a Brain-Computer
Interface Interpreting Steady-State Visual Evoked Potentials,” in Proc.
IEEE/RSJ International Conference on Intelligent Robots and Systems
(IROS), pp.1118–1125, 2009.
[4] T. Hayashi, H. Kawamoto, and Y. Sankai, “Control Method of Robot
Suit HAL Working as Operator’s Muscle Using Biological and Dy-
namical Information,” in Proc. IEEE/RSJ International Conference on
Intelligent Robots and Systems (IROS), pp.3455–3460, 2005.
[5] K. Iwakiri, M. Takahashi, M. Sotoyama, M. Hirata, and N. Hisanaga,
“Usage Survey of Care Equipment in Care Service Facilities for
the Elderly,J. Occupational Health, Vol.49, pp.12–20, 2007 (in
[6] A. Garg, B. Owen, D. Beller, and J. Banaag, “A Biomechanical
and Ergonomic Evaluation of Patient Transferring Tasks: Bed to
Wheelchair and Wheelchair to Bed,” Ergonomics, Vol. 34, No.3,
pp.289–312, 1991.
[7] H. Wang and F. Kasagami, “A Patient Transfer Apparatus Between
Bed and Stretcher,IEEE Trans. on Systems, Man, and Cybernetics-
Part B (Cybernetics), Vol.38, No.1, pp.60–67, 2008.
[8] Y. Kume and H. Kawakami, “Development of Power-Motion Assist
Technology for Transfer Assist Robot,” Matsushita Technical Journal,
Vol.54, No.2, pp.50–52, 2008 (in Japanese).
[9] M. Onishi, Z. W. Luo, T. Odashima, S. Hirano, K. Tahara, and T.
Mukai, “Generation of Human Care Behaviors by Human-Interactive
Robot RI-MAN,” in Proc. IEEE International Conference on Robotics
and Automation (ICRA), pp. 3128–3129, 2007.
[10] T. Mukai, M. Onishi, T. Odashima, S. Hirano, and Z. W. Luo,
“Development of the Tactile Sensor System of a Human-Interactive
Robot ‘RI-MAN’,” IEEE Trans. on Robotics, Vol.24, No.2, pp.505–
512, 2008.
[11] S. Hashino, T. Iwaki, C. T. Wang, and E. Nakano, “Control of Pa-
tient Care Robot ‘MELKONG’,J. Society of Biomechanisms Japan,
Vol.10, pp.259–269, 1990 (in Japanese).
[14] K. Harada and M. Kaneko, “Whole Body Manipulation,” in Proc.
IEEE International Conference on Robotics, Intelligent Systems and
Signal Processing, Vol.1, pp.190–195, 2003.
[15] S. Hirose and M. Sato, “Coupled Drive of the Multi-DOF Robot,
in Proc. IEEE International Conference on Robotics and Automation
(ICRA), Vol. 3, pp.1610–1616, 1989.
[16] T. Mukai and Y. Kato, “1 ms Soft Areal Tactile Giving Robots Soft
Response,” J. Robotics and Mechatronics, Vol.20, No.3, pp.473–480,
... AIBO (Robins, Dautenhahn, Nehaniv, et al., 2005) Actroid-F (Yoshikawa, Matsumoto, Sumitani, & Ishiguro, 2011) ASIMO (Sakagami et al., 2002) Charlie (Jayawardena et al., 2010) COCO (Racky et al., 2011) Cody (Chen, King, Thomaz, & Kemp, 2011) Huggable Kaspar (Amirabdollahian, Robins, Dautenhahn, & Ji, 2011;Dautenhahn et al., 2009) Keepon Nabaztag (Klamer & Ben Allouch, 2010) Nao Paro (Wada, Shibata, Musha, & Kimura, 2005) RIBA (Mukai et al., 2010) Robota (Robins, Dautenhahn, Te Boekhorst, & Billard, 2005) Twendy-One (Iwata & Sugano, 2009) Zeno (Ranatunga, Rajruangrabin, Popa, & Makedon, 2011) ...
Full-text available
Despite there being much work bringing the vision of socially interactive robots to life, it is still a challenge to program them. There are two key reasons for this. First, many robot programming tools express primitives for programming social-interaction at low abstraction levels. Second, the usability of methods used to combine primitives, regardless of their abstraction levels, into social-interactions are not well understood. To address these problems, this thesis presents the iterative design and evaluation of application programming interfaces (APIs) for programming socially interactive robots. An iterative design-based research method was used throughout this thesis. During each research iteration, an API was designed with user-centred design principles to address particular aspects of the social robot programming research problem. Each API was then evaluated with a user study and the data analysed with the Cognitive Dimensions of Notations. The first API iteration began exploring what abstraction level is appropriate for programming social robot applications, through an evaluation of an API with high-level, domain-specific primitives. The results of the evaluation showed that the chosen abstraction level had positive effects on usability, however, a stakeholder interview suggested that finer control of social-interaction was needed. This evaluation also explored how primitives, regardless of their abstraction level, should be orchestrated to create higher level social-interaction. This was done by evaluating an imperative finite state machine API for authoring robot dialogue; the results showed that they have poor usability when used in this context. This is important because imperative finite state machines are a common means of orchestrating robot behaviour in the robotics community. The second API iteration further refined what abstraction level is appropriate for programming social robot applications. Based on the results of a stakeholder interview of the previous API, its primitives were refactored into a slightly lower abstraction level, providing finer control of social-interaction. A user evaluation demonstrated the benefits of the refactoring which retained the key advantages from the previous API and also the trade-offs inherent in having a higher abstraction level. The positive effects include hiding of lower level implementation details, primitives having a close mapping to the socialinteraction domain and a structure that enables programmers to use their domain knowledge to understand the API. The main trade-off occurs when implementation details are hidden, this can make progressive evaluation difficult because programmers are suddenly exposed to lower level implementation details when debugging errors. Parallel to the iterative API design, I derived an expanded implementation independent taxonomy of the primitives required for programming robot social-interaction. The taxonomy defines robot actions, perceptions and the objects with which a social robot interacts. This is important for stakeholders who help create socially interactive robots. The last stage of the research was the design and implementation of an extensible, vendor agnostic architecture to support the API. The architecture enables API users to define new entities (objects, people and robots) and create relationships between them. It also enables the API to interface with multiple social robot platforms through its vendor agnostic action and perception interfaces. In summary, this thesis makes five main contributions. First, the implementation of an API with high-level, domain-specific primitives for programming socially interactive robots. Second, an in depth understanding of what abstraction level is appropriate for programming social robot applications and the effects that different abstraction levels have on the APIs usability. Third, an exploration of how primitives should be orchestrated into higher level social-interaction, in particular, the effects that imperative finite state machines have on usability when used to author robot dialogue. Fourth, a taxonomy of primitives that are set at an appropriate abstraction level for programming robot social interaction. Fifth, the design and implementation of an extensible, vendor agnostic architecture to support social robot programming APIs with primitives at an appropriate abstraction level.
... However, recent studies neglect to consider a biomechanical evaluation that takes into account the synthesis of a caregiver's functional ability and ergonomic working and the use of assistance systems. In addition, there are limitations to scientific contributions that focus on the physical relief of nurses by robotic systems and the power of such systems regarding telepresence and telemanipulation in care [30][31][32][33] . For this reason, we developed a "Healthcare Prevention System" 34,35 that measures the kinematics, kinetics, and muscle activities during manual patient handling 29,30,[34][35][36] . ...
Full-text available
Manual patient handling is one of the most significant challenges leading to musculoskeletal burden among healthcare workers. Traditional working techniques could be enhanced by innovations that can be individually adapted to the physical capacity of nurses. We evaluated the use of a robotic system providing physical relief by collaboratively assisting nurses in manual patient handling tasks. By quantifying kinetic and muscle activity data, it was possible to distinguish two kinds of movement patterns. Highly asymmetric postures and movements corresponded to distinct extremes in lower limb and spine muscle activity data. The use of collaborative robotics significantly reduced maximum force exertion in the caregiving process by up to 51%. Lateral flexion and torsion of the trunk were reduced by up to 54% and 87%, respectively, leading to a significant reduction in mean spine muscle activity of up to 55%. These findings indicate the feasibility of collaborative robot-assisted patient handling and emphasize the need for future individual intervention programs to prevent physical burden in care.
... Further, few robots are strong enough to lift humans. For example, the RIBA robot [58] is able to lift a person, but it has not yet been widely used in care settings. Although robots have a long way to go, advancing capabilities make the vision of care robots much more within reach. ...
Robots hold significant promise to assist with providing care to an aging population and to help overcome increasing caregiver demands. Although a large body of research has explored robotic assistance for individuals with disabilities and age-related challenges, this past work focuses primarily on building robotic capabilities for assistance and has not yet fully considered how these capabilities could be used by professional caregivers. To better understand the workflows and practices of caregivers who support aging populations and to determine how robotic assistance can be integrated into their work, we conducted a field study using ethnographic and co-design methods in a senior living community. From our results, we created a set of design opportunities for robotic assistance, which we organized into three different parts: supporting caregiver workflows, adapting to resident abilities, and providing feedback to all stakeholders of the interaction.
... In addition, robots are known to help nurses. For instance, the Riba robot is programmed only to assist nurses in transferring and mobilizing patients (Mukai et al., 2010), and the Moxi robot is programmed to assist nurses for the logistics mobilization that they need (Rozga, 2018). ...
Full-text available
Aim: This research was planned to identify nurse managers' opinions on artificial intelligence and robot nurses. Background: As the concepts of artificial intelligence and robot nurses are becoming widespread in Turkey, nurse managers are expected to guide and cooperate with nurses in the future in regards to these technologies. Methods: The sample of the study consisted of 326 manager nurses, who were reached via the online questionnaire during the period of September-November 2021. Nurse Managers Information Form and Question Form on Artificial Intelligence and Robot Nurses were used to collect data. Data in this cross-sectional descriptive study was collected between September 2021 and November 2021 by the online survey method. The descriptive statistics of the data were analyzed with numbers and percentages. The difference between the knowledge of artificial intelligence and robot nurses and demographic characteristics was analyzed with the Chi-square test. Results: According to the findings, 66.9% of the nurse managers reported having heard the concepts of artificial intelligence and robot nurses previously. 67.2% stated that they thought that robot nurses would benefit the nursing profession, but 86.2% voiced disbelief that robots would replace nurses. Conclusions: The majority of the participating nurse managers reported that artificial intelligence and robot nurses would not replace nurses but would be beneficial for nurses and would reduce their workload. Implications for nursing management: It should be ensured that the nurse managers plan the areas in the hospital where artificial intelligence and robot nurses will be used and determine the possible risks. Awareness should be increased with in-service trainings and patient safety and ethical problems regarding the use of artificial intelligence and robot nurses should be identified.
Smart health care technologies (SHCTs) can assist older adults to manage their health and acquire convenient medical and caring services, which can support older adults to live healthy and independent lives. Currently, there are various kinds of SHCTs, such as smart wearable devices, smart health monitors, health care applications, and nursing and assistive robots. Nevertheless, due to the decrease of physical and cognitive capabilities, older adults may encounter many difficulties and problems when applying different types of SHCTs, which could affect their willingness to adopt SHCTs. However, there is still unclear about Hong Kong older adults’ actual use and intention adoption of different types of SHCTs, as well as the possible factors relevant to adoption. Therefore, this study attempted to investigate Hong Kong older adults’ actual use and adoption intention of different kinds of SHCTs, and explore the possible factors of SHCTs adoption. We employed a structured interview to approach the actual use and adoption intention of four types of SHCTs and recruited eight Hong Kong older adults from a local community. The data analysis revealed the most popular and the least used types of SHCTs. The reasons for adopting SHCTs and possible factors of adoption intention toward each kind of SHCTs were discussed.
Digital media technologies have been gradually integrated into teaching activities over the past few years, providing teachers with more possibilities for teaching. This study examines the teaching effects of an interactive AI based image-processing platform in assisting as a teaching aid for children painting education. In this study, we compared the learning interest, learning attitude, and continuous learning intention of 96 children aged 5 to 13 in the process of painting education. The subjects were divided into two groups: the experimental group used AI image processing for painting education, and the control group utilized traditional teaching methods for painting learning. Results showed that the use of AI image-processing tools in painting education reduces girls’ learning attitudes and continuous learning intention, while stimulating boys’ learning interest.
Intelligent Robots as well as Affective (Emotional) Technologies are among the most important advances of the so-called era of the fourth industrial revolution, which arguably creates an ever-increasing fusion of the physical, digital, and biological worlds. But how is gender intertwined with these advances, as viewed under multiple lenses? More specifically, how are these advances promising to change the world of healthcare and wellbeing, and what research has taken place at the intersection of these advances with gender and healthcare? These are the main questions that will be tackled in this chapter, following a short historical introduction. A forward-looking discussion, integrating predicted developments with the concepts of the digital twin and the quantified self, illustrates a vision of a possible future—A future in which these technologies will catalyze an increasingly holistic and antireductionist understanding, simulation, and capacity for beneficial orchestrated interventions, not only at a single level but rather at the multiple levels, ranging from the biological and neural to the psychological and social and beyond, that constitute the foundations of health and the prerequisites for wellbeing.
Robots will play a part in all aspects of healthcare. The presence of service robots in healthcare demands special attention, whether it is in the automation of menial labour, prescription distribution, or offering comfort. In this chapter, we examine the several applications of healthcare-oriented robots in the acute, ambulatory and at-home settings. We discuss the role of robotics in reducing environmental dangers, as well as at the patient’s bedside and in the operating room, in the acute setting. We examine how robotics can protect and scale up healthcare services in the ambulatory setting. Finally, in the at-home scenario, we look at how robots can be employed for both rural/remote healthcare delivery and home-based care. In addition to assessing the current state of robotics at the interface of healthcare delivery, we describe critical problems for the future where such technology will be ubiquitous. Patients, health care workers, institutions, insurance companies, and governments will realize that service robots will deliver significant benefits in the future in terms of leverage and cost savings, while maintaining or improving access, equity, and high-quality health care.
Robotics is widely seen as a key enabling technology for the society of tomorrow. This review examines the role of robotics and intelligent medical devices in Intensive Care Medicine. Demographics predict that more elderly patients will need to be treated with fewer healthcare personnel, calling for innovations in ICU management. Robotics is a key enabling technology for the 21st century and may help to smooth the foreseeable workload/manpower disparity in medicine. Studying the application of robotics in the ICU in a manner beneficial for patients and accepted by intensive care team is therefore desirable. Financial sustainability is essential for the introduction of these new technologies to the ICU. This study therefore assesses the state of the art in robotics in intensive care medicine and identifies opportunities for progress, using an observational approach in a teaching hospital ICU in combination with an in-depth review of the literature and a survey of the market. Tasks potentially amenable to robotics are identified, their acceptability to patients and caregivers are examined and their quantitative contribution to future management of an intensive care unit is assessed.
Tele-nursing robots provide a safe approach for patient-caring in quarantine areas. For effective nurse-robot collaboration, ergonomic teleoperation and intuitive interfaces with low physical and cognitive workload must be developed. We propose a framework to evaluate the control interfaces to iteratively develop an intuitive, efficient, and ergonomic teleoperation interface. The framework is a hierarchical procedure that incorporates general to specific assessment and its role in design evolution. We first present pre-defined objective and subjective metrics used to evaluate three representative contemporary teleoperation interfaces. The results indicate that teleoperation via human motion mapping outperforms the gamepad and stylus interfaces. The trade-off with using motion mapping as a teleoperation interface is the non-trivial physical fatigue. To understand the impact of heavy physical demand during motion mapping teleoperation, we propose an objective assessment of physical workload in teleoperation using electromyography (EMG). We find that physical fatigue happens in the actions that involve precise manipulation and steady posture maintenance. We further implemented teleoperation assistance in the form of shared autonomy to eliminate the fatigue-causing component in robot teleoperation via motion mapping. The experimental results show that the autonomous feature effectively reduces the physical effort while improving the efficiency and accuracy of the teleoperation interface.
Conference Paper
Full-text available
Recently, active researches have been performed to increase a robot's intelligence so as to realize the dexterous tasks in complex environment such as in the street or homes. However, since the skillful human-like task ability is so difficult to be formulated for the robot, not only the analytical and theoretical control researches but also the direct human motion mimetic approach is necessary. In this paper, we propose that to realize the environmental interactive tasks, such as human care tasks, it is insufficient to replay the human motion along. We show a novel motion generation approach to integrate the cognitive information into the mimic of human motions so as to realize the final complex task by the robot.
Conference Paper
Full-text available
In order to allow severely disabled people who cannot move their arms and legs to steer an automated wheelchair, this work proposes the combination of a non-invasive EEG-based human-robot interface and an autonomous navigation system that safely executes the issued commands. The robust classification of steady-state visual evoked potentials in brain activity allows for the seamless projection of qualitative directional navigation commands onto a frequently updated route graph representation of the environment. The deduced metrical target locations are navigated to by the application of an extended version of the well-established nearness diagram navigation method. The applicability of the system proposed is demonstrated by a real-world pilot study in which eight out of nine untrained subjects successfully navigated an automated wheelchair, requiring only some ten minutes of preparation.
The patient care robot "MELKONG" has been developed for lifting a patient from a hospital bed and transporting him/her in a semi-automatic way. The "MELKONG" is composed of two main parts. One is a manipulator for lifting the patient, and the other is a transportation mechanism capable of moving around in narrow spaces such as those in a hospital ward. The manipulator consists of two arms which are hydraulically driven in a pantograph link mechanism. In total, there are 7 DOF's, 3 for each arm and one for their base, which can rotate along a vertical axis. Each arm is equipped with a pair of forks at its wrist, and each outer fork has 2-dimentional force sensors at its tip. In order to lift a patient, the forks are inserted into specially designed beds, lifting the patient along with parts of the bed. The operator can move the two arms by using three kinds of input method: keyboard, joy-stick, and voice control. By using one of these methods, the tip of the forks are positioned at the end of the bed. Insertion is carried out automatically using the information from the sensors which is processed by two cpu's. ITRON, real-time OS, and another cpu have been adopted to carry out real-time tasks.
Conference Paper
A long-term experiment of robot assisted activity for elderly people has been conducted at a health service facility for the aged since Aug. 2003. Two therapeutic seal robots, Paro, were introduced there. This paper describes the first interim report of the experiment for three months. Face scales that consist of illustrations of person's faces were used to evaluate person's moods. In addition, geriatric depression scales were used to measure person's depression by questionnaires. As the results, feelings of the elderly people were improved by interaction with the seal robots.
Conference Paper
A robot system design to minimize the robot weight while producing multi-degree-of-freedom (DOF) functioning is proposed. This method aims to couple mutually the degrees of freedom of the robot in such a way that, so far as possible, they can be jointly driven in the robots most common modes of operation. The method is called coupled drive. As an evaluation function for the coupled drive, an actuation index, which is the ratio of the power for the actuation to the whole output power to be provided to the robot system, is introduced. A simulation experiment involving a quadruped walking robot with suckers ascending the vertical surface of a wall is carried out. The experiment deduces by linear programming a walk by which the actuation ratio of the robot is maximized when specific configurations or gaits of the walking robot are given
Conference Paper
For assisting human motion, assistive devices working as muscles would be useful. A robot suit HAL (hybrid assistive limb) has been developed as an assistive device for lower limbs. Human can appropriately produce muscle contraction torque and control joint viscoelasticity by muscle effort such as co-contraction. Thus, to implement functions equivalent to human muscles using HAL, it is necessary to control viscoelasticity of HAL as well as to produce torque in accordance with operator's intention. Therefore the purpose of this study is to propose a control method of HAL using biological and motion information. In this method, HAL produces torque corresponding to muscle contraction torque by referring to the myoelectricity that is biological information to control operator's muscles. In addition, the viscoelasticities of HAL are adjusted in proportion to operator's viscoelasticity that is estimated from motion information by using an on-line parameter identification method. To evaluate the effectiveness of the proposed method, the method was applied to a swinging motion of a lower leg. When this method was applied, HAL could work like operator's muscles in the swinging motion, and as a consequence, the muscle activities of the operator were reduced. As a result of this experiment, we confirmed the effectiveness of the proposed method.
A laboratory study was conducted in an effort to reduce back stress for nursing personnel while performing the patient handling tasks of transferring the patient from bed to wheelchair and wheelchair to bed. These patient handling tasks were studied using five manual techniques and three hoist-assisted techniques. The manual techniques involved one-person and two-person transfers. One manual technique involved a two-person lift of the patient under the arms; the others used a rocking and pulling action and included the use of assistive devices (a gait belt using a two-person transfer, a walking belt with handles using a one-person and a two-person transfer, and a patient handling sling with cutout areas to allow for a hand grip (Medesign( for a one-person transfer(. The three mechanical hoists were Hoyer, Trans-Aid and Ambulift. Six female nursing students with prior patient transfer experience served both as nurses and as passive patients. Static biomechanical evaluation showed that pulling techniques, as compared to lifting the patient, required significantly lower hand forces and produced significantly lower erector spinae and compressive forces at the Ls/S1, disc (P≥0·01). Shear force, trunk moments and the percentage of females who were capable of performing the transfers (based on static strength simulation) also favoured pulling methods. Perceived stress ratings for the shoulder, upper back, lower back and whole body were lower for pulling methods than those for lifting the patient (P≤0·01). Patients found the pulling techniques, with the exception of when using the gait belt, felt more comfortable and more secure than the lifting method (P≤0·01). However, a number of subjects believed that the patient handling sling (Medesign) and the walking belt with one person making the transfer would not work for those patients who could not bear weight and those who were heavy, contracted or combative. A walking belt with two persons was the preferred manual method. Two out of three hoists (Hoyer lift and Trans-Aid) were perceived by the nurses to be as physically stressful as manual methods. Patients found these two hoists to be more uncomfortable and felt less secure than with three of the five manual methods (one- and two-person walking belts and Medesign). Ambulift was found to be the least stressful, the most comfortable, and the most secure among all eight methods. Pulling techniques and hoists took significantly longer amounts of time to make the transfer than manually lifting the patient (P≤0·01). The two-person walking belt using a pulling technique and Ambulift are recommended for transferring patients from bed to wheelchair and wheelchair to bed. A large-scale field study is needed to verify these recommendations.
Musculoskeletal disorders(MSD)have been increasing recently among care workers. Since providing care workers with appropriate equipment is effective for preventing MSD, we conducted a questionnaire survey in two nursing homes and a healthcare facility for the elderly to clarify equipment usage, problems and points for improvement. A total of 81 care workers(average age 32.2 yr; 63 females, 18 males)participated in the survey. The average number of residents and the average resident's care level were 70.0 and 3.6, respectively. Wheelchair and height adjustable beds were fully available and always used in all facilities. Portable lifts, ceiling lifts and transfer boards were, however, few in all 3 facilities and the proportion of use was 14.8%, 16.0%, and 23.5%, respectively. Participants reported that it is time consuming to move residents from place to place with lifts and there is a danger of dropping a resident. Although approximately 90% of care workers had received education and training on care techniques, the workload on the low back was found to be great. Therefore, we thought that care workers must consistently use care equipment. To achieve such increased usage, we must improve the usability of the equipment.
This paper presents a patient transfer apparatus between bed and stretcher. This apparatus makes it possible for the nurse to move weak, injured, or paralyzed patient from bed to stretcher, or vice versa, alone. Moreover, the suffering, stress, and uneasy feeling of the patient can be alleviated. This paper describes the specification, mechanical design, control system, and motion control of the apparatus. A special devised mechanism is developed, and a new servo system is used in this control system. The control principle and algorithm of the new servo system are proposed, and the motion-control method and safety function of the apparatus are described. The experimental results and evaluation indicated the effectiveness of this system.
Conference Paper
This paper discusses the manipulation of an object by using the whole body of a human type robotic mechanism. To manipulate a big object placed on the floor, a human type robot pushes it. For this problem, we extend the internal force in grasping by a multi-fingered hand to the whole body manipulation by a humanoid robot and introduce the "whole body internal force". To exert the pushing force effectively onto the object, we obtain some conditions regarding the whole body internal force. We also obtain the region of the foot position enabling a robot to push an object without causing the slip.