Optical Guidance System for Multiple Mobile Robots
Igor E. Paromtchik, Hajime Asama
The Institute of Physical and Chemical Research (RIKEN), Advanced Engineering Center
Hirosawa 2-1, Wako-shi, Saitama 351-0198, Japan
This paper describes our research work towards the de-
velopment of an optical guidance system for multiple mo-
bile robots in an indoor environment. The guidance sys-
tem operates with an environmental model, communicates
with mobile robots and indicates their target positions by
means of a light projection from a laser pointer onto the
ground. Processing the image datafrom a CCD color cam-
light beaconon the ground and estimate its relative coordi-
nates. The robot’s control system ensures the accurate mo-
tion of the robot to the indicated target position. The guid-
ance system subsequently indicates target positions corre-
sponding to a desired route for a specified mobile robot in
the fleet. The concept of the optical guidance system, its
implementation and experimental results obtained are dis-
Guidance of a mobile robot involves its localization in
the environment . The precise localization becomes es-
pecially relevant in the case of multiple mobile robots shar-
ing a common environment. Various localization methods
are known, from simple and widely used odometry and
other dead reckoning methods to active and passive range
sensing approaches;see a recentsurveyon laser rangefind-
ers, triangulation range finders and passive stereo for mo-
bile robots .
vision) along with external means (landmarks, beacons)
and fusion of sensor data are necessary in order to obtain
the precise position and orientation of the robot in the en-
vironment and update the environmental model. An in-
crease in discrepancybetween the actual robot positionand
its estimate (e.g. if localization relies on a dead-reckoning
method) can lead to inadequate motion planning and con-
trol, resulting in collisions with objects or other robots.
In order to deal with the localization problem, various
optical guidance methods have been developed such as us-
ing reflective beacons, tracking stationary light sources,
tracking a guidance line on the floor or ceiling, or using
a scanning laser on the mobile robot in order to measure
distances to surrounding objects .
We propose an optical guidance system for mobile
robots that makes use of a projected laser light. The guid-
ance system operates with the environmental model and
comprises a computer-controlledlaser pointer with at least
two degrees-of-freedomin orderto direct a laser beamonto
the desired positions on the ground. The guidance system
communicates with the mobile robot when indicating its
target position and subsequent checking if the robot has at-
tained this position.
The key idea of the optical guidance system is to indi-
cate the numerical coordinates of the target position for the
mobile robot by means of projection of a laser light onto
the ground. The on-board vision system of the robot pro-
cesses the color images in order to detect the laser light
beacon on the ground and evaluate its relative coordinates.
This visual feedback ensures the accurate following of the
indicated positions by the robot.
The main advantage of the proposed guidance system is
the improved accuracy. The system also allows implicit
localization of the mobile robot within the environment:
when the robot has reached its indicated target position, an
estimate of its coordinates in the environmental model is
known. Since the robot’s control system operates with the
relative coordinates of target positions obtained from im-
age processing, the transformation between the coordinate
systems of the environmental model (“world” coordinate
system) and that of the mobile robot becomes less relevant
Our paper focuses on the concept of the optical guid-
ance system integrated into an environment with multiple
mobile robots. The paper is organizedas follows. The con-
cept of the proposed optical guidance system is described
in section 2. The mobile robot and its control architecture
are presented in section 3. The operation of the guidance
system is discussed in section 4. The implementation and
our experimental results are presented in section 5. The
conclusions are given in section 6.
Proceedings of the 2001 IEEE
International Conference on Robotics & Automation
Seoul, Korea • May 21-26, 2001
0-7803-6475-9/01/$10.00© 2001 IEEE
0.2 0.40.60.81 1.2 1.4
Lateral distance [m]
Longitudinal distance [m]
Figure 9: Localization of projected laser light
The software for the teleoperation and optical guidance
system is developedin JAVA language. The software of the
robot’s control system is implemented in C language and
runs under VxWorks real-time operating system on a Pen-
tium 200 MHz processor. The client-server communica-
tion between the teleoperation and optical guidancesystem
(“client”) and the mobile robots (“servers”) is performed
via a wireless Ethernet. Our video illustrates the experi-
The concept of the optical guidance system that makes
use of a laser pointer was introduced. The features of the
optical guidance system and its use for multiple mobile
robots were discussed. The control architecture of the mo-
ered. The implementation and experimental results on the
operation of the optical guidance system were presented.
Our future work will deal with the improvement of the im-
age processing algorithm, system integration and conduct-
ing experiments with multiple mobile robots.
 J. Borenstein, H. R. Everett, L. Feng, Navigating Mo-
bile Robots – Systems and Techniques, A K Peters,
Wellesley, MA, USA, 1996, 225 p.
 M. Hebert, Active and Passive Range Sensing
for Robotics, Proc. of the IEEE Int. Conf. on
Robotics and Automation, San Francisco, CA, USA,
April 24-28, 2000, pp. 102-110.
 H. Asama et al.,
Directional Mobile Robot with 3 DOF Decou-
pling Drive Mechanism, Proc. of the IEEE Int.
Conf. on Robotics and Automation, Nagoya, Japan,
May 21-27, 1995, pp. 1925-1930.
Development of an Omni-
 T. Suzuki et al. A Multi-Robot Teleoperation System
Utilizing the Internet, Advanced Robotics, Vol. 11,
No. 8, 1998, pp. 781-797.
 Y. Araiet al., AdaptiveBehaviourAcquisitionofCol-
lision Avoidance among Multiple Autonomous Mo-
bile Robots, Proc. of the IEEE/RSJ Int. Conf. on
Intelligent Robots and Systems, Grenoble, France,
Sept. 7-11, 1997, pp. 1762-1767.
 I.E.Paromtchik,U.M.Nassal, ReactiveMotionCon-
trol for an Omnidirectional Mobile Robot, Proc. of
the Third European Control Conference, Roma, Italy,
Sept. 5-8, 1995, pp. 3074-3079.
 I. E. Paromtchik, H. Asama, A Motion Genera-
tion Approach for an Omnidirectional Vehicle, Proc.
of the IEEE Int. Conf. on Robotics and Automa-
tion, San Francisco, CA, USA, April 24-28, 2000,
 S. Okina et al., Self-Diagnosis System of an Au-
tonomous Mobile Robot Using Sensory Information,
 A. Nakamura, T. Arai, T. Hiroki, J. Ota, Deve-
lopment of a Multiple Mobile Robot System Con-
trolledbya Human–Realisation of ObjectCommand
Level, Distributed Autonomous Robotics Systems 4,
L. E. Parker, G. Bekey Eds., Springer Verlag Tokyo,
2000, pp. 209-218.
 G. X. Ritter, J. N. Wilson, Handbookof Computer Vi-
sion Algorithms in Image Algebra, CRC Press, Boca
Raton, USA, 1996.
 C. C. Yang, F. W. Ciarallo, Optimized Sensor Place-
ment for Active Visual Perception, J. of Robotic Sys-
tems, Vol. 18, 2001, pp. 1-15.