Access to this full-text is provided by Wiley.
Content available from Applied Bionics and Biomechanics
This content is subject to copyright. Terms and conditions apply.
Research Article
Symmetric Kullback-Leibler Metric Based Tracking
Behaviors for Bioinspired Robotic Eyes
Hengli Liu, Jun Luo, Peng Wu, Shaorong Xie, and Hengyu Li
School of Mechatronic Engineering and Automation, Shanghai University, Shanghai 200072, China
Correspondence should be addressed to Hengyu Li; lihengyu@shu.edu.cn
Received July ; Revised October ; Accepted October
Academic Editor: Cecilia Laschi
Copyright © Hengli Liu et al. is is an open access article distributed under the Creative Commons Attribution License,
which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
A symmetric Kullback-Leibler metric based tracking system, capable of tracking moving targets, is presented for a bionic spherical
parallel mechanism to minimize a tracking error function to simulate smooth pursuit of human eyes. More specically,we propose a
real-time moving target tracking algorithmw hichutilizes spatial histog rams taking into account symmetric Kullback-Leibler metric.
In the proposed algorithm, the key spatial histograms are extracted and taken into particle ltering framework. Once the target
is identied, an image-based control scheme is implemented to drive bionic spherical parallel mechanism such that the identied
target is to be tracked at the center of the captured images. Meanwhile, the robot motion information is fed forward to develop an
adaptive smooth tracking controller inspired by the Vestibuloocular Reex mechanism. e proposed tracking system is designed
to make the robot track dynamic objects when the robot travels through transmittable terrains, especially bumpy environment. To
perform bumpy-resist capability under the condition of violent attitude variation when the robot works in the bumpy environment
mentioned, experimental results demonstrate the eectiveness and robustness of our bioinspired tracking system using bionic
spherical parallel mechanism inspired by head-eye coordination.
1. Introduction
Robot vision systems are crucial to recognize and acquire
surrounding information for mobile robots. Target tracking,
target recognition, surrounding perception, robotic localiza-
tion, and attitude estimation are the most popular topics in
robotics. And the target tracking function has emerged as a
signicant aspect for Human Robot Interaction (HRI), cam-
era Motion-Disturbance Compensation (MDC), and tracking
stabilization.
e robot motion information is commonly used to
keep the camera stabilization and compensate small rotation
or movements of the camera. ese systems used inertial
sensors and visual cues to compute the motion information
ofthecamera.JungandSukhatme[]developedaKanade-
Lucas-Tomasi (KLT) based motion tracking system for a
moving target using a single camera on a mobile robot.
Hwangbo et al. [, ] also developed a gyro-aided KLT feature
tracking method that remained robust under fast camera-
ego rotation conditions. Park et al. [] proposed an Extended
Kalman Filter (EKF) based motion data fusion scheme for
visual object tracking by autonomous vehicles. Jia et al. []
also proposed a scheme of joint of visual features and the
vehicle’s inertial measurements for visual object identication
and tracking. Hol et al. [] used a multirate EKF by fusing
measurements from inertial sensors (accelerometers and rate
gyroscopes) and vision to estimate and predict position and
orientation (pose) of a camera for robust real-time tracking.
Recently, biomimetic systems were extensively investi-
gated by adopting the movement mechanics of human eye.
e development of eyeball’s neurophysiology provides a
large amount of data and theory foundation for building up
the controlling model of eye movement. Among the several
types of eye movements, smooth tracking and gaze stabiliza-
tion play a fundamental role. Lenz et al. [] developed an
adaptive gaze stabilization controller inspired by the Vestibu-
loocular Reex (VOR). It integrated inertial and visual
information to drive the eyes in the opposite direction to head
movementandtherebystabilizedtheimageontheretina
under dynamic changes. Shibata’s biological oculomotor
systems [] used human-eye’s VOR and Optokinetic Reex
Hindawi Publishing Corporation
Applied Bionics and Biomechanics
Volume 2015, Article ID 714572, 11 pages
http://dx.doi.org/10.1155/2015/714572
Applied Bionics and Biomechanics
(OKR) to improve the gaze stabilization of vision system. A
chameleon-inspired binocular “negative correlation” visual
system (CIBNCVS) with neck [] was designed to achieve
swi and accurate positioning and tracking. Avni et al. []
also presented a biologically motivated approach of track-
ing with independent cameras inspired by chameleon-like
visual system. Law et al. [] described biologically con-
strained architecture for developmental learning of eye-head
gaze control on an iCub robot. Xie et al. [] proposed
a biomimetic control strategy of on-board pan-tilt-zoom
camera to stabilize visual tracking from a helicopter based on
physiologicalneuralpathofeyemovementcontrol.Vannucci
et al. [] established an adaptive model for robotic control
able to perform visual pursuit with prediction of the target
motion. Falotico et al. [] employed “catch-up” saccade
model to xate the object of interest in case of moving
targets in order to obtain a human-like tracking system.
Compared with the classical control methods, the advantages
of using a bionic controller make the robot easily adapted
to transmittable terrains and track moving targets stably.
Inspiredbytheexcellentwork,wetackleturbulenceproblem
of tracking when the robots travel through bumpy terrains
using a tracking system, that is, bumpy-resist capability.
Furthermore, with the development of anatomy of human
eye, the movement mechanics of the human eye have aroused
much interest in bionic engineering. Humanoid robot James
[, ] was equipped with two articial eyes, which can
pan and tilt independently (totally DOFs.). us, the iCub
[, ] also had two articial eyes with DOFs, oering
viewing and tracking motions. Wang et al. [] devised a
novel humanoid robot eye, which is driven by six pneumatic
articial muscles (PAMs) and rotates with DOFs. Bioin-
spired actuators and mechanisms have been proposed to pan
and tilt a camera with comparable characteristics as a human
eye [, ]. Tendon-driven robot eye [] was presented
utilizing a mechanical base of the geometry of the eye and of
its actuation system behind the implementation of Listing’s
law. Gu et al. [] presented an articial eye implant with
shapememoryalloys(SMAs)drivenbyasmallservomotor.A
miniature articial compound eye called the curved articial
compound eye (CurvACE) [] was endowed using similar
micromovements to those occurring in the y’s compound
eye.
Many bionic eyes have been presented as mentioned
above. However, spherical parallel mechanism (SPM) has a
compact structure, excellent dynamic performance, and high
accuracy; in addition, a -DOF SPM is in line with the struc-
tural design of the bionic eye. -DOF SPMs attract decent
amount of interest for this reason. A large number of these -
DOF SPM bionic eyes have been proposed. An articial eye
[, ] for humanoid robots has been devised to be small
in size and weight as well as to imitate the high dynamic
movements of the human eye. e “Agile Eye” [] is a high-
performanceparallelmechanismwhichhasthecapabilityof
orienting a camera mounted end eector within a workspace
larger than that of a human eye and with velocities and
accelerations larger than those of the human eye. Bang et al.
[] design a -DOF anthropomorphic oculomotor system
to match the human-like eye’s performance capabilities. Our
mechanism platform is inspired by these excellent works and
plays a vital role in tracking dynamic objects.
Tracking a dynamic object when a robot performs its
normal motion is common in application. To keep smoothly
tracking moving objects, we develop a bioinspired tracking
system that is extensively used when the robot works in
bumpy environment or with dynamic disturbance in this paper.
With active robot vision, an image-based feedback tracking
system is presented for our bionic SPM to minimize tracking
servoing, capable of tracking moving target when the robot
moves across in bumpy environment. More specically, we
propose a real-time moving target tracking algorithm which
utilizes spatial histograms and symmetric Kullback-Leibler
(SKL) metric integrated in particle ltering framework to
achieve automatic moving target tracking and gaze stabiliza-
tion. In the proposed algorithm, the key spatial histograms
are extracted and taken into particle ltering framework.
An image-based feedback control scheme is implemented
todrivebionicSPMsuchthattheidentiedtargetistobe
trackedatthecenterofthecapturedimages.Meanwhile,
the robot motion information is fed forward to develop
an adaptive smooth tracking controller bioinspired by the
VOR mechanism. To perform good specication, we test our
vision stability system under the condition of violent attitude
variation when the robot works in bumpy environment.
From a robotics point of view, our system is biologically
inspired. While smooth tracking is employed to create a
consistent perception of the surrounding world, the inter-
action with environment is also used to adjust the control
model involved in the smooth tracking generation. Action
and perception are tightly coupled in a bidirectional way:
perception triggers an action and the output of action changes
the perception. Meanwhile, the robot motion information is
fed forward, inspired by the VOR mechanism, to stabilize
smooth tracking.
e paper is organized as follows. Section introduces
bionic issues and design of our bionic SPM. Section pro-
poses visual tracking based on symmetric Kullback-Leibler
metric spatiograms. Our bionic eye plant control system is
described in Section . Experimental results are shown in
Section . Section presents our conclusion.
2. Design of Human-Eye-Inspired
PTZ Platform
2.1. Human-Eye’s Movement Mechanism. Each eye is con-
trolled by three complementary pairs of extraocular muscles,
as shown in Figure (a). e movement of each eye involves
rotating the globe of the eye in the socket of the skull. Because
of minimal translation during its movement, the eye can be
regarded as a spherical joint with an orientation dened by
threeaxesofrotation(horizontal,vertical,andtorsional).But
inourimplementationanddevelopmentofasimulator,we
view eye movement with no translation for simplicity.
e medial rectus turns eye inward and, thus, lateral
rectus outward. erefore, they form a pair to control the
horizontal position of the eye. In contraction to the pair of
medial rectus and lateral rectus, the actions of the other two
Applied Bionics and Biomechanics
Tro c hle a
Superior oblique
Superior rectus
Levator (cut)
Optic nerve
Inferior rectus
Lateral rectus
Inferior oblique
(a) (b)
F : Development of bionic eye plant. (a) Muscles of the eye. Six muscles, arranged in three pairs, control the movements of the eye as
shownhereinacutawayviewoftheeyeinitssocketororbit.(b)StructureofourSPMprototype.
pairs of muscles are more complex. When the eye is centered
in the orbit, the primary eect of the superior and inferior
rectus is to rotate up or rotate down the eye. However, when
the eye is deviated horizontally in the orbit, these muscles also
contribute to torsion, the rotation of the eye around the line of
sight that determines the orientation of images on the retina.
e primary eect of the superior and inferior obliques
is to turn eyes downward and upward when the eye does not
deviate from horizontal position. So do superior rectus and
inferior rectus. In addition, these muscles also determine the
vertical orientation of the eye.
Smooth pursuit eye movements slowly rotate the eyes
to compensate for any motion of the visual target and thus
act to minimize the blurring of the target’s retinal image
that would otherwise occur. We implement smooth target
tracking, continuously adjusted by visual feedback about the
target’s image (retinal image).
Kinematic characteristics of SPM and the mechanics of
eye movements are very similar []. Both have a -DOF
spherical movement and rotating globe is the center of the
sphere. SPM also has a compact structure, excellent dynamic
performance, and high accuracy, so -DOF SPM is in line
with the structural design of the bionic eye to replicate the
eye movement.
eeyeballisseenasasphere,witharotationcenterwhen
it rotates. Inspired by the mechanics of eye movements and
activeroboticvision,wepresentedanewbioniceyeprototype
basedonSPM,whichismadeupofaneye-in-handsystemas
shown in Figure (b).
Because the eye is free to rotate in three dimensions,
eyeballs can keep retinal images stable in the fovea when
they track moving target. In our work, we proposed two
main points about structural requirements inspired by the
human eyes: () camera center must be located at the center
of “eyeball” to ensure that the angle of image planes between
two dierent positions keeps identical with the rotation of
“eyeball”; () in the process of eye movement, any mechanical
component except the “eyeball” cannot exceed the plane of
the center of the sphere as much as possible to ensure that
when the movement of the “eyeball” occurs, they do not block
the sight of the camera and do not interfere with the robot
face.
2.2. Oculomotor Plant Compensation of VOR. In the human-
eye VOR, a signal from the vestibular system related to
head velocity, which is encoded by semicircular ducts, is
used to drive the eyes in the opposite direction to the head
movement. e VOR operates in feedforward mode and
as such requires calibration to ensure accurate nulling of
head movement. e simplicity of this “three-neuron arc,”
together with the relatively straightforward mechanics of the
eye plant, has long made the VOR an attractive model for
experimental and computational neuroscientists seeking to
understand cerebellar function. To abolish image motion
across the retina, the vestibular signal must be processed
by neural circuitry which compensates for the mechanical
properties of the oculomotor plant. e VOR is therefore a
particular example of motor plant compensation. Horizontal
and vertical and angular and linear head movement motivate
the appropriate combinations of six extraocular muscles in
three dimensions.
2.3. Kinematics of Human-Eye-Inspired PTZ Platform. Spher-
ical parallel mechanism consists of an upper layer and a base,
connected by three pairs of identical kinematic subchains as
shown in Figure . In each chain, there is one xed revolute
joint 𝑖and two free revolute joints 𝑖and 𝑖connecting the
proximal link to the distal link and the distal link to the upper
layer, respectively. e axes of all revolute joints intersect at
a common point which is referred to as the rotational
center. e plane passing through the point and becoming
parallel with the base is called the sphere center plane, also
seen as Listing’s plane of eyeball. 1,2,1,2,andare
the parameters of this mechanism, where 1and 2are the
structural angle of the lower link and upper link, 1and 2
are the half-cone angle of the upper platform and the base,
and isthestructuraltorsionangleofinitialstateoftheupper
platform and the base, namely, the initial torsion angle.
Figure demonstrates the kinematics of our SPM plat-
form, and the kinematic equation of the SPM is given by []
𝜃=1,2,1,2,,𝑥,𝑦,𝑧𝜔,()
Applied Bionics and Biomechanics
Eye plant
Distal link
Proximal link
Base
Sphere centre
datum
Z
Z13
𝛽2
Z0
Y
X
X0
Z11
Y0
𝛽1
𝛼1
𝛼2
Z12
O
F : Kinematic sketch of a spherical parallel manipulator.
where
𝜃=(
1,
2,
3)is the angular velocity vector input of
the motor, 𝜔=(𝑥,𝑦,𝑧)is the angular velocity vector out-
putoftheupperplatform,andis the Jacobian matrix which
is decided by the mechanical parameter (1,2,1,2,and )
and the eyeball posture (𝑥,𝑦,and 𝑧).1and 2are the
structural angles of lower link and upper link, respectively.
1and 2are the angles of base and upper platform. e
proposed PTZ platform has similar kinematics to the human
eye, as shown in Figure .
3. SKL-Based Particle Filter Visual Tracking
3.1. Spatial Histograms: Spatiogram. Color histogram is one
of the common target models which is just a statistic of
dierent colors in the entire picture in proportion without
concern for spatial location of each color. erefore, it is not
rather sensitive to rotation but suitable for nonrigid or prone
to deformation modeling target objects. Targets based on
this model are vulnerable to backgrounds which have similar
color distribution or other interference, thereby causing the
target tracking failure. In this paper, we improve the particle
lter algorithm based on a new target model, spatiogram
[], which adds the pixel coordinate information to the
traditional color histogram and uses SKL metric. e second-
order spatiogram can be described as follows:
()=𝑏,𝜇𝑏,Σ𝑏, =1,...,, ()
where is the total number of the intervals and {𝑏,𝜇𝑏,Σ𝑏}
is the probability of each interval, coordinate mean, and
covariance matrix, respectively. ey can be calculated using
the formula as follows:
𝑏=1
𝑁
𝑗=1𝑗𝑏,
𝜇𝑏=1
∗𝑏
𝑁
𝑗=1
x𝑗𝑗𝑏,
Σ𝑏=1
∗𝑏−1𝑁
𝑗=1 x𝑗−𝜇𝑏x𝑗−𝜇𝑏𝑇𝑗𝑏.
()
is the total number of pixels within the target area, x𝑗=
[𝑗,𝑗]𝑇is the coordinate position of the th pixel, and 𝑗𝑏 =
1denotes that the th pixel is quantized to the th interval,
while 𝑗𝑏 =0indicates that the th pixel is quantized to other
intervals.
3.2. SKL-Based Particle Filter. In order to apply the spa-
tiogram to target tracking, we need to select a method
to measure the similarity metrics of the spatial histogram
between the targets and the candidate targets. We select the
SKL-based coecient of similarity metrics to measure the
similarity of the target spatiogram () = {𝑏,𝜇𝑏,Σ𝑏}and
candidate target spatiogram ()={
𝑏,𝜇
𝑏,Σ
𝑏}.
Given a spatiogram ()={𝑏,𝜇𝑏,Σ𝑏},weuseaGaussian
distribution to describe the spatial distribution of all the
pixels in each section. e distribution of the th section can
be described as
𝑏(x)=1
2Σ𝑏1/2 exp −1
2x−𝜇𝑏𝑇Σ−1
𝑏x−𝜇𝑏, ()
where 𝜇𝑏is the mean value of all coordinates of the pixels of
the th interval and Σ𝑏is the mean covariance matrix of all
coordinates of the pixels of the th interval.
e KL distance between the Gaussian distribution 𝑏(x)
and the Gaussian distribution
𝑏(x)can be obtained by a
closed form solution which is calculated using the following
formula:
KL 𝑏
𝑏=1
2log Σ
𝑏
Σ𝑏+Tr Σ
𝑏−1 Σ𝑏−
+𝜇𝑏−𝜇
𝑏𝑇Σ
𝑏−1 𝜇𝑏−𝜇
𝑏, ()
where is the spatial dimension (for spatiogram, =2).
Similarly, we can get the KL distance between the Gaussian
distribution
𝑏(x)and the Gaussian distribution 𝑏(x):
KL
𝑏𝑏=1
2log Σ𝑏
Σ
𝑏+Tr Σ−1
𝑏Σ
𝑏−
+𝜇
𝑏−𝜇𝑏𝑇Σ−1
𝑏𝜇
𝑏−𝜇𝑏. ()
e SKL distance of the two Gaussian distributions of 𝑏(x)
and
𝑏(x)is
SKL 𝑏
𝑏=1
2KL 𝑏
𝑏+KL
𝑏𝑏,
Applied Bionics and Biomechanics
Pan, x
Roll, z
𝛼
𝛾
Tilt, y
𝛽
(a)
x
z
y
Optic nerve
(b)
F : Analogue of camera rotation and eye movement. Our SPM prototype has similar kinematics to human eye.
SKL 𝑏
𝑏
=1
4Tr Σ
𝑏−1 Σ𝑏+Tr Σ−1
𝑏Σ
𝑏−2
+1
4𝜇𝑏−𝜇
𝑏𝑇Σ−1
𝑏+Σ
𝑏−1𝜇𝑏−𝜇
𝑏. ()
Generally, the ranges of the similarity are [0,1],andthe
similarity 𝑏of each pair of intervals on the spatiogram can
be described as 𝑏=exp −SKL 𝑏,
𝑏. ()
us, the similarity of the spatiogram based on SKL distance
canbecalculatedas
,=𝐵
𝑏=1𝑏
𝑏𝑏
=𝐵
𝑏=1𝑏
𝑏exp −SKL 𝑏,
𝑏. ()
According to (), we can get
,=𝐵
𝑏=1𝑏𝑏
⋅exp −1
4Tr Σ−1
𝑏Σ
𝑏+Tr Σ−1
𝑏Σ
𝑏−2+0
=𝐵
𝑏=1𝑏exp −1
4(+−2)=𝐵
𝑏=1𝑏=1.
()
is indicates that the similarity measure of symmetric
spatiogram based KL distance ensures that the object has the
most similarity to the target.
4. Image-Based Feedback and Dynamic
Compensation Eye Plant Control
4.1. Visual Feedback Scheme. When the target is identied
in the image, the visual feedback tracking control strategy
is proposed to control the bionic eye plant mechanism to
minimize a tracking error function, which is also called eye-
in-hand visual servoing [, ]. Since the relative distance
betweentheSPMandthemovingtargetislarge,ifthe
error function is dened in any D reference coordinate
frame, coarse estimation of the relative pose between the
SPMandthemovingtargetmaycausethemovingtarget
to fall out of the visual eld, while adjusting the SPM servo
mechanism, and also aect the accuracy of the pose reached
aer convergence. In our project, to make tracking control
more robust and stable, we dene a tracking error function
in the visual sensor frame, which is given by [, ]
e()=s()−s∗,()
where s()and s∗are the measured and desired locations of
the centroid of the tracked moving target with respect to the
imageplane,respectively.Inourwork,wesets∗=[0,0]𝑇,a
constant, which is the centroid of the captured image.
4.2. Camera Calibration of Human-Eye-Inspired PTZ Plat-
form. BasedonthePTZvisualsystem,thecoordinatesystem
is established as shown in Figure . Assume the motion of the
object is unknown; how do we control motion of the PTZ
platform so that the projection of the moving object is xed
at the center of the image plane, with full consideration of the
dynamiceectsofthe“eyeball”?Tomake-axis of the camera
coordinate system coincide with the target by adjusting the
postureofthecamera,wehavetocompensatetheosetangle
between the camera and the target. We employ a pinhole
camera model to obtain a more accurate camera projection.
Applied Bionics and Biomechanics
Following the general pinhole camera model, the intrinsic
parameter model equation of the camera is given by
V
1
=
𝑥0
0
0
𝑦V0
001
𝑐
𝑐
𝑐
𝑐
1
,()
where (,V)denotes the image coordinate of the target in
the image coordinate system. (0,V0)are the coordinates of
the principal point. (𝑐,𝑐,𝑐)is the target coordinate in the
camera coordinate system. 𝑥is the scale factor in the -
coordinate direction, and 𝑦is the scale factor in the -
coordinate direction.
In order to keep the target tracked in the center of the
eld, we need to make the target lie on the optical axis. e
location of the target which passes through the optical axis is
represented by (0,0,𝑇),where𝑇is the distance between the
cameraandthetarget.eorientationis
𝑐
𝑐
𝑐
=
cos −sin 0
sin cos 0
001
cos 0sin
010
−sin 0cos
10 0
0cos −sin
0sin cos
0
0
𝑇
.()
Finally, we can deduce the angle oset between the target
and camera’s line of sight:
=arctan
𝑥−0
𝑥,
=arctan 𝑥
𝑦V−V02
2
𝑥+−02. ()
In our implementation, the camera center is located at the
center of “eyeball” so that the angle of image planes between
two dierent positions keeps identical with the rotation of
“eyeball.” e -DOF SPM satises the principles of eyeball
movement; a camera can be mounted in the SPM and
actively oriented (horizontal, vertical, and torsional) around
its -axis, -axis, and -axis, respectively. We, considering
minimal translation during eye’s movement, implement our
eye plant system with no translation for simplicity. So our
visual tracking strategy is applicable to all SPMs with no
translation.
In the visual tracking section, we give how to determine
thepositionofthemovingobject.erelativeposition
determines our visual tracking strategy. Eye rotation about
the vertical “-axis” is controlled by the lateral and medial
rectus muscles, which results in eye movements to le or
right. Rotation about the transverse “-axis” is controlled by
the superior and inferior rectus muscles, which elevates and
depresses the eye. Finally, rotations about the anteroposterior
“-axis” result in counterclockwise as well as upward and
downward eye motion. See Figures (a) and (b). Our model
receives visually guided signal to control eye plant; see ().
Meanwhile, the robot motion information is fed forward
into control loop. Our whole bioinspired tracking system is
illustrated in Figure . It is known that the VOR is basically
driven by the signals from vestibular apparatus in the inner
ear. e semicircular canals (SCs) detect the head rotation
anddrivetherotationalVOR;ontheotherhand,theotoliths
detect the head translation and drive the translational VOR.
Anatomists and physiologists tend to engage in the VOR
as a simple neural system mediated by a three-neuron arc
and displaying a distinct function. Starting in the vestibular
system, SCs get activated by head rotation and send their
impulses via the vestibular nerve through brainstem neurons
and end in oculomotor plant. Here, we use IMU to acquire
pose change of eye from the robot.
When the robot works in the bumpy environment, rigid
bumpsandpulsejittercausetheoccurrenceofsignicant
turbulence with high frequency and posture change with
lower frequency. erefore, the motion information of the
robot is acquired and fed forward into the controller to
compensate the external disturbance. In [], an active
compensation model of visual error is proposed according
to the principle of VOR in (). Here, we use our proposed
bioinspired controller to compensate motion disturbance
caused by bumpy jitter. Hence,
()=()−V𝑛2
V+1𝑛+1
+()++−𝑙𝑠
𝑞𝑛
𝑛+1,()
where () = −(()+())is slide error of retina, ()
denotes the rotation angle of head, and () means the
rotation angle of eyeball. ,,andrepresent the gains
of the velocity signal of head rotation, the velocity signal
ofretinaslide,andthespikeofnerveberscausedbythe
displayment of retina, respectively. is the compensation
weight value of occulus caused by error signal of retina. In
our system, ,,andare equal to and =2.5.Combining
position compensation with speed compensation of eyeball,
our system is used to build a smooth tracking system.
5. Experiments and Results
To prove that the proposed image-based feedback tracking
system based on our developed eye-in-hand prototype is able
to orient a camera with the required orientation changes,
Applied Bionics and Biomechanics
Motion feedforward
Visu al feedb ack
Pan-tilt-zoom
command
generator
Object position of image plane
Kinematics
Calibrator
Image2World
Motor system
Oculomotor
Coordinate
Trochl ea
Superior oblique
Superior rectus
Levator (cut)
Optic nerve
Inferior rectus
Lateral rectus
Inferior oblique
Eye plant
External disturbance
Eyeball
+
−calibrator
system
s∗= (0, 0) e(t)
s(t)
system
coordinate +
+
Visual tracking
(SKL-PF)
20
0
(mm)
Mobile ro bot Vestibular organ
Bioinspired robotic eyes
tracking system
̇
𝜃=J𝜔
F : Tracking control scheme. Camera means human’s eyeball; IMU means canals.
IMU
Camera
Eye plant
F : Experimental platform based on Tracked Mobile Robot.
especially its dynamic disturbance resistance capability and
SPM-based structural dexterity, closed-loop control experi-
ments were performed. We design an experimental platform
basedonatrackedrobot,asshowninFigure.Avarietyof
obstacles are placed on the tracked robot’s path to simulate
a real harsh environment. We introduced the used joint
space control architecture in []. In the chosen control
approach, the desired camera orientation is transformed to
linear actuator set points using the inverse kinematics. us,
here only a brief overview of the architecture and exemplary
control results are presented.
To measure angular velocities of “eyeball” in three axes,
we employ the attitude sensor DM-GX-TM. e device
oers a range of output data quantities from fully cali-
brated inertial measurements (acceleration, angular rate, and
deltaAngle and deltaVelocity vectors) to computed orienta-
tion estimates, including pitch, roll, and heading (yaw) or
rotation matrix. All quantities are fully temperature com-
pensated and are mathematically aligned to an orthogonal
coordinate system.
In addition, the image information is gained by using
a high-speed camera (Guppy F-C), which is connected
totheIEEEcardinstalledinaPCwithIntelCore
CPU which acquires the video signal. e camera is an
ultracompact, inexpensive VGA machine vision camera with
CCD sensor (Sony ICX). At full resolution, it runs up
to fps. We employ a pinhole camera model to obtain a
more accurate camera projection. Camera calibration was
repeated ten times to seek an approximate camera calibration
matrix =
𝑥00;0
𝑦0; 0V01.Acamerais
calibrated using chessboard []. Here, we employ a pinhole
camera model to obtain a more accurate camera projection.
Following the general pinhole camera model, the parameters
contained in are called the internal camera parameters
or the internal orientation of the camera. See [] for more
details.
Figure shows our tracking control scheme. We imple-
mented smooth target tracking (smooth pursuit) to ensure
the target located in the eld of view, continuously adjusted
by visual feedback about the target’s image. Image is captured
from camera (retinal image) and IMU measures the robot
body’s movement to compensate dynamic disturbance.
Supposing that we do not know the motion of the tracked
object, how do we control the motion of the “eyeball” to
ensure that the moving object is xed at the centroid of the
image plane?
In the process of tracking moving target, the tracking
algorithmshouldberobusttoappearancevariationsintro-
duced by occlusion, illumination changes, and pose varia-
tions. In our library environment, the proposed algorithm
can relocate the target when object appearance changes
due to illumination, scale, and pose variations. Once the
moving target is located, the “eyeball” should keep images
stable in the eld of view (center of image). at is, target
position uctuates at zero. See Figures and . Figure gives
some snapshots of tracking results and demonstrates that
Applied Bionics and Biomechanics
270 310 350
390 430 470
590550510
F : Moving object tracked in the laboratory.
35.32%
87.56%
98.37%
025
25
10 15 x
y
15
5
5
10
20
30
20 35
35
30
Pixels dierence of tracking error
F : Pixel dierence of and directions.
the moving target is located in the eld of view. Meanwhile,
extensive experiments are conducted to perform bumpy-
resist capability. Figure illustrates the pixel dierence in
and direction. Smaller eyeball errors accompanying larger
postural changes can be good proofs of good bumpy-resist
capability and VOR function.
Figures and show the performance of the proposed
tracking system on the tracked robot running across bumpy
environment. e result of statistics shows that .% of
50
40
30
20
10
0
−10
−20
−30
x-dierence y-dierence
Range
Range within 1.5 IQR
Median line
25%∼75%
25%∼75%
F : Statistics of and direction dierence.
tracking errors, including and direction dierence, have
fallen into the range of < pixels, as shown in Figure . e
statistics of and direction pixel dierence are demon-
strated in Figure . In our test, as the tracked robot platform
travels through the rough ground full of obstacles, rigid
bumpsandpulsejittercausetheoccurrenceofsignicant
turbulence with high frequency and the oscillatory posture
changes with lower frequency, which makes tracking eect
slightly larger than the data recorded in the literature, such as
[, ]. But our experiments are established under relatively
harsh environmental conditions, and the eect achieved is
objective. Tracking eects still stay in a controllable range like
the above situation. Apparently, this indicates that the system
has robustness.
Applied Bionics and Biomechanics
20
15
10
5
0
030 60 90 120
Time (s)
Amplitude (∘)
Robot, pitch
Robot, roll
Eye plant, pitch
Eye plant, roll
−5
−10
−15
−20
F : Experimental results of robot’s and eye plant’s pose.
20
15
10
5
0
Robot, pitch Robot, roll Eye plant, pitch Eye plant, roll
−5
−10
−15
−20
Range
Range within 1.5 IQR
Median line
Robot, pitch (25%∼75%)
Robot, roll (25%∼75%)
Eye plant, pitch (25%∼75%)
Eye plant, roll (25%∼75%)
F : Statistics of robot’s and eye plant’s pose.
In our actual situation, we install three IMUs on the
tracked robot and eye plants to measure the pose changes. We
recorded the angle variances to validate the system bumpy-
resist capability that the eyeball moves on the opposite
direction according to position compensation and velocity
compensation when the tracked robot’s pose changes. In
other words, the robot pose variance information is fed for-
ward into controller to form a head-eye coordination system.
Figures and show the experimental results of tracked
robot’sandeyeplant’sposechangesonthetrackedrobotin
bumpy environment. In addition, the large tracking errors
happen when the robot encounters instantaneous postural
changes. Nonetheless, quick returns to lower errors of eyeball
verify good robustness of the bionic visual tracking system
and high dexterity of the SPM-based bionic eye. Obviously,
these variances reect good stability of the tracking system.
6. Conclusion
To accurately replicate the human vision system, we pre-
sented a -DOF “eyeball” in the directions of horizontal,
vertical, and torsional axes according to the mechanics
of eye movements. us, an image-based visual feedback
tracking system is presented to minimize a tracking error
function, capable of tracking moving target. More specically,
the proposed real-time moving target tracking algorithm
utilizes spatial histograms and symmetric Kullback-Leibler
metric integrated into particle ltering framework to achieve
automatic moving target identication and gaze stabilization.
Meanwhile, the robot motion information is fed forward
to develop an adaptive smooth tracking controller bioin-
spired by the VOR mechanism. e experimental results
demonstrate that our algorithm is eective and robust in
dealing with moving object tracking and can always keep the
target at the center of the camera to avoid tracking failure.
Furthermore, as the tracked robot platform travels through
the rough ground full of obstacles, rigid bumps and pulse
jitter cause the occurrence of signicant turbulence with high
frequency and the oscillatory posture changes with lower
frequency. Tracking eects still stay in a controllable range
and this indicates that the system has bumpy-resist capability.
Conflict of Interests
e authors declare that there is no conict of interests
regarding the publication of this paper.
Acknowledgments
is work was partly supported by the National Natural Sci-
ence Foundation of China (nos. and ), the
Nature Science Foundation of Shanghai (no. ZR),
and the Open Research Project of the State Key Laboratory
of Industrial Control Technology, Zhejiang University, China
(no. ICT).
References
[] B. Jung and G. S. Sukhatme, “Real-time motion tracking from
amobilerobot,”International Journal of Social Robotics,vol.,
no. , pp. –, .
[] M. Hwangbo, J.-S. Kim, and T. Kanade, “Gyro-aided feature
tracking for a moving camera: fusion, auto-calibration and GPU
implementation,” e International Journal of Robotics Research,
vol.,no.,pp.–,.
[] M. Hwangbo, J.-S. Kim, and T. Kanade, “Inertial-aided KLT
feature tracking for a moving camera,” in Proceedings of the
IEEE/RSJ International Conference on Intelligent Robots and
Systems (IROS ’09),pp.–,St.Louis,MO,USA,October
.
[] J. Park, W. Hwang, H. Kwon, K. Kim, and D.-I. D. Cho, “A novel
line of sight control system for a robot vision tracking system,
using vision feedback and motion-disturbance feedforward
compensation,” Robotica,vol.,no.,pp.–,.
Applied Bionics and Biomechanics
[]Z.Jia,A.Balasuriya,andS.Challa,“Sensorfusion-based
visual target tracking for autonomous vehicles with the out-
of-sequence measurements solution,” Robotics and Autonomous
Systems,vol.,no.,pp.–,.
[] J.D.Hol,T.B.Sch
¨
on,H.Luinge,P.J.Slycke,andF.Gustafsson,
“Robust real-time tracking by fusing measurements from iner-
tialandvisionsensors,”Journal of Real-Time Image Processing,
vol. , no. -, pp. –, .
[]A.Lenz,T.Balakrishnan,A.G.Pipe,andC.Melhuish,“An
adaptive gaze stabilization controller inspired by the vestibulo-
ocular reex,” Bioinspiration and Biomimetics,vol.,no.,
Article ID , .
[] T. Shibata and S. Schaal, “Biomimetic gaze stabilization based
on feedback-error-learning with nonparametric regression net-
works,” Neural Networks,vol.,no.,pp.–,.
[]H.Xu,Y.Xu,H.Fu,Y.Xu,X.Z.Gao,andK.Alipour,
“Coordinated movement of biomimetic dual PTZ visual system
and wheeled mobile robot,” Industrial Robot,vol.,no.,pp.
–, .
[] O. Avni, F. Borrelli, G. Katzir, E. Rivlin, and H. Rotstein,
“Scanning and tracking with independent cameras-a biologi-
cally motivated approach based on model predictive control,”
Autonomous Robots,vol.,no.,pp.–,.
[] J. Law, P. Shaw, and M. Lee, “A biologically constrained architec-
ture for developmental learning of eye–head gaze control on a
humanoid robot,” Autonomous Robots,vol.,no.,pp.–,
.
[] S. Xie, J. Luo, Z. Gong, W. Ding, H. Zou, and X. Fu, “Biomimetic
control of pan-tilt-zoom camera for visual tracking based-on an
autonomous helicopter,” in Proceedings of the IEEE/RSJ Inter-
national Conference on Intelligent Robots and Systems (IROS
’2007),pp.–,SanDiego,Calif,USA,November.
[] L.Vannucci,N.Cauli,E.Falotico,A.Bernardino,andC.Laschi,
“Adaptive visual pursuit involving eye-head coordination and
prediction of the target motion,” in Proceedings of the 14th IEEE-
RAS International Conference on Humanoid Robots (Humanoids
’14), pp. –, IEEE, Madrid, Spain, November .
[] E. Falotico, D. Zambrano, G. G. Muscolo, L. Marazzato, P.
Dario, and C. Laschi, “Implementation of a bio-inspired visual
tracking model on the iCub robot,” in Proceedings of the 19th
IEEE International Symposium on Robot and Human Interactive
Communication (RO-MAN ’10), pp. –, IEEE, Viareggio,
Italy, September .
[] L. Jamone, M. Fumagalli, G. Metta, L. Natale, F. Nori, and
G. Sandini, “Machine-learning based control of a human-like
tendon-driven neck,” in Proceedings of the IEEE International
Conference on Robotics and Automation (ICRA ’10),pp.–
, Anchorage, Alaska, USA, May .
[] F. Nori, L. Jamone, G. Sandini, and G. Metta, “Accurate control
of a human-like tendon-driven neck,” in Proceedings of the 7th
IEEE-RAS International Conference on Humanoid Robots,pp.
–, IEEE, Pittsburgh, Pa, USA, November-December .
[] N. G. Tsagarakis, G. Metta, G. Sandini et al., “iCub: the design
and realization of an op en humanoid platform for cognitive and
neuroscience research,” Advanced Robotics,vol.,no.,pp.
–, .
[] J. Leitner, S. Harding, M. Frank, A. F¨
orster, and J. Schmidhuber,
“Anintegrated,modularframeworkforcomputervisionand
cognitive robotics research (icVision),” in Biologically Inspired
Cognitive Architectures,vol.ofAdvances in Intelligent Sys-
tems and Computing,pp.–,Springer,.
[] X.-Y. Wang, Y. Zhang, X.-J. Fu, and G.-S. Xiang, “Design
and kinematic analysis of a novel humanoid robot eye using
pneumatic articial muscles,” Journal of Bionic Engineering,vol.
, no. , pp. –, .
[] Y.-C. Lee, C.-C. Lan, C.-Y. Chu, C.-M. Lai, and Y.-J. Chen,
“A pan–tilt orienting mechanism with parallel axes of exural
actuation,” IEEE/ASME Transactions on Mechatronics,vol.,
no. , pp. –, .
[] C.-C. Lan, Y.-C. Lee, J.-F. Jiang, Y.-J. Chen, and H.-Y. Wei,
“Design of a compact camera-orienting mechanism with ex-
ural pan and tilt axes,” in Proceedings of the IEEE/RSJ Interna-
tional Conference on Intelligent Robots and Systems (IROS ’11),
pp. –, San Francisco, Calif, USA, September .
[] G. Cannata and M. Maggiali, “Models for the design of
bioinspired robot eyes,” IEEE Transactions on Robotics,vol.,
no. , pp. –, .
[]J.Gu,M.Meng,A.Cook,andM.G.Faulkner,“Astudy
on natural movement of articial eye implant,” Robotics and
Autonomous Systems,vol.,no.,pp.–,.
[] F. Colonnier, A. Manecy, R. Juston et al., “A small-scale hyper-
acute compound eye featuring active eye tremor: application to
visual stabilization, target tracking, and short-range odometry,”
Bioinspiration & Biomimetics,vol.,no.,ArticleID,
.
[] T. Villgrattner, E. Schneider, P. Andersch, and H. Ulbrich,
“Compact high dynamic DoF camera orientation system:
development and control,” JournalofSystemDesignand
Dynamics,vol.,no.,pp.–,.
[] T. Villgrattner and H. Ulbrich, “Optimization and dynamic
simulation of a parallel three degree-of-freedom camera orien-
tation system,” in Proceedings of the 23rd IEEE/RSJ 2010 Interna-
tional Conference on Intelligent Robots and Systems (IROS ’10),
pp. –, Taipei, Taiwan, October .
[] C. M. Gosselin, E. St. Pierre, and M. Gagn´
e, “On the develop-
ment of the agile eye,” IEEE Robotics & Automation Magazine,
vol. , no. , pp. –, .
[] Y.-B. Bang, J. K. Paik, B.-H. Shin, and C. Lee, “A three-
degree-of-freedom anthropomorphic oculomotor simulator,”
International Journal of Control, Automation and Systems,vol.
, no. , pp. –, .
[] S. Refaat, J. M. Herv´
e, S. Nahavandi, and H. Trinh, “Two-mode
overconstrained three-DOFs rotational-translational linear-
motor-based parallel-kinematics mechanism for machine tool
applications,” Robotica,vol.,no.,pp.–,.
[] C. Li, S. Xie, H. Li, D. Wang, and J. Luo, “Design of bionic
eye based on spherical parallel mechanism with optimized
parameters,” Robot,vol.,article,.
[] S. T. Bircheld and S. Rangarajan, “Spatiograms versus his-
tograms for region-based tracking,” in Proceedings of the IEEE
Computer Society Conference on Computer Vision and Pattern
Recognition (CVPR ’05), vol. , pp. –, IEEE, June .
[] F. Chaumette and S. Hutchinson, “Visual servo control. I. Basic
approaches,” IEEERoboticsandAutomationMagazine,vol.,
no. , pp. –, .
[] F. Chaumette and S. Hutchinson, “Visual servo control. II.
Advanced approaches [Tutorial],” IEEE Robotics and Automa-
tion Magazine, vol. , no. , pp. –, .
[] H. Li, J. Luo, C. Li, L. Li, and S. Xie, “Active compensation
method of robot visual error based on vestibulo-ocular reex,”
Jiqiren/Robot,vol.,no.,pp.–,.
Applied Bionics and Biomechanics
[] C.Li,S.Xie,H.Li,J.Miao,Y.Xu,andJ.Luo,“Systemdesign
and study on bionic eye of spherical parallel mechanism based
on attitude closed-loop control,” Jiqiren/Robot,vol.,pp.–
, .
[] Z. Zhang, “A exible new technique for camera calibration,”
IEEE Transactions on Pattern Analysis and Machine Intelligence,
vol.,no.,pp.–,.
[] R. Hartley and A. Zisserman, Multiple View Geometr y in
Computer Vision, Cambridge University Press, Cambridge, UK,
.
Available via license: CC BY
Content may be subject to copyright.