ArticlePDF Available

Symmetric Kullback-Leibler Metric Based Tracking Behaviors for Bioinspired Robotic Eyes

Wiley
Applied Bionics and Biomechanics
Authors:
  • The Chinese University of Hong Kong, Shenzhen, China

Abstract and Figures

A symmetric Kullback-Leibler metric based tracking system, capable of tracking moving targets, is presented for a bionic spherical parallel mechanism to minimize a tracking error function to simulate smooth pursuit of human eyes. More specifically, we propose a real-time moving target tracking algorithm which utilizes spatial histograms taking into account symmetric Kullback-Leibler metric. In the proposed algorithm, the key spatial histograms are extracted and taken into particle filtering framework. Once the target is identified, an image-based control scheme is implemented to drive bionic spherical parallel mechanism such that the identified target is to be tracked at the center of the captured images. Meanwhile, the robot motion information is fed forward to develop an adaptive smooth tracking controller inspired by the Vestibuloocular Reflex mechanism. The proposed tracking system is designed to make the robot track dynamic objects when the robot travels through transmittable terrains, especially bumpy environment. To perform bumpy-resist capability under the condition of violent attitude variation when the robot works in the bumpy environment mentioned, experimental results demonstrate the effectiveness and robustness of our bioinspired tracking system using bionic spherical parallel mechanism inspired by head-eye coordination.
This content is subject to copyright. Terms and conditions apply.
Research Article
Symmetric Kullback-Leibler Metric Based Tracking
Behaviors for Bioinspired Robotic Eyes
Hengli Liu, Jun Luo, Peng Wu, Shaorong Xie, and Hengyu Li
School of Mechatronic Engineering and Automation, Shanghai University, Shanghai 200072, China
Correspondence should be addressed to Hengyu Li; lihengyu@shu.edu.cn
Received  July ; Revised  October ; Accepted  October 
Academic Editor: Cecilia Laschi
Copyright ©  Hengli Liu et al. is is an open access article distributed under the Creative Commons Attribution License,
which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
A symmetric Kullback-Leibler metric based tracking system, capable of tracking moving targets, is presented for a bionic spherical
parallel mechanism to minimize a tracking error function to simulate smooth pursuit of human eyes. More specically,we propose a
real-time moving target tracking algorithmw hichutilizes spatial histog rams taking into account symmetric Kullback-Leibler metric.
In the proposed algorithm, the key spatial histograms are extracted and taken into particle ltering framework. Once the target
is identied, an image-based control scheme is implemented to drive bionic spherical parallel mechanism such that the identied
target is to be tracked at the center of the captured images. Meanwhile, the robot motion information is fed forward to develop an
adaptive smooth tracking controller inspired by the Vestibuloocular Reex mechanism. e proposed tracking system is designed
to make the robot track dynamic objects when the robot travels through transmittable terrains, especially bumpy environment. To
perform bumpy-resist capability under the condition of violent attitude variation when the robot works in the bumpy environment
mentioned, experimental results demonstrate the eectiveness and robustness of our bioinspired tracking system using bionic
spherical parallel mechanism inspired by head-eye coordination.
1. Introduction
Robot vision systems are crucial to recognize and acquire
surrounding information for mobile robots. Target tracking,
target recognition, surrounding perception, robotic localiza-
tion, and attitude estimation are the most popular topics in
robotics. And the target tracking function has emerged as a
signicant aspect for Human Robot Interaction (HRI), cam-
era Motion-Disturbance Compensation (MDC), and tracking
stabilization.
e robot motion information is commonly used to
keep the camera stabilization and compensate small rotation
or movements of the camera. ese systems used inertial
sensors and visual cues to compute the motion information
ofthecamera.JungandSukhatme[]developedaKanade-
Lucas-Tomasi (KLT) based motion tracking system for a
moving target using a single camera on a mobile robot.
Hwangbo et al. [, ] also developed a gyro-aided KLT feature
tracking method that remained robust under fast camera-
ego rotation conditions. Park et al. [] proposed an Extended
Kalman Filter (EKF) based motion data fusion scheme for
visual object tracking by autonomous vehicles. Jia et al. []
also proposed a scheme of joint of visual features and the
vehicle’s inertial measurements for visual object identication
and tracking. Hol et al. [] used a multirate EKF by fusing
measurements from inertial sensors (accelerometers and rate
gyroscopes) and vision to estimate and predict position and
orientation (pose) of a camera for robust real-time tracking.
Recently, biomimetic systems were extensively investi-
gated by adopting the movement mechanics of human eye.
e development of eyeball’s neurophysiology provides a
large amount of data and theory foundation for building up
the controlling model of eye movement. Among the several
types of eye movements, smooth tracking and gaze stabiliza-
tion play a fundamental role. Lenz et al. [] developed an
adaptive gaze stabilization controller inspired by the Vestibu-
loocular Reex (VOR). It integrated inertial and visual
information to drive the eyes in the opposite direction to head
movementandtherebystabilizedtheimageontheretina
under dynamic changes. Shibatas biological oculomotor
systems [] used human-eyes VOR and Optokinetic Reex
Hindawi Publishing Corporation
Applied Bionics and Biomechanics
Volume 2015, Article ID 714572, 11 pages
http://dx.doi.org/10.1155/2015/714572
Applied Bionics and Biomechanics
(OKR) to improve the gaze stabilization of vision system. A
chameleon-inspired binocular “negative correlation” visual
system (CIBNCVS) with neck [] was designed to achieve
swi and accurate positioning and tracking. Avni et al. []
also presented a biologically motivated approach of track-
ing with independent cameras inspired by chameleon-like
visual system. Law et al. [] described biologically con-
strained architecture for developmental learning of eye-head
gaze control on an iCub robot. Xie et al. [] proposed
a biomimetic control strategy of on-board pan-tilt-zoom
camera to stabilize visual tracking from a helicopter based on
physiologicalneuralpathofeyemovementcontrol.Vannucci
et al. [] established an adaptive model for robotic control
able to perform visual pursuit with prediction of the target
motion. Falotico et al. [] employed “catch-up” saccade
model to xate the object of interest in case of moving
targets in order to obtain a human-like tracking system.
Compared with the classical control methods, the advantages
of using a bionic controller make the robot easily adapted
to transmittable terrains and track moving targets stably.
Inspiredbytheexcellentwork,wetackleturbulenceproblem
of tracking when the robots travel through bumpy terrains
using a tracking system, that is, bumpy-resist capability.
Furthermore, with the development of anatomy of human
eye, the movement mechanics of the human eye have aroused
much interest in bionic engineering. Humanoid robot James
[, ] was equipped with two articial eyes, which can
pan and tilt independently (totally  DOFs.). us, the iCub
[, ] also had two articial eyes with  DOFs, oering
viewing and tracking motions. Wang et al. [] devised a
novel humanoid robot eye, which is driven by six pneumatic
articial muscles (PAMs) and rotates with  DOFs. Bioin-
spired actuators and mechanisms have been proposed to pan
and tilt a camera with comparable characteristics as a human
eye [, ]. Tendon-driven robot eye [] was presented
utilizing a mechanical base of the geometry of the eye and of
its actuation system behind the implementation of Listing’s
law. Gu et al. [] presented an articial eye implant with
shapememoryalloys(SMAs)drivenbyasmallservomotor.A
miniature articial compound eye called the curved articial
compound eye (CurvACE) [] was endowed using similar
micromovements to those occurring in the y’s compound
eye.
Many bionic eyes have been presented as mentioned
above. However, spherical parallel mechanism (SPM) has a
compact structure, excellent dynamic performance, and high
accuracy; in addition, a -DOF SPM is in line with the struc-
tural design of the bionic eye. -DOF SPMs attract decent
amount of interest for this reason. A large number of these -
DOF SPM bionic eyes have been proposed. An articial eye
[, ] for humanoid robots has been devised to be small
in size and weight as well as to imitate the high dynamic
movements of the human eye. e “Agile Eye” [] is a high-
performanceparallelmechanismwhichhasthecapabilityof
orienting a camera mounted end eector within a workspace
larger than that of a human eye and with velocities and
accelerations larger than those of the human eye. Bang et al.
[] design a -DOF anthropomorphic oculomotor system
to match the human-like eyes performance capabilities. Our
mechanism platform is inspired by these excellent works and
plays a vital role in tracking dynamic objects.
Tracking a dynamic object when a robot performs its
normal motion is common in application. To keep smoothly
tracking moving objects, we develop a bioinspired tracking
system that is extensively used when the robot works in
bumpy environment or with dynamic disturbance in this paper.
With active robot vision, an image-based feedback tracking
system is presented for our bionic SPM to minimize tracking
servoing, capable of tracking moving target when the robot
moves across in bumpy environment. More specically, we
propose a real-time moving target tracking algorithm which
utilizes spatial histograms and symmetric Kullback-Leibler
(SKL) metric integrated in particle ltering framework to
achieve automatic moving target tracking and gaze stabiliza-
tion. In the proposed algorithm, the key spatial histograms
are extracted and taken into particle ltering framework.
An image-based feedback control scheme is implemented
todrivebionicSPMsuchthattheidentiedtargetistobe
trackedatthecenterofthecapturedimages.Meanwhile,
the robot motion information is fed forward to develop
an adaptive smooth tracking controller bioinspired by the
VOR mechanism. To perform good specication, we test our
vision stability system under the condition of violent attitude
variation when the robot works in bumpy environment.
From a robotics point of view, our system is biologically
inspired. While smooth tracking is employed to create a
consistent perception of the surrounding world, the inter-
action with environment is also used to adjust the control
model involved in the smooth tracking generation. Action
and perception are tightly coupled in a bidirectional way:
perception triggers an action and the output of action changes
the perception. Meanwhile, the robot motion information is
fed forward, inspired by the VOR mechanism, to stabilize
smooth tracking.
e paper is organized as follows. Section  introduces
bionic issues and design of our bionic SPM. Section  pro-
poses visual tracking based on symmetric Kullback-Leibler
metric spatiograms. Our bionic eye plant control system is
described in Section . Experimental results are shown in
Section . Section  presents our conclusion.
2. Design of Human-Eye-Inspired
PTZ Platform
2.1. Human-Eye’s Movement Mechanism. Each eye is con-
trolled by three complementary pairs of extraocular muscles,
as shown in Figure (a). e movement of each eye involves
rotating the globe of the eye in the socket of the skull. Because
of minimal translation during its movement, the eye can be
regarded as a spherical joint with an orientation dened by
threeaxesofrotation(horizontal,vertical,andtorsional).But
inourimplementationanddevelopmentofasimulator,we
view eye movement with no translation for simplicity.
e medial rectus turns eye inward and, thus, lateral
rectus outward. erefore, they form a pair to control the
horizontal position of the eye. In contraction to the pair of
medial rectus and lateral rectus, the actions of the other two
Applied Bionics and Biomechanics
Tro c hle a
Superior oblique
Superior rectus
Levator (cut)
Optic nerve
Inferior rectus
Lateral rectus
Inferior oblique
(a) (b)
F : Development of bionic eye plant. (a) Muscles of the eye. Six muscles, arranged in three pairs, control the movements of the eye as
shownhereinacutawayviewoftheeyeinitssocketororbit.(b)StructureofourSPMprototype.
pairs of muscles are more complex. When the eye is centered
in the orbit, the primary eect of the superior and inferior
rectus is to rotate up or rotate down the eye. However, when
the eye is deviated horizontally in the orbit, these muscles also
contribute to torsion, the rotation of the eye around the line of
sight that determines the orientation of images on the retina.
e primary eect of the superior and inferior obliques
is to turn eyes downward and upward when the eye does not
deviate from horizontal position. So do superior rectus and
inferior rectus. In addition, these muscles also determine the
vertical orientation of the eye.
Smooth pursuit eye movements slowly rotate the eyes
to compensate for any motion of the visual target and thus
act to minimize the blurring of the target’s retinal image
that would otherwise occur. We implement smooth target
tracking, continuously adjusted by visual feedback about the
target’s image (retinal image).
Kinematic characteristics of SPM and the mechanics of
eye movements are very similar []. Both have a -DOF
spherical movement and rotating globe is the center of the
sphere. SPM also has a compact structure, excellent dynamic
performance, and high accuracy, so -DOF SPM is in line
with the structural design of the bionic eye to replicate the
eye movement.
eeyeballisseenasasphere,witharotationcenterwhen
it rotates. Inspired by the mechanics of eye movements and
activeroboticvision,wepresentedanewbioniceyeprototype
basedonSPM,whichismadeupofaneye-in-handsystemas
shown in Figure (b).
Because the eye is free to rotate in three dimensions,
eyeballs can keep retinal images stable in the fovea when
they track moving target. In our work, we proposed two
main points about structural requirements inspired by the
human eyes: () camera center must be located at the center
of “eyeball” to ensure that the angle of image planes between
two dierent positions keeps identical with the rotation of
eyeball”; () in the process of eye movement, any mechanical
component except the “eyeball” cannot exceed the plane of
the center of the sphere as much as possible to ensure that
when the movement of the “eyeball” occurs, they do not block
the sight of the camera and do not interfere with the robot
face.
2.2. Oculomotor Plant Compensation of VOR. In the human-
eye VOR, a signal from the vestibular system related to
head velocity, which is encoded by semicircular ducts, is
used to drive the eyes in the opposite direction to the head
movement. e VOR operates in feedforward mode and
as such requires calibration to ensure accurate nulling of
head movement. e simplicity of this “three-neuron arc,”
together with the relatively straightforward mechanics of the
eye plant, has long made the VOR an attractive model for
experimental and computational neuroscientists seeking to
understand cerebellar function. To abolish image motion
across the retina, the vestibular signal must be processed
by neural circuitry which compensates for the mechanical
properties of the oculomotor plant. e VOR is therefore a
particular example of motor plant compensation. Horizontal
and vertical and angular and linear head movement motivate
the appropriate combinations of six extraocular muscles in
three dimensions.
2.3. Kinematics of Human-Eye-Inspired PTZ Platform. Spher-
ical parallel mechanism consists of an upper layer and a base,
connected by three pairs of identical kinematic subchains as
shown in Figure . In each chain, there is one xed revolute
joint 𝑖and two free revolute joints 𝑖and 𝑖connecting the
proximal link to the distal link and the distal link to the upper
layer, respectively. e axes of all revolute joints intersect at
a common point which is referred to as the rotational
center. e plane passing through the point and becoming
parallel with the base is called the sphere center plane, also
seen as Listings plane of eyeball. 1,2,1,2,andare
the parameters of this mechanism, where 1and 2are the
structural angle of the lower link and upper link, 1and 2
are the half-cone angle of the upper platform and the base,
and isthestructuraltorsionangleofinitialstateoftheupper
platform and the base, namely, the initial torsion angle.
Figure  demonstrates the kinematics of our SPM plat-
form, and the kinematic equation of the SPM is given by []
𝜃=1,2,1,2,,𝑥,𝑦,𝑧𝜔,()
Applied Bionics and Biomechanics
Eye plant
Distal link
Proximal link
Base
Sphere centre
datum
Z
Z13
𝛽2
Z0
Y
X
X0
Z11
Y0
𝛽1
𝛼1
𝛼2
Z12
O
F : Kinematic sketch of a spherical parallel manipulator.
where
𝜃=(
1,
2,
3)is the angular velocity vector input of
the motor, 𝜔=(𝑥,𝑦,𝑧)is the angular velocity vector out-
putoftheupperplatform,andis the Jacobian matrix which
is decided by the mechanical parameter (1,2,1,2,and )
and the eyeball posture (𝑥,𝑦,and 𝑧).1and 2are the
structural angles of lower link and upper link, respectively.
1and 2are the angles of base and upper platform. e
proposed PTZ platform has similar kinematics to the human
eye, as shown in Figure .
3. SKL-Based Particle Filter Visual Tracking
3.1. Spatial Histograms: Spatiogram. Color histogram is one
of the common target models which is just a statistic of
dierent colors in the entire picture in proportion without
concern for spatial location of each color. erefore, it is not
rather sensitive to rotation but suitable for nonrigid or prone
to deformation modeling target objects. Targets based on
this model are vulnerable to backgrounds which have similar
color distribution or other interference, thereby causing the
target tracking failure. In this paper, we improve the particle
lter algorithm based on a new target model, spatiogram
[], which adds the pixel coordinate information to the
traditional color histogram and uses SKL metric. e second-
order spatiogram can be described as follows:
()=𝑏,𝜇𝑏𝑏, =1,...,, ()
where is the total number of the intervals and {𝑏,𝜇𝑏𝑏}
is the probability of each interval, coordinate mean, and
covariance matrix, respectively. ey can be calculated using
the formula as follows:
𝑏=1
𝑁
𝑗=1𝑗𝑏,
𝜇𝑏=1
∗𝑏
𝑁
𝑗=1
x𝑗𝑗𝑏,
Σ𝑏=1
∗𝑏−1𝑁
𝑗=1 x𝑗𝜇𝑏x𝑗𝜇𝑏𝑇𝑗𝑏.
()
is the total number of pixels within the target area, x𝑗=
[𝑗,𝑗]𝑇is the coordinate position of the th pixel, and 𝑗𝑏 =
1denotes that the th pixel is quantized to the th interval,
while 𝑗𝑏 =0indicates that the th pixel is quantized to other
intervals.
3.2. SKL-Based Particle Filter. In order to apply the spa-
tiogram to target tracking, we need to select a method
to measure the similarity metrics of the spatial histogram
between the targets and the candidate targets. We select the
SKL-based coecient of similarity metrics to measure the
similarity of the target spatiogram () = {𝑏,𝜇𝑏𝑏}and
candidate target spatiogram 󸀠()={󸀠
𝑏,𝜇󸀠
𝑏󸀠
𝑏}.
Given a spatiogram ()={𝑏,𝜇𝑏𝑏},weuseaGaussian
distribution to describe the spatial distribution of all the
pixels in each section. e distribution of the th section can
be described as
𝑏(x)=1
2Σ𝑏1/2 exp −1
2x𝜇𝑏𝑇Σ−1
𝑏x𝜇𝑏, ()
where 𝜇𝑏is the mean value of all coordinates of the pixels of
the th interval and Σ𝑏is the mean covariance matrix of all
coordinates of the pixels of the th interval.
e KL distance between the Gaussian distribution 𝑏(x)
and the Gaussian distribution 󸀠
𝑏(x)can be obtained by a
closed form solution which is calculated using the following
formula:
KL 𝑏󸀠
𝑏=1
2log Σ󸀠
𝑏
Σ𝑏+Tr Σ󸀠
𝑏−1 Σ𝑏−
+𝜇𝑏𝜇󸀠
𝑏𝑇Σ󸀠
𝑏−1 𝜇𝑏𝜇󸀠
𝑏, ()
where is the spatial dimension (for spatiogram, =2).
Similarly, we can get the KL distance between the Gaussian
distribution 󸀠
𝑏(x)and the Gaussian distribution 𝑏(x):
KL 󸀠
𝑏𝑏=1
2log Σ𝑏
Σ󸀠
𝑏+Tr Σ−1
𝑏Σ󸀠
𝑏−
+𝜇󸀠
𝑏𝜇𝑏𝑇Σ−1
𝑏𝜇󸀠
𝑏𝜇𝑏. ()
e SKL distance of the two Gaussian distributions of 𝑏(x)
and 󸀠
𝑏(x)is
SKL 𝑏󸀠
𝑏=1
2KL 𝑏󸀠
𝑏+KL 󸀠
𝑏𝑏,
Applied Bionics and Biomechanics
Pan, x
Roll, z
𝛼
𝛾
Tilt, y
𝛽
(a)
x
z
y
Optic nerve
(b)
F : Analogue of camera rotation and eye movement. Our SPM prototype has similar kinematics to human eye.
SKL 𝑏󸀠
𝑏
=1
4Tr Σ󸀠
𝑏−1 Σ𝑏+Tr Σ−1
𝑏Σ󸀠
𝑏2
+1
4𝜇𝑏𝜇󸀠
𝑏𝑇Σ−1
𝑏+Σ󸀠
𝑏−1𝜇𝑏𝜇󸀠
𝑏. ()
Generally, the ranges of the similarity are [0,1],andthe
similarity 𝑏of each pair of intervals on the spatiogram can
be described as 𝑏=exp −SKL 𝑏,󸀠
𝑏. ()
us, the similarity of the spatiogram based on SKL distance
canbecalculatedas
,󸀠=𝐵
𝑏=1𝑏󸀠
𝑏𝑏
=𝐵
𝑏=1𝑏󸀠
𝑏exp −SKL 𝑏,󸀠
𝑏. ()
According to (), we can get
,󸀠=𝐵
𝑏=1𝑏𝑏
exp −1
4Tr Σ−1
𝑏Σ󸀠
𝑏+Tr Σ−1
𝑏Σ󸀠
𝑏−2+0
=𝐵
𝑏=1𝑏exp −1
4(+−2)=𝐵
𝑏=1𝑏=1.
()
is indicates that the similarity measure of symmetric
spatiogram based KL distance ensures that the object has the
most similarity to the target.
4. Image-Based Feedback and Dynamic
Compensation Eye Plant Control
4.1. Visual Feedback Scheme. When the target is identied
in the image, the visual feedback tracking control strategy
is proposed to control the bionic eye plant mechanism to
minimize a tracking error function, which is also called eye-
in-hand visual servoing [, ]. Since the relative distance
betweentheSPMandthemovingtargetislarge,ifthe
error function is dened in any D reference coordinate
frame, coarse estimation of the relative pose between the
SPMandthemovingtargetmaycausethemovingtarget
to fall out of the visual eld, while adjusting the SPM servo
mechanism, and also aect the accuracy of the pose reached
aer convergence. In our project, to make tracking control
more robust and stable, we dene a tracking error function
in the visual sensor frame, which is given by [, ]
e()=s()s,()
where s()and sare the measured and desired locations of
the centroid of the tracked moving target with respect to the
imageplane,respectively.Inourwork,wesets=[0,0]𝑇,a
constant, which is the centroid of the captured image.
4.2. Camera Calibration of Human-Eye-Inspired PTZ Plat-
form. BasedonthePTZvisualsystem,thecoordinatesystem
is established as shown in Figure . Assume the motion of the
object is unknown; how do we control motion of the PTZ
platform so that the projection of the moving object is xed
at the center of the image plane, with full consideration of the
dynamiceectsofthe“eyeball?Tomake-axis of the camera
coordinate system coincide with the target by adjusting the
postureofthecamera,wehavetocompensatetheosetangle
between the camera and the target. We employ a pinhole
camera model to obtain a more accurate camera projection.
Applied Bionics and Biomechanics
Following the general pinhole camera model, the intrinsic
parameter model equation of the camera is given by
V
1
=
𝑥0
0
0
𝑦V0
001
𝑐
𝑐
𝑐
𝑐
1
,()
where (,V)denotes the image coordinate of the target in
the image coordinate system. (0,V0)are the coordinates of
the principal point. (𝑐,𝑐,𝑐)is the target coordinate in the
camera coordinate system. 𝑥is the scale factor in the -
coordinate direction, and 𝑦is the scale factor in the -
coordinate direction.
In order to keep the target tracked in the center of the
eld, we need to make the target lie on the optical axis. e
location of the target which passes through the optical axis is
represented by (0,0,𝑇),where𝑇is the distance between the
cameraandthetarget.eorientationis
𝑐
𝑐
𝑐
=
cos −sin 0
sin cos 0
001
cos 0sin
010
sin 0cos
10 0
0cos −sin
0sin cos
0
0
𝑇
.()
Finally, we can deduce the angle oset between the target
and camera’s line of sight:
=arctan
𝑥0
𝑥,
=arctan 𝑥
𝑦VV02
2
𝑥+−02. ()
In our implementation, the camera center is located at the
center of “eyeball” so that the angle of image planes between
two dierent positions keeps identical with the rotation of
eyeball.” e -DOF SPM satises the principles of eyeball
movement; a camera can be mounted in the SPM and
actively oriented (horizontal, vertical, and torsional) around
its -axis, -axis, and -axis, respectively. We, considering
minimal translation during eyes movement, implement our
eye plant system with no translation for simplicity. So our
visual tracking strategy is applicable to all SPMs with no
translation.
In the visual tracking section, we give how to determine
thepositionofthemovingobject.erelativeposition
determines our visual tracking strategy. Eye rotation about
the vertical “-axis” is controlled by the lateral and medial
rectus muscles, which results in eye movements to le or
right. Rotation about the transverse “-axis” is controlled by
the superior and inferior rectus muscles, which elevates and
depresses the eye. Finally, rotations about the anteroposterior
-axis” result in counterclockwise as well as upward and
downward eye motion. See Figures (a) and (b). Our model
receives visually guided signal to control eye plant; see ().
Meanwhile, the robot motion information is fed forward
into control loop. Our whole bioinspired tracking system is
illustrated in Figure . It is known that the VOR is basically
driven by the signals from vestibular apparatus in the inner
ear. e semicircular canals (SCs) detect the head rotation
anddrivetherotationalVOR;ontheotherhand,theotoliths
detect the head translation and drive the translational VOR.
Anatomists and physiologists tend to engage in the VOR
as a simple neural system mediated by a three-neuron arc
and displaying a distinct function. Starting in the vestibular
system, SCs get activated by head rotation and send their
impulses via the vestibular nerve through brainstem neurons
and end in oculomotor plant. Here, we use IMU to acquire
pose change of eye from the robot.
When the robot works in the bumpy environment, rigid
bumpsandpulsejittercausetheoccurrenceofsignicant
turbulence with high frequency and posture change with
lower frequency. erefore, the motion information of the
robot is acquired and fed forward into the controller to
compensate the external disturbance. In [], an active
compensation model of visual error is proposed according
to the principle of VOR in (). Here, we use our proposed
bioinspired controller to compensate motion disturbance
caused by bumpy jitter. Hence,
()=()−V𝑛2
V+1𝑛+1
+()++−𝑙𝑠
𝑞𝑛
𝑛+1,()
where () = −(()+())is slide error of retina, ()
denotes the rotation angle of head, and () means the
rotation angle of eyeball. ,,andrepresent the gains
of the velocity signal of head rotation, the velocity signal
ofretinaslide,andthespikeofnerveberscausedbythe
displayment of retina, respectively. is the compensation
weight value of occulus caused by error signal of retina. In
our system, ,,andare equal to  and =2.5.Combining
position compensation with speed compensation of eyeball,
our system is used to build a smooth tracking system.
5. Experiments and Results
To prove that the proposed image-based feedback tracking
system based on our developed eye-in-hand prototype is able
to orient a camera with the required orientation changes,
Applied Bionics and Biomechanics
Motion feedforward
Visu al feedb ack
Pan-tilt-zoom
command
generator
Object position of image plane
Kinematics
Calibrator
Image2World
Motor system
Oculomotor
Coordinate
Trochl ea
Superior oblique
Superior rectus
Levator (cut)
Optic nerve
Inferior rectus
Lateral rectus
Inferior oblique
Eye plant
External disturbance
Eyeball
+
calibrator
system
s= (0, 0) e(t)
s(t)
system
coordinate +
+
Visual tracking
(SKL-PF)
20
0
(mm)
Mobile ro bot Vestibular organ
Bioinspired robotic eyes
tracking system
̇
𝜃=J𝜔
F : Tracking control scheme. Camera means human’s eyeball; IMU means canals.
IMU
Camera
Eye plant
F : Experimental platform based on Tracked Mobile Robot.
especially its dynamic disturbance resistance capability and
SPM-based structural dexterity, closed-loop control experi-
ments were performed. We design an experimental platform
basedonatrackedrobot,asshowninFigure.Avarietyof
obstacles are placed on the tracked robot’s path to simulate
a real harsh environment. We introduced the used joint
space control architecture in []. In the chosen control
approach, the desired camera orientation is transformed to
linear actuator set points using the inverse kinematics. us,
here only a brief overview of the architecture and exemplary
control results are presented.
To measure angular velocities of “eyeball” in three axes,
we employ the attitude sensor DM-GX-TM. e device
oers a range of output data quantities from fully cali-
brated inertial measurements (acceleration, angular rate, and
deltaAngle and deltaVelocity vectors) to computed orienta-
tion estimates, including pitch, roll, and heading (yaw) or
rotation matrix. All quantities are fully temperature com-
pensated and are mathematically aligned to an orthogonal
coordinate system.
In addition, the image information is gained by using
a high-speed camera (Guppy F-C), which is connected
totheIEEEcardinstalledinaPCwithIntelCore
CPU which acquires the video signal. e camera is an
ultracompact, inexpensive VGA machine vision camera with
CCD sensor (Sony ICX). At full resolution, it runs up
to  fps. We employ a pinhole camera model to obtain a
more accurate camera projection. Camera calibration was
repeated ten times to seek an approximate camera calibration
matrix =
𝑥00;0
𝑦0; 0V01.Acamerais
calibrated using chessboard []. Here, we employ a pinhole
camera model to obtain a more accurate camera projection.
Following the general pinhole camera model, the parameters
contained in are called the internal camera parameters
or the internal orientation of the camera. See [] for more
details.
Figure  shows our tracking control scheme. We imple-
mented smooth target tracking (smooth pursuit) to ensure
the target located in the eld of view, continuously adjusted
by visual feedback about the target’s image. Image is captured
from camera (retinal image) and IMU measures the robot
body’s movement to compensate dynamic disturbance.
Supposing that we do not know the motion of the tracked
object, how do we control the motion of the “eyeball” to
ensure that the moving object is xed at the centroid of the
image plane?
In the process of tracking moving target, the tracking
algorithmshouldberobusttoappearancevariationsintro-
duced by occlusion, illumination changes, and pose varia-
tions. In our library environment, the proposed algorithm
can relocate the target when object appearance changes
due to illumination, scale, and pose variations. Once the
moving target is located, the “eyeball” should keep images
stable in the eld of view (center of image). at is, target
position uctuates at zero. See Figures  and . Figure  gives
some snapshots of tracking results and demonstrates that
Applied Bionics and Biomechanics
270 310 350
390 430 470
590550510
F : Moving object tracked in the laboratory.
35.32%
87.56%
98.37%
025
25
10 15 x
y
15
5
5
10
20
30
20 35
35
30
Pixels dierence of tracking error
F : Pixel dierence of and directions.
the moving target is located in the eld of view. Meanwhile,
extensive experiments are conducted to perform bumpy-
resist capability. Figure  illustrates the pixel dierence in
and direction. Smaller eyeball errors accompanying larger
postural changes can be good proofs of good bumpy-resist
capability and VOR function.
Figures  and  show the performance of the proposed
tracking system on the tracked robot running across bumpy
environment. e result of statistics shows that .% of
50
40
30
20
10
0
−10
−20
−30
x-dierence y-dierence
Range
Range within 1.5 IQR
Median line
25%∼75%
25%∼75%
F : Statistics of and direction dierence.
tracking errors, including and direction dierence, have
fallen into the range of < pixels, as shown in Figure . e
statistics of and direction pixel dierence are demon-
strated in Figure . In our test, as the tracked robot platform
travels through the rough ground full of obstacles, rigid
bumpsandpulsejittercausetheoccurrenceofsignicant
turbulence with high frequency and the oscillatory posture
changes with lower frequency, which makes tracking eect
slightly larger than the data recorded in the literature, such as
[, ]. But our experiments are established under relatively
harsh environmental conditions, and the eect achieved is
objective. Tracking eects still stay in a controllable range like
the above situation. Apparently, this indicates that the system
has robustness.
Applied Bionics and Biomechanics
20
15
10
5
0
030 60 90 120
Time (s)
Amplitude ()
Robot, pitch
Robot, roll
Eye plant, pitch
Eye plant, roll
−5
−10
−15
−20
F : Experimental results of robot’s and eye plant’s pose.
20
15
10
5
0
Robot, pitch Robot, roll Eye plant, pitch Eye plant, roll
−5
−10
−15
−20
Range
Range within 1.5 IQR
Median line
Robot, pitch (25%∼75%)
Robot, roll (25%∼75%)
Eye plant, pitch (25%∼75%)
Eye plant, roll (25%∼75%)
F : Statistics of robot’s and eye plant’s pose.
In our actual situation, we install three IMUs on the
tracked robot and eye plants to measure the pose changes. We
recorded the angle variances to validate the system bumpy-
resist capability that the eyeball moves on the opposite
direction according to position compensation and velocity
compensation when the tracked robots pose changes. In
other words, the robot pose variance information is fed for-
ward into controller to form a head-eye coordination system.
Figures  and  show the experimental results of tracked
robotsandeyeplantsposechangesonthetrackedrobotin
bumpy environment. In addition, the large tracking errors
happen when the robot encounters instantaneous postural
changes. Nonetheless, quick returns to lower errors of eyeball
verify good robustness of the bionic visual tracking system
and high dexterity of the SPM-based bionic eye. Obviously,
these variances reect good stability of the tracking system.
6. Conclusion
To accurately replicate the human vision system, we pre-
sented a -DOF “eyeball” in the directions of horizontal,
vertical, and torsional axes according to the mechanics
of eye movements. us, an image-based visual feedback
tracking system is presented to minimize a tracking error
function, capable of tracking moving target. More specically,
the proposed real-time moving target tracking algorithm
utilizes spatial histograms and symmetric Kullback-Leibler
metric integrated into particle ltering framework to achieve
automatic moving target identication and gaze stabilization.
Meanwhile, the robot motion information is fed forward
to develop an adaptive smooth tracking controller bioin-
spired by the VOR mechanism. e experimental results
demonstrate that our algorithm is eective and robust in
dealing with moving object tracking and can always keep the
target at the center of the camera to avoid tracking failure.
Furthermore, as the tracked robot platform travels through
the rough ground full of obstacles, rigid bumps and pulse
jitter cause the occurrence of signicant turbulence with high
frequency and the oscillatory posture changes with lower
frequency. Tracking eects still stay in a controllable range
and this indicates that the system has bumpy-resist capability.
Conflict of Interests
e authors declare that there is no conict of interests
regarding the publication of this paper.
Acknowledgments
is work was partly supported by the National Natural Sci-
ence Foundation of China (nos.  and ), the
Nature Science Foundation of Shanghai (no. ZR),
and the Open Research Project of the State Key Laboratory
of Industrial Control Technology, Zhejiang University, China
(no. ICT).
References
[] B. Jung and G. S. Sukhatme, “Real-time motion tracking from
amobilerobot,International Journal of Social Robotics,vol.,
no. , pp. –, .
[] M. Hwangbo, J.-S. Kim, and T. Kanade, “Gyro-aided feature
tracking for a moving camera: fusion, auto-calibration and GPU
implementation,e International Journal of Robotics Research,
vol.,no.,pp.,.
[] M. Hwangbo, J.-S. Kim, and T. Kanade, “Inertial-aided KLT
feature tracking for a moving camera,” in Proceedings of the
IEEE/RSJ International Conference on Intelligent Robots and
Systems (IROS ’09),pp.,St.Louis,MO,USA,October
.
[] J. Park, W. Hwang, H. Kwon, K. Kim, and D.-I. D. Cho, “A novel
line of sight control system for a robot vision tracking system,
using vision feedback and motion-disturbance feedforward
compensation,Robotica,vol.,no.,pp.,.
 Applied Bionics and Biomechanics
[]Z.Jia,A.Balasuriya,andS.Challa,“Sensorfusion-based
visual target tracking for autonomous vehicles with the out-
of-sequence measurements solution,Robotics and Autonomous
Systems,vol.,no.,pp.,.
[] J.D.Hol,T.B.Sch
¨
on,H.Luinge,P.J.Slycke,andF.Gustafsson,
“Robust real-time tracking by fusing measurements from iner-
tialandvisionsensors,Journal of Real-Time Image Processing,
vol. , no. -, pp. –, .
[]A.Lenz,T.Balakrishnan,A.G.Pipe,andC.Melhuish,“An
adaptive gaze stabilization controller inspired by the vestibulo-
ocular reex,Bioinspiration and Biomimetics,vol.,no.,
Article ID , .
[] T. Shibata and S. Schaal, “Biomimetic gaze stabilization based
on feedback-error-learning with nonparametric regression net-
works,Neural Networks,vol.,no.,pp.,.
[]H.Xu,Y.Xu,H.Fu,Y.Xu,X.Z.Gao,andK.Alipour,
“Coordinated movement of biomimetic dual PTZ visual system
and wheeled mobile robot,Industrial Robot,vol.,no.,pp.
–, .
[] O. Avni, F. Borrelli, G. Katzir, E. Rivlin, and H. Rotstein,
“Scanning and tracking with independent cameras-a biologi-
cally motivated approach based on model predictive control,
Autonomous Robots,vol.,no.,pp.,.
[] J. Law, P. Shaw, and M. Lee, “A biologically constrained architec-
ture for developmental learning of eye–head gaze control on a
humanoid robot,Autonomous Robots,vol.,no.,pp.,
.
[] S. Xie, J. Luo, Z. Gong, W. Ding, H. Zou, and X. Fu, “Biomimetic
control of pan-tilt-zoom camera for visual tracking based-on an
autonomous helicopter,” in Proceedings of the IEEE/RSJ Inter-
national Conference on Intelligent Robots and Systems (IROS
’2007),pp.,SanDiego,Calif,USA,November.
[] L.Vannucci,N.Cauli,E.Falotico,A.Bernardino,andC.Laschi,
Adaptive visual pursuit involving eye-head coordination and
prediction of the target motion,” in Proceedings of the 14th IEEE-
RAS International Conference on Humanoid Robots (Humanoids
’14), pp. –, IEEE, Madrid, Spain, November .
[] E. Falotico, D. Zambrano, G. G. Muscolo, L. Marazzato, P.
Dario, and C. Laschi, “Implementation of a bio-inspired visual
tracking model on the iCub robot,” in Proceedings of the 19th
IEEE International Symposium on Robot and Human Interactive
Communication (RO-MAN ’10), pp. –, IEEE, Viareggio,
Italy, September .
[] L. Jamone, M. Fumagalli, G. Metta, L. Natale, F. Nori, and
G. Sandini, “Machine-learning based control of a human-like
tendon-driven neck,” in Proceedings of the IEEE International
Conference on Robotics and Automation (ICRA ’10),pp.
, Anchorage, Alaska, USA, May .
[] F. Nori, L. Jamone, G. Sandini, and G. Metta, “Accurate control
of a human-like tendon-driven neck,” in Proceedings of the 7th
IEEE-RAS International Conference on Humanoid Robots,pp.
–, IEEE, Pittsburgh, Pa, USA, November-December .
[] N. G. Tsagarakis, G. Metta, G. Sandini et al., “iCub: the design
and realization of an op en humanoid platform for cognitive and
neuroscience research,Advanced Robotics,vol.,no.,pp.
–, .
[] J. Leitner, S. Harding, M. Frank, A. F¨
orster, and J. Schmidhuber,
Anintegrated,modularframeworkforcomputervisionand
cognitive robotics research (icVision),” in Biologically Inspired
Cognitive Architectures,vol.ofAdvances in Intelligent Sys-
tems and Computing,pp.,Springer,.
[] X.-Y. Wang, Y. Zhang, X.-J. Fu, and G.-S. Xiang, “Design
and kinematic analysis of a novel humanoid robot eye using
pneumatic articial muscles,Journal of Bionic Engineering,vol.
, no. , pp. –, .
[] Y.-C. Lee, C.-C. Lan, C.-Y. Chu, C.-M. Lai, and Y.-J. Chen,
A pan–tilt orienting mechanism with parallel axes of exural
actuation,IEEE/ASME Transactions on Mechatronics,vol.,
no. , pp. –, .
[] C.-C. Lan, Y.-C. Lee, J.-F. Jiang, Y.-J. Chen, and H.-Y. Wei,
“Design of a compact camera-orienting mechanism with ex-
ural pan and tilt axes,” in Proceedings of the IEEE/RSJ Interna-
tional Conference on Intelligent Robots and Systems (IROS ’11),
pp. –, San Francisco, Calif, USA, September .
[] G. Cannata and M. Maggiali, “Models for the design of
bioinspired robot eyes,IEEE Transactions on Robotics,vol.,
no. , pp. –, .
[]J.Gu,M.Meng,A.Cook,andM.G.Faulkner,“Astudy
on natural movement of articial eye implant,Robotics and
Autonomous Systems,vol.,no.,pp.,.
[] F. Colonnier, A. Manecy, R. Juston et al., “A small-scale hyper-
acute compound eye featuring active eye tremor: application to
visual stabilization, target tracking, and short-range odometry,
Bioinspiration & Biomimetics,vol.,no.,ArticleID,
.
[] T. Villgrattner, E. Schneider, P. Andersch, and H. Ulbrich,
“Compact high dynamic  DoF camera orientation system:
development and control,JournalofSystemDesignand
Dynamics,vol.,no.,pp.,.
[] T. Villgrattner and H. Ulbrich, “Optimization and dynamic
simulation of a parallel three degree-of-freedom camera orien-
tation system,” in Proceedings of the 23rd IEEE/RSJ 2010 Interna-
tional Conference on Intelligent Robots and Systems (IROS ’10),
pp. –, Taipei, Taiwan, October .
[] C. M. Gosselin, E. St. Pierre, and M. Gagn´
e, “On the develop-
ment of the agile eye,IEEE Robotics & Automation Magazine,
vol. , no. , pp. –, .
[] Y.-B. Bang, J. K. Paik, B.-H. Shin, and C. Lee, “A three-
degree-of-freedom anthropomorphic oculomotor simulator,
International Journal of Control, Automation and Systems,vol.
, no. , pp. –, .
[] S. Refaat, J. M. Herv´
e, S. Nahavandi, and H. Trinh, “Two-mode
overconstrained three-DOFs rotational-translational linear-
motor-based parallel-kinematics mechanism for machine tool
applications,” Robotica,vol.,no.,pp.,.
[] C. Li, S. Xie, H. Li, D. Wang, and J. Luo, “Design of bionic
eye based on spherical parallel mechanism with optimized
parameters,Robot,vol.,article,.
[] S. T. Bircheld and S. Rangarajan, “Spatiograms versus his-
tograms for region-based tracking,” in Proceedings of the IEEE
Computer Society Conference on Computer Vision and Pattern
Recognition (CVPR ’05), vol. , pp. –, IEEE, June .
[] F. Chaumette and S. Hutchinson, “Visual servo control. I. Basic
approaches,IEEERoboticsandAutomationMagazine,vol.,
no. , pp. –, .
[] F. Chaumette and S. Hutchinson, “Visual servo control. II.
Advanced approaches [Tutorial],IEEE Robotics and Automa-
tion Magazine, vol. , no. , pp. –, .
[] H. Li, J. Luo, C. Li, L. Li, and S. Xie, “Active compensation
method of robot visual error based on vestibulo-ocular reex,
Jiqiren/Robot,vol.,no.,pp.,.
Applied Bionics and Biomechanics 
[] C.Li,S.Xie,H.Li,J.Miao,Y.Xu,andJ.Luo,“Systemdesign
and study on bionic eye of spherical parallel mechanism based
on attitude closed-loop control,” Jiqiren/Robot,vol.,pp.
, .
[] Z. Zhang, “A exible new technique for camera calibration,
IEEE Transactions on Pattern Analysis and Machine Intelligence,
vol.,no.,pp.,.
[] R. Hartley and A. Zisserman, Multiple View Geometr y in
Computer Vision, Cambridge University Press, Cambridge, UK,
.
... Since the first scientific publications, schematic, theoretical and computational models have been reported to measure, manipulate, and compensate the effects of ocular aberrations in the image on the human retina; more recently, adaptive binocular visual simulators have emerged to do so [1,2,3]. On the other hand, technological advances have allowed the development of various bio-inspired vision systems in simple or compound eyes, with a great variety of eyepieces, binoculars, stereos, active, tunable, adaptable, and reconfigurable, with image processors, object tracking, light-sensitive, etc. [4][5][6][7][8][9][10][11][12][13][14][15][16][17][18][19][20][21]. Miniaturized cameras with multi lenses and others to adapt to the eye have also been developed [11][12]; some systems can generate selected blur [14]. ...
... There are also devices to measure, reduce, imitate or reproduce aberrations in artificial eyes separately or together, with GRIN lenses inclined and uncentered, or with electrical actuation to mimic eye asymmetries [53,57,[63][64][65][66][67][68] moreover, there are systems for carrying out navigation and identification tasks and thereby coordinate eye movement [16,69]; likewise, control schemes have been developed to maintain sight of the object using peripheral information [12]. ...
Article
Full-text available
Bio-inspired optical systems have recently been developed using polarizers and liquid or rigid lenses. In this work, we propose a bio-inspired opto-mechatronic system that imitates the accommodation and regulation of light intensity as the human eye does. The system uses a polymeric lens as a cornea, an adjustable diaphragm as an iris, a tunable solid elastic lens as a crystalline lens, and a commercial sensor as a retina. We also present the development of the electronic control system to accommodate and regulate the amount of light that enters the system, for which two stepper motors, an Arduino control system, and light and movement sensors are used. The characterization of the system is presented together with the results obtained, where it can be seen that the system works in an acceptable range as the human eye does.
... III. RELATED ROBOTICS WORK This section reviews those robotic eye models in the literature that have tried to replicate characteristics found in biology. These models have focused on either achieving high movement speeds, on human-like external appearance [31], [32], [33], on replicating pre-recorded human gaze shifts [34], [35], [36], or on hard-coding the biological control rules into the robot control system [37]. Yet, none of these models studied how the eye movements emerge and how they can generalize to other designs, as every new design requires a new controller. ...
Article
Full-text available
Using robotic models to test theories on the behavior of humans or animals can help understand many aspects of intelligence and control in natural systems. In this paper, we study how model-based optimal control principles can explain stereotyped human oculomotor behaviors, through simulations in a realistic model of the human eye with a cable-driven actuation system that mimics the six degrees of freedom of the extraocular muscles. Previous works have only addressed systems with 1-3 degrees-of-freedom. The current paper is a first study on a six-muscle system design, which introduces novel challenges to address. We propose nonlinear optimal control techniques to optimize the accuracy, energy, and duration of eye movement trajectories. We use a recurrent neural network that learns to emulate the nonlinear system dynamics from recorded sample trajectories. We focus on the generation of rapid saccadic eye movements with fully unconstrained kinematics and corresponding control signals for the six cables that simultaneously satisfied the proposed optimization criteria. We show that realistic three-dimensional rotational kinematics and dynamics, as seen in human saccades, emerged from our model. Interestingly, just as in the primate oculomotor system, the six cables organized themselves into appropriate antagonistic muscle pairs without slack.</p
... Villgrattner et al. built a 2-DOF [6] and a 3-DOF [7] parallel bionic eye mechanism using model airplane motors to drive the camera through the connecting rods. Li et al. developed an improved 3-DOF spherical parallel bionic eye mechanism [8,9], a 3-DOF ultrasonic motor-driven parallel bionic eye mechanism [10] and a 3-DOF bionic eye using a hybrid-driven cable parallel mechanism [11]. Lee et al. [12] proposed a 2-DOF parallel bionic eye mechanism that uses 2 motors to drive the platform through the flexible rods. ...
Article
Full-text available
To obtain a large viewing angle and fast, smooth tracking of the target for a robot in an unknown environment, the vertebrate extraocular muscles are imitated in this paper, the configuration with a large field of view and high motion precision is studied, and then, a smallsized "inverted-tricept" parallel bionic eye mechanism is designed. For the "inverted-tricept" mechanism, the large inclination will make the mechanism more "slender", whereas the high precision requires the mechanism to be "stout". Thus, there is a contradiction between these two indexes. To ensure a large field of view and high eye movement accuracy simultaneously, the non-dominated sorting-based genetic algorithm-II (NSGA-II) is adopted under the constraints of the overall dimensions, driver stroke and hinge deflection to optimize the moving platform inclination and kinematic performance. Thus, the structural parameters are obtained. According to the optimization results, the dynamic simulation analysis of the system is conducted, and the rationality of the structural parameters is verified. In addition, the experimental prototype is built accordingly. It is shown from the actual measurement results that the designed parallel bionic eye has a distortion-free field of view of up to 200.95° and an eye movement accuracy of 0.01°, which lay a foundation for the wide range search and high-precision ranging of the subsequent binocular stereo vision system.
Article
Simulation equipment designed to replicate human eye movement, encompassing saccades, gaze, and other nuances, proves invaluable in eye movement tests, serving as an effective substitute for human subjects. Functioning as a reliable value system, it excels in ensuring precision in measurements and facilitating systematic analysis. This article introduces the design of a human eye simulation device featuring a unique dual‐axis motor structure. This design allows for the controlled rotation of the simulated eyeball without the need to move the entire device. Subsequently, a motion control algorithm is developed based on the coupling relationship of poses during dual‐axis rotation, facilitating precise control of the line of sight angle. Finally, an objective comparison of test data between real human eyes and the simulated eye device is conducted, validating the efficacy of the latter.
Article
Full-text available
IntroductionThis study aimed to report the surgical outcomes in patients with high-energy induced subtrochanteric fracture and determine the risk factors for nonunion using statistical analysis.Methods This study evaluated 88 patients with high-energy induced subtrochanteric fractures who underwent surgeries with indirect reduction technique and intramedullary nailing between March 2015 and December 2020. Outcome measures, including union time and nonunion incidence, were assessed by radiologic evaluation. Multiple logistic regression analyses were performed to identify the risk factors for nonunion, using age, sex, injury severity score, body mass index, preoperative mobility score, implant, and isthmic fixation as covariates.ResultsFive nonunions and two delayed unions were identified. The average union time was 17.4 weeks. Multiple logistic regression analyses showed that poor isthmic fixation was the only risk factor for nonunion (odds ratio 15.294, 95% confidence interval 1.603–145.894, P value 0.018). Out of five nonunion cases, four were confirmed as hypertrophic, and one was confirmed as atrophic.Conclusion Although surgical treatment using an indirect reduction technique and intramedullary nailing showed good outcomes, hypertrophic nonunion due to distal instability could occur if a firm fixation at the level of the isthmus cannot be achieved.Level of evidenceLevel III, retrospective cohort study
Article
Full-text available
In this study, a miniature artificial compound eye (15 mm in diameter) called the curved artificial compound eye (CurvACE) was endowed for the first time with hyperacuity, using similar micro-movements to those occurring in the fly’s compound eye. A periodic micro-scanning movement of only a few degrees enables the vibrating compound eye to locate contrasting objects with a 40-fold greater resolution than that imposed by the interommatidial angle. In this study, we developed a new algorithm merging the output of 35 local processing units consisting of adjacent pairs of artificial ommatidia. The local measurements performed by each pair are processed in parallel with very few computational resources, which makes it possible to reach a high refresh rate of 500 Hz. An aerial robotic platform with two degrees of freedom equipped with the active CurvACE placed over naturally textured panels was able to assess its linear position accurately with respect to the environment thanks to its efficient gaze stabilization system. The algorithm was found to perform robustly at different light conditions as well as distance variations relative to the ground and featured small closed-loop positioning errors of the robot in the range of 45 mm. In addition, three tasks of interest were performed without having to change the algorithm: short-range odometry, visual stabilization, and tracking contrasting objects (hands) moving over a textured background.
Conference Paper
Full-text available
Nowadays, increasingly complex robots are being designed. As the complexity of robots increases, traditional methods for robotic control fail, as the problem of finding the appropriate kinematic functions can easily become intractable. For this reason the use of neuro-controllers, controllers based on machine learning methods, has risen at a rapid pace. This kind of controllers are especially useful in the field of humanoid robotics, where it is common for the robot to perform hard tasks in a complex environment. A basic task for a humanoid robot is to visually pursue a target using eye-head coordination. In this work we present an adaptive model based on a neuro-controller for visual pursuit. This model allows the robot to follow a moving target with no delay (zero phase lag) using a predictor of the target motion. The results show that the new controller can reach a target posed at a starting distance of 1.2 meters in less than 100 control steps (1 second) and it can follow a moving target at low to medium frequencies (0.3 to 0.5 Hz) with zero-lag and small position error (less then 4 cm along the main motion axis). The controller also has adaptive capabilities, being able to reach and follow a target even when some joints of the robot are clamped.
Article
Full-text available
In this paper we describe a biologically constrained architecture for developmental learning of eye–head gaze control on an iCub robot. In contrast to other computational implementations, the developmental approach aims to acquire sensorimotor competence through growth processes modelled on data and theory from infant psychology. Constraints help shape learning in infancy by limiting the complexity of interactions between the body and environment, and we use this idea to produce efficient, effective learning in autonomous robots. Our architecture is based on current thinking surrounding the gaze mechanism, and experimentally derived models of stereotypical eye–head gaze contributions. It is built using our proven constraint-based field-mapping approach. We identify stages in the development of infant gaze control, and propose a framework of artificial constraints to shape learning on the robot in a similar manner. We demonstrate the impact these constraints have on learning, and the resulting ability of the robot to make controlled gaze shifts.
Article
Full-text available
This paper presents the design and prototype of a camera-orienting mechanism. Bioinspired actuators and mechanisms have been developed to pan and tilt a camera with comparable characteristics as a human eye. To meet the stringent space/weight requirement of robotic applications, a compact-orienting mechanism is proposed. We specifically aim at matching the size of a human eye. Through the arrangement of two parallel placed actuators and flexible mechanisms, nearly uncoupled pan and tilt motions can be provided in a streamlined space. The flexible mechanisms utilize the deflection of beams to replace the kinematic joints; thus, they have fewer parts and can be easily adapted to a small and irregular design space. The optimal mechanism configuration has linear input-output relation that makes driving electronics very direct. Through verification and prototype illustration, the novel orienting mechanism is expected to serve as an alternative for robotic vision applications.
Article
The required workspace of the bionic eye is firstly defined. According to the special requirement of its configuration, a method to determine structure parameters of SPM (spherical parallel mechanism), which utilizes the worst-case performance index as the optimizing target in required workspace, is presented. Consequently, the worse performance in certain positions as well as the appearance of singular Jacobian matrix, which always happen when taking the average performance index as the optimizing target, are avoided. In addition, the optimizing process is carried out in required workspace instead of reachable workspace for the sake of minimizing the optimization scope and circumventing the calculation of reachable workspace. Finally, the bionic eye with the optimized parameters meets the design requirements.
Article
This paper presents a novel line of sight control system for a robot vision tracking system, which uses a position feedforward controller to preposition a camera, and a vision feedback controller to compensate for the positioning error. Continuous target tracking is an important function for service robots, surveillance robots, and cooperating robot systems. However, it is difficult to track a specific target using only vision information, while a robot is in motion. This is especially true when a robot is moving fast or rotating fast. The proposed system controls the camera line of sight, using a feedforward controller based on estimated robot position and motion information. Specifically, the camera is rotated in the direction opposite to the motion of the robot. To implement the system, a disturbance compensator is developed to determine the current position of the robot, even when the robot wheels slip. The disturbance compensator is comprised of two extended Kalman filters (EKFs) and a slip detector. The inputs of the disturbance compensator are data from an accelerometer, a gyroscope, and two wheel-encoders. The vision feedback information, which is the targeting error, is used as the measurement update for the two EKFs. Using output of the disturbance compensator, an actuation module pans the camera to locate a target at the center of an image plane. This line of sight control methodology improves the recognition performance of the vision tracking system, by keeping a target image at the center of an image frame. The proposed system is implemented on a two-wheeled robot. Experiments are performed for various robot motion scenarios in dynamic situations to evaluate the tracking and recognition performance. Experimental results showed the proposed system achieves high tracking and recognition performances with a small targeting error.
Article
An attitude feedback based method for establishing the closed-loop control system of spherical parallel mechanism (SPM) is proposed to circumvent the difficulty caused by the complex, three-dimensional, nonlinear and strongly coupled relationship between the input and output of the mechanism. SPM, then, is employed to the design of bionic eye which emulates the function of human's eye but is bigger than it in size, and incorporates the interface of the signal of control and video. In addition, the real-time online calculation of the inverse kinematics and the scheme of the closed-loop control is conducted on DSP (digital signal processor). Finally, the experimental results substantially confirm that the improved positioning precision of the bionic eye is obtained by introducing the proposed algorithm.
Article
Purpose – The purpose of this paper is to explore a novel control approach for swift and accurate positioning and tracking of a mobile robot. Coordinated movement of the mobile robot-body and chameleon-inspired binocular “negative correlation” visual system (CIBNCVS) with neck has rarely been considered in conventional mobile robot design. However, it is vital in swift and accurate positioning and tracking of the target. Consequently, it is valuable to find an optimized method where the robot-body, the biomimetic eyes and neck could achieve optimal coordinated movement. Design/methodology/approach – Based on a wheeled mobile robot, a biomimetic dual Pan–Tilt–Zoom visual system with neck is constructed. The cameras can rely on the unique “negative correlation” mode of chameleon vision, and cooperate with neck, achieving swift search of the 160° scope in front of the robot. Genetic algorithm is used to obtain optimal rotation of the neck and robot-body. Variable resolution targeting is also applied for accurate aiming. Using these two approaches, we can achieve efficient targeting with low energy consumption. Particle filter algorithm is further utilized for real-time tracking. Findings – In the proposed approach, swift and accurate positioning and tracking of the target can be obtained. The rationality of the approach is verified by experiments on flat and sandy terrains with satisfactory results. Originality/value – This paper proposes a novel control approach for wheeled mobile robots, which achieves coordinated movement of the robot-body and CIBNCVS with neck concerning time and energy saving in the process of swift and accurate tracking.
Article
To solve the problem of vision instability caused by attitude variation of the robot working under bumpy environment, a method of active compensation for robot visual error is proposed based on the principle of vestibulo-ocular reflex(VOR). According to oculomotor neural circuits, an adaptive control model of vestibulo-ocular reflex is established on the basis of physiology and anatomy. In order to verify the model's performance, some simulation experiments are performed in different environments. Simulation results show that the model can actively compensate the visual error caused by robot attitude variation and has better adaptability. Finally, physical robot experiments demonstrate the validity and accuracy of the control model.
Article
This paper reports on the development and control of a compact, high dynamic camera orientation system with three degrees–of–freedom (DoF). The system orients a small camera around its pan, tilt, and roll axes, using a parallel kinematics driven by ultrasonic piezo–actuators. To fit its application as part of a gaze–driven head–mounted camera system (EyeSeeCam) or as an artificial eye for humanoid robots, the camera orientation device was designed to be small in weight and size as well as to replicate the high dynamic movements of the human eye. The mechanical setup is described and the closed loop control architecture, including a dead zone compensation for the actuators, is introduced. Control experiments conducted with the prototype demonstrated that the system performance is comparable to and even exceeds that of the human oculomotor system.