Hindawi Publishing Corporation
Advances in Human-Computer Interaction
Volume 2009, Article ID 901707, 6 pages
CyARM: Haptic SensingDeviceforSpatial Localization on
Basis of Explorationby Arms
JunichiAkita,1Takanori komatsu,2Kiyohide Ito,3Tetsuo Ono,4andMakoto Okamoto3
1Department of Info.Sys.Eng., Kanazawa University, Kakuma, Kanazawa, Ishikawa 920-1192, Japan
2International Young Researcher Empowerment Center, Shinshu University, 3-15-1 Tokida, Ueda, Nagano 386-8567, Japan
3Department of Media Architecture, 116-2 Kamedanakano, Hakodate, Hokkaido 041-8655, Japan
4Graduate School of Infomation Science and Technology, Hokkaido University, Kita 14, Nishi 9, Kita-ku, Sapporo,
Hokkaido 060-0814, Japan
Correspondence should be addressed to Junichi Akita, firstname.lastname@example.org
Received 10 June 2009; Accepted 25 October 2009
Recommended by Sunil K. Agrawal
We introduce a new type of perception aid device based on user’s exploration action, which is named as CyARM (acronym of
“Cyber Arm”). The user holds this device in her/his arm, the extension of the arm is controlled by tension in wires, which are
attached to her/his body according to the distance to the object. This user interface has unique characteristics that give users
the illusion of an imaginary arm that extends to existing objects. The implementations of CyARM and our two experiments to
investigate the efficiency and effectiveness of CyARM are described. The results show that we could confirm that CyARM can be
used to recognize the presence of an object in front of the user and to measure the relative distance to the object.
Copyright © 2009 Junichi Akita et al. This is an open access article distributed under the Creative Commons Attribution License,
which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
We as humans can avoid obstacles while walking or can stop
along our way when blooming flowers catch our attention.
Our brain and sensory organs bear a large part of responsi-
bility for controlling such spontaneous behaviors . Here,
the question arises; “Are we able to perceive the surrounding
environment as it really is?” For example, insects perceive
their environment with compound eyes, while bats do so
with an ultrasonic wave transmitter and receptor. Therefore,
one can say that all animals have developed sensory organs
and methods of environmental perception that are uniquely
suitable for their style of living. Therefore, one can say that
they perceive the same environment differently.
The sensory organs of humans can be roughly divided
into the following two groups :
(i) peripheral receptors that detect faraway objects, for
example, eyes, ears, or nose.
through sensations received through the skin, mem-
branes, or muscles, that is, the tactile senses.
However, these categories are not so strict; for example,
although radiant heat is not generally perceived via a
tactile receptor, this receptor can interpret that stimulus in
some cases. Therefore, possibilities exist that a person can
perceive a stimulus by using a variety of receptors; in other
words, some sensory organs can compensate the others. For
example, visually impaired people develop their auditory
functions to perceive space in their environment using the
relationship between direct and reflective sound waves. We
believe that creating nonsensory organs and/or using our
physical actions as a new kind of sensory organ is also
In our study, we focused on physical action as a way
of compensating the sensory organs, and here we describe
a new interactive device, called CyARM (abbreviation of
“Cyber Arm”) that is designed to provide users with a
unique and intuitive interface for perceiving their living
space. In this paper, Section 2 describes our concept of
CyARM, and Section 3 describes the detail about the actual
implementation of CyARM. And Section 4 and 5 describes
the experiments to evaluate the ability of the CyARM, and
finally Section 6 shows the discussion and conclusions.
2 Advances in Human-Computer Interaction
2.Concept of CyARM
2.1. Conventional Visual Aid Devices. Currently, numerous
visual aid devices have been developed, especially those
for the visually impaired that replace the white cane.
These devices can be divided into two groups with respect
to the modality of their user interfaces. The first group
uses auditory signals. Specifically, information about the
surrounding environment is gathered by ultrasonic sensors
and transformed into audible sounds that users can hear
(e.g., high pitched sounds means the object is close to them).
For example, Sonicguide  and TriSensor (KASPA) are
equipments that generate the audible sound depending on
the distance to the object by the ultra sonic distance sensor.
The second group exploits tactile modality. Information
obtained by distance sensors is conveyed to the user through
tactile stimuli such as vibrations. Some of these methods
have already been applied to commercial products.
Up to now, these devices have had some critical prob-
lems. For example, the devices in the first group that
users to create and use a kind of mental mapping between
distance and sound pitch, and this can cause cognitive or
mental distress for those users. Moreover, these audible
sounds sometime masks the external sounds that were quite
important information for blind persons to comprehend
their current situation, so that the blind person reported that
these devices cannot be used in the crowded environment.
The devices in the second group also force the user to
create a kind of mental mapping between distance and
frequency of vibration, and again users can have difficulty
the continuous sensing of these vibrations makes users
decrease their sensitivity of the vibrations.
2.2. Concept. A great deal of research in sensor technology
has been undertaken in recent years. If this sophisticated
technology was successfully coupled with new interface
methods, it could be used to create sensing devices that
would aid users in perceiving their surroundings. Since
humans cannot intuitively recognize artificial signals (e.g.,
the meanings of vibrations mentioned in the previous
section), proposing an intuitive method of transforming
such signals into sensory information is essential to make
these signals easy for users to understand.
We have designed a new sensory aid device named
CyARM (abbreviation of “Cyber Arm”) that allows users
to perceive distance and other spatial information without
interference with natural sound sources and without cogni-
tive or mental loads [4–7]. Suppose that you tried walking
with your eyes closed, you would attempt to investigate
your environment by extending your arms in front of you.
This kind of behavior is similar to the functions of insects’
antennae or the white cane of the visually impaired. Similar
to the use of those antennae or canes, when the extended
arm touched some objects, one would bend one’s arm at the
elbow and stop exploring. On the other hand, if no objects
are in found in front of one, the arm naturally extends, as is
illustrated in Figure 1. Here, the physical motions of bending
Figure 1: Metaphor of physical arm motion. The position of the
arm in the user’s exploration (extending) action corresponds to the
distance to the object.
Figure 2: Concept of CyARM
and extending thearmcanbe considered asa kind of sensory
CyARM was developed by focusing on this intuitive
Specifically, we assumed that the users would grasp CyARM
in their hand and investigate a desired direction by holding
this device as shown in Figure 2. The CyARM is connected to
with ultrasonic waves. The tension of the wire is controlled
is a short distance away, the CyARM pulls the wire tightly
so that users feel stronger tension and their arms are forced
to bend. This indicates that the object can be reached by
just extending the arm. On the other hand, if the object is
far away, CyARM stops after giving just enough slack to the
wire so that users can extend their arm and feel almost no
tension; this means that the object cannot be reached. In this
way, users can search for objects in any direction by holding
this device. There are a lot of studies on force representation
device using wire tension in the area of virtual reality (VR),
such as the SPIDER system [8–10]. These devices aim to
represent the real force according to the information from
the controller, while the operation of CyARM is directly
based on the user’s exploration action.
Advances in Human-Computer Interaction3
(To user’s body)
(f = 38kHz)
Figure 3: Mechanical architecture of first prototype of CyARM.
Figure 4: First developed prototype of CyARM.
3.1. First CyARM Prototype. The mechanical architecture
of the first CyARM prototype is illustrated in Figure 3.
Ultrasonic sensors measure the distance to an object (mea-
surement range is from 0.3 to 3.0m), and the geared motor
wire is rewound to the appropriate position as determined
from the measured distance. When users attempt to extend
their arms, the device detects a slight displacement of the reel
caused by the wire tension, and regulates rewinding the wire.
Figure 4 is a photograph of the first CyARM prototype
we developed. Ultrasonic sensors were placed on the front
of its body to facilitate easy aiming for users. A hook was
attached near the wire release site allowing the device to
be attached to the users’ body (e.g., on a belt loop). This
CyARM prototype weighs 500grams and its dimensions are
15cm height, 10cm width, and 10cm depth. The distance
to the object is measured every 50ms, in other words, the
are both controlled in the same period. Since the actions of
users’ arms are quite slow in relation to this measurement
cycle time, the wire can be stably controlled. The maximum
speed of the wire retraction is approximately 1:0m/s. The
Reel’s position info.
Figure 5: Mechanical architecture of the second prototype of
Figure 6: Second developed prototype of CyARM.
coefficient of the wire length against the measured distance
to the object is currently configured as 0.17 (wire of 50mm
against the distance of 300mm).
3.2. Second CyARM Prototype. The first CyARM prototype
was too heavy to be used continuously for very long periods.
Therefore, we developed a second CyARM prototype to
reduce the weight of those that parts users carry in their
hands. Figure 5 shows the architecture of the second CyARM
in order to reduce the weight of the component users hold
in their hands, that component contains only the distance
sensor. The other parts, for example, reel, motor, controllers,
and the battery, are placed at the second component that
users carry on their bodies. These two components are
transmit the measured distance signal and supply power to
the sensor circuitry.
Figure 6 is a photograph of the second CyARM proto-
functions are identical to those of the first prototype. We
used a brushless motor to reduce free torque; this effectively
reduced the force required when the users extend the wire by
extending their arms.
The coefficient of the wire length against the measured
distance to the object is currently configured as 0.27 (wire
of 27mm against the distance of 100mm). This coefficient is
4 Advances in Human-Computer Interaction
Figure 7: Practical use of CyARM.
expected to depend greatly on a user’s body size, and should
be configured when it is first used according to that user’s
body size and the purpose for using the device.
In the next sections, we describe two experiments to
investigate the efficiency and effectiveness of the second
prototype of the CyARM. The first experiment was to
confirm its accuracy in detecting the presence of an object,
and the second was to confirm the accuracy of perceiving the
distance to an object by using CyARM.
4.Experiment 1:Target Presence Detection
4.1. Overview. Four sighted persons with blindfolds and
one visually impaired person who was completely blind
participated in this experiment. At first, an experimenter
gave the participants brief instructions about the CyARM,
for example, the concept of this device and how to use
this device. After given the instructions, they were asked to
practice with CyARM for a few minutes. All participants
were then asked to stand and hold the CyARM while wearing
headphones playing white noise to prevent those hearing
to be blindfolded during the experiment.
The task of these participants was to determine whether
a static object was present in front of them by using CyARM.
A white board, one meter wide and two meters high, was
randomly placed as the static object at a distance of about
two meters in front of the participants. Participants were
asked to report whether they felt the presence or the absence
of the object. Each participant experienced 20 trials; in 10
trials out of 20 the object was present, while in the other 10
trials it was not. Each trial took about less than 10 seconds,
so that each participant took about 40 minutes for this
experiment including the receiving the instructions from the
experimenter and practicing the CyARM. The sequences of
the presence or absence of the objects are set as random.
4.2. Results. The experimental results are summarized in
Table 1. The mean percentage among the five participants
Table 1: Results of experiment for recognizing the existence of
objects (with standard deviations in parentheses).
Reported as existing Reported as not existing
Object does not exit
of successfully detecting the object’s presence or absence
was 90.0% for the object’s presence, while the percentage
for object absence was 96.0%. The results of chi-square
test showed that the participants succeeded in detecting the
presence or absence of the object (χ2= 61.9, P < .001).
The percentage of the successful detection is almost constant
during the trial of each participant, after the learning
operation prior to the experiments.
Although we could not conduct the statistical analysis
between the sighted and blind participants due to the
limited number of the participants, it seemed that there was
no differences between these participants. The result then
suggests that CyARM is an efficient device for detecting the
presence of objects near users.
5.Experiment 2:Perceivingthe Distance
5.1. Overview. Next, we conducted an experiment to evalu-
ate the efficacy of the CyARM for perceiving the distance to
six women) participated in this experiment. They did not
participate in Experiment 1. As the same with Experiment
1, after being given brief instructions and practicing using
CyARM, all participants were asked to be blindfolded, and
to stand and hold the CyARM while wearing headphones
playing white noise. The task of these participants was to
report the perceived distance to the object by using CyARM.
in the Experiment 1.
At first, we conducted the training phase in which
the participants experienced the CyARM’s actual behaviors
whentheobject wasplaced at either 50cmor 150cmin front
of the participants. The distance from the participants to the
object was measured as the distance from the participants’
toes to the object. After this training phase, the actual
experimental procedure was started; the object was placed in
front of the participants at random distances from 50cm to
150cm with steps of 10cm, and the participants were asked
to report the perceived distance. Also the participants were
informed that the object was placed in 10cm steps from
50cm to 150cm. Each participant experienced the two sets
distance were counterbalanced among participants.
5.2.Results. Figure 8showstherelationoftheactualdistance
and the averaged reported distance in this experiment,
where the crosses in Figure 8 correspond to the pairs of the
presented distances and the averaged reported distances by
Advances in Human-Computer Interaction5
Actual distance (cm)
Reported distance (cm)
= actual distance
= 0.84 × actual
distance + 4.5
of reported distance
The regression line of the participants reported distance
was (reported distance) = 0.84 ∗ (actual distance) + 4.5,
with the product-moment correlation coefficient between
the actual and reported distance of r = 0.873. Due to
this significant higher correlation, it can be said that the
participants could reasonably recognize the distance to the
object. However, thses results simultaneously showed that
the smaller errors occurred in the short distance and the
bigger errors occurred in the long distance.
As a reason of this slight tendency of perceiving a
smaller distance than the actual distance, we assumed that
the posture of the arm holding the CyARM caused this
phenomenon. For example, when the participants perceived
their arms. On the other hand, when they perceived the
object near to them, they must bend their arms at the elbow.
This means that the CyARM was far from their body in the
former case, and it was near in the latter one. As already
mentioned in the above, the distance from the participants
to the object was measured from the participant’s toe.
Therefore, when the participants perceived the object at a
far distance, the reported distance should be shorter than
the actual one because the CyARM was away from their
body, and when they perceived the object at a near distance,
the reported one should be accurate because the CyARM
was very close to their body. Considering these participants’
In this paper, we focused on physical actions for com-
pensating the sensory organs, and we described a new
CyARM provides users a unique and intuitive interface for
comprehending the space surrounding them by using body
actions, specifically that of extending or stretching their
arms. We described two implementations of CyARM—the
first prototype and the improved second prototype. We also
described the results of two experiments evaluating CyARM.
One evaluated using it to detect the existence of an object,
and the other evaluated using it to perceive the distance to
the object. As the results of the above two experiment, we
could confirm the followings.
(i) The participants holding the CyARM could detect
whether the object was present or absent in front of
them or not.
(ii) The participants holding the CyARM could detect
the distance from the participants to the object
These results successfully showed that CyARM has the
to the object. It is then said that CyARM can provide users
a kind of imaginary arm that extends to objects at a long
distance. Moreover, the participants with just a few minutes
of training with the CyARM could perceive the object’s
presence and the distance to the object by using this, so
that this is the one of the strong advantage of this device.
As an one of the consecutive studies, we are now planning
to conduct an experiment to investigate whether the user
could detect the shape of the object by means of the CyARM.
Because the users freely can move and shake this CyARM,
they can recognize the different distances to the object
according to the positions or directions of the holding arm.
Therefore, it is assumed that the pattern in different distance
would lead to perceive the shape of object. We expected that
the CyARM might enable the user to identify the shape by
moving the device as if tracing the surface of object.
Further detailed evaluation, especially in practical usage
situations, will be conducted and reported in future work.
We are also planning to make practical improvements in
CyARM, such as improving its accuracy and range of
distance measurement, developing a more compact body,
providing longer battery life, and enhancing its ease of use.
We are also planing the experiments on recognizing the
shapes of the objects using CyARM, and the results will be
reported in our future work.
 T. Yoshimoto, Here Is the Finger? Hokkaido University Press,
Sapporo, Japan, 1979.
 E. T. Hall, The Hidden Dimension, Anchor Books, New York,
NY, USA, 1976.
 C. Carter and K. A. Ferrell, The Implementation of Sonicguide
with Visually Impaired Infants and School Children, Sensory
Aids Corporation, Bensenville, Ill, USA, 1980, D. W. Camp-
 J. Akita, T. Takagi, M. Okamoto, et al., “CyARM: environment
sensing device using non-visual modality,” in Proceedings of
the International Conference on Technology (CSUN ’04), Los
Angeles, Calif, USA, March 2004.
 M. Okamoto, J. Akita, K. Ito, T. Ono, and T. Takagi, “CyARM:
interactive device for environment recognition using a non-
visual modality,” in Proceedings of the International Conference
on Computers Helping People with Special Needs (ICCHP ’04),
pp. 462–467, Paris, France, July 2004.
6 Advances in Human-Computer Interaction Download full-text
 K. Ito, M. Okamoto, J. Akita, T. Ono, I. Gyobu, and T. Takagi,
“CyARM: an alternative aid device for blind persons,” in
Proceedings of the Conference on Human Factors in Computer
Systems (CHI ’05), p. 36, Portland, Ore, USA, April 2005.
 J. Akita, K. Ito, T. Komatsu, et al., “CyARM: direct percep-
tion device by dynamic touch,” in Proceedings of the 13th
International Conference on Perception and Action, pp. 87–90,
Montery, Calif, USA, July 2005.
 T. Iwamoto, M. Tatezono, T. Hoshi, and H. Shinoda, “Air-
borne ultrasound tactile display,” in Proceedings of the 35th
International Conference on Computer Graphics and Interactive
Techniques (SIGGRAPH ’08), pp. 11–15, Los Angeles, Calif,
USA, August 2008.
 Y. Hirata and M. Sato, “3-dimensional interface device for vir-
tual work space,” in Proceedings of the IEEE/RSJ International
Conference on Intelligent Robots and Systems (IROS ’92), vol. 2,
pp. 889–896, Raleigh, NC, USA, July 1992.
 A. Yamamoto, T. Ishii, and T. Higuchi, “Electrostatic tactile
display for presenting surface roughness sensation,” in Pro-
ceedings of the IEEE International Conference on Industrial
Technology (ICIT ’03), vol. 2, pp. 680–684, Maribor, Slovenija,