Conference PaperPDF Available

Playing on AREEF - Evaluation of an Underwater Augmented Reality Game for Kids

Conference Paper

Playing on AREEF - Evaluation of an Underwater Augmented Reality Game for Kids

Abstract and Figures

This paper reports on a study of AREEF, a multi-player Underwater Augmented Reality (UWAR) experience for swimming pools. Using off-the-shelf components combined with a custom made waterproof case and an innovative game concept, AREEF puts computer game technology to use for recreational and educational purposes in and under water. After an experience overview, we present evidence gained from a user-centred design-process including a pilot study with 3 kids and a final evaluation with 36 kids. Our discussion covers technical findings regarding marker placement, tracking, and device handling, as well as design related issues like virtual object placement and the need for extremely obvious user interaction and feedback when staging a mobile underwater experience.
Content may be subject to copyright.
Playing on AREEF - Evaluation of an Underwater
Augmented Reality Game for Kids
Leif Oppermann, Lisa Blum,
Marius Shekow
Fraunhofer FIT
Schloss Birlinghoven, 53754 Sankt Augustin, Germany
leif.oppermann@fit.fraunhofer.de
ABSTRACT
This paper reports on a study of AREEF, a multi-player
Underwater Augmented Reality (UWAR) experience for
swimming pools. Using off-the-shelf components combined
with a custom made waterproof case and an innovative
game concept, AREEF puts computer game technology to
use for recreational and educational purposes in and under
water. After an experience overview, we present evidence
gained from a user-centred design-process including a pilot
study with 3 kids and a final evaluation with 36 kids. Our
discussion covers technical findings regarding marker
placement, tracking, and device handling, as well as design
related issues like virtual object placement and the need for
extremely obvious user interaction and feedback when
staging a mobile underwater experience.
Author Keywords
Mobile; underwater; augmented reality; virtual
environments; exertion; games
ACM Classification Keywords
H.5.2 Information Interfaces and Presentation: User
Interfaces User-Centered Design; Contextual software
domains: Virtual worlds software Interactive games
INTRODUCTION
Underwater worlds with their colourful fishes and corals
have impressed people in all times and the interest in
activities like diving and snorkelling is still increasing.
Nevertheless traditional diving and snorkelling activities
can be dangerous and are not affordable for everyone.
Possibilities to admire the underwater flora and fauna in a
risk free and comfortable way are offered by big aquariums,
often equipped with huge underwater tunnels and viewing
opportunities as well as by virtual aquariums imitating the
underwater world in a virtual form. However, both
alternatives reduce the user’s perception mostly on the
visual sense, neglecting locomotion or just simply allowing
to dive into the medium of water.
Over the past two decades, the civil use of Augmented
Reality (AR) has been studied for and applied to a
multitude of application-scenarios under dry conditions [1]
[2] [3]. This naturally also included games, ranging from
stationary webcam-based games like Wonderbook [4] or
Tankwar [5] (with TV and HMD views respectively), to
more complex setups that moved away from the desk or
TV, and at the same time also combined several sensing
technologies for playing in prepared indoor [6] as well as in
unprepared outdoor settings [7]. Furthermore there is a lot
of literature on professional and military work underwater,
e.g. for submarines and other underwater vehicles, as well
as for occupational, military, and even astronaut divers [8].
While this predates work in our field by a few decades and
is generally interesting, it has little effect on designing
digital mobile experiences for recreational purposes.
In the medium of water, many technologies like Bluetooth
or Wi-Fi fail and therefore the development of an
Augmented Reality system with entertaining applications
for use in a swimming pool requires a careful analysis of
the used hardware in matters of the specific characteristics
of water. The system has to be waterproof and rugged
enough to resist the water pressure. Many different
approaches have been developed in recent years but not all
tracking technologies used in Augmented Reality systems
on shore are suitable for underwater use. It is necessary to
explore adequate interaction techniques for underwater use.
In our project, we emphasized this through a holistic
experience design process that involved end-users, i.e.
children, as well as industry experts in form of swimming
pool experts and their technology providers. We wanted to
know how to build comprehensive and untethered
underwater interaction during locomotion using off-the-
shelf devices. With little prior work available in this domain
at all, we opted for building a game for children. Games
provide a social frame and are very useful to study
interaction with new technology, esp. when they are multi-
player games, as their competitive character is known to
limit awkwardness [9]. Our system was previously
described in [10] and we are now presenting its evaluation.
RELATED WORK
AR has been applied to and studied in a number of
educational scenarios, typically for dedicated museum
installations, e.g. designed for visiting school classes, or
learners with special needs [11], and more recently also as a
“to-go” kit in a suitcase for being brought into class.
The first wearable underwater computer, the so-called
WetPC from 1993, was built to increase the efficiency of
the collection of biological data on the seafloor. The system
consisted of a monochrome display mounted in front of a
diving mask and a computer worn in a waterproof case on
the diver's back. The KordPad, an input device with five
keys, was used to control the system. While providing a
good handling the WetPC information was not depending
on the user's position or orientation. No Augmented Reality
technologies have been used by the WetPC.
The first digital underwater device for entertainment use is
the Tryton, a.k.a. Dolphyn [12] developed by the French
company VirtualDive. The Tryton is based on a height
adjustable monitor which is mounted inside of a swimming
pool. It is connected to a computer outside the pool. To
control the interactive, audio-visual contents of the Tryton
two joysticks attached to the sides of the monitor are used.
The Tryton combines haptic experience with digital
information but the lack of mobility of the user restricts the
application spectrum.
First approaches of Augmented Reality underwater
concentrate on technologies and mechanisms for improving
navigation of underwater remotely operated vehicles (ROV)
[13]. Due to low illumination and small particles in the
water, also known as marine snow, visibility conditions
underwater can be bad. Fusiello et al. [14] therefore used
visual as well as acoustical sensors to detect objects like
tubes or simple structures in the water. They compared the
data with information from a database of synthetic 3D
objects to determine position and orientation. Virtual 3D
models were then blended over the real camera images to
support the operator of a ROV.
While experience with Augmented Reality underwater is
still relatively low, aspects like tracking position and
orientation have been investigated in connection with
underwater vehicles and robots. Most of these approaches
are intended for the use in marine environments and are
therefore often focused on an approximate large distance
tracking. Nevertheless there are also tracking attempts for
structured environments like swimming pools: Simoncelli
et al. used two sonars to calculate the distance to the walls
and positioned a cleaning robot in a swimming pool [15].
Eskinja et al. [16] also used acoustical tracking by sonar to
position an underwater vehicle in a round pool. They got
good results as long as the vehicle was not moving. Both
approaches suffer problems with signal reflections from the
walls. Beside these acoustical approaches optical tracking
was proposed for determination of position, orientation and
velocity of an autonomous underwater vehicle by Carreras
et al. [17]. They provided their test swimming pool with
artificial markers based on grey and black dots on white
ground which were then recognized by a camera of the
underwater vehicle. They achieved good results in a clean
pool with 1.3 m water depth and indirect sunlight.
Back on the application layer, Blum et al. untethered the
entertainment potential of water-based Augmented Reality
from the wires that were still present in Tryton. Her seminal
work allowed sensing virtual corals and fish even when
physically underwater and in locomotion by using a mobile
prototype using Augmented Reality techniques to enhance a
regular swimming pool with virtual objects for playful
interaction [18]. Other work in this domain include the
artistically inspired Gravity Well, which highlights on using
the unique characteristics of bodily interaction in the water
for digital play [19] and a recent overview paper about
designing digital play experiences with water contact [20].
EXPERIENCE OVERVIEW
Although the aforementioned prototype by Blum worked
fine, it was far from being usable in everyday scenarios, as
it was quite bulky and fragile with its backpack (hosting the
computer) and mask-components which were connected by
a hose. It thus generally required very careful handling.
Moreover, the mask had an expensive military-grade see-
through display attached to it and everything had to be
water-proofed at many different places.
Figure 1. The base station at the pool side and several
markers spread in the pool which show different content
on the tablet when the children approach.
The AREEF system presented in this paper extends on the
initial application idea, but instead uses comparatively
cheap mass-market tablets. It relies on the magic-lens
metaphor [21] meaning that the children playing the game
use the display as a kind of window into an augmented
world that lets them see virtual fishes and corals while they
hold the tablets like little swim-boards. For playing
AREEF, each child is equipped with a tablet that is hosted
in a custom made waterproof case. In addition to these
personal devices a base station is placed at the border of the
swimming pool basin, consisting of a marker and a (also
waterproof) tablet computer attached to a larger display.
Several visual markers, on which the actual virtual island
sceneries are augmented by means of Vuforia feature
tracking, are sunk in the water and distributed in the pool in
a triangular setup, keeping a few meters distance between
them. The base station marker is augmented by a virtual
character that provides guidance for the overall game and
leads the children through the game scenario, instructing
them what to do. Figure 1 provides an overview of the
system setup.
The multi-player game design is based on a “find and
deliver” mechanism. This means that a task which involves
finding a specific item is presented to the children on the
screen of the base station. This simple mechanism allows
for questions like finding a specific fish that is shown to the
child as well as questions that require more knowledge
about the underwater scenery. In order to score, the children
spread out from the base station, search the item they have
been asked to find, once found, they bring it back to the
base station where the find gets automatically recognized
and points are scored for a correct delivery. Game-state
synchronization across devices is done via Wi-Fi. The use
of the base station comes with the advantage of a better Wi-
Fi connectivity when children are above the water surface.
While playing underwater, Wi-Fi connection cannot be
available, as the radio waves are damped too much for
consumer devices to maintain the connection. However,
when children come back to the surface their tablets
automatically reconnect to the base station which is also out
of the water. Subsequently, the AREEF system
synchronizes the solitary game-states from the tablets with
the base station, making the childrens´ scores visible on a
leader board shown on the large screen and giving them
new missions. The AREEF application embraces the
“wireless network transmission disconnection”-effect
without breaking the game-flow for the children by
weaving it into the game-design and providing continued
solitary fun for them when they are in network
disconnection mode underwater. Thus, it effectively hides a
wireless network disconnection seam in a seamful design in
accordance to Chalmers and Galani [22].
Figure 2. Digital game content on physical marker,
showing one of the islands (left) Actual setup during
study with three markers in the water (right).
Selecting an item works by approaching the virtual scenery
and positioning the item of interest in the centre of the
screen, under a cross-hair. This then yields a graphical
progress bar that requires the child to keep the item focused
for some time to really select it. For animated items like
fish this adds an extra challenge as some might quickly
swim away and focusing on them will be difficult. When a
fish is caught, the child swims or walks back to the base
station and points the device at the base station marker. This
indicates to the application that the child has successfully
delivered an item. In case this was the correct item, the
child will receive points and the leader board gets updated
accordingly. This whole “find and deliver” process was
repeated in various mini-games until a child completed their
game or the time ran out. The usual game time is about 10
minutes. Figure 3 shows two children in action.
Figure 3. Two children playing AREEF during the study.
Hardware and location setup
The AREEF system consists of a base station with tablet
and marker, as well as three island markers in the
swimming pool. The base station marker is positioned
upright with the tablet next to it, running the server software
and showing the scoreboard. Due to safety regulations it is
not possible to set up a separate, large screen (danger of
electrocution). For the Wi-Fi connection we use a small
battery-powered Wi-Fi router inside an Aquapac waterproof
plastic case, also located at the base station.
The AREEF application runs on Google Nexus 10 tablets
with Android 4.3. No off-the-shelf waterproof tablets were
available at the time, thus the devices are surrounded by a
waterproof case (see Fig. 3) designed by one of our
partners. During several design iterations, the material
changed from metal to mostly plastic, opening mechanisms
were refined, a conductive layer was added to support
underwater touch (not used in the evaluated system), and
different charging mechanisms were evaluated. Together,
device and case weigh about 2 kg.
The underwater markers for the pool need to be waterproof
and heavy enough to sink to the ground. Thus, the marker
images are printed on alu dibond in a waterproof and non-
reflective way. As the edges are quite sharp, they are
wrapped with plastic edge protection. We chose a size of
105x70 cm for the underwater markers and 60x40 cm for
the base station marker. The base station marker is smaller
as the children could otherwise solve the tasks from far
away which is not intended.
The water depth in non-swimmer pools usually varies from
90 cm to 135 cm. Using the same markers for all depths can
therefore result in not being able to see the entire virtual
island in very shallow water while in deep water it might be
difficult to get close enough to see the virtual scenery in
enough detail. For our main user tests we had a water depth
of up to 120 cm. For this depth the constructed markers
worked fine. Nevertheless, it would be nice to have a set of
differently sized markers to adequately fit each water depth.
Beyond this finding, it was good to have flexibly-
positionable markers. For groups with small children we
moved the markers to more shallow water, so that they were
able to safely stand at all game locations. Another obstacle
that we were not aware of before the tests were water park
installations such as water fountains and whirlpool nozzles
that produce a current that can easily move the markers or
disturb the children, making it more difficult for them to
stay at one place. Here, it was also very useful to be able to
slightly adjust the positions within the pool.
Software setup
The virtual scenery consists of four computer-generated
islands. One of them is positioned at the base station and
the others are located underwater. The AREEF game uses
the Unity3d game engine for rendering and the Qualcomm
Augmented Reality SDK “Vuforia” in order to calculate the
position and orientation of the device. The client application
used by the children offers a single-player mode for
demonstration purposes and a multi-player mode that was
used during the evaluation.
In multi-player mode, the child has to be connected with the
base station device that holds the high score lists. Therefore,
the game app contains an option menu where the IP address
of the server and the port can be set. This was always set up
before starting the game, so that the children only had to
enter their name and were able to start with the game in
multi-player mode. Furthermore the start of each game
session creates a log file in a time-stamped key-value
format that allows us to analyse the speed of the children
and other things in more detail. The game is available in
German, English and Korean and automatically changes
when the system language of the Android tablet is changed.
Game Challenges
The game challenges are all introduced by marine
characters that talk to the child and give instructions. The
main character is a Dugong which first asks to find the Orca
(see Fig. 4, left). When the child finds the Orca and points
at it with the cross-hair in the centre of the screen, the Orca
starts explaining that three fish got lost on other islands.
These fish can be caught by focusing them under the cross-
hair for a specific time. The time is symbolized by a bar that
increases over time. If the bar is full the fish is caught and
appears in a fish net in the lower right corner. After
catching all three fish the child is summoned to find the
Lobster character.
The Lobster introduces the “cleaning trash” challenge. The
task is to help tidy up the neighbour island. When the child
reaches the island trash items such as an old television, cans
and an old car tire are falling down onto the island and
pollute it. At the same time a brush appears in the centre of
the screen. When the child gets close enough to the island
to sweep the trash items away (implemented by an invisible
collision plane placed above the island), the brush starts
moving with a slight sweeping animation. The child can
realize a collision with a trash item visually and haptically
as a vibration is triggered when the brush hits a trash item.
Underwater this makes it more evident for the child if a
trash item has been swept. The lower right corner of the
screen features a trash counter indicating how many trash
items have been cleaned already.
After successfully tidying up the island the child has to find
the Turtle character to receive a treasure chest as reward for
the work done. In order to open up the treasure chest the
child has to throw wooden trunks at it. This is complicated
by the periodic movement of the chest to the left and right
side. The child can follow that movement by balancing the
tablet device accordingly, using the car steering wheel
metaphor. In addition the child has to pay attention to the
progress bar at the bottom indicating when the next trunk is
thrown (see Fig. 4, right). After opening the treasure chest
the child has to return to the Dugong and point at it. This
marks the end of the game and the final screen with the
expired game time is shown. The actions of the child are
also reflected on the scoreboard.
Figure 4. AREEF tablet running the application
underwater (left), aiming at the treasure chest (right).
USER EVALUATION
In order to test and fine-tune our game, as well as the
overall feasibility of our study plan, we conducted a pilot
study with three kids in Darmstadt, Germany on 30th
September 2013. While this led only to minor refinements,
like making the “cleaning trash” challenge easier by
reducing the number of trash items and raising the height of
the invisible collision plane, it helped to establish a routine
of the evaluation procedure for the main evaluation run.
The main user evaluation then took place in the Oktopus
swimming pool in Siegburg, Germany from 22nd 31st
October 2013. On four evaluation days 36 children between
7 and 12 years tested and expressed their opinions about the
AREEF system. Each evaluation day we performed three
test sessions with three children each. Data collection was
done via participant observation, event logging on the
devices, and questionnaires filled directly after the session
(see Fig. 5). The logs and observation allow to identify the
potential bias of children giving overly positive feedback
about a new toy and thus improve the quality of the results.
Figure 5. Mixed data collection approach
A session took about one hour and started with welcoming
and a brief explanation of the AREEF system. Afterwards,
each child got a device and was able to play the game for
about 10 30 minutes depending on the speed of the child.
During the game play we observed the children with two
researchers: one from the outside of the basin and one from
the inside (see Fig. 6 centre, right). Both observers had
waterproof pocketbooks to note interesting findings and
observations. One additional researcher was positioned in
the pool to answer questions and explain the game.
Furthermore, the children were partially filmed and
photographed by another researcher for documentation
purposes (Fig. 6 left) when the parents filed their consent.
Figure 6. Participant observation, taking notes and video.
After all children had finished the game we guided them
through a questionnaire in an interview fashion. The
questionnaire contains open answer questions and questions
with two different Likert scales. For simplicity we added
smileys (green happy smiley, grey normal smiley and red
unhappy smiley) to make the scales clearer for kids. The
interviewing researcher also helped the children to
understand the questions. At the end of the testing session
the children got a small present as a reward.
Overview of the participants
The AREEF user evaluation was performed with 36
children (20 male, 16 female) between 7 and 12 (avg. 10)
that were addressed via public bulletins, a press release and
an internal newsletter to Fraunhofer colleagues.
The ground of the Oktopus non-swimmer pool was sloped,
its depth varying from knee-deep water up to 1.20 m at its
deepest. The average body height of the children was
1.45 m with the smallest child being 1.22 m and the tallest
being 1.62 m. For groups with smaller children we moved
the markers into more shallow water. The children were all
able to swim and most of them had already several
swimming badges (e.g. “sea horse”, “bronze”, “silver”,
“gold”) which are typical to express swimming and diving
skills in Germany. The previous experience of the children
with smart phones and tablet devices was much diversified.
Some even already had their own smart phone while others
had no experience at all. Nevertheless, most of the children
(29) had not seen anything like the AREEF system before.
General impression
The overall idea of having virtual fish in a swimming pool
was rated very positive (avg. 4.83, sd. 0.37 on the Likert
scale with 1 = “bad idea” and 5 = “great idea”). The
AREEF system implementation was also perceived very
positive (avg. 4.64, sd. 0.54 on the same Likert scale).
In the next questions the children were asked which specific
aspects they liked or disliked about the AREEF system and
what ideas they have for future versions. In particular, 11
children were fascinated by the Augmented Reality
allowing them to see virtual scenery which is not there in
reality. Seven children stated that they liked the virtual
scenery and the virtual fishes / Orca respectively and partly
highlighted the animation. Six children enjoyed the
catching fish” challenge a lot. For three children
“everything” was great. Further answers that were given
more than once were the idea in general (3 mentions),
finding animals (2), the exciting game (2), realism in terms
of believable animation (2), and the open treasure chest
challenge (2).
Regarding the non-liked issues four children mentioned
problems with the visibility of the virtual scenery when they
look from far away / very close or from a very lateral angle.
Four children had difficulties with the catching fish”
challenge which often resulted in overlooked GUI elements
or problems to hold the device still for a longer time. Four
children were annoyed by having to surface again to breathe
and would like to use a snorkel. For the evaluation we did
not provide snorkels for sanitary reasons. The “cleaning
trash” mission was stated as too difficult by three children
as here the trash got stuck in the marine landscape of the
island and was more difficult to sweep away. Another
problem that was mentioned by three children was
produced by a water fountain that the pool attendant did not
want to stop for our evaluation. The water fountain pushed
the markers on the ground to other positions and made it
more difficult for light-weight children to keep their
position and focus an item in the pool. Further issues that at
least two children mentioned were that the tablet was too
heavy (3), holding a stable position (2), and not enough
virtual scenery / no augmentation between markers (2).
For a future version, eight children proposed more virtual
scenery and four would even like to have the virtual scenery
everywhere, such that the whole ground of the pool would
be entirely covered by virtual content. Eight children would
like to solve more tasks or have more levels and five asked
for a longer playtime. Four children would prefer a smaller
and more light-weight device and two children mentioned
that it would be nice to have two handles on the case. The
breathing issues should be solved with snorkels according
to four children of our study. Also six children had ideas for
other tasks such as e.g. fighting with sharks or hiding from
obstacles. According to three children the AREEF system
could also be opened for other topics such as space or Harry
Potter. Other issues that were mentioned by at least two
children were an improved tracking from near and far
distance (2), ship wrecks (3) and more fish to catch (2).
The difficulty of the overall game was perceived as
appropriate by the majority (23) of children (avg. -0.17, sd.
0.65 on a Likert scale from -2 = “too easy” to 2 = “too
difficult”). Overall, it seems that the game had a good level
of difficulty for the chosen age group.
Challenge-specific problems
Challenge
Difficulty
(-2 to 2)
Finding next character
-0.22 / 0.75
Catching fish
0.08 / 0.79
Cleaning trash
0.39 / 0.68
Open treasure chest
-0.14 / 1.08
Table 1. Average ratings / standard deviation regarding
clarity of instructions on Likert scale (1 = “difficult”, 5 =
“easy” to understand) and difficulty (-2 = “too easy”, 2 =
“too difficult”)
Finding the next character
Several marine characters lead through the AREEF game
and present new instructions to the children. In general the
instructions to find them seemed well understandable (4.81
of 5, see table 1). Nevertheless we noticed a problem with
the main character of the game, the Dugong, which is a kind
of sea cow. This animal is not well known in Germany and
most of the children in our evaluation did not know the
animal. Therefore, they were unsure where to go, when the
instruction was to “Find the Dugong”. While many children
learnt about a new animal, it might have been better to
choose a more familiar marine animal. The difficulty of
finding the characters was judged as adequate (0) by 23
children (-0.22 on average, see table 1). In the observation
we noticed that many children had problems to find the
Orca which was the first animal after the introduction
character Dugong. The Orca swims very close to the water
surface (i.e., has a high distance from the marker plane) and
is therefore only visible when looking at the marker from
the side. Often the children looked from above the marker
or from a too close distance, unable to see the Orca. In
order to prevent the children from frustration the
researchers helped with hints. Sometimes the children also
asked their friends for advice.
The “catching fish” challenge instructions were rated only
fair (3) by most (17) children (avg. 3.78), indicating that
they need more detailed instructions. During the
observation we noticed that several children overlooked the
cross-hair in the centre of the screen. In a future version this
GUI element should be more eye-catching. Another
problem was that most children intuitively tried to touch the
device to catch the fish as they know touching from
"normal" smartphone use. Even if they were told in the
beginning that they would not be able to solve the tasks by
touching they mostly tried it nonetheless. One possible
explanation is that the progress bar that appears while a fish
is focused was not obvious enough. The difficulty of the
challenge was rated as adequate by the majority (16) of
children. The main difficulties in this scenario were to a)
find the correct fish as some looked very similar and b) to
focus on the fish for a longer time. Focusing was especially
difficult for more light-weight children as they could not
hold the position easily. In some evaluation sessions the
water fountain also made it more difficult for them to
complete this challenge.
The instructions for the “Cleaning trash” challenge were
rated least understandable of all challenges. That problem
was also explicitly mentioned by four children during the
interviews. We frequently observed that children held the
device very close to the marker, below the collision plane,
causing the brush to disappear below the island. This
problem was also verbally expressed in the interviews by
two children. Future versions of the game should also
present the success for each cleaned item more visibly, as
the evaluated version merely shows a counter with items
left to clean in the lower right corner. The challenge’s
difficulty was rated highest among all challenges. As
already stated, the main problem of the challenge was to
keep the correct distance to the marker, in order to be able
to sweep the trash from the island. If the child swims too
close to it, the brush is hidden below the island and does not
sweep away the trash items. Children sometimes said “I
think I lost the brush?!” or “The brush is under the island!”,
indicating the problem with the distance. Two children also
reported this problem in the interview session afterwards.
Another problem was that the trash sometimes got stuck in
the surrounding environment so that the children had to
sweep from a different direction, also stated in the interview
by two children. As most of them were not familiar with
Augmented Reality in general they did not always try to
swim around the islands to try it from another direction.
The open treasure chest challenge’s instructions were rated
better than those of the other two challenges, indicating a
rather good understanding (see table 1). In the observations
however, we noticed that it was not directly clear to all
children what to do. They tried different things such as
touching or rotating the device. The difficulty of the
challenge was rated adequate (0) by the majority (17) of
children and was perceived lower than the difficulty of the
other tasks, yet with the highest entropy in the data (avg.
-0.14, sd. 1.08 which is the highest sd.). Occasionally they
were lucky, holding the tablet correctly right away, hitting
the treasure chest with the wooden trunk on the very first
time. Two children reported their surprise in the interview
about the treasure chest opening unexpectedly. This could
have led to the opinion that the challenge was too easy.
Favourite challenge
The favourite quest of the vast majority (26) was the
“finding fish” challenge. In the observation we noticed that
finding and catching a fish often lead to feelings of success.
This was often indicated by children happily stating “Oh, I
got one”. Especially when the fish was a bit hidden or when
the catch was impeded by the water fountain the children
got a stronger feeling of achievement when they finally
caught it. 12 children stated that they enjoyed searching and
six liked exploring the virtual scenery. Six children liked
this challenge because it was demanding for them. Five
children also stated that they liked to have a repeating mini
task. Two children enjoyed the catching process particularly
because of the aiming mechanics. The log file analysis
confirmed that these two were able to catch fish in one or
two attempts, whereas other children required up to 70.
Challenge-independent problems
A major challenge-independent problem reported by four
children was the water fountain. It caused markers and the
children to drift away. It was only enabled during some test
sessions and could not be turned off because this would
have affected regular guests located in the adjacent parts of
the pool. The pool area was not big enough to place the
markers completely out of the water fountain's range. In a
future setup concept this should be taken into account.
Additionally, two children reported general tracking
problems, having experienced frequently (re-)appearing
islands. Two children stated that some items were too
hidden and another two claimed that the playground was
too small and that they were disturbed by other children.
In the observations we noticed that the end of the game was
too unspectacular (“Have I finished now?”). The final
screen showed a firework, a jumping dugong, and displayed
the game time. In a future version there could be further
texts and animations indicating that they made it and some
more visual effects to make the end sequence of the game
more obvious and rewarding.
Technical and physical aspects
The technical aspects of the AREEF system were also
judged as rather positive by the children of our study. The
weight of the case (2 kg) was seen as appropriate by 16
children while 16 found it slightly too heavy. Two children
found it far too heavy while two other children found it too
light-weight. One child stated this because he wanted the
device to keep him better underwater. In the water the
device is not actually heavy but it does not float on the
water surface. Therefore, it is not possible for the children
to e.g. reorganize their diving masks with both hands as one
has to hold the device. A future version of the case could
add thin plastic plates with handles at the sides of the case
which only marginally increase the cases’ weight but
improve its floatability and handling. While 17 children
rated the handling to be good, two children explicitly asked
for two grips to better hold the device.
The children liked other technical aspects. 29 children
stated that the contents had a good visibility on the tablet. 4
found it at least okay. No child complained about the wire
grid between the display and the foil of the waterproof case.
Regarding the distance to the virtual items 32 children were
able to get close enough to see the fish in all details. Some
children here stated that they sometimes got even too close.
This could be improved by shrinking the size of the
waterproof markers. Ideally there should be a set of
different sized markers to fit different water depths. While
the used markers were fine in the deeper water, we
sometimes had to move them to more shallow water for
smaller children. This resulted in a closer distance and
therefore bigger appearing virtual scenery.
Good tracking is the most important part for good
Augmented Reality experiences. By asking quantitative
questions in the questionnaire we determined that sudden
jumps of the virtual scenery occurred often (1), sometimes
(4) and never (31). A disappearing virtual island was rated
as a “seldom” problem by 19 children and “sometimes”
occurred in the game experiences of 5 children. In general
the tracking seemed to have worked rather well, also
confirmed by most of the log files indicating few erratic
tracking losses. Tracking problems caused by occlusion
happened seldom for 14 and sometimes for 7 children,
leaving 15 children unaffected. From the observation we
had the impression that real disturbance happened seldom.
Device log analysis
To improve the results of the user evaluation, the game was
extended with extensive game-event logging functionality.
A textual log file was created each time a child registered
with the base station. It contained the child’s name, the
date, time and a large list of time-stamped events which
describe activities, such as finding or losing markers,
finding the character (Dugong, Orca, etc.) catching or
losing a fish and many more task-specific events.
The logs provided helpful evidence for statements made by
the children in the interviews, by looking for correlations.
We found that those two children who stated that catching
fish was fun because of aiming were those children who
had the lowest number of unsuccessful trials for catching
fish in their group. The logs indicate that these two were
good aimers and this correlates with their statement that
catching fish is fun for them.
Figure 7. Legend of observation log visualizations.
Interpreting the log files proved to be a valuable tool for
analysing game sessions. Each evaluation session was done
with three children who all completed the tasks at their own
pace, each of them having different kinds of difficulties.
Visualizations of two evaluation sessions are discussed
below. Figure 7 shows a legend that applies to both
visualizations. Apart from the different events it explains
the colour of missions (challenges) and markers, as well as
the markers used in each mission. For example in mission
two, only marker 1 and 2 needed to be visited, as marker 3
was not helpful in completing the mission.
Evaluation session Paul 1, Paul 2 & Kenan
A visualization of the session (see Fig. 8 and 9) shows the
three children which assigned their names as Paul 1, Paul 2
and Kenan. For each child there is a group of bars or icons
and at the bottom of the figure, the elapsed game time is
plotted. Whenever a mission is finished (small yellow bar
within the mission bars) the children have to see the
Dugong first before the next mission begins and the bar
changes its colour. The marker bars are a visualization that
indicate the times during which a marker was continuously
tracked.
In figure 8 we see that Paul 2 finished all tasks extremely
quickly, actually being the fastest child in the whole
evaluation. Although it took him longer than Paul 1 to
complete the first mission, he managed to complete the
second trash-cleaning mission in a remarkably short time
within about 2 minutes, leading to total time of ~8:45 to
complete all missions.
The second mission, cleaning trash, generally requires the
child to stay underwater, due to the low distance the tablet
has to have to the marker. The marker tracking visualization
shows that Paul 2 only had to catch air once to complete
this task, cleaning 2-3 trash items per dive.
In contrast, Paul 1 finished the catching fish mission very
quickly but then cleaning the trash proved to be very
difficult for him. The marker tracking was lost often and he
even tried to find trash on other islands where there was
none. In the follow-up interview he explained that
instructions were not clearly indicating that trash had to be
pushed off the island and that it was very difficult for him
to find the right distance between tablet and marker.
Analysing marker tracking gives interesting insights, not
just regarding tracking stability. Very long phases of
continuous marker tracking (almost 2 minutes long for
Paul 1 at ~13:30) indicate not only stable marker tracking
but also indicate that the child could not have been diving
all the time, as he could impossibly hold breath for 2
minutes. Very narrow white-space gaps between solid bars
Figure 8. Evaluation session of Paul 1, Paul 2 and Kenan
identify short loss of tracking, e.g. due to interference from
other children or a bad viewing angle. Wider gaps identify
transit times, either between islands, or for repositioning
(looking at the island from another angle), or the time the
child asked others for help. The tracking also reveals that all
children understood that the treasure chest task does not
require tracking. Between finding and opening the treasure
chest no marker tracking is shown.
We also see that all three children caught the fish in the
same order. Interestingly, the silver fish was always caught
last, even though it was on the same island as the blue fish
which was always caught first. This leads to believe that
children had simply overlooked the silver fish, but there is
contradicting evidence. Every child unsuccessfully tried to
catch the silver fish while staying on the first island, i.e.,
before catching the second, orange fish. It was apparently
so difficult to catch it that they temporarily abandoned it in
order to continue looking for fishes on other islands.
A different group of children had considerably more
difficulties in the catching fish mission. Figure 9 shows this
session with children Fabian, Markus and Melanie. Markus
attempted to catch fish 73 times before catching his first,
blue fish. He actually spotted and attempted catching all 3
fish, abandoning catching one fish for finding other ones.
Fabian caught the last fish fastest, but did not start the
second mission until about 90 seconds later, looking at
other islands to catch more fish. He confirmed this
behaviour in the interview where he stated that he
considered the virtual worlds to be well done and that he
enjoyed exploring them. He also managed to continuously
track the corresponding island marker throughout the
complete trash cleaning session and explored the island for
another minute even after the last trash item was cleaned.
Apart from the visual analysis, the log files also allowed for
computing various features that were performed in Excel.
Using colour coding, it was easy to identify the
performance of children relative to each other which helped
in identifying potentially interesting aspects in the data that
warrant a further analysis of the qualitative material, e.g.
the video and the questionnaire answers. While the data is
not exhaustive enough for statistical significance testing,
and we do not believe that it would add much to our case,
we frequently used the massive Excel table for guidance
during the analysis of our user testing.
DISCUSSION
Early on in the project, we conducted three expert
interviews with swimming pool experts and industry
professionals as part of our requirement analysis. They
helped us to define the target age-group and highlighted
security and robustness issues, e.g. the device must
withstand heavy play and not break a tile when dropped, the
design shall not encourage running around, and the water
must not be polluted in any case. Furthermore, chlorine
water is very aggressive, so the device has to be protected.
While all of this originally fed into our requirement analysis
a priori, we can now confirm it from practical experience.
Our AREEF user-study consisted of 36 children in the main
study and another three in the pilot. The main evaluation
was carried out smoothly and without problems, albeit one
tablet computer suffered from water damage due to human
error in the very first play-session. The number of
participating children is comparably large and shows that
good effort was made to collect a comprehensive amount of
data, comprised of qualitative and quantitative data from the
questionnaire, qualitative data from observations and
quantitative data from the device log files.
Figure 9. Evaluation session of Fabian, Markus & Melanie.
On the technical side, AR tracking worked well, even
though the Vuforia is not explicitly designed for underwater
use, where the camera’s image is affected by different
effects such as a blue tint and distortion. Loss of tracking
was primarily caused by extreme viewing angles, too large
distance or occlusion caused by other children. Using even
larger markers could improve performance, given that the
actually augmented area, the island in this case, keeps its
size, i.e. is smaller than the whole marker. The physical
diameter of an island of 60-70 cm turned out to be a good
fit for this evaluation.
Having relocatable markers turned out to be beneficial, as
the setup needs to be adjustable for the individual heights of
the children and to deal with the current caused by
fountains or whirlpool nozzles, which cause markers (and
nearby children) to drift. To get reliable marker tracking
and a comfortable experience for children, future versions
of our system could feature a set of all markers with
different sizes each. This way, small markers can be placed
in shallow water for small children, while large markers are
placed in deeper water for larger children. In the latter case
the physical distance between tablet and marker is larger,
requiring a large marker to allow for better tracking
performance and visibility.
Regarding case and device handling, we learnt that tablets
need a case that allows it to float on the water surface, while
still being able dive with it. Otherwise children cannot
quickly put the device aside, e.g. to fix their diving masks.
This could be achieved by adding thin plastic plates with
handles at the sides of the case which only marginally
increase the cases’ weight but improve its floatability and
handling. Finally, the case-device-combination should be
childproof, making it difficult if not impossible for them to
break the device accidentally. At the same time, staff
members need to be able to charge the device easily, for
which wireless charging (although slower) provides the
easiest solution in an environment where open cables with
electric current are forbidden. After the evaluation our
researchers had tested an off-the-shelf waterproof tablet
(Sony Xperia Tablet Z). While the price point is clearly an
advantage, handling the thin device underwater is difficult,
children might accidentally open the caps, and touch did not
work underwater. Even worse, once surfacing with the
tablet, random touch events would be registered by the
device, probably due to water drops trickling down the
touch screen’s surface. Embedding an already waterproof
device in a safe case thus appears to be a viable solution for
future versions.
One interesting anecdote from a father was that his son was
usually panicking when he got his head underwater as he
was then not able to control his breath properly. While his
son was playing, that father observed and told us that his
son seemed to have suddenly forgotten about his water-
anxiety since he wanted to play the game so badly. The
father was very pleased that his son was now holding his
breath nicely while underwater without any more problems.
We did not plan the project for this, but this observation
seems to pose an interesting question for future research in
this domain: can AR experiences in the water, much like
VR in the dry [23], be used to overcome anxiety?
Feedback from children regarding the idea and the
hardware & software implementation was overwhelmingly
positive. The overall game difficulty as well as the
difficulty of the specific challenges was, on average,
considered to be just right. Yet, it was possible to identify
issues both in interaction design and in the way instructions
were presented on the screen.
In the first challenge, “catching fish”, the children have to
find and catch three fish, depicted in a 2-d image,
distributed on the virtual islands. Catching is done by
continuously aiming at them for a specific time with an on-
screen cross-hair. The children rated this kind of find-and-
catch challenge their favorite, as the instructions were clear
and the game showed immediate visual feedback after each
catch. Assigning items a static position in the scenery
(instead of having them move) turned out to be a good
decision, as many children needed several attempts to catch
them, showing that pointing at items underwater is much
more challenging than in a dry scenario.
In the second challenge, “cleaning trash”, the idea is to use
the tablet as collider (represented by a brush) that moves
virtual object in the scenery. We learnt that the collision
plane needs to have a sufficiently large distance to the
marker (which is on ground level of the basin) for the
interaction to work, and that children manage this process
best while standing, which complicates finding sweet spot
values, as now the child’s individual height and the depth of
the water play a significant role.
We found that for both challenges, it’s vital to provide
extremely obvious feedback for each completed and
ongoing action, because the focus of the children is divided
in an underwater setting, compared to normal tablet use.
The lack of feedback caused interesting issues. For example
in the catching fish challenge, children attempted to use
touch gestures while aiming, indicating that our use of a
filling progress bar as feedback of the catching-process is
not sufficiently obvious (and could e.g. be supported by a
vibration). In the challenge where children open a treasure
chest, we also learnt that switching from an AR game to a
non-AR game is understood by the vast majority of children
without further instructions, whereas, for the cleaning trash
challenge, the lack of instructions explaining how to clean
trash could have caused that challenge was rated least clear
to understand by the children.
Aside from challenge-specific issues the children reported
many interesting suggestions such as more virtual scenery
or more play-time in upcoming versions, and the majority
expressed strong interest in commercialization, since they
would like to rent the AREEF system in the future.
ACKNOWLEDGMENTS
We would like to acknowledge the Korea Institute for
Advancement of Technology (KIAT) for funding AREEF
as their first project with a European lead organization
under their International Collaborative R&D Program.
Thanks to our project partners from EUMTECH, UD4M
and TRUBICON, and to the helping hands of our
colleagues and students at the Mixed and Augmented
Reality Solutions research group at Fraunhofer FIT.
Moreover, we are greatly indebted to the swimming pools,
organisations and individuals that supported our outings:
Oktopus in Siegburg, Germany (hosting the main study),
Bessunger Bad in Darmstadt, Germany (hosting the pilot
study), Vidamar Resort in Funchal, Portugal (hosting
during ACE conference demos), Victoria Leisure Centre in
Nottingham, UK (hosting during GameCity 9).
REFERENCES
[1] R. Azuma, ‘A Survey of Augmented Reality’,
Presence Teleoperators Virtual Environ., vol. 6, no.
4, pp. 355385, 1997.
[2] R. Azuma, Y. Baillot, R. Behringer, S. Feiner, S.
Julier, and B. MacIntyre, ‘Recent Advances in
Augmented Reality’, IEEE Comput. Graph. Appl.,
vol. 21, no. 6, pp. 3447, 2001.
[3] M. Billinghurst, A. Clark, and G. Lee, A Survey of
Augmented Reality. now publishers Inc, 2015.
[4] WonderbookTM: Book of Spells game for PS3TM
[Online]. Available: http://us.playstation.com/games/
wonderbook-book-of-spells-ps3.html.
[Accessed: 25-May-2016].
[5] T. Nilsen and J. Looser, ‘Tankwar Tabletop war
gaming in augmented reality’, presented at the
Second International Workshop on Gaming
Applications in Pervasive Computing Environments
(PerGames), Munich, 2005.
[6] H. Tamura, ‘Real-time interaction in mixed reality
space: Entertaining real and virtual worlds’,
presented at the Imagina, 2000.
[7] L. Blum, R. Wetzel, R. McCall, L. Oppermann, and
W. Broll, ‘The final TimeWarp: Using Form and
Content to Support Player Experience and Presence
when Designing Location-Aware Mobile Augmented
Reality Games’, in Designing Interactive Systems,
Newcastle, 2012.
[8] O. F. Trout, H. L. Loats, and G. S. Mattingly, ‘A
Water-Immersion Technique for the Study of
Mobility of a Pressure-Suited Subject Under
Balanced-Gravity Conditions’, Jan. 1966.
[9] M. Montola, ‘A Ludological View on the Pervasive
Mixed-reality Game Research Paradigm’, Pers.
Ubiquitous Comput, vol. 15, no. 1, pp. 312, 2011.
[10] L. Oppermann, L. Blum, J.-Y. Lee, and J.-H. Seo,
‘AREEF Multi-player Underwater Augmented
Reality experience’, in Games Innovation Conference
(IGIC), 2013 IEEE International, 2013, pp. 199202.
[11] T. N. Arvanitis, A. Petrou, J. F. Knight, S. Savas, S.
Sotiriou, M. Gargalakos, and E. Gialouri, ‘Human
factors and qualitative pedagogical evaluation of a
mobile augmented reality system for science
education used by learners with physical disabilities’,
Pers. Ubiquitous Comput., vol. 13, no. 3, Mar. 2009.
[12] A. Bellarbi, C. Domingues, S. Otmane, S.
Benbelkacem, and A. Dinis, ‘Augmented reality for
underwater activities with the use of the DOLPHYN’,
in 2013 10th IEEE International Conference on
Networking, Sensing and Control (ICNSC), 2013
[13] U. F. von Lukas, J. Quarles, P. Kaklis, and T.
Dolereit, ‘Underwater Mixed Environments’, in
Virtual Realities, G. Brunnett, S. Coquillart, R. van
Liere, G. Welch, and L. Váša, Eds. Springer
International Publishing, 2015, pp. 5676.
[14] Fusiello, A., Trucco, E., Fusiel Tommasini, T., and
Roberto, V., ‘Improving feature tracking with robust
statistics’, Pattern Anal. Appl., vol. 2, no. 4, 1999.
[15] M. Simoncelli, G. Zunino, H. I. Christensen, and K.
Lange, ‘Autonomous Pool Cleaning: Self
Localization and Autonomous Navigation for
Cleaning’, Auton. Robots, vol. 9, no. 3, pp. 261270,
Dec. 2000.
[16] Z. Eskinja, Z. Fabekovic, and Z. Vukic, ‘Localization
of autonomous underwater vehicles by sonar image
processing’, in ELMAR, 2007, 2007, pp. 103106.
[17] M. Carreras, P. Ridao, R. Garcia, and T. Nicosevici,
‘Vision-based localization of an underwater robot in
a structured environment’, in IEEE International
Conference on Robotics and Automation, 2003.
Proceedings. ICRA ’03, 2003, vol. 1, pp. 971976
[18] L. Blum, W. Broll, and S. Müller, ‘Augmented reality
under water’, in SIGGRAPH ’09: Posters, New York,
NY, USA, 2009, p. 97:197:1.
[19] S. J. Pell and F. Mueller, ‘Gravity Well: Underwater
Play’, in CHI ’13 Extended Abstracts on Human
Factors in Computing Systems, New York, NY,
USA, 2013, pp. 31153118.
[20] W. L. Raffe, M. Tamassia, F. Zambetta, X. Li, S. J.
Pell, and F. ‘Floyd’ Mueller, ‘Player-Computer
Interaction Features for Designing Digital Play
Experiences Across Six Degrees of Water Contact’,
presented at CHI Play, New York, NY, USA, 2015.
[21] J. Viega, M. J. Conway, G. Williams, and R. Pausch,
‘3D Magic Lenses’, in Proceedings of the 9th Annual
ACM Symposium on User Interface Software and
Technology, New York, NY, USA, 1996, pp. 5158.
[22] M. Chalmers and A. Galani, ‘Seamful Interweaving:
Heterogeneity in the Theory and Design of
Interactive Systems’, in Designing Interactive
Systems, Cambridge, Massachusetts, 2004.
[23] K. Meyerbröker and P. M. G. Emmelkamp, ‘Virtual
reality exposure therapy in anxiety disorders: a
systematic review of process-and-outcome studies’,
Depress. Anxiety, vol. 27, no. 10, pp. 933944, Oct.
2010.
... Tracking technologies assess the user's movements through inertial sensors [3, 22, 23, 38, 47, 52-54, 98, 102, 110], pressure sensors [55], RFID tags [15], GPS [104], cameras [112], and computer vision technology [42,49,60,67,92,124,135]. Robotic systems explore human-robot interaction in recreational settings [88,98,124], while XR includes augmented [9,10,12,14,20,21,49,60,93,94,135] and virtual reality applications [26,102] that also feature tracking, but focus on the user's perspective rather than body movements. These systems are used within aquatic recreation at various points across a span of four decades (the 1990s, 2000s, 2010s, and 2020s), which we detail next. ...
... Jets and currents represent fux of water rather than a constant quantity, so the "fow" rate can be used much in the same way as electrical current [17,29,56,57,95]. Despite this similarity, water's "electrical conductivity" can dampen wireless network communication and damage or render useless any circuitry it contacts that lacks sufcient waterproofng [9,94]. Users are not safe from the dangers of water either: access to life-sustaining "air is limited" in many aquatic contexts [55,94]. ...
... Despite this similarity, water's "electrical conductivity" can dampen wireless network communication and damage or render useless any circuitry it contacts that lacks sufcient waterproofng [9,94]. Users are not safe from the dangers of water either: access to life-sustaining "air is limited" in many aquatic contexts [55,94]. These properties have implications for technology that helps to facilitate interactive recreational experiences with particular attention to health and safety. ...
Conference Paper
Aquatic recreation encompasses a variety of water-based activities from which participants gain physical, mental, and social benefits. Although interactive technologies for supporting aquatic recreation activities have increased in recent years, the HCI community does not yet have a structured understanding of approaches to interaction design for aquatic recreation. To contribute towards such an understanding, we present the results of a systematic review of 48 papers on the design of interactive technology for aquatic recreation, drawn from the ACM, IEEE, and SPORTDiscus libraries. This review presents an aquatic recreation user experience framework that details water's problems and opportunities concerning HCI. Our framework brings us closer to understanding how technology can interact with users and the aquatic environment to enhance the existing recreational experiences that connect us to aquatic environments. We found that designers can elicit delight, enablement, challenge, and synergy in aquatic recreation experiences.
... Brown et al. [8] presented an underwater AR system intended to support both recreational and commercial divers during navigation and provide also a fish identification scheme. Other AR systems were limited only to swimming pools [4], [38], [39]. Bruno et al. [10] presented a novel system that integrates an underwater tablet with an acoustic localization system with a Long BaseLine (LBL) configuration to enable a diver to locate itself through a map of the underwater site. ...
... Still, none of this research explored user experience in FIGURE 1: Architecture (a) and hardware (b) of the markerbased AR system. underwater marine virtual environments, though an AR game for children in swimming pools was designed and tested by Oppermann et al. [39]. ...
Article
Full-text available
Underwater cultural heritage sites represent an attractive and exciting experience for diving tourists, even if often it is complicated for them to understand the significance and value of the remains that are usually strongly damaged and covered by the marine organisms. Thanks to the recent advancements in technologies that overcome these problems, augmented reality is nowadays possible even in such harsh conditions, opening new possibilities for enhancing the diver’s experience. However, no user study has formally evaluated the usefulness and usability of augmented reality in open sea underwater environments. This paper presents two novel solutions for underwater augmented reality: a compact marker-based system for small areas, and a complex acoustic system for large areas. Both of them were deployed at an underwater cultural heritage site and evaluated by ten divers in experiments analyzing their perception and remembrance, interests, and user experience. For comparison, the same study was also performed with non-divers assessing the marker-based system on land. Results show that both systems allow divers to encounter new and exciting moments and provide valuable insights for underwater augmented reality applications.
... The resulting Mattson-Mann taxonomy is illustrated in Fig 10. We invited a number of the world's top researchers to present their work, including Stanford University's smart toilet, and a variety of underwater VR technologies such as Vuzix SmartSwim, Ballast VR, and AREEF [11]. ...
Article
Full-text available
WaterHCI (Water-Human-Computer Interaction) is a field of study and practice focusing on the design and creation of interactive devices, systems, and experiences at the intersection of water, humans, and technology. It is a relatively new field that originated in Ontario, Canada, in the 1960s, and was further developed at University of Toronto through a series of annual (de)conferences from 1998 to present. This year, at our 24th annual WaterHCI (de)conference, we focus our attention on ideas for creating spaces and places (e.g. Ontario-Placemaking\tma) that combine fun and frolic with health and wellbeing, nature and spirituality, research and teaching, and innovation, where we can touch and be touched by water.
... The resulting Mattson-Mann taxonomy is illustrated in Fig 10. We invited a number of the world's top researchers to present their work, including Stanford University's smart toilet, and a variety of underwater VR technologies such as Vuzix SmartSwim, Ballast VR, and AREEF [11]. ...
Conference Paper
Full-text available
WaterHCI (Water-Human-Computer Interaction) is a field of study and practice focusing on the design and creation of interactive devices, systems, and experiences at the intersection of water, humans, and technology. It is a relatively new field that originated in Ontario, Canada, in the 1960s, and was further developed at University of Toronto through a series of annual (de)conferences from 1998 to present. This year, at our 24th annual WaterHCI (de)conference, we focus our attention on ideas for creating spaces and places (e.g. Ontario-Placemaking\tma) that combine fun and frolic with health and wellbeing, nature and spirituality, research and teaching, and innovation, where we can touch and be touched by water.
... With advances in technology, such as communication and waterproofing, underwater devices and interactions have seen increased focus. Oppermann et al. [8] and Yamashita et al. [9] explored the use of Augmented Reality (AR) equipment in underwater education and entertainment. Pell et al. [10] proposed a gravity well to support underwater play through water-movement interaction. ...
Article
Full-text available
The fabrication of underwater devices is necessary for the exploration of water environments and interactions in the Human–Computer Interaction (HCI) field. However, there are fewer approaches to support prototyping used in water environments. The existing prototype methods lack systematic waterproof treatments and provide insufficient software for balance and buoyancy analysis. To address these limitations, we present ElectroPaper, a new approach for the design and fabrication of prototypes used in water environments (surface or beneath) with paper-based electronic interfaces with a crease layer, hardware distribution layer, and hollow-out layer to support physical properties, such as waterproofing, foldability, and conformability. The approach includes a computational design tool for assisting in balance analysis, three-dimensional (3D) model unfolding, and circuit drawing. We describe the design and fabrication process and provide several example applications to illustrate the feasibility and utility of our approach. ElectroPaper provides an inexpensive and effective medium for the fabrication of customized digital prototypes for water environment use.
... The idea of using AR under water is not new [3], however, solutions are limited mostly on the clear water of swimming pools [4,5,6]. In marine areas, AR solutions use acoustic beacons to replace the visual input [7,8], but they are limited only to show a map and a textual information about the area, since they are not able to track precise position of the diver required to accurately superimpose virtual objects. ...
Preprint
Full-text available
Underwater sites are a harsh environment for augmented reality applications. Obstacles that must be battled include poor visibility conditions, difficult navigation, and hard manipulation with devices under water. This chapter focuses on the problem of localizing a device under water using markers. It discusses various filters that enhance and improve images recorded under water, and their impact on marker-based tracking. It presents various combinations of 10 image improving algorithms and 4 marker detecting algorithms, and tests their performance in real situations. All solutions are designed to run real-time on mobile devices to provide a solid basis for augmented reality. Usability of this solution is evaluated on locations in Mediterranean Sea. It is shown that image improving algorithms with carefully chosen parameters can reduce the problems with visibility under water and improve the detection of markers. The best results are obtained with marker detecting algorithms that are specifically designed for underwater environments.
... A more sophisticated system, developed for edutainment purposes, was presented in 2009 [Morales et al. 2009], it consists of a UWAR system, based on optical square-markers, that provides visual aids to increase divers' capability to detect, perceive, and understand elements in underwater environments. A similar wearable waterproof system, but limited to a swimming pool environment, was developed in the same year by the Fraunhofer Institute for Applied Information Technology [Blum et al. 2009;Oppermann et al. 2016]. Another marker-based AR underwater device, that can be adopted in swimming pools, for aquatic leisure activities is the Dolphyn system [Bellarbi et al. 2013]. ...
Preprint
The Mediterranean Sea has a vast maritime heritage which exploitation is made difficult because of the many limitations imposed by the submerged environment. Archaeological diving tours, in fact, suffer from the impossibility to provide underwater an exhaustive explanation of the submerged remains. Furthermore, low visibility conditions, due to water turbidity and biological colonization, sometimes make very confusing for tourists to find their way around in the underwater archaeological site. To this end, the paper investigates the feasibility and potentials of the underwater Augmented Reality (UWAR) technologies developed in the iMARECulture project for improving the experience of the divers that visit the Underwater Archaeological Park of Baiae (Naples). In particular, the paper presents two UWAR technologies that adopt hybrid tracking techniques to perform an augmented visualization of the actual conditions and of a hypothetical 3D reconstruction of the archaeological remains as appeared in the past. The first one integrates a marker-based tracking with inertial sensors, while the second one adopts a markerless approach that integrates acoustic localization and visual-inertial odometry. The experimentations show that the proposed UWAR technologies could contribute to have a better comprehension of the underwater site and its archaeological remains.
Article
With a core purpose of helping users to understand the context, a water interface provides possibility for enhancing user experience in interaction process. Starting from analyzing existing water‐mediated interaction approaches, we proposed a water‐mediated interaction design model and a corresponding user experience model, aiming to eliminate the boundary between users and the context with water as the medium. According to the proposed model, we implemented a water‐mediated interaction system Landscape Rippling, with the painting “A Panorama of Rivers and Mountains” as its context. Ultimately, user experience tests of the interaction system demonstrate the effectiveness of this water‐mediated interaction design model.
Conference Paper
Bodies of water can be a hostile environment for both humans and technology, yet they are increasingly becoming sources, sites and media of interaction across a range of academic and practical disciplines. Despite the increasing number of interactive systems that can be used in-, on-, and underwater, there does not seem to be a coherent approach or understanding of how HCI can or should engage with water. This workshop will explicitly address the challenges of designing interactive aquatic systems with the aim of articulating the grand challenges faced by WaterHCI. We will first map user experiences around water based on participants’ personal experiences with water and interactive technology. Building on those experiences, we then discuss specific challenges when designing interactive aquatic experiences. This includes considerations such as safety, accessibility, the environment and well-being. In doing so, participants will help shape future work in WaterHCI.
Article
Augmented reality (AR) technology has been gaining a meaningful position in the education sector with multiple subjects’ application to support teaching and learning experiences. Thanks to the advancement of hardware and technological platforms it is becoming easier to adopt AR technology at schools. This study represents a systematic literature review of AR application in the area of science, technology, engineering and mathematics (STEM) education at primary school level with the focus on collaborative and gamified practices applied to mobile application for geometry and geography subjects. The main objective of this paper is to investigate and analyse the main characteristics and state of the art of the AR STEM gamified collaborative applications for geometry and geography in primary education and their efficiency on the teaching and learning process. This study performs qualitative analysis of the content of 97 articles that provide an overview of the relevant mobile applications for this systematic literature review, book chapters and conference articles published from 2010 to 2021. The main contribution of this article is in the area of in‐app augmented collaborative experiences in geometry and geography learning in primary grades for further research and technology development. In addition, an extensive analysis of the existing AR mobile applications from peer reviewed articles for the above‐mentioned subjects is provided in a comparative table.
Conference Paper
Full-text available
Physical games involving the use of water or that are played in a water environment can be found in many cultures throughout history. However, these experiences have yet to see much benefit from advancements in digital technology. With advances in interactive technology that is waterproof, we see a great potential for digital water play. This paper provides a guide for commencing projects that aim to design and develop digital water-play experiences. A series of interaction features are provided as a result of reflecting on prior work as well as our own practice in designing playful experiences for water environments. These features are examined in terms of the effect that water has on them in relation to a taxonomy of six degrees of water contact, ranging from the player being in the vicinity of water to them being completely underwater. The intent of this paper is to prompt forward thinking in the prototype design phase of digital water-play experiences, allowing designers to learn and gain inspiration from similar past projects before development begins.
Conference Paper
Full-text available
This paper reports on AREEF, the world's first multi-player Under Water Augmented Reality (UWAR) experience. The underlying mission of this work was to bring computer games and entertainment applications from traditional settings into the water using Augmented Reality (AR) technology. We provide an application overview and present findings from our participatory design process that involved engineers and designers, as wells as end-users and water-park experts. The paper closes with a brief discussion of technical aspects that relate to Wi-Fi communication and computer vision tracking, and provides an outlook for future work.
Conference Paper
Full-text available
More and more technology supports utilitarian interactions in altered gravity conditions, for example underwater and during Zero-G flights. Extending this, we are interested in digital play in these conditions, and in particular see an opportunity to explore underwater bodily games. We present an interactive shallow-water system that supports bodily play through water-movement interactions: Gravity Well. Through designing the system and combining aquabatic principles with exertion game design strategies, we identified a set of design tactics for underwater play based on the relationship between the afforded type, and level, of bodily exertion relative to pressure change and narcosis. With our work, we aim to inspire designers to utilize the unique characteristics of bodily interactions underwater and guide them in developing digital play in altered gravity domains.
Conference Paper
Full-text available
The objective of this work is to introduce Augmented and Mixed Reality technologies in aquatic leisure activities. We have proposed a new device which is autonomous, mobile and easily transportable by one person. It can also be easily installed, equipped with GPS and wireless systems, and has positive buoyancy. The device will be used at water surface as well as underwater using a tuba. Moreover, the device is equipped with one (can be upgraded for more) video camera pointing downwards. Augmented Reality contents combining actual underwater images with 3D animated images will be one of the preferred ways to use the device.
Article
Full-text available
Designing Augmented Reality location aware games requires an understanding of how form and content issues impact on presence. A study of 60 players was conducted using questionnaires, video analysis and interviews. The results indicate that content including: moral dilemmas, strong narratives, using real locations effectively and applying simple physical behavior within virtual characters to improve embodiment have a positive impact on player experience. The results are presented in the form of guidelines.
Book
A Survey of Augmented Reality summarizes almost fifty years of research and development in the field of Augmented Reality (AR). From early research in the 1960's until widespread availability by the 2010's, there has been steady progress towards the goal of being able to seamlessly combine real and virtual worlds. This monograph provides an overview of the common definitions of AR, and shows how AR fits into taxonomies of other related technologies. A history of important milestones in Augmented Reality is followed by sections on the key enabling technologies of tracking, display, and input devices. The author also review design guidelines and provide some examples of successful AR applications. The work concludes with a summary of directions for future work, and a review of some of the areas that are currently being researched. A Survey of Augmented Reality is an invaluable resource for researchers and practitioners. It provides an ideal starting point for those who want an overview of the technology and to undertake research and development in the field.
Chapter
In this chapter we give a systematic overview over Virtual Reality (VR) and Augmented Reality (AR) in underwater settings and suggest several future applications. Based on a novel classification scheme we illustrate the broad range of available and future implementation options. Whilst we find a variety of previous work on creating and using virtual underwater worlds, quite few examples of real underwater settings exist up to now. Thus, we concentrate on this new category, sketch attractive application areas that go beyond entertainment, and derive requirements for Underwater Mixed Environments (UWME). Combined with a short summary on relevant aspects of underwater optics, we formulate potential topics of future research to overcome current limitations of UWME.
Article
This survey summarizes almost 50 years of research and development in the field of Augmented Reality (AR). From early research in the 1960's until widespread availability by the 2010's there has been steady progress towards the goal of being able to seamlessly combine real and virtual worlds. We provide an overview of the common definitions of AR, and show how AR fits into taxonomies of other related technologies. A history of important milestones in Augmented Reality is followed by sections on the key enabling technologies of tracking, display and input devices. We also review design guidelines and provide some examples of successful AR applications. Finally, we conclude with a summary of directions for future work and a review of some of the areas that are currently being researched.
Article
Cleaning is a major problem associated with pools. Since the manual cleaning is tedious and boring there is an interest in automating the task. This paper presents methods for autonomous localization and navigation for a pool cleaner to enable full coverage of pools. Path following cannot be ensured through use of internal position estimation methods alone; therefore sensing is needed. Sensor based estimation enable automatic correction of slippage. For this application we use ultrasonic sonars. Based on an analysis of the overall task and performance of the system a strategy for cleaning/navigation is developed. For the automatic localization a Kalman filtering technique is proposed: the Kalman filter uses sonar measurements and a dynamic model of the robot to provide estimates of the pose of the pool cleaner. Using this localization method we derive an optimal control strategy for traversal of a pool. The system has been implemented and successfully tested on the WEDAB400 pool cleaner.