ArticlePDF Available

Acoustic touch screen for dolphins first application of ELVIS - An echo-location visualization and interface system

Authors:
  • Lund University Faculty of Engineering
  • Kolmarden Wildlife Park
Proceedings of the Institute of Acoustics
Vol. 29. Pt.3 2007
ACOUSTIC TOUCH SCREEN FOR DOLPHINS
FIRST APPLICATION OF ELVIS - AN ECHO-LOCATION VISUALIZATION AND
INTERFACE SYSTEM
J. Starkhammar Dept. of Electrical Measurements, LTH, Lund University, Lund, Sweden
M. Amundin Kolmårdens Djurpark, Kolmården and Linköping University, Linköping, Sweden
H. Olsén Linköping University, Linköping, Sweden
M. Ahlmqvist Dept. of Electrical Measurements, LTH, Lund University, Lund, Sweden
K. Lindström Dept. of Electrical Measurements, Lund University, Lund, Sweden
H. W. Persson Dept. of Electrical Measurements, LTH, Lund University, Lund, Sweden
1 INTRODUCTION
Dolphin sonar has been extensively studied over several decades, and much of its basic
characteristics are well known (Au, 1993) [1]. Most of these studies have been based on an
experimental setup where the dolphin has been trained to be voluntarily fixed, so its directional
sonar beam could be recorded with fixed hydrophones. Although this allows for very exact
measurements, it most likely has prevented the full dynamic potential of the dolphin’s sonar to be
revealed. Also the dolphin’s response to scientific questions, e.g. in target detection threshold or
discrimination trials, mostly has been a “go/no go” response or pressing a yes/no paddle. This
traditional experimental methodology to measure the response makes rather coarse indications of
choice. It is difficult to refine and will be impractical with a multi-choice paradigm.
In cognitive studies with primates, e.g. the chimpanzee, a computerized symbol interface, based on
a finger operated touch screen, has been successfully used (Rumbaugh et al. 1975) [3]. Even with
birds, like chicken and doves, a similar approach has been used where the birds have used their
beak to indicate their choices (Cheng & Spetch, 1995) [4]. So far, however, to our knowledge a
conventional computer touch screen has not been attempted with dolphins, mainly because the
electro-magnetic grid over the touch screen would be short-circuited by the salt water. Delfour [2]
(2007) used a system that functionally was a touch screen. It was based on infrared light beams
projected through an underwater viewing panel, and guided by mirrors to create a grid in front of the
panel, and in front of the TV screen that was placed on the dry side of the panel. The dolphin
indicated its choice by breaking the light beams by its rostrum. This system was used in a study
testing the dolphin’s ability for self-recognition.
Although dolphins are known to use their rostrum to touch and manipulate things, we were
interested in exploiting and studying their main sensory system, their sonar. Therefore a new tool to
study dolphin sonar, psychophysics and cognitive skills was based on the dolphin’s sonar beam.
The new system was called the EchoLocation Visualization and Interface System (ELVIS) and was
developed at Lund University in cooperation with Kolmården Wild Animal Park [5]. It has been
further developed and is presently being tested in a dolphin food preference study at the Kolmården
Dolphinarium.
This system can function as an acoustic “touch screen” for the dolphins, i.e. the dolphin can indicate
a choice by aiming its sonar beam axis at designated areas on the screen. Hence, for the first time
the dolphins are given the opportunity to execute and run a computer program using their sonar
beam like we use a mouse cursor. This may be a much more intuitive response mode for dolphins
than the traditional go/no go or paddle press responses and equal to using their rostrum in the
Delfour touch screen [2]. The system is highly adaptable to testing a variety of different scientific
questions, since the core of the interface features is software based.
Proceedings of the Institute of Acoustics
Vol. 29. Pt.3 2007
One primary reason to perform this particular study was to evaluate if the system will be possible to
use in future food preference investigations at Kolmården Dolphinarium. The aim of this study was
to develop a training procedure for introducing the “acoustic touch screen” concept to the dolphins,
to investigate if the concept of the interactive features of the system is comprehensible for the
dolphins and to test the software in order to develop it further and optimize it for this purpose.
2 METHOD
2.1 System configuration
ELVIS is based on a matrix of 16 hydrophones as seen in Figure 1. The distance between the
hydrophones is 300mm. The hydrophones are attached to a semi-transparent screen lowered into
the water of the pool, in front of an underwater acrylic panel. The hydrophones pick up the dolphins’
sonar signals aimed at the screen. The signals are transferred via cables to an amplifier and signal
conditioning unit, and from there via the parallel port to a computer, where the signal analysis is
performed by custom designed LabVIEW software. This software constitutes the core of the
interactive features of the system. Among many features, it makes it possible to trace the sonar
beam axis in real time by generating a round colour spot on the computer screen, corresponding to
the maximum intensity in the sound beam. The sound intensity can be coded in colour and/or light
intensity. In order to improve the rather coarse resolution given by the 16 hydrophones, the exact
location of the maximum sound intensity point is derived through interpolation between the
hydrophones in the matrix.
The resulting computer screen image is continuously projected back onto the hydrophone matrix
screen, using a standard PC projector, hence giving the dolphin an immediate visual feedback to its
sonar output. In the food preference study presented here, the theoretical centre of the beam was
traced by displaying a dark red spot on the screen. This colour was chosen to make the spot
inconspicuous or even invisible to the dolphin (Madsen and Herman, 1980,) [6], but still make it
possible for the experimenter to trace the scanning of the sonar beam over the screen.
Figure 1. The basic configuration of ELVIS (Echolocation Visualization and Interface
System). The screen in this particular mode plots the sound pressure level distribution in
the dolphin sonar beam.
Proceedings of the Institute of Acoustics
Vol. 29. Pt.3 2007
The software used in this food preference study designates “buttons” or active areas on the screen,
indicated by visual symbols. See Figure 2. These symbols represented different fish species.
According to Yaman [7] these symbols should be easily discriminated visually by the dolphins.
Which symbols to be displayed on the screen were selected by the systems operator prior to each
trial.
The symbols were randomly placed over the entire screen, but were always placed over a
hydrophone. This was done to make sure that each symbol could be activated at any time, even if
the dolphin was very close to the screen, and the whole sonar beam may fall completely in between
the hydrophones. The size of these active areas of the screen as well as the trig level can easily be
altered so that, as the dolphin’s skill in handling the task improves, more accurate aiming and higher
sound pressure levels in the dolphin’s sound beam can be required.
2.2 Procedure
When the dolphin aims its sonar beam axis at a symbol, and generates clicks above a set threshold
level, it flashes to indicate a “hit” and a bridging stimulus (a 400 ms, 10 kHz sinus tone) is played
through two speakers placed close to the underwater acrylic panel. In this study, each of three such
symbols represented a different fish species (mackerel, capelin and squid; Figure 2). When the
dolphin “clicked” on one of them, it was rewarded by the fish represented by it. To help evaluating if
the choices were deliberate or random, a fourth symbol was introduced. This was essentially a
“wrong” button, and “clicking” on it resulted in the trial being aborted without any reward being
given.
Three female bottlenose dolphins (Tursiops Truncatus) were used in this study:
Vicky, born in 1973, arrived at Kolmården in 1979.
Ariel, born on October 6, 1996, at Kolmården. Mother: Vicky.
Luna, born on January 10, 2001, at Kolmården. Mother: Vicky.
Figure 2. Each symbol represents a fish species, except for the unfilled circle which
represent “false”. Only symbols and no text are displayed on the screen.
Standard operant conditioning procedures were used for training the dolphins to perform the task.
Approximately the same amount of fish of each species (approx. 50g) was given for each correct
choice.
Proceedings of the Institute of Acoustics
Vol. 29. Pt.3 2007
3 RESULTS AND DISCUSSION
3.1 Training Procedure
The training procedure and the level of understanding the task could be divided into six main steps.
Each training session was video taped (the screen and the behaviour of the dolphin from above)
and selected parameters were logged by the software in spreadsheet files. These files contain data
such as the timestamp of a choice, the active dolphin, the projected symbols (their positions were
monitored by filming the screen), the chosen symbol, the required time of echolocating at the
symbols to induce a trig of the reward signal, the trig intensity level, the size of the active area
around the symbols, and the trig level.
1. At the very first training session only one symbol was projected onto the screen. In this
initial phase of the training only capelin was used. See Figure 2. To encourage the dolphins
to echolocate towards a symbol, a metal object was held in front of it. The trig level was low
and a relatively large area around the symbol was included in the active area, to make it
easier for the dolphin to obtain a trig/correct choice.
2. The metal object was phased out and the dolphins were required to aim their sonar beam
axis at the symbol, each time appearing in the same place on the screen. The active area
around the spot was decreased and the trig level was increased.
3. The position of the symbol was randomly altered for each trial. The dolphin was required to
click deliberately on the symbol, wherever it was positioned.
4. A second symbol, representing mackerel, was introduced and appeared at a random
position together with symbol number one, forcing the dolphins to make a choice in order to
receive the reward. To avoid random trigs during the approach, the dolphin was required to
line up in front of the screen before the symbols were displayed.
5. Symbol number three, associated with squid, was introduced, requiring the dolphin to select
between all three symbols at the same time.
6. Symbol number four, the “error” symbol, was introduced when the dolphin was performing
the task with confidence and clicking at symbols without hesitation. Only this symbol and
one of the food symbols were displayed at a time. Clicking on the “error” symbol resulted in
the trial being aborted, i.e. the screen was black and no reward signal was played. The
dolphin had to return to the trainer, received no fish and was given the cue to make a new
choice. This symbol was used as a control, aimed at indicating if the dolphin was capable of
making deliberate choices between the symbols.
The training sessions lasted 5-15 minutes for each dolphin, including 5-40 fish rewards, and were
carried out two to three times a week. They were all very motivated to perform the task. Therefore
the number of repetitions during each training session was limited only by the amount of fish
available for these training sessions. The total daily ratio had to be shared with other daily training
tasks, e.g. for the public display programmes and husbandry training.
3.2 Individual differences
Luna was the first to understand that the light symbol indicated the active area on the screen. She
reached step 6 after 16 training sessions within a total of 223 trials. The development of the other
two subjects stagnated at step 5, and their behaviour indicated that they had not even reached step
3. They apparently associated the rewards with aiming their sonar beam axis towards a transducer,
but failed to connect this to the symbols. Hence, in some trials they systematically tested the
individual transducers until they by chance echolocated at the one covered by a symbol. This may
Proceedings of the Institute of Acoustics
Vol. 29. Pt.3 2007
indicate that they were locked into the acoustic domain, and that the echo characteristics of the
transducers were prioritized in the process of associating the stimuli to the task, and not the visual
symbols. It may also indicate that the dolphins have a “blind spot” along the sonar beam axis,
making it impossible for them to see the symbol while hitting it with the sonar beam axis. Still it
should be possible for them to select a symbol, since often, when the symbols were lit, they were
displayed outside the long axis of their snout. Their systematic tesing of hydrophones appeared to
occur more frequently when the symbols were displayed at the top (near the water surface) of the
screen. This also indicated that the performance of these two dolphins was influenced by the their
visual problems.
One reason for Luna’s better performance might be that she lined up 0.5-1m away from the screen,
whereas the other two had their rostrum only 0.1-0.2m from the screen. This probably helped her in
getting a better overview of the entire screen, hence making it easier for her to see the symbols at
any position on the screen.
Luna also had a different “clicking” technique than the other two. She did not use her sonar at all
until she had lined up in front of the screen and located the symbol of her choice. Then she
generated a precise click train directly at it. Vicky and Ariel both echolocated during approach and
then almost constantly after lining up in front of the screen, even when no symbols were shown.
When the symbols were displayed, they continued to echolocate while turning their sonar beam
towards the hydrophone of their choice.
The preliminar analysis indicates that the dolphins did not make any deliberate choice between the
three fish species. This may be due to all fish being equally appreciated they were all a part of
their normal fish diet, and were used indiscriminately used during normal training - or just reflect that
the dolphins failed to discriminate between the symbols. Luna’s behaviour during the trials with the
“error button” indicates that the latter may be true. She “clicked” equally often on the “error button”
and displayed frustration and confusion when the screen was blanked and no bridging tone was
played. One conclusion that can be drawn from this is that the dolphins should have been taught
the concept of discrimination before being offered the option to select.
3.3 Evaluation of software
The software worked as intended and turned out to be very robust. Most of the interactive features
were automated. However, since it turned out to be cruicial not to display the symbols until the
dolphin were lined up in front of the screen, this had to be implemented by starting the software
manually at precisely the correct moment. After a correct response, the programme had to be
stopped manually. Thereafter the symbols had to be deleted from the screen by running a short
sequence of the programme with the setting “no symbols displayed”. These non-automated features
will be eliminated in future versions of the software.
The bridging stimulus was played via the speakers placed close to the underwater acrylic panel and
via the PA system. It was feared that this might have introduced another difficulty for the dolphins in
understanding to whom the signal was intended. Normally they would expect the whistle to come
from the trainer who gave the the task cue. However, this turned out not to be a problem. All three
dolphins reacted immediately to the stimulus, and never hesitated to swim back to the trainer to
collect the fish reward.
Since the core of the interface system is software based, is it easy to alter i. e. the properties of the
visual feedback to help optimize the performance and find a solution that makes the system as
intuitive as possible to the dolphins.
Proceedings of the Institute of Acoustics
Vol. 29. Pt.3 2007
4 CONCLUSIONS
The interface function of the system worked as intended. The dolphins quickly understood the
requirement of echolocating towards the screen. They were highly motivated to perform the task
and did not have a problem with understanding that the bridging stimulus, played by the software,
was associated with their performance at the screen.
One major advantage with the system set-up is the possibility to adjust the visual feedback via the
software. The initial training of the dolphins showed that the concept of an acoustic touch screen
was comprehensible to all of them. However, individual differences in the learning to use the visual
symbols indicate that improvements in the visual feedback are motivated. This was demonstrated
by the behaviour of two of the dolphins, that apparently searched for the discriminative stimulus in
the acoustic properties of the hydrophones, instead of associating it with the visual symbol. This
problem needs to be understood and solved to reach the full potential of the system as an intuitive
response tool.
The dolphins did not show a preferrence for any of the three fish species. This may be a true non-
preference, but also indicate a failure of the dolphins to discriminate between the symbols. Whether
this was due to a visual problem or a result of the training regime remains to be resolved.
From all the results discussed, the major conclusion must be that this acoustic touch screen” has
the potential to be a valuable research tool for future psychophysical and/or cognitive studies of
dolphins, e.g. for the more elaborate food preference investigations that will be carried out at the
Kolmården Dolphinarium.
REFERENCES
1. Au, W. L.,The Sonar of Dolphins. Springer-Verlag, New York, 277pp. (1993)
2. Delfour F. Marine mammals in front of the mirror – body experiences to self-recognition: A
cognitive ethological methodology combined with phenological questioning. Aquatic
Mammals 32(4):517-527.(2007)
3. Rumbaugh D.M, Gill, T.V, von Glasersfeld, E, Warner, H, and Pisani, P., Conversations
With a Chimpanzee in a Computer-Controlled Environment. Biol Psychiatry. 10(6):627-41.
(1975)
4. Cheng, K., & Spetch, M. L., Stimulus control in the use of landmarks by pigeons in a touch-
screen task. Journal of the Experimental Analysis of Behaviour, 63, 187-201. (1995)
5. Nilsson, M., Lindström, K., Amundin, M., Persson, H. W., Echolocation and Visualization
Interface System, Clinical Physiology and Functional Imaging, 24, 3, 169-178, (2004)
6. Madsen, C.J. and Herman, L.M., Social and Ecological Correlates of Cetacean Vision and
Visual Appearance. In: Cetacean Behavior- Mechanisms and Functions. L.M. Herman (ed.)
John Wiley & Sons, NY. (1980)
7. Yaman, S. von Fersen, L., Dehnhardt G., Güntürkü O., Visual Lateralization in the
Bottlenose Dolphin (Tursiops truncatus): Evidence for a Population Asymmetry.
Behavioural Brain Research, 143, 109-114. (2003)
... Here, touch panel devices that present only visual stimuli that are impossible to explore by echolocation might appear to be useful to eliminate this possibility. However, the situation of having those devices in direct contact with underwater animals is difficult to achieve because the electro-magnetic grid over the touch panel would be short-circuited by the salt water [40]. There have been a few examples of previous studies using touch panels on a sea lion [41] and dolphins [42], but most of these studies were also conducted in the air, such as in poolside environments. ...
... Thus, it makes it possible to conduct interspecies comparisons and to solve one of the problems in the cognitive studies on cetaceans, i.e., where the target species has been limited to bottlenose dolphins. Similar to the present study, there have been several attempts to apply computerized tasks that have been used mainly with the terrestrial mammals to underwater mammals [40][41][42][43][44][45]61]. The widespread use of these methods in cognitive studies of underwater mammals will make interspecies comparisons across a wide range of species easier not only with other underwater animals but also with terrestrial animals. ...
Article
Full-text available
Matching-to-sample tasks have been a useful method in visual cognitive studies on non-human animals. The use of touch panels in matching-to-sample tasks has contributed to cognitive studies on terrestrial animals; however, there has been a difficulty in using these devices underwater, which is one of the factors that has slowed the progress of visual studies on underwater animals. Cetaceans (e.g., dolphins and whales) are highly adapted to underwater environments, and further studies on their cognitive abilities are needed to advance our understanding of the interactions between environmental factors and the evolution of cognitive abilities. In this study, we aimed to develop a new experimental method in which a captive killer whale performed a matching-to-sample task using a monitor shown through an underwater window as if a touch panel were used. In order to confirm the usefulness of this method, one simple experiment on mirror image discrimination was conducted, and the pairs with mirror images were shown to be more difficult to identify than the pairs with other normal images. The advantages of using this method include (1) simplicity in the devices and stimuli used in the experiments, (2) appropriate and rigorous experimental control, (3) the possibility of increasing the number of individuals to be tested and interspecies comparisons, and (4) contributions to animal welfare. The use of this method solves some of the problems in previous visual cognitive studies on cetaceans, and it suggests the further possibility of future comparative cognitive studies. It is also expected to contribute to animal welfare in terms of cognitive enrichment, and it could help with the proposal of new exhibition methods in zoos and aquariums.
... Multi-channel computer-based dolphin research equipments have previously been reported by several research groups [1][2][3][4][5]. The system presented in [4] housed 24 hydrophones. ...
... To further broaden the applications of this system, data acquisition software allowing the system to work also as an acoustically operated "touch screen", activated by the echolocation beam, was developed. The acoustically operated "touch screen" concept was first reported in [2,3]. The improved software in the present system can be set to respond to both the location, frequency composition and/or the amplitude of the clicks. ...
Article
Full-text available
This paper describes in depth the design and application considerations of a computer based measurement system enabling 1 MS/s simultaneous sampling of 47 hydrophones for cross sectional recordings of echolocation beams of toothed whales (Odontocetes). An earlier prototype version of the system has previously only been presented as a brief proof of principle that did not offer a complete description of the software and hardware solution. Crucial hardware and software design considerations of the further developed system include the rearm times of the burst mode sampling and the dual-core distributed execution of the software components. The rearm time was measured to 283 µs, using a 550 µs long sample window around each click. This enables burst mode sampling of clicks with an inter-click interval as short as 833 µs. It is shown through both synthetic benchmark tests of the system and through field measurements of bottle-nose dolphins (Tursiops truncatus) and a beluga whale (Delphinapterus leucas) that it is capable of acquiring, analyzing and visualizing data in run-time. It operates effectively also in highly reverberant surroundings like concrete pools and shallow waters. Burst mode sampling allows the system to block reflections with 0.3-0.5 m longer propagation paths than the direct path. It is suggested that the system's compliance to reverberant recording sites makes it valuable in future dolphin echolocation studies.
... Underwater tones such as electronic sounds, words and even musical instruments can also cue behaviours in presentations without trainers being visible to the spectators. Interactive screens with symbols work well, some of which can be activated by echolocation, to give an individual the opportunity to request preferred reinforcers from their trainer (Starkhammar et al. 2007). ...
... In an innovative study of choice and control for bottlenose dolphins, the EchoLocation Visualisation and Interface System (ELVIS) was developed at Kolmården Dolphinarium, Sweden (Starkhammar et al., 2007). This is a software-based system providing infinite possibilities and variations, and is only limited by imagination. ...
Article
Marine mammals include cetaceans, pinnipeds, sirenians, sea otters and polar bears, many of which are charismatic and popular species commonly kept under human care in zoos and aquaria. However, in comparison with their fully terrestrial counterparts their welfare has been less intensively studied, and their partial or full reliance on the aquatic environment leads to unique welfare challenges. In this paper we attempt to collate and review the research undertaken thus far on marine mammal welfare, and identify the most important gaps in knowledge. We use 'best practice case studies' to highlight examples of research promoting optimal welfare, include suggestions for future directions of research efforts, and make recommendations to strive for optimal welfare, where it is currently lacking, above and beyond minimum legislation and guidelines. Our review of the current literature shows that recently there have been positive forward strides in marine mammal welfare assessment, but fundamental research is still required to validate positive and negative indicators of welfare in marine mammals. Across all marine mammals, more research is required on the dimensions and complexity of pools and land areas necessary for optimal welfare, and the impact of staff absence for most of the 24-hour day, as standard working hours are usually between 0900-1700.
... The purpose of this first application was to test the acoustic touch screen and to introduce the dolphins to this concept. Initial results of this study are presented in Starkhammar et al. 2007 and Olsén 2007. To help evaluate the functionality of the touch screen, a tracing function of the beam axis was built into the program of the acoustic touch screen. ...
Article
Full-text available
The present study describes the development and testing of a tool for dolphin research. This tool was able to visualize the dolphin echolocation signals as well as function as an acoustically operated "touch screen." The system consisted of a matrix of hydrophones attached to a semitransparent screen, which was lowered in front of an underwater acrylic panel in a dolphin pool. When a dolphin aimed its sonar beam at the screen, the hydrophones measured the received sound pressure levels. These hydrophone signals were then transferred to a computer where they were translated into a video image that corresponds to the dynamic sound pressure variations in the sonar beam and the location of the beam axis. There was a continuous projection of the image back onto the hydrophone matrix screen, giving the dolphin an immediate visual feedback to its sonar output. The system offers a whole new experimental methodology in dolphin research and since it is software-based, many different kinds of scientific questions can be addressed. The results were promising and motivate further development of the system and studies of sonar and cognitive abilities of dolphins.
Chapter
Scientific Foundations of Zoos and Aquariums - edited by Allison B. Kaufman January 2019
Article
In the past few decades, there has been an increase in the number of zoo-based touchscreen studies of animal cognition around the world. Such studies have contributed to the field of comparative cognition despite the fact research has only been performed at a relatively small number of institutions and with a narrow range of species. Nonetheless, zoo-based touchscreen studies are increasingly recognized as both having the potential to be enriching for captive animals by providing them with opportunities for choice, as well as potentially being a tool with which to measure changes in welfare. Zoo-based touchscreen research on public display also has the potential to impact zoo visitors; encouraging them not only learn more about the cognitive abilities of animals, but also potentially promoting increased respect for these species. Given the lack of a comprehensive review of this scope of specialized research, and the broad potential impacts on animals and programs, here we discuss the history, implementation, and potential outcomes of touchscreen research in zoo settings. K E Y W O R D S comparative cognition, touch panel, touchscreens, welfare, zoos
Article
We have an ethical responsibility to provide captive animals with environments that allow them to experience good welfare. Husbandry activities are often scheduled for the convenience of care staff working within the constraints of the facility, rather than considering the biological and psychological requirements of the animals themselves. The animal welfare 24/7 across the lifespan concept provides a holistic framework to map features of the animal's life cycle, taking into account their natural history, in relation to variations in the captive environment, across day and night, weekdays, weekends, and seasons. In order for animals to have the opportunity to thrive, we argue the need to consider their lifetime experience, integrated into the environments we provide, and with their perspective in mind. Here, we propose a welfare assessment tool based upon 14 criteria, to allow care staff to determine if their animals' welfare needs are met. We conclude that animal habitat management will be enhanced with the use of integrated technologies that provide the animals with more opportunities to engineer their own environments, providing them with complexity, choice and control.
Article
Full-text available
Various research programs study and analyze selfrecognition and self-consciousness in animals and humans. This article briefly presents and discusses experiments investigating self-recognition in several marine mammal species and aims to introduce a new dual conceptual framework that could be a useful tool to understanding complex phenomena like self-recognition and self-consciousness. Results of previous studies show that some marine mammal species can recognize their image in a mirror while others cannot. This discontinuity, also present in nonhuman primates, leads one to question the genesis and nature of the individual’s relation to the self. The relation of a subject to its environment is strongly associated with its perception of its own body. Sensorimotor interface of the subject enables it to get significant and relevant information from the environment; it then builds sense (meaning) through this particular relation. The complexity of underlying philosophical and ethical stakes related to this topic may demonstrate a need for multidisciplinary approaches. A recent attempt to point out the benefits of combining an ethological approach with phenomenological questioning has already been made (Delfour & Carlier, 2004). On one side, ethology takes an external perspective on the subject by studying and analyzing behaviors in the context of specific stimuli of the environment. On the other side, phenomenology allows a double “opening” (i.e., access) of subjectivity to the world and to others with an embodied, temporal, and imaginative consciousness (MerleauPonty, 1945). This paper describes the theoretical and conceptual difficulties in studying self-recognition and self-consciousness in animals and reports on some major epistemological and methodological pitfalls to these studies.
Article
Full-text available
Pigeons were tested in a search task on the surface of a monitor on which their responses were registered by a touch-sensitive device. A graphic landmark array was presented consisting of a square outline (the frame) and a colored "landmark." The unmarked goal, pecks at which produced reward, was located near the center of one edge of the frame, and the landmark was near it. The entire array was displaced without rotation on the monitor from trial to trial. On occasional no-reward tests, the following manipulations were made to the landmark array: (a) either the frame or the landmark was removed; (2) either one edge of the frame or the landmark was shifted; and (3) two landmarks were presented with or without the frame present. On these two-landmark tests, the frame, when present, defined which was the "correct" landmark. When the frame was absent, the "correct" landmark was arbitrarily determined. Results showed that pecks of 2 pigeons were controlled almost solely by the landmark, pecks of 3 were controlled primarily by the landmark but the frame could distinguish the correct landmark, and 1 bird's behavior was controlled primarily by the frame. Stimulus control in this search task is thus selective and differs across individuals. Comparisons to other search tasks and to other stimulus control experiments are made.
Article
Full-text available
The present study describes the development and testing of a tool for dolphin research. This tool was able to visualize the dolphin echolocation signals as well as function as an acoustically operated "touch screen." The system consisted of a matrix of hydrophones attached to a semitransparent screen, which was lowered in front of an underwater acrylic panel in a dolphin pool. When a dolphin aimed its sonar beam at the screen, the hydrophones measured the received sound pressure levels. These hydrophone signals were then transferred to a computer where they were translated into a video image that corresponds to the dynamic sound pressure variations in the sonar beam and the location of the beam axis. There was a continuous projection of the image back onto the hydrophone matrix screen, giving the dolphin an immediate visual feedback to its sonar output. The system offers a whole new experimental methodology in dolphin research and since it is software-based, many different kinds of scientific questions can be addressed. The results were promising and motivate further development of the system and studies of sonar and cognitive abilities of dolphins.
Book
The sonar of dolphins has undergone evolutionary re-finement for millions of years and has evolved to be the premier sonar system for short range applications. It far surpasses the capability of technological sonar, i.e. the only sonar system the US Navy has to detect buried mines is a dolphin system. Echolocation experiments with captive animals have revealed much of the basic parameters of the dolphin sonar. Features such as signal characteristics, transmission and reception beam patterns, hearing and internal filtering properties will be discussed. Sonar detection range and discrimination capabilities will also be included. Recent measurements of echolocation signals used by wild dolphins have expanded our understanding of their sonar system and their utilization in the field. A capability to perform time-varying gain has been recently uncovered which is very different than that of a technological sonar. A model of killer whale foraging on Chinook salmon will be examined in order to gain an understanding of the effectiveness of the sonar system in nature. The model will examine foraging in both quiet and noisy environments and will show that the echo levels are more than sufficient for prey detection at relatively long ranges.
Article
The linguistic-type skills of a young chimpanzee (Pan) acquired in a computer-controlled language-training situation are reviewed. Those skills include facile acquisition of vocabulary, object naming, color naming, appropriate use of "yes" and "no" in response to certain questions, and conversation. In conversations the subject has formulated novel sentences and without special training has asked that objects be named, whereupon requests were made that they be given to her. These findings are interpreted in terms of how enriched environments can serve to bring forth novel communication skills in the chimpanzee, which is otherwise alinguistic; how the challenge of the environment can serve to limit manifest intelligence; and how a cognitive, rather than the traditional stimulus-response, framework is required for understanding the communication skills and psychological processes of the chimpanzee.
Article
A previous behavioural study with a single bottlenose dolphin had reported a right eye superiority in visual discrimination tasks, indicating a left hemisphere dominance for visual object processing. The presence of a functional asymmetry demonstrated with one individual shows that this function can be lateralized in this single animal, but cannot reveal if this represents a population asymmetry. Therefore, we conducted a series of visual discrimination experiments with three individuals of Tursiops truncatus under monocular conditions. The tested animals had to distinguish between simultaneously presented stimulus pairs of different patterns, whereby one stimulus was always defined to be correct. Additionally, the animals were observed for their free eye use during training and introduction of new items. The present data set revealed a right eye advantage (left hemisphere dominance) for all tested animals and a predominance of right eye use during daily activities. These results make it possible that bottlenose dolphins are lateralized for visual pattern discrimination at the level of a population asymmetry. Against the background of similar data in other vertebrates, a left hemisphere dominance for pattern discrimination points to the possibility that dolphins exploit local visual details instead of global configurational features to recognize and memorize visual stimuli.
Echolocation and Visualization Interface System
  • M Nilsson
  • K Lindström
  • M Amundin
  • H W Persson
Nilsson, M., Lindström, K., Amundin, M., Persson, H. W., Echolocation and Visualization Interface System, Clinical Physiology and Functional Imaging, 24, 3, 169-178, (2004)