Article

Acoustic Flow-Based Control of a Mobile Platform Using a 3D Sonar Sensor

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Moving a sensor through its environment creates signature time variations of the sensor's readings often referred to as flow cues. We analyze the acoustic flow field generated by a sonar sensor, capable of imaging the full frontal hemisphere, mounted on a mobile platform. We show how the cues derived from this acoustic flow field can be used directly in a layered control strategy which supports a robotic platform to perform a set of motion primitives such as obstacle avoidance, corridor following and negotiating corners and T-junctions. The programmable nature of the spatial sampling pattern of the sonar allows efficient support of the varying information requirements of the different motion primitives. The proposed control strategy is first validated in a simulated environment and subsequently transferred to a real mobile robot. We present simulated and experimental results on the controller's performance while executing the different motion primitives. The results further show that the proposed control strategy can easily integrate minimal steering commands given by a user (electric wheelchair application) or by a high-level navigation module (autonomous SLAM applications).

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... It can cope with simultaneously arriving echos and transform the recorded microphone signals containing the reflections to full 3D spatial images or point-clouds of the environment. Such biologically inspired [8] sensors have been developed by Jan Steckel et al. [6,9] and were implemented for several use-cases, such as for autonomous local navigation [10] and SLAM [11,12]. ...
... Research on insects that use optical flow clues [13,14] and bats using acoustic flow cues [15][16][17][18] shows that they use these cues to extract the motion of the agent through the environment, also called the ego-motion, and the spatial 3D structure of said environment through that motion. Jan Steckel et al. [10] used acoustic flow cues, found in the images of a sonar sensor, in a 2D navigation controller for a mobile platform with motion behaviors such as obstacle avoidance, collision avoidance, and the following of corridors in indoor environments. However, its theoretical foundation had the constraint that only a single sensor could be used, which had to be placed in the center of rotation of the mobile platform. ...
... In this work, we will use this gain in spatial resolution and FOV by using a multi-sonar in real time. The theoretical foundation created in [10] will be expanded to remove the constraint of using one single sonar sensor and to allow a modular design of the mobile platform and placement of sensors. The navigation controller is expanded to provide the operator the opportunity to create adaptive zoning around the mobile platform where certain motion behaviors should be executed. ...
Preprint
Full-text available
Navigation in varied and dynamic indoor environments remains a complex task for autonomous mobile platforms. Especially when conditions worsen, typical sensor modalities may fail to operate optimally and subsequently provide inapt input for safe navigation control. In this study, we present an approach for the navigation of a dynamic indoor environment with a mobile platform with a single or several sonar sensors using a layered control system. These sensors can operate in conditions such as rain, fog, dust, or dirt. The different control layers, such as collision avoidance and corridor following behavior, are activated based on acoustic flow queues in the fusion of the sonar images. The novelty of this work is allowing these sensors to be freely positioned on the mobile platform and providing the framework for designing the optimal navigational outcome based on a zoning system around the mobile platform. Presented in this paper is the acoustic flow model used, as well as the design of the layered controller. Next to validation in simulation, an implementation is presented and validated in a real office environment using a real mobile platform with one, two, or three sonar sensors in real time with 2D navigation. Multiple sensor layouts were validated in both the simulation and real experiments to demonstrate that the modular approach for the controller and sensor fusion works optimally. The results of this work show stable and safe navigation of indoor environments with dynamic objects.
... However, not the typical sonar sensor found on a car used as a parking-sensor, but an advanced sonar sensor with a wide field of view (FOV), ability to cope with simultaneously arriving echoes and capable of extracting full 3D-spatial information from the reflections in the environments. In recent years, such biologically inspired [1] sensors have been developed by Jan Steckel et al. [2], [3] and have been used for applications with semi-autonomous agents to Simultaneously Localise and Map (SLAM) [4], [5] the environment and navigate it locally [6]. However, a common issue still seen with these advanced sonar sensors, is the limited spatial resolution one sonar sensor can deliver of the full spatial environment around for example a mobile platform. ...
... Furthermore, we will look into two particular cases where we construct the 2D-velocity field specifically for only linear movement and only rotation movement. The previous acoustic flow model on which this paper is based is described in [6] for a single forward facing sonar sensor. In this section we will focus on the changes made to support a Multi-Sonar configuration where a sensor can be placed anywhere on a mobile platform with the only restrictions that they all lie on the same horizontal plane. ...
... These input motion commands are subsequently modulated by the control behaviour laws. The previous control system is described in detail in [6] and has been altered in this paper to be adaptive and stable for Multi-Sonar configurations. ...
Preprint
Full-text available
Navigating spatially varied and dynamic environments is one of the key tasks for autonomous agents. In this paper we present a novel method of navigating a mobile platform with one or multiple 3D-sonar sensors. Moving a mobile platform and subsequently any 3D-sonar sensor on it, will create signature variations over time of the echoed reflections in the sensor readings. An approach is presented to create a predictive model of these signature variations for any motion type. Furthermore, the model is adaptive and works for any position and orientation of one or multiple sonar sensors on a mobile platform. We propose to use this adaptive model and fuse all sensory readings to create a layered control system allowing a mobile platform to perform a set of primitive motions such as collision avoidance, obstacle avoidance, wall following and corridor following behaviours to navigate an environment with dynamically moving objects within it. This paper describes the underlying theoretical base of the entire navigation model and validates it in a simulated environment with results that shows the system is stable and delivers expected behaviour for several tested spatial configurations of one or multiple sonar sensors that can complete an autonomous navigation task.
... It can cope with simultaneously arriving echos and transform the recorded microphone signals containing the reflections to full 3D spatial images or point-clouds of the environment. Such biologically inspired [8] sensors have been developed by Jan Steckel et al. [6,9] and were implemented for several use-cases, such as for autonomous local navigation [10] and SLAM [11,12]. ...
... Research on insects that use optical flow clues [13,14] and bats using acoustic flow cues [15][16][17][18] shows that they use these cues to extract the motion of the agent through the environment, also called the ego-motion, and the spatial 3D structure of said environment through that motion. Jan Steckel et al. [10] used acoustic flow cues, found in the images of a sonar sensor, in a 2D navigation controller for a mobile platform with motion behaviors such as obstacle avoidance, collision avoidance, and the following of corridors in indoor environments. However, its theoretical foundation had the constraint that only a single sensor could be used, which had to be placed in the center of rotation of the mobile platform. ...
... In this work, we will use this gain in spatial resolution and FOV by using a multi-sonar in real time. The theoretical foundation created in [10] will be expanded to remove the constraint of using one single sonar sensor and to allow a modular design of the mobile platform and placement of sensors. The navigation controller is expanded to provide the operator the opportunity to create adaptive zoning around the mobile platform where certain motion behaviors should be executed. ...
Article
Full-text available
Navigation in varied and dynamic indoor environments remains a complex task for autonomous mobile platforms. Especially when conditions worsen, typical sensor modalities may fail to operate optimally and subsequently provide inapt input for safe navigation control. In this study, we present an approach for the navigation of a dynamic indoor environment with a mobile platform with a single or several sonar sensors using a layered control system. These sensors can operate in conditions such as rain, fog, dust, or dirt. The different control layers, such as collision avoidance and corridor following behavior, are activated based on acoustic flow queues in the fusion of the sonar images. The novelty of this work is allowing these sensors to be freely positioned on the mobile platform and providing the framework for designing the optimal navigational outcome based on a zoning system around the mobile platform. Presented in this paper is the acoustic flow model used, as well as the design of the layered controller. Next to validation in simulation, an implementation is presented and validated in a real office environment using a real mobile platform with one, two, or three sonar sensors in real time with 2D navigation. Multiple sensor layouts were validated in both the simulation and real experiments to demonstrate that the modular approach for the controller and sensor fusion works optimally. The results of this work show stable and safe navigation of indoor environments with dynamic objects.
... However, not the typical sonar sensor found on a car used as a parking-sensor, but an advanced sonar sensor with a wide field of view (FOV), ability to cope with simultaneously arriving echoes and capable of extracting full 3D-spatial information from the reflections in the environments. In recent years, such biologically inspired [1] sensors have been developed by Jan Steckel et al. [2], [3] and have been used for applications with semi-autonomous agents to Simultaneously Localise and Map (SLAM) [4], [5] the environment and navigate it locally [6]. However, a common issue still seen with these advanced sonar sensors, is the limited spatial resolution one sonar sensor can deliver of the full spatial environment around for example a mobile platform. ...
... Furthermore, we will look into two particular cases where we construct the 2D-velocity field specifically for only linear movement and only rotation movement. The previous acoustic flow model on which this paper is based is described in [6] for a single forward facing sonar sensor. In this section we will focus on the changes made to support a Multi-Sonar configuration where a sensor can be placed anywhere on a mobile platform with the only restrictions that they all lie on the same horizontal plane. ...
... These input motion commands are subsequently modulated by the control behaviour laws. The previous control system is described in detail in [6] and has been altered in this paper to be adaptive and stable for Multi-Sonar configurations. ...
Conference Paper
Navigating spatially varied and dynamic environments is one of the key tasks for autonomous agents. In this paper we present a novel method of navigating a mobile platform with one or multiple 3D-sonar sensors. Moving a mobile platform and subsequently any 3D-sonar sensor on it, will create signature variations over time of the echoed reflections in the sensor readings. An approach is presented to create a predictive model of these signature variations for any motion type. Furthermore, the model is adaptive and works for any position and orientation of one or multiple sonar sensors on a mobile platform. We propose to use this adaptive model and fuse all sensory readings to create a layered control system allowing a mobile platform to perform a set of primitive motions such as collision avoidance, obstacle avoidance, wall following and corridor following behaviours to navigate an environment with dynamically moving objects within it. This paper describes the underlying theoretical base of the entire navigation model and validates it in a simulated environment with results that shows the system is stable and delivers expected behaviour for several tested spatial configurations of one or multiple sonar sensors that can complete an autonomous navigation task.
... This complementary behaviour of sensors is exactly what is necessary to develop robust multimodal sensing systems. While the eRTIS sonar sensor can be used to obtain robust navigation behaviour and motion primitives [16], the use of the eRTIS in multimodal sensor systems is still to be explored. To facilitate this incorporation, we experiment with the conversion of 3D sonar measurements into LiDAR point cloud data. ...
... This generation is done by traversing a simulated robot through an environment and calculating for each time step the corresponding sonar and LiDAR measurements. The sonar sensor is simulated as described in [16], for the LiDAR simulation we use a raytracing algorithm. We simulate the essential properties of both sensors as close to reality as possible, without making the computation too complex. ...
... the algorithm [16]. Figure 2 shows an example simulated environment and the corresponding simulated measurements. ...
Article
Full-text available
Sensors using ultrasonic sound have proven to provide accurate 3D perception in difficult environments where other modalities fail. Several industrial sectors need accurate and reliable sensing in these harsh conditions. The conventional LiDAR/camera approach in many state-of-the-art autonomous navigation methods is limited to environments with optimal sensing conditions for visual modalities. The use of other sensing modalities can thus improve reliability and usability and increase the application potential of autonomous agents. Ultrasonic measurements provide, compared to LiDAR, a much sparser representation of the environment, making a direct replacement of the LiDAR sensor difficult. In this work, we propose a method to predict LiDAR point cloud data from an in-air acoustic sonar sensor using a convolutional stacked autoencoder. This provides a robotic system with high-resolution measurements and allows for easier integration into existing systems to safely navigate environments where visual modalities become unreliable and less accurate. A video of our predictions is available at https://youtu.be/jlx1S-tslmo.
... In this paper we present two architectures that build upon our previous developments in which we transfer the computational signal processing tasks from a dedicated computing device to an embedded system. Without the need for an external computing platform these are very compact yet powerfull systems capable of imaging the complete frontal hemisphere and performing 3D localization, suitable for applications like SLAM [10] or complex control tasks such as navigation in corridorlike environments [11]. The rest of this paper is structured as follows: In the next section (II) the sonar architecture will be discussed to give an overview of how the sensor systems acquire, process and evaluate environmental data. ...
... The digital interface of these microphones allows us to connect them directly to the GPIO pins of a platform capable of acquiring the data at these speeds. In this case 32 microphones are placed in an array, one having a rectangular boundary similar to the array used in [8] [10] [11], while the second type of array makes use of a new method where the outline is an ellipsoidal shape and the microphones are irregularly placed using Poisson disc sampling [18]. Their corresponding Point Spread Function (PSF) is calculated using a time-domain model of the array sensor, which is shown in Fig.3. ...
... In this way we were able to create a small yet very powerful and flexible experimental platform which enables us to mimic the echolocating [5], [6] properties of different species of bats. The biomimetic approach in combination with ultrasonic pulse-echo sensing has proven to become a valid option for both navigation and Simultaneous Localization And Mapping [7]- [9]. To demonstrate our flexible low-cost sonar sensor platform we have looked into the behavior of the common big-eared bat (Micronycteris microtis) [10] of which an example can be seen in the right panel of Figure 1. ...
... In order to do this a behavior-based control architecture or subsumption control architecture was designed. This layered control system distinguishes itself by coupling these layers to a specific set of actions and priorities [9], [12]. This also means that only one layer, and hence behavior, will take control of the robot's action at a single measurement. ...
... Work has been done on recreating the bat's sensory functionality in hardware by means of artificial pinnae and microphone arrays [7]. Sonar has also been used to perform simultaneous localization and mapping (SLAM) [8], as well as robotic guidance for autonomous navigation [9]. ...
... Our control scheme makes a significant contribution to this topic, because it is able to smoothly avoid obstacles in large-scale, real-world environments using a biomimetic radar sensor. For the implementation of the control scheme we base ourselves on subsumption architecture, which was developed around the late eighties as a reaction to the classical robotic control design prevalent at the time [15] and is still relevant in the present day [9], [42]. In classic control, a system would sequentially sense the environment, build or update its internal model of the external world, plan actions to perform based on the input, and finally act out the planned operations before repeating this cycle. ...
Article
Full-text available
This article introduces the application of principles from biological echolocation to radar sensing. A novel biomimetic radar sensor is presented whose features are based on the relevant morphology of a bat. Signal properties accessible to bats as well as biologically feasible processing techniques are discussed, and we show how they translate to the domain of pulseecho radar. We demonstrate that by applying these techniques, our radar system can achieve a 3D localization of reflectors, and in combining that localization with a custom subsumption control architecture, we have realized a robotic platform capable of autonomous navigation through an unknown environment. Finally, we employ a specialized simultaneous localization and mapping (SLAM) algorithm during autonomous navigation and show that it is capable of building an accurate topological map of the environment using radar as the only source of exteroceptive information.
... Several engineering initiatives have made use of sensory-guided navigation to control autonomous vehicles Conte and Doherty, 2008;Smith et al., 2013;Steckel and Peremans, 2017;Strydom et al., 2014) or create devices to help visually impaired individuals move safely within their environment (Filipe et al., 2012;Katzschmann et al., 2018;Lee and Medioni, 2011). While some of these systems use patterns of light, such as optic flow, to process information from the environment (Conte and Doherty, 2008;Strydom et al., 2014), recent work in sonar-based navigation has incorporated acoustic flow cues to automatically steer unmanned vehicles through complex corridors Peremans and Steckel, 2014;Smith et al., 2014;Steckel and Peremans, 2017;Vanderelst et al., 2016). ...
... Several engineering initiatives have made use of sensory-guided navigation to control autonomous vehicles Conte and Doherty, 2008;Smith et al., 2013;Steckel and Peremans, 2017;Strydom et al., 2014) or create devices to help visually impaired individuals move safely within their environment (Filipe et al., 2012;Katzschmann et al., 2018;Lee and Medioni, 2011). While some of these systems use patterns of light, such as optic flow, to process information from the environment (Conte and Doherty, 2008;Strydom et al., 2014), recent work in sonar-based navigation has incorporated acoustic flow cues to automatically steer unmanned vehicles through complex corridors Peremans and Steckel, 2014;Smith et al., 2014;Steckel and Peremans, 2017;Vanderelst et al., 2016). Most of the acoustic-based navigation devices have been tested in environments that contain large objects or flat surfaces, and it would be interesting to test the behavior of these systems in environments that create echo flow patterns similar to those presented here. ...
Article
To navigate in the natural environment, animals must adapt their locomotion in response to environmental stimuli. The echolocating bat relies on auditory processing of echo returns to represent its surroundings. Recent studies have shown that echo flow patterns influence bat navigation, but the acoustic basis for flight path selection remains unknown. To investigate this problem, we released bats in a flight corridor with walls constructed of adjacent individual wooden poles, which returned cascades of echoes to the flying bat. We manipulated the spacing and echo strength of the poles comprising each corridor side, and predicted that bats would adapt their flight paths to deviate toward the corridor side returning weaker echo cascades. Our results show that the bat's trajectory through the corridor was not affected by the intensity of echo cascades. Instead, bats deviated toward the corridor wall with more sparsely spaced, highly reflective poles, suggesting that pole spacing, rather than echo intensity, influenced bat flight path selection. This result motivated investigation of the neural processing of echo cascades. We measured local evoked auditory responses in the bat inferior colliculus to echo playback recordings from corridor walls constructed of sparsely and densely spaced poles. We predicted that evoked neural responses would be discretely modulated by temporally distinct echoes recorded from the sparsely spaced pole corridor wall, but not by echoes from the more densely spaced corridor wall. The data confirm this prediction and suggest that the bat's temporal resolution of echo cascades may drive its flight behavior in the corridor.
... In recent years, our research group has developed state of the art 3D sonar sensors [1]- [3] which use a low-cost MEMS microphone array for real-time acoustic imaging in air. Using this sensor, various robotic applications have been developed [2], including obstacle avoidance and corridor following and SLAM [1]. The developed sensor is capable of localizing an arbitrary number of reflectors, and generates 2D (range versus azimuth) or 3D (range versus azimuth and elevation) acoustic images of the environment by emitting a broadband, spatially omnidirectional acoustic emission. ...
... The position of the pan/tilt unit is adapted to track the ball. The final demonstration is a video of robot navigation using only the sonar sensor, similar to the video which accompanies our paper on acoustic flow-based corridor following [2]. Non-functional sonar prototypes will also be available at the demonstration booth to clarify the internal workings of the sensor. ...
... An example of a combination with capacitive sensors to overcome detection limitations at short distances is provided in [101]. The group of Prof. Steckel at University Antwerp has a strong focus on 3D A-US for robotic applications, e. g. [102], [103]. The group has achieved remarkable results in areas such as SLAM for the navigation of mobile platforms. ...
Article
Proximity perception is a technology that has the potential to play an essential role in the future of robotics. It can fulfill the promise of safe, robust, and autonomous systems in industry and everyday life, alongside humans, as well as in remote locations in space and underwater. In this survey article, we cover the developments of this field from the early days up to the present, with a focus on human-centered robotics. In this domain, proximity sensors are typically deployed in two scenarios: first, on the exterior of manipulator arms to support safety and interaction functionality, and second, on the inside of grippers or hands to support grasping and exploration. Therefore, based on this observation, in the beginning of this article, we propose a categorization to organize the use cases of proximity sensors in human-centered robotics. Then, we devote effort to present the sensing technologies and different measuring principles that have been developed over the years, also providing a summary in form of a table. Following, we review the literature regarding the applications that have been proposed. Finally, we give an overview of the most important trends that will shape the future of this domain.
... An example of a combination with capacitive sensors to overcome detection limitations at short distances is provided in [101]. The group of Prof. Steckel at University Antwerp has a strong focus on 3D A-US for robotic applications, e. g. [102], [103]. The group has achieved remarkable results in areas such as SLAM for the navigation of mobile platforms. ...
Preprint
Full-text available
Proximity perception is a technology that has the potential to play an essential role in the future of robotics. It can fulfill the promise of safe, robust, and autonomous systems in industry and everyday life, alongside humans, as well as in remote locations in space and underwater. In this survey paper, we cover the developments of this field from the early days up to the present, \textcolor{Reviewer4}{with a focus on human-centered robotics.} Here, proximity sensors are typically deployed in two scenarios: first, on the exterior of manipulator arms to support safety and interaction functionality, and second, on the inside of grippers or hands to support grasping and exploration. Starting from this observation, we propose a categorization for the approaches found in the literature. To provide a basis for understanding these approaches, we devote effort to present the technologies and different measuring principles that were developed over the years, also providing a summary in form of a table. Then, we show the diversity of applications that have been presented in the literature. Finally, we give an overview of the most important trends that will shape the future of this domain.
... Experimental results show that sufficient information can be acquired from this system to construct a map of an unmodified office environments. [22] shows that the cues extracted from the acoustic flow provided by a 3-D sonar sensor can be directly used in the motion control of a mobile robot. All of these systems utilize multiple emitters and receivers to extract 2-D or 3-D position information from which we can infer that spatial and temporal ultrasonic data is indispensable to acquire a rich description of the environment. ...
Conference Paper
Recent works have proved that combining spatial and temporal visual cues can significantly improve the performance of various vision-based robotic systems. However, for the ultrasonic sensors used in most robotic tasks (e.g. collision avoidance, localization and navigation), there is a lack of benchmark ultrasonic datasets that consist of spatial and temporal data to verify the usability of spatial and temporal ultrasonic cues. In this paper, we are the first to propose a Spatio-Temporal Ultrasonic Dataset (STUD), which aims to develop the ability of ultrasonic sensors by mining spatial and temporal information from multiple ultrasonic measurements. In particular, we first propose a novel Spatio-Temporal (ST) ultrasonic data gathering scheme, in which an innovatory data instance is designed. Besides, part of the data in the STUD is collected in a robot simulator, in which a well-designed corridor map is utilized to increase the data diversity. Then a selection algorithm is proposed to find a proper length of data sequences to obtain the best description of the navigation environments. Finally, we present an end-to-end learning benchmark model that learns driving policies by extracting spatial and temporal ultrasonic cues from the STUD. With the help of our STUD and this benchmark model, more powerful deep neural networks can be trained for addressing the tasks of indoor navigation or motion planning of mobile robots, which is unachievable by using the existing ultrasonic datasets. Comparison experiments verified the effectiveness of spatial and temporal ultrasonic cues for the robot driving policy learning.
... The proposed sensor solution is not only intended at creating maps of the environment, but rather can contribute to solving sensing problems in harsh conditions where optical techniques such as lidar or 3D cameras fail. We plan to continue and extend our previous work on autonomous navigation using 3D sonar sensors as main sensing modality [18], implementing autonomous navigation strategies in harsh, outdoor conditions. ...
... To aid underwater robot navigation, sonar landmarks (12), information encoding sonar markers (13,14), and even commercially available sonar targets exist (15). Surprisingly, airborne sonar systems have not used synthetic markers so far, even though nearly every autonomous vehicle is equipped with sonar sensors and several studies showed that such sensors are capable of 3-dimensional (3D) localization and classification of objects in complex environments (16)(17)(18)(19)(20)(21). The reason reflector designs used for underwater sonar cannot be used in air is that they are usually multilayered, which means that they have layers of different materials with different acoustic impedances creating a certain recognizable reflection pattern. ...
Article
Sonar sensors are universally applied in autonomous vehicles such as robots and driverless cars as they are inexpensive, energy-efficient, and provide accurate range measurements; however, they have some limitations. Their measurements can lead to ambiguous estimates and echo clutter can hamper target detection. In nature, echolocating bats experience similar problems when searching for food, especially if their food source is close to vegetation, as is the case for gleaning bats and nectar-feeding bats. However, nature has come up with solutions to overcome clutter problems and acoustically guide bats. Several bat-pollinated plants have evolved specially shaped floral parts that act as sonar reflectors, making the plants acoustically conspicuous. Here we show that artificial sonar beacons inspired by floral shapes streamline the navigation efficacy of sonar-guided robot systems. We developed floral-inspired reflector forms and demonstrate their functionality in 2 proof-of-principle experiments. First we show that the reflectors are easily recognized among dense clutter, and second we show that it is possible to discern different reflector shapes and use this identification to guide a robot through an unfamiliar environment. Bioinspired sonar reflectors could have a wide range of applications that could significantly advance sonar-guided systems.
... Sensors global grid coordinates is in guiding the mobile robot with high accuracy and robustness[20]. ...
Article
Full-text available
In the recently published researches in the object localization field, 3D object localization takes the largest part of this researches due to its importance in our daily life. 3D object localization has many applications such as collision avoidance, robotic guiding and vision and object surfaces topography modeling. This research study represents a novel localization algorithm and system design using a low-resolution 2D ultrasonic sensor array for 3D real-time object localization. A novel localization algorithm is developed and applied to the acquired data using the three sensors having the minimum calculated distances at each acquired sample, the algorithm was tested on objects at different locations in 3D space and validated with acceptable level of precision and accuracy. Polytope Faces Pursuit (PFP) algorithm was used for finding an approximate sparse solution to the object location from the measured three minimum distances. The proposed system successfully localizes the object at different positions with an error average of ±1.4 mm, ±1.8 mm, and ±3.7 mm in x-direction, y-direction, and z-direction respectively, which are considered as low error rates.
... As an example application in robotics, a Simultaneous Localisation and Mapping (SLAM) solution was developed using 3D sonar for estimating the ego-motion of a mobile platform [16]. Furthermore, in 2017, a control strategy was developed to navigate a mobile platform through an environment using the acoustic flow field generated by the sonar sensor [17]. As these works show, sonar is an increasingly capable sensing modality. ...
Conference Paper
Full-text available
For autonomous navigation and robotic applications, sensing the environment correctly is crucial. Many sensing modalities for this purpose exist. In recent years, one such modality that is being used is in-air imaging sonar. It is ideal in complex environments with rough conditions such as dust or fog. However, like with most sensing modalities, to sense the full environment around the mobile platform, multiple such sensors are needed to capture the full 360-degree range. Currently the processing algorithms used to create this data are insufficient to do so for multiple sensors at a reasonably fast update rate. Furthermore, a flexible and robust framework is needed to easily implement multiple imaging sonar sensors into any setup and serve multiple application types for the data. In this paper we present a sensor network framework designed for this novel sensing modality. Furthermore, an implementation of the processing algorithm on a Graphics Processing Unit is proposed to potentially decrease the computing time to allow for real-time processing of one or more imaging sonar sensors at a sufficiently high update rate.
... The arbitrator then selects the active behavior with the highest priority in the system and collects its intended velocity vectors, which are passed on to the robot. This architectural layout is inspired by previous work of Steckel and Peremans regarding robot control using acoustic flow [24]. It should be noted that similarities restrict themselves solely to the layout, as the previously mentioned work uses a sonar imaging sensor which enables explicit 3D localization of reflectors, while the current work uses a biomimetic radar sensor which implicitly localizes reflectors through the use of temporal and spectral cues. ...
Article
Full-text available
This paper presents a novel biomimetic radar sensor for autonomous navigation. To accomplish this, we have drawn inspiration from the sensory mechanisms present in an echolocating mammal, the common big-eared bat (Micronycteris microtis). We demonstrate the correspondence in both the hardware, system model, and signal processing. To validate the performance of the sensor, we have developed a complementary control system based on subsumption architecture, which allows the system to autonomously navigate unknown environments. This architecture consist of separate behaviors with different levels of complexity, which are combined to produce the overall functionality of the system. We describe each behavior separately and examine their performance in real-world navigation experiments. For this purpose, the system is placed in two distinct office environments with the goal of achieving smooth and stable trajectories. Here, we can observe noticeable improvements when employing high-level behaviors. Furthermore, we utilize the data collected during the navigation experiments to perform simultaneous localization and mapping, using an algorithm developed in our earlier work. These results show a substantial improvement over the odometry. We attribute this to the fact that the system traverses stable and repetitive paths, which facilitates place recognition.
... To demonstrate the applicability of the proposed system architecture for acoustic imaging, we constructed a small microphone array consisting of six Knowles SPH0641LM4H-1 microphones. Using this small array it is possible to determine if the microphones are suitable for our final application, a 32microphone array capable of 3D imaging [1], [6]. The signal used for the proposed application is a broadband frequency modulated sweep that goes from 80 kHz to 20 kHz in three milliseconds, shown in figure 2. ...
Conference Paper
In recent years, our research group has developed state of the art 3D sonar sensors which use a low-cost MEMS microphone array for real-time acoustic imaging in air. Using this sensor, various robotic applications have been developed, including obstacle avoidance, corridor following and SLAM. The developed sensor is capable of localizing an arbitrary number of reflectors, and generates 2D (range versus azimuth) or 3D (range versus azimuth and elevation) acoustic images of the environment by emitting a broadband, spatially omnidirectional acoustic emission. This emission is reflected back by the environment to the microphone array. Using array beamforming algorithms an acoustic image of the environment is created, which subsequently can be either visualized or used in various control algorithms.
Article
Full-text available
Animals rely on sensory feedback from their environment to guide locomotion. For instance, visually guided animals use patterns of optic flow to control their velocity and to estimate their distance to objects (e.g. Srinivasan et al. 1991, 1996). In this study, we investigated how acoustic information guides locomotion of animals that use hearing as a primary sensory modality to orient and navigate in the dark, where visual information is unavailable. We studied flight and echolocation behaviors of big brown bats as they flew under infrared illumination through a corridor with walls constructed from a series of individual vertical wooden poles. The spacing between poles on opposite walls of the corridor was experimentally manipulated to create dense/sparse and balanced/imbalanced spatial structure. The bats’ flight trajectories and echolocation signals were recorded with high-speed infrared motion-capture cameras and ultrasound microphones, respectively. As bats flew through the corridor, successive biosonar emissions returned cascades of echoes from the walls of the corridor. The bats flew through the center of the corridor when the pole spacing on opposite walls was balanced and closer to the side with wider pole spacing when opposite walls had an imbalanced density. Moreover, bats produced shorter duration echolocation calls when they flew through corridors with smaller spacing between poles, suggesting that clutter density influences features of the bat’s sonar signals. Flight speed and echolocation call rate did not, however, vary with dense and sparse spacing between the poles forming the corridor walls. Overall, these data demonstrate that bats adapt their flight and echolocation behavior dynamically when flying through acoustically complex environments.
Article
Full-text available
In this paper we explore the use of 3D acoustic flow fields to steer robot motion. We derive the 3D velocity fields set up by linear and rotational robot motions and explain how they can be sampled directly by a sonar array-sensor that we developed recently. The resulting acoustic flow field patterns are then shown to contain all the information necessary for controlling obstacle avoidance and corridor following behavior. Experimental data collected by the sonar system mounted on a wheelchair that was driven in an office environment are presented to validate the theoretically predicted acoustic flow patterns.
Article
Full-text available
There is currently considerable interest in the consequences of loss in one sensory modality on the remaining senses. Much of this work has focused on the development of enhanced auditory abilities among blind individuals, who are often able to use sound to navigate through space. It has now been established that many blind individuals produce sound emissions and use the returning echoes to provide them with information about objects in their surroundings, in a similar manner to bats navigating in the dark. In this review, we summarize current knowledge regarding human echolocation. Some blind individuals develop remarkable echolocation abilities, and are able to assess the position, size, distance, shape, and material of objects using reflected sound waves. After training, normally sighted people are also able to use echolocation to perceive objects, and can develop abilities comparable to, but typically somewhat poorer than, those of blind people. The underlying cues and mechanisms, operable range, spatial acuity and neurological underpinnings of echolocation are described. Echolocation can result in functional real life benefits. It is possible that these benefits can be optimized via suitable training, especially among those with recently acquired blindness, but this requires further study. Areas for further research are identified.
Article
Full-text available
We propose to combine a biomimetic navigation model which solves a simultaneous localization and mapping task with a biomimetic sonar mounted on a mobile robot to address two related questions. First, can robotic sonar sensing lead to intelligent interactions with complex environments? Second, can we model sonar based spatial orientation and the construction of spatial maps by bats? To address these questions we adapt the mapping module of RatSLAM, a previously published navigation system based on computational models of the rodent hippocampus. We analyze the performance of the proposed robotic implementation operating in the real world. We conclude that the biomimetic navigation model operating on the information from the biomimetic sonar allows an autonomous agent to map unmodified (office) environments efficiently and consistently. Furthermore, these results also show that successful navigation does not require the readings of the biomimetic sonar to be interpreted in terms of individual objects/landmarks in the environment. We argue that the system has applications in robotics as well as in the field of biology as a simple, first order, model for sonar based spatial orientation and map building.
Article
Full-text available
Array beamforming techniques allow for the generation of 3-D spatial filters which can be used to localize objects in a large field of view (FOV) without the need for mechanical scanning. By combining broadband beamforming with a sparse, random array of receivers, we have constructed a low-cost, yet powerful, in-air sonar system, which is suited for a wide range of robotic applications. Experimental results in unmodified office environments show the performance of the sonar sensor. In particular, we document the sensor's capacity to produce 3-D location measurements in the presence of multiple highly overlapping echoes. We show how this capability makes possible the combination of a wide FOV with accurate 3-D localization, allowing the sensor to operate under real-time constraints in realistic environments. To demonstrate the use of this sensor, we describe an odometry application that estimates egomotion of a mobile robot using acoustic flow.
Article
Full-text available
Gleaning insectivorous bats that forage by using echolocation within dense forest vegetation face the sensorial challenge of acoustic masking effects. Active perception of silent and motionless prey in acoustically cluttered environments by echolocation alone has thus been regarded impossible. The gleaning insectivorous bat Micronycteris microtis however, forages in dense understory vegetation and preys on insects, including dragonflies, which rest silent and motionless on vegetation. From behavioural experiments, we show that M. microtis uses echolocation as the sole sensorial modality for successful prey perception within a complex acoustic environment. All individuals performed a stereotypical three-dimensional hovering flight in front of prey items, while continuously emitting short, multi-harmonic, broadband echolocation calls. We observed a high precision in target localization which suggests that M. microtis perceives a detailed acoustic image of the prey based on shape, surface structure and material. Our experiments provide, to our knowledge, the first evidence that a gleaning bat uses echolocation alone for successful detection, classification and precise localization of silent and motionless prey in acoustic clutter. Overall, we conclude that the three-dimensional hovering flight of M. microtis in combination with a frequent emission of short, high-frequency echolocation calls is the key for active prey perception in acoustically highly cluttered environments.
Conference Paper
Full-text available
As an echolocating sensor moves through an environment the pattern of echoes reflected by objects to that sensor changes continuously, creating acoustic flow. Acoustic flow has been observed in both bats and humans. In this paper, we develop a theory of acoustic flow, and discuss measuring it with a Continuous Transmission Frequency Modulated (CTFM) ultrasonic sensor.
Article
Full-text available
Limitations both for the further development as well as for the actual technical application of autonomous robots arise from the lack of a unifying theoretical language. We propose three concepts for such a language: (1) Behaviors are represented by variables, specific constant values of which correspond to task demands; (2) Behaviors are generated as attractors of dynamical systems; (3) Neural field dynamics lift these dynamic principles to the representation of information. We show how these concepts can be used to design autonomous robots. Because behaviors are generated from attractor states of dynamical systems, design of a robot architecture addresses control-theoretic stability. Moreover, flexibility of the robot arises from bifurcations in the behavioral dynamics. Therefore techniques from the qualitative theory of dynamical systems can be used to design and tune autonomous robot architectures. We demonstrate these ideas in two implementations. In one case, visual sensory information is integrated to achieve target acquisition and obstacle avoidance in an autonomous vehicle minimizing the known problem of spurious states. In a second implementation of the same behavior, a neural dynamic field endows the system with a form of obstacle memory. A critical discussion of the approach highlights strengths and weaknesses and compares to other efforts in this direction.
Article
Full-text available
A computer model is described that combines concepts from the fields of acoustics, linear system theory, and digital signal processing to simulate an acoustic sensor navigation system using time-of-flight ranging. By separating the transmitter/receiver into separate components and assuming mirror-like reflectors, closed-form solutions for the reflections from corners, edges, and walls are determined as a function of transducer size, location, and orientation. A floor plan consisting of corners, walls, and edges is efficiently encoded to indicate which of these elements contribute to a particular pulse-echo response. Sonar maps produced by transducers having different resonant frequencies and transmitted pulse waveforms can then be simulated efficiently. Examples of simulated sonar maps of two floor plans illustrate the performance of the model. Actual sonar maps are presented to verify the simulation results.
Article
Full-text available
Signal design in cf-bats is hypothesized to be commensurate with the evaluation of time-variant echo parameters, imposed by changes in the sound channel occurring as the bat flies by a target. Two such parameters, the proportional changes in Doppler frequency and sound pressure amplitude, are surveyed, employing a simplified acoustic model in order to assess their fitness for target localization given a translational movement within a plane. This is accomplished by considering the properties of the scalar fields given by the value of these putative sensory variables as a function of position in a plane. The considered criteria are: existence and extent of ambiguity areas (i.e., multiple solutions for target position), magnitude of the variables (relevant with regard to perceptual thresholds), as well as magnitude and orthogonality of the gradients (relevant to localization accuracy). It is concluded that these properties render the considered variables compatible with gross judgements of target position. This may be sufficient for behavioral contexts like obstacle avoidance, where adoption of suitable safety margins could compensate for the variance and bias associated with estimates of target location.
Article
Full-text available
When insects are flying forward, the image of the ground sweeps backward across their ventral viewfield and forms an "optic flow," which depends on both the groundspeed and the groundheight. To explain how these animals manage to avoid the ground by using this visual motion cue, we suggest that insect navigation hinges on a visual-feedback loop we have called the optic-flow regulator, which controls the vertical lift. To test this idea, we built a micro-helicopter equipped with an optic-flow regulator and a bio-inspired optic-flow sensor. This fly-by-sight micro-robot can perform exacting tasks such as take-off, level flight, and landing. Our control scheme accounts for many hitherto unexplained findings published during the last 70 years on insects' visually guided performances; for example, it accounts for the fact that honeybees descend in a headwind, land with a constant slope, and drown when travelling over mirror-smooth water. Our control scheme explains how insects manage to fly safely without any of the instruments used onboard aircraft to measure the groundheight, groundspeed, and descent speed. An optic-flow regulator is quite simple in terms of its neural implementation and just as appropriate for insects as it would be for aircraft.
Article
Full-text available
In this paper a sensor fusion scheme, called triangulation-based fusion (TBF) of sonar data, is presented. This algorithm delivers stable natural point landmarks, which appear in practically all indoor environments, i.e., vertical edges like door posts, table legs, and so forth. The landmark precision is in most cases within centimeters. The TBF algorithm is implemented as a voting scheme, which groups sonar measurements that are likely to have hit the same object in the environment. The algorithm has low complexity and is sufficiently fast for most mobile robot applications. As a case study, we apply the TBF algorithm to robot pose tracking. The pose tracker is implemented as a classic extended Kalman filter, which use odometry readings for the prediction step and TBF data for measurement updates. The TBF data is matched to pre-recorded reference maps of landmarks in order to measure the robot pose. In corridors, complementary TBF data measurements from the walls are used to improve the orientation and position estimate. Experiments demonstrate that the pose tracker is robust enough for handling kilometer distances in a large scale indoor environment containing a sufficiently dense landmark set
Article
Flying animals need to react fast to rapid changes in their environment. Visually guided animals use optic flow, generated by their movement through structured environments. Nocturnal bats cannot make use of optic flow, but rely mostly on echolocation. Here we show that bats exploit echo-acoustic flow to negotiate flight through narrow passages. Specifically, bats' flight between lateral structures is significantly affected by the echo-acoustic salience of those structures, independent of their physical distance. This is true although echolocation, unlike vision, provides explicit distance cues. Moreover, the bats reduced the echolocation sound levels in stronger flow, likely to compensate for the increased summary target strength of the lateral reflectors. However, bats did not reduce flight velocity under stronger echo-acoustic flow. Our results demonstrate that sensory flow is a ubiquitous principle for flight guidance, independent of the fundamentally different peripheral representation of flow across the senses of vision and echolocation.
Book
The sonar of dolphins has undergone evolutionary re-finement for millions of years and has evolved to be the premier sonar system for short range applications. It far surpasses the capability of technological sonar, i.e. the only sonar system the US Navy has to detect buried mines is a dolphin system. Echolocation experiments with captive animals have revealed much of the basic parameters of the dolphin sonar. Features such as signal characteristics, transmission and reception beam patterns, hearing and internal filtering properties will be discussed. Sonar detection range and discrimination capabilities will also be included. Recent measurements of echolocation signals used by wild dolphins have expanded our understanding of their sonar system and their utilization in the field. A capability to perform time-varying gain has been recently uncovered which is very different than that of a technological sonar. A model of killer whale foraging on Chinook salmon will be examined in order to gain an understanding of the effectiveness of the sonar system in nature. The model will examine foraging in both quiet and noisy environments and will show that the echo levels are more than sufficient for prey detection at relatively long ranges.
Article
The application of optical flow techniques on artificial image sequences in computing relative motions of obstacles is studied as a data processing algorithm for ultrasonic sensors. These sensors in robot systems are widely used in calculating the distance of obstacles. However, owing to the angular uncertainty of sensor characteristics, limitations exist on the positioning of an object's location, making it difficult to obtain accurate velocity information. Collision avoidance in dynamic environments requires information on relative motion, such as distance and velocity. In this paper, a technique for image processing on a range of ultrasonic sensor data is suggested in order to obtain the relative motion of front objects. The proposed technique is conducted using a two-wheel differential drive robot for problem cases related to indoor navigation. Digital filtering is also conducted to smooth the fluctuations, after which relative velocity is calculated using the optical flow technique. Experimental results demonstrate the possible application of the proposed algorithm for robot navigation in dynamic environments. This technique will increase the intelligence
Article
"This unusual book bridges the fields of biology, physics, and psychology in its discussion of acoustic orientation in the animal world and its significance for man. Here is the up-to-date information on bats, their natural history, biological nature, and flying skill, along with the adventures of their observers, presented as matter of import to man today because the brain of a bat 'is the end result of eons of evolutionary refinement for the process of echolocation' while we who also try to fly and to see in the dark have the brains 'of large ground apes with stereoscopic vision and limbs designed for walking and climbing.' " 467-item bibliography. (PsycINFO Database Record (c) 2012 APA, all rights reserved)
Article
We present a novel solution for topological exploration in corridor environments using cheap and error-prone sonar sensors. Topological exploration requires significant location detection and motion planning. To detect nodes (i.e., significant places) robustly, we propose a new measure, the eigenvalue ratio (EVR), which converts geometrical shapes in the environment into quantitative values using principal component analysis. For planning the safe motion of a robot, we propose the circle following (CF) method, which abstracts the geometry of the environment while taking the characteristics of the sonar sensors into consideration. Integrating the EVR with the CF method results in a topological exploration strategy using sonar sensors approach. The practicality of this approach is demonstrated by simulations and real experiments in corridor environments. KeywordsExploration-Sonar sensors-Topology
Conference Paper
This paper describes a new data association technique for the interpretation of wide-beam sonar measurements. The goal is to group sets of returns that originate from the same surface of an object. We consider the case of a moving observer that obtains range and bearing measurements of curved and/or faceted objects with binaural sonar operating in the specular wavelength regime. Rather than projecting sensor data into a Cartesian space before processing, we operate on a "raw" representation in terms of range, bearing, amplitude, and time. A binary geometric constraint is applied to pairs of consecutive measurements to rule out impossible assignments. A measurement flow model is proposed for validating triples of measurement. The performance of these techniques is illustrated via a set of experiments using a wide-beam 500 kHz binaural sonar system, demonstrating effective perceptual grouping and mapping with sonar echoes originating from a set of objects, despite the presence of navigation error.
Article
A new architecture for controlling mobile robots is described. Layers of control system are built to let the robot operate at increasing levels of competence. Layers are made up of asynchronous modules that communicate over low-bandwidth channels. Each module is an instance of a fairly simple computational machine. Higher-level layers can subsume the roles of lower levels by suppressing their outputs. However, lower levels continue to function as higher levels are added. The result is a robust and flexible robot control system. The system has been used to control a mobile robot wandering around unconstrained laboratory areas and computer machine rooms. Eventually it is intended to control a robot that wanders the office areas of our laboratory, building maps of its surroundings using an onboard arm to perform simple tasks.
Article
The design of an air sonar device with a new form of binaural display is described which aids the blind in perceiving their environment. Some of the limitations of knowledge of human perception and the influence this has on a specification for the device are discussed. Inherent limitations in the binaural aid both in terms of technology development and performance are also explained. The paper described what is expected of the man-machine control system in a mobility setting and discusses the technique of evaluating a manmachinesystem so as to assess the machine performance.
Article
In this paper, a mobile robot control law for corridor navigation and wall-following, based on sonar and odometric sensorial information is proposed. The control law allows for stable navigation avoiding actuator saturation. The posture information of the robot travelling through the corridor is estimated by using odometric and sonar sensing. The control system is theoretically proved to be asymptotically stable. Obstacle avoidance capability is added to the control system as a perturbation signal. A state variables estimation structure is proposed that fuses the sonar and odometric information. Experimental results are presented to show the performance of the proposed control system.
Article
Field research on echolocation behavior in bats has emphasized studies of food acquisition, and the adaptive value of sonar signal design as been considered largely in the context of foraging. However, echolocation tasks related to spatial orientation also differ among bats and are relevant to understanding signal structure. Here, we argue that the evolution of echolocation in bats is characterized by two key innovations: first, the evolution of echolocation for spatial orientation and, second, a later transition for prey acquisition. This conceptual framework calls for a new view on field data from bats orienting and foraging in different types of habitats. According to the ecological constraints in which foraging bats operate, four distinct functional groups or guilds can be defined. Within each group, signal design and echolocation behavior are rather similar.
Article
There is an alternative route to Artificial Intelligence that diverges from the directions pursued under that banner for the last thirty some years. The traditional approach has emphasized the abstract manipulation of symbols, whose grounding in physical reality has rarely been achieved. We explore a research methodology which emphasizes ongoing physical interaction with the environment as the primary source of constraint on the design of intelligent systems. We show how this methodology has recently had significant successes on a par with the most successful classical efforts. We outline plausible future work along these lines which can lead to vastly more ambitious systems.
Article
So-called CF–FM bats are highly mobile creatures who emit long calls in which much of the energy is concentrated in a single frequency. These bats face sensor interpretation problems very similar to those of mobile robots provided with ultrasonic sensors, while navigating in cluttered environments. This paper presents biologically-inspired engineering on the use of narrowband Sonar in mobile robotics. It replicates, using robotics as a modelling medium, methods CF–FM bats use to exploit Doppler-shifts––a rich source of information not used by commercial robotic ultrasonic range sensors––in different tasks. The experimental platform for the work is RoBat, a 6 DOF biomimetic sonarhead mounted on a commercial 3 DOF mobile platform. The platform is provided with signal processing capabilities inspired by the bat's auditory system. The CF–FM bat modifies––increasing or decreasing––the carrier frequency of its own calls, compensating the Doppler-shift produced when the bat, the reflector or both are moving. This echolocating behaviour, called Doppler-shift compensation, is successfully implemented in RoBat. Inspired by this behaviour, a convoy navigation controller following a set of simple Doppler-dependent rules is successfully devised. The performance of the controller is satisfactory despite low Doppler-shift resolution caused by the lower velocity of the robot when compared to real bats. Finally, Müller's hypothesis on the use of acoustic flow by CF–FM bats for obstacle avoidance is also implemented in RoBat, resulting in a crude estimation of the target's passing distance at small bearing angles, which improves as the angle increases, nevertheless sufficing for avoiding the two reflectors of the experiment.
Article
Behavior coordination is a notorious problem in mobile robotics. Behaviors are either in competition or collaborating to achieve the goals of a system, which leads to requirements for arbitration and/or fusion of control signals. In most systems the arbitration is specified in terms of 'events' that denote positions or sensory input. The detection of these events allows discrete switching between groups of behaviors. In contrast, the fusion of behaviors is often achieved using potential fields, fuzzy rules or superposition. In most cases, the underlying theoretical foundation is rather weak and the behavior switching results in discrete changes in the overall system dynamics. In this paper, we present a scheme for behavior coordination that is grounded in the dynamical systems approach. The methodology provides a solid theoretical basis for analysis and design of individual behaviors and their coordination. This coordination framework is demonstrated in the context of a domestic robot for fetch-and-carry-type tasks. It is shown here that behavior coordination can be analyzed as an integral part of the design to facilitate smooth transition and fusion between behaviors.
Article
In this paper, wide-field integration methods, which are inspired by the spatial decompositions of wide-field patterns of optic flow in the insect visuomotor system, are explored as an efficient means to extract visual cues for guidance and navigation. A control-theoretic framework is developed and used to quantitatively link weighting functions to behaviorally-relevant interpretations such as relative orientation, position, and speed in a corridor environment. It is shown through analysis and demonstration on a ground vehicle that the proposed sensorimotor architecture gives rise to navigational heuristics, namely, centering and speed regulation, which are observed in natural systems.
Article
This paper presents an advanced bio-inspired binaural sonar sensor capable of localizing reflectors in 3D space with a single reading. The technique makes use of broadband spectral cues in the received echoes only. Two artificial pinnae act as complex direction-dependent spectral filters on the echoes returning from the ensonified reflector. The “active head-related transfer function” (AHRTF) is introduced to describe this spectral filtering as a function of the reflector angle, taking into account the transmitter radiation pattern, both pinnae and the particular sonar head geometry. 3D localization is performed by selecting the azimuth—elevation pair with the highest a posteriori probability, given the binaural target echo spectrum. Experimental 3D localization results of a ball reflector show that the AHRTF carries sufficient information to discriminate between different reflector locations under realistic noise conditions. In addition, experiments with more complex reflectors illustrate that the AHRTF dominates the echo spectrum, allowing 3D localization in the presence of spectrum distortions caused by unknown reflector filtering. These experiments show that a fairly simple sonar device can extract more spatial information about realistic objects in its direct surroundings than is conventionally believed.
Article
Flying insects display remarkable agility, despite their diminutive eyes and brains. This review describes our growing understanding of how these creatures use visual information to stabilize flight, avoid collisions with objects, regulate flight speed, detect and intercept other flying insects such as mates or prey, navigate to a distant food source, and orchestrate flawless landings. It also outlines the ways in which these insights are now being used to develop novel, biologically inspired strategies for the guidance of autonomous, airborne vehicles.
Article
Echo-locating bats constantly emit ultrasonic pulses and analyze the returning echoes to detect, localize, and classify objects in their surroundings. Echo classification is essential for bats' everyday life; for instance, it enables bats to use acoustical landmarks for navigation and to recognize food sources from other objects. Most of the research of echo based object classification in echo-locating bats was done in the context of simple artificial objects. These objects might represent prey, flower, or fruit and are characterized by simple echoes with a single up to several reflectors. Bats, however, must also be able to use echoes that return from complex structures such as plants or other types of background. Such echoes are characterized by superpositions of many reflections that can only be described using a stochastic statistical approach. Scientists have only lately started to address the issue of complex echo classification by echo-locating bats. Some behavioral evidence showing that bats can classify complex echoes has been accumulated and several hypotheses have been suggested as to how they do so. Here, we present a first review of this data. We raise some hypotheses regarding possible interpretations of the data and point out necessary future directions that should be pursued.
Article
New approaches in the research fields of ultrasonic sensing, environment mapping and self-localization, as well as fault detection, diagnosis, and recovery for autonomous mobile systems are presented. A concept of high-resolution ultrasonic sensing by a multi-aural sensor configuration is proposed, which incorporates cross echoes between neighbour sensors as well as multiple echoes per sensor. In order to benefit from the increased sensor information, algorithms for adequate sensor data processing and sensor data fusion have been developed. In this context, it is described how local environment models can be created at different robot locations by extracting geometric primitives from laser range finder data and from ultrasonic sensor data. Additionally, the application of an extended Kalman filter to mobile robot self-localization based on previously modeled and newly extracted geometric primitives is explained. Furthermore, it is demonstrated how local environment models can be merged for building a global environment map. As a supplement for monitoring the state of environmental sensors, a fault detection model has been developed, which consists of sub-models for data from laser range finders and ultrasonic sensors.
Article
This paper presents a novel ultrasonic sensing system for autonomous mobile systems. We describe how wide-angled ultrasonic transducers can be used to obtain substantial information from the environment. This can be achieved by exploiting the overlapping of detection cones from neighbor sensors and by receiving cross echoes between them. The ultrasonic sensing system also allows the detection of multiple echoes from different echo paths for each sensor. In this way, a significantly higher number of echoes can be obtained in comparison to conventional ultrasonic sensing systems for mobile robots. In order to benefit from the increased sensor information, algorithms for adequate data post-processing are required. In this context, we describe how an environment model can be created from ultrasonic sensor data.
Article
We introduce an ultrasonic sensor system that measures artificial potential fields (APFs) directly. The APF is derived from the traveling-times of the transmitted pulses. Advantages of the sensor are that it needs only three transducers, that its design is simple, and that it measures a quantity that can be used directly for simple navigation, such as collision avoidance
Article
An active sonar is described that adaptively changes its location and configuration in response to the echoes it observes in order to locate an object, position it at a known location, and identify it using features extracted from the echoes. The sonar consists of a center transmitter flanked by two receivers that can rotate and is positioned at the end of a robot arm that has five degree-of-freedom mobility. The sonar operates in air using Polaroid transducers that are resonant at 60 kHz with a nominal wavelength equal to 6 mm. The emitted pulse has a short duration with a useful bandwidth extending from 20 to 130 kHz. Using binaural information, the transmitter rotates to position an echo-producing object on its axis to maximize the acoustic intensity incident on the nearest echo-producing feature. The receivers rotate to maximize the echo amplitude and bandwidth. These optimizations are useful for differentiating objects. The system recognizes a collection of ball bearings, machine washers, and rubber O-rings of different sizes ranging from 0.45 to 2.54 cm, some differing by less than 1 mm in diameter. Learning is accomplished by extracting vectors of 32 echo envelope values acquired during a scan in elevation and forming a data base. Recognition is accomplished by comparing a single observed echo vector with the data base to find the least squared error match. A bent-wire paper clip illustrates the recognition of an asymmetric pose-dependent object
Frontiers in sensing: from biology to engineering
  • H Peremans
  • F De Mey
  • F Schillebeeckx
H. Peremans, F. de Mey, and F. Schillebeeckx, "Frontiers in sensing: from biology to engineering," 2012. [Online]. Available: https://scholar.google.be/scholar?q=related:s7-regHJIKEJ:scholar.google.com/&hl=en&as sdt=0,5#0
Trajectory sonar perception
  • R J Rikoski
  • J J Leonard