Robust sonar feature detection for the SLAM of mobile robot

Conference Paper · September 2005with71 Reads
DOI: 10.1109/IROS.2005.1545284 · Source: IEEE Xplore
Conference: Intelligent Robots and Systems, 2005. (IROS 2005). 2005 IEEE/RSJ International Conference on
Abstract
Sonar sensor is an attractive tool for the SLAM of mobile robot because of their economic aspects. This cheap sensor gives relatively accurate range readings if disregarding angular uncertainty and specular reflections. However, these defects make feature detection difficult for the most part of the SLAM. This paper proposes a robust sonar feature detection algorithm. This algorithm gives feature detection methods for both point features and line features. The point feature detection method is based on the TBF (Wijk and Christensen, 2000) scheme. Moreover, three additional processes improve the performance of feature detection as follows; 1) stable intersections; 2) efficient sliding window update; and 3) removal of the false point features on the wall. The line feature detection method is based on the basic property of adjacent sonar sensors. Along the line feature, three adjacent sonar sensors give similar range readings. Using this sensor property, we propose a novel algorithm for line feature detection, which is simple and the feature can be obtained by using only current sensor data. The proposed feature detection algorithm gives a good solution for the SLAM of mobile robots because it gives an accurate feature information for both the point and line features even with sensor errors. Furthermore, a sufficient number of features are available to correct mobile robot pose. Experimental results of the EKF-based SLAM demonstrate the performance of the proposed feature detection algorithm in a home-like environment.
    • The sensor used is HC-SR04 by Elecfreaks, transmitter and receiver are built in together, therefore the accuracy is greater, it can amount to 3mm [5]. Some researchers combine two device sensors for mapping and detect such as vision/laser sensor and sonar sensor (SLAM Methods [2]
    File · Data · Jul 2016 · Advanced Robotics
    • The solution to the SLAM problem allows deployment of robots in many applications such as search and rescue operations, underwater surveillance and gas distribution mapping. Numerous SLAM techniques have been developed by previous researchers utilizing different devices, including sonar sensors [2,3], cameras456 and laser scanners7891011. However, the methods are often device-specific due to the different sensing modalities and capabilities of each device.
    [Show abstract] [Hide abstract] ABSTRACT: This paper presents a performance analysis of two open-source, laser scanner-based Simultaneous Localization and Mapping (SLAM) techniques (i.e., Gmapping and Hector SLAM) using a Microsoft Kinect to replace the laser sensor. Furthermore, the paper proposes a new system integration approach whereby a Linux virtual machine is used to run the open source SLAM algorithms. The experiments were conducted in two different environments; a small room with no features and a typical office corridor with desks and chairs. Using the data logged from real-time experiments, each SLAM technique was simulated and tested with different parameter settings. The results show that the system is able to achieve real time SLAM operation. The system implementation offers a simple and reliable way to compare the performance of Windows-based SLAM algorithm with the algorithms typically implemented in a Robot Operating System (ROS). The results also indicate that certain modifications to the default laser scanner-based parameters are able to improve the map accuracy. However, the limited field of view and range of Kinect’s depth sensor often causes the map to be inaccurate, especially in featureless areas, therefore the Kinect sensor is not a direct replacement for a laser scanner, but rather offers a feasible alternative for 2D SLAM tasks.
    Full-text · Article · Dec 2014
    • Particularly, localization and driving control of AGV are important elements of autonomous techniques [5] [6] [7]. Among the techniques relevant to AGVs, positioning is the most important because all autonomous techniques are based on AGV position information [8] [9] [10] [11] [12] [13]. Generally, an AGV positioning system uses a global positioning sensor in conjunction with local positioning sensors.
    [Show abstract] [Hide abstract] ABSTRACT: We designed and implemented a fork-type automatic guided vehicle (AGV) with a laser guidance system. Most previous AGVs have used two types of guidance systems: magnetgyro and wire guidance. However, these guidance systems have high costs, are difficult to maintain with changes in the operating environment, and can drive only a pre-determined path with installed sensors. A laser guidance system was developed for addressing these issues, but limitations including slow response time and low accuracy remain. We present a laser guidance system and control system for AGVs with laser navigation. For analyzing the performance of the proposed system, we designed and built a fork-type AGV, and performed repetitions of our experiments under the same working conditions. The results show an average positioning error of 51.76 mm between the simulated driving path and the driving path of the actual fork-type AGV. Consequently, we verified that the proposed method is effective and suitable for use in actual AGVs.
    Full-text · Article · Dec 2013
    • Inaccuracies present in the localization system can be corrected by using an accelerometer. Maps can be refined by improving feature detection of SONAR transducer[9], [10]. Another method of enhancing results is by the fusion of SONAR with vision[11], or with laser range finders[12], [13].
    Article · Jan 2013 · Advanced Robotics
    • Sonar transmits sound waves stepwisely and receives them after encountering obstacles; we call each sound wave beam one ping and every ping can be divided into several bins. The relative position of a bin reflects the distance between the sonar and the obstacle, the larger the intensity of bins are, the more obvious features there are, and vice versa [31]. AsFigure 5 shows, sonar scans stepwisely in a given sector, and finds one obstacle in the k – th bin of the i – th and (i + 1) – th ping, respectively.
    [Show abstract] [Hide abstract] ABSTRACT: This paper addresses an autonomous navigation method for the autonomous underwater vehicle (AUV) C-Ranger applying information-filter-based simultaneous localization and mapping (SLAM), and its sea trial experiments in Tuandao Bay (Shangdong Province, P.R. China). Weak links in the information matrix in an extended information filter (EIF) can be pruned to achieve an efficient approach-sparse EIF algorithm (SEIF-SLAM). All the basic update formulae can be implemented in constant time irrespective of the size of the map; hence the computational complexity is significantly reduced. The mechanical scanning imaging sonar is chosen as the active sensing device for the underwater vehicle, and a compensation method based on feedback of the AUV pose is presented to overcome distortion of the acoustic images due to the vehicle motion. In order to verify the feasibility of the navigation methods proposed for the C-Ranger, a sea trial was conducted in Tuandao Bay. Experimental results and analysis show that the proposed navigation approach based on SEIF-SLAM improves the accuracy of the navigation compared with conventional method; moreover the algorithm has a low computational cost when compared with EKF-SLAM.
    Full-text · Article · Dec 2011
    • To cluster sonar measurements in the feature window, we first set the window reference position x W R t which is the most recent robot position in the window. Whole sonar readings in the window are transformed to the position in the robot coordinate frame at x W R t and then the sonar readings are grouped into several clusters through the sonar measurement selection method [18] and the distance between features (Fig. 1b). For the sonar measurements clustering in the feature window:
    [Show abstract] [Hide abstract] ABSTRACT: This paper addresses a solution of simultaneous localization and mapping (SLAM) for sonar readings based on neuro-evolutionary optimization algorithm. In the past two decades, numerous studies have attempted to solve the SLAM problem using laser scanners and vision sensors. However, relatively little research has been carried out on a sonar-based SLAM algorithm, because the bearing accuracy and resolution of sonars are not enough to find consistent features for SLAM. The proposed algorithm in this paper solves the sonar-based SLAM as a global optimization problem using the cost function that represents the quality of a robot's trajectory in the world coordinate frame. In our algorithm, a neural network helps to estimate the robot's pose error accurately using sonar inputs at each position and the pose difference between two consecutive robot poses, and evolutionary programming is used to find the most suitable neural network. By way of learning and evolution, our algorithm does not need a prior assumption on the motion and sensor models, and therefore shows a robust performance regardless of the actual noise type. Our neural network-based SLAM algorithm is applied to a robot that has sonar sensors. The various experimental results demonstrate that the neural network-based SLAM guarantees a consistent environmental map under sonar readings that in general are known to have poor bearing accuracy and resolution.
    Full-text · Article · May 2010
Show more
Article
December 2005 · Proceedings of SPIE - The International Society for Optical Engineering · Impact Factor: 0.20
Sonar sensor is an attractive tool for the SLAM of mobile robot because of their economic aspects. This cheap sensor gives relatively accurate range readings if disregarding angular uncertainty and specular reflections. However, these defects make feature detection difficult for the most part of the SLAM. This paper proposed a robust sonar feature detection algorithm. This algorithm gives... [Show full abstract]
Conference Paper
October 2006
To increase the intelligence of mobile robot, various sensors need to be fused effectively to cope with uncertainty induced from both environment and sensors. Combining sonar and vision sensors possesses numerous advantages of economical efficiency and complementary cooperation. Especially, it can remedy false data association and divergence problem of sonar sensors, and overcome low frequency... [Show full abstract]
Conference Paper
November 2006
To implement an autonomous mobile robot, both SLAM and task based navigation algorithms should be performed successfully. Especially, the performance of the estimation while the mobile robot performs task based navigation should be guaranteed. For this purpose, we integrate a SLAM method and a navigation algorithm for practical autonomous mobile robot. The SLAM method combines sonar sensors... [Show full abstract]
Conference Paper
October 2006
Reliable data association is crucial to localization and map building for mobile robot applications. For that reason, many mobile robots tend to choose vision-based SLAM solutions. In this paper, a SLAM scheme based on visual object recognition, not just a scene matching, in home environment is proposed without using artificial landmarks. For the object-based SLAM, the following algorithms are... [Show full abstract]
Discover more