Figure 2 - available via license: Creative Commons Attribution 4.0 International
Content may be subject to copyright.
Source publication
Light Detection And Ranging sensors (lidar) are key to autonomous driving, but their data is severely impacted by weather events (rain, fog, snow). To increase the safety and availability of self-driving vehicles, the analysis of the phenomena of the consequences at stake is necessary. This paper presents experiments performed in a climatic chamber...
Context in source publication
Context 1
... most automotive lidars, the time-of-flight (TOF) principle is then applied to estimate the distance between the sensor and the objects from the elapsed time between the emission of the photons and their backscattering on objects. The detected ranges and orientations of the laser beams (elevation and azimuth) result in 3D coordinates of impacts in the sensor reference frames, and the combination of all 3D impacts gathered during a scanning period produces a 3D-pointcloud ( Figure 2). In this work, we only consider digitized echoes as the lidars currently available for autonomous driving (COTS systems) do not provide access to the FWM signal and directly output pointclouds. ...
Citations
... While researches on general LiDAR pointcloud anomalies are limited, the topic of LiDAR performance under adverse weather conditions have been studied extensively [5]- [12]. Many of the studies focus on the performance degradation of the LiDAR in rain/fog and have developed various quantification methods for aspects such as signal attenuation, visibility range, point density and target reflectance. ...
LiDAR sensors play an important role in the perception stack of modern autonomous driving systems. Adverse weather conditions such as rain, fog and dust, as well as some (occasional) LiDAR hardware fault may cause the LiDAR to produce pointcloud with abnormal patterns such as scattered noise points and uncommon intensity values. In this paper, we propose a novel approach to detect whether a LiDAR is generating anomalous pointcloud by analyzing the pointcloud characteristics. Specifically, we develop a pointcloud quality metric based on the LiDAR points' spatial and intensity distribution to characterize the noise level of the pointcloud, which relies on pure mathematical analysis and does not require any labeling or training as learning-based methods do. Therefore, the method is scalable and can be quickly deployed either online to improve the autonomy safety by monitoring anomalies in the LiDAR data or offline to perform in-depth study of the LiDAR behavior over large amount of data. The proposed approach is studied with extensive real public road data collected by LiDARs with different scanning mechanisms and laser spectrums, and is proven to be able to effectively handle various known and unknown sources of pointcloud anomaly.
... Internal and structural factors like sensor technology, model and mounting position play a role in the degree of deterioration [6]. Additionally, adverse weather affects the intensity values, number of points, and other point cloud characteristics (see Figure I) [7], [8]. In general, when encountering particles in the air due to dust or adverse weather, the emitted light is back scattered or diverted. ...
... While there is a large body of research on analyzing the performance degradation of LiDAR sensors under adverse weather condition [7], [5], [6], [9], [10], [12], [13], [14], a comprehensive summary on algorithmic coping strategies for an improved perception is missing. Furthermore, surveys on autonomous driving under adverse weather conditions which address weather-induced sensor deterioration [15], [16], [17] do not pinpoint weather-related problems which are unique to the LiDAR sensor. ...
... Depending on the technology, the characteristics and the configuration, different LiDAR models are more or less influenced by the weather conditions [43], [15], [7], [8]. Due to eye safety restrictions and the suppression of ambient light, two operation wavelengths for LiDAR sensors prevailed: 905nm and 1550nm, with 905nm being the majority of the available sensors. ...
Autonomous vehicles rely on a variety of sensors to gather information about their surrounding. The vehicle's behavior is planned based on the environment perception, making its reliability crucial for safety reasons. The active LiDAR sensor is able to create an accurate 3D representation of a scene, making it a valuable addition for environment perception for autonomous vehicles. Due to light scattering and occlusion, the LiDAR's performance change under adverse weather conditions like fog, snow or rain. This limitation recently fostered a large body of research on approaches to alleviate the decrease in perception performance. In this survey, we gathered, analyzed, and discussed different aspects on dealing with adverse weather conditions in LiDAR-based environment perception. We address topics such as the availability of appropriate data, raw point cloud processing and denoising, robust perception algorithms and sensor fusion to mitigate adverse weather induced shortcomings. We furthermore identify the most pressing gaps in the current literature and pinpoint promising research directions.
... Internal and structural factors like sensor technology, model and mounting position play a role in the degree of deterioration [6]. Additionally, adverse weather affects the intensity values, number of points, and other point cloud characteristics (see Figure I) [7], [8]. In general, when encountering particles in the air due to dust or adverse weather, the emitted light is back scattered or diverted. ...
... While there is a large body of research on analyzing the performance degradation of LiDAR sensors under adverse weather condition [7], [5], [6], [9], [10], [12], [13], [14], a comprehensive summary on algorithmic coping strategies for an improved perception is missing. Furthermore, surveys on autonomous driving under adverse weather conditions which address weather-induced sensor deterioration [15], [16], [17] do not pinpoint weather-related problems which are unique to the LiDAR sensor. ...
... Depending on the technology, the characteristics and the configuration, different LiDAR models are more or less influenced by the weather conditions [43], [15], [7], [8]. Due to eye safety restrictions and the suppression of ambient light, two operation wavelengths for LiDAR sensors prevailed: 905nm and 1550nm, with 905nm being the majority of the available sensors. ...
Autonomous vehicles rely on a variety of sensors to gather information about their surrounding. The vehicle's behavior is planned based on the environment perception, making its reliability crucial for safety reasons. The active LiDAR sensor is able to create an accurate 3D representation of a scene, making it a valuable addition for environment perception for autonomous vehicles. Due to light scattering and occlusion, the LiDAR's performance change under adverse weather conditions like fog, snow or rain. This limitation recently fostered a large body of research on approaches to alleviate the decrease in perception performance. In this survey, we gathered, analyzed, and discussed different aspects on dealing with adverse weather conditions in LiDAR-based environment perception. We address topics such as the availability of appropriate data, raw point cloud processing and denoising, robust perception algorithms and sensor fusion to mitigate adverse weather induced shortcomings. We furthermore identify the most pressing gaps in the current literature and pinpoint promising research directions.
... Although LiDAR is more robust to climate change than vision sensors, an additional disadvantage is that it is affected by bad weather. Studies have demonstrated that LiDAR performance degrades in bad weather [12,13,[18][19][20][21][22]. This performance degradation has been attributed to interference by moisture in the air [20,23]. ...
... Studies have demonstrated that LiDAR performance degrades in bad weather [12,13,[18][19][20][21][22]. This performance degradation has been attributed to interference by moisture in the air [20,23]. The LiDAR performance degradation under bad weather has been demonstrated through simulation [23] or in a limited test site, such as a climate chamber [20]. ...
... This performance degradation has been attributed to interference by moisture in the air [20,23]. The LiDAR performance degradation under bad weather has been demonstrated through simulation [23] or in a limited test site, such as a climate chamber [20]. However, few studies have verified the effect of rain and fog on LiDAR-detection performance at the same time and place on an actual road. ...
Light detection and ranging (LiDAR) is widely used in autonomous vehicles to obtain precise 3D information about surrounding road environments. However, under bad weather conditions, such as rain, snow, and fog, LiDAR-detection performance is reduced. This effect has hardly been verified in actual road environments. In this study, tests were conducted with different precipitation levels (10, 20, 30, and 40 mm/h) and fog visibilities (50, 100, and 150 m) on actual roads. Square test objects (60 × 60 cm2) made of retroreflective film, aluminum, steel, black sheet, and plastic, commonly used in Korean road traffic signs, were investigated. Number of point clouds (NPC) and intensity (reflection value of points) were selected as LiDAR performance indicators. These indicators decreased with deteriorating weather in order of light rain (10–20 mm/h), weak fog (<150 m), intense rain (30–40 mm/h), and thick fog (≤50 m). Retroreflective film preserved at least 74% of the NPC under clear conditions with intense rain (30–40 mm/h) and thick fog (<50 m). Aluminum and steel showed non-observation for distances of 20–30 m under these conditions. ANOVA and post hoc tests suggested that these performance reductions were statistically significant. Such empirical tests should clarify the LiDAR performance degradation.
... Without focusing on differences in the sensor design, Heinzler et al. [8] investigated the performance of LiDARs from Velodyne and Valeo under lab induced fog and rain conditions and generally emphasised the increasing measurement range reduction and number of false positive detections close to the sensor. Similarly, Montalban et al. [11] tested LiDARs from Velodyne, Ouster, Livox, Cepton and Aeye under artificial rain and fog conditions. By doing so, the authors evaluated the relative backscattered energy and number of points on a defined hard target as well as in the optical channel between sensor and target (noise) under varying precipitation rates and fog densities. ...
... Nevertheless, the majority of works in this field are lacking validation with respect to real weather conditions, partly showing strong deviations from the expected measurement behavior. Hence, the comparability between artificially generated weather conditions and real outdoor conditions is limited, as specifically emphasised by Montalban et al. [11] and Rasshofer et al. [16]. This strongly underlines the importance of sensor testing and model validation under realistic environmental conditions. ...
... Although the extinction efficiency itself is, for typical LiDAR wavelengths and in case of rain, not sensitive to the particle size, it is multiplied with its squared as well as with its DSD to finally determine the extinction coefficient. Consequentially, small deviations from natural precipitation characteristics can cause unexpected signal extinction, as observed by earlier works (see section II, [11], [16]). ...
There is a strong demand for high fidelity sensor models which are capable of simulating realistic automotive sensor perception of Radar, LiDAR and camera sensors in real time, in order to validate advanced driving assistance functions like lane change assist (LCA), automated emergency breaking (AEB), or even path planning virtually. For central data fusion the sensor models need to deliver realistic, artificial sensor raw data. In especially, optical sensors are heavily influenced by precipitation, fog and sun irradiance. However, most LiDAR models lack the capability of replicating the impact of specific weather characteristics. Furthermore, there is – in contrast to numerous publicly available LiDAR datasets – a strong lack of datasets which are annotated with quantitative weather data such as the precipitation rate or meteorological visibility in order to develop and validate such models. Hence, within this work, an automated infrastructure is setup to measure time-correlated LiDAR and weather data to develop and calibrate weather models. The effects of varying precipitation rates on an automotive Flash LiDAR system is demonstrated based on in-field measurements and a respective modeling methodology is developed. Based on the in-field measurement data, raw data LiDAR models can be developed which augment virtual LiDAR data obtained from raytracing capable driving simulation suits as well as real data, recorded under ideal weather conditions.
... Concerning degraded meteorological conditions, only some studies exist, but based on lidar sensors [13][14][15][16], visible cameras [17], or by fusion of data [18][19][20]. ...
Object detection is recognized as one of the most critical research areas for the perception of self-driving cars. Current vision systems combine visible imaging, LIDAR, and/or RADAR technology, allowing perception of the vehicle’s surroundings. However, harsh weather conditions mitigate the performances of these systems. Under these circumstances, thermal imaging becomes the complementary solution to current systems not only because it makes it possible to detect and recognize the environment in the most extreme conditions, but also because thermal images are compatible with detection and recognition algorithms, such as those based on artificial neural networks. In this paper, an analysis of the resilience of thermal sensors in very unfavorable fog conditions is presented. The goal was to study the operational limits, i.e., the very degraded fog situation beyond which a thermal camera becomes unreliable. For the analysis, the mean pixel intensity and the contrast were used as indicators. Results showed that the angle of view (AOV) of a thermal camera is a determining parameter for object detection in foggy conditions. Additionally, results show that cameras with AOVs 18° and 30° are suitable for object detection, even under thick fog conditions (from 13 m meteorological optical range). These results were extended using object detection software, with which it is shown that, for the pedestrian, a detection rate ≥90% was achieved using the images from the 18° and 30° cameras.
... Without focusing on differences in the sensor design, Heinzler et al. [8] investigated the performance of LiDARs from Velodyne and Valeo under lab induced fog and rain conditions and generally emphasised the increasing measurement range reduction and number of false positive detections close to the sensor. Similarly, Montalban et al. [11] tested LiDARs from Velodyne, Ouster, Livox, Cepton and Aeye under artificial rain and fog conditions. By doing so, the authors evaluated the relative backscattered energy and number of points on a defined hard target as well as in the optical channel between sensor and target (noise) under varying precipitation rates and fog densities. ...
... Nevertheless, the majority of works in this field are lacking validation with respect to real weather conditions, partly showing strong deviations from the expected measurement behavior. Hence, the comparability between artificially generated weather conditions and real outdoor conditions is limited, as specifically emphasised by Montalban et al. [11] and Rasshofer et al. [16]. This strongly underlines the importance of sensor testing and model validation under realistic environmental conditions. ...
... Although the extinction efficiency itself is, for typical LiDAR wavelengths and in case of rain, not sensitive to the particle size, it is multiplied with its squared as well as with its DSD to finally determine the extinction coefficient. Consequentially, small deviations from natural precipitation characteristics can cause unexpected signal extinction, as observed by earlier works (see section II, [11], [16]). ...
With the automotive industry's dedicated roadmap towards partly automated driving, the responsibility for reliable environmental perception moves from the driver to the vehicle's advanced driving assistance systems (ADAS). However, with steadily growing system complexity, the required test mileage to certify new driving functions increases to an unworkably high level. In order to validate ADAS functions like lane change assist (LCA), automated emergency breaking (AEB), or even path planning virtually, there is a strong demand for high fidelity sensor models which are capable of simulating automotive Radar, LiDAR as well as camera sensor perception in real time while providing realistic, artificial sensor raw data. Yet, especially LiDAR models mostly lack the capability of replicating the impact of specific weather characteristics, although optical sensors in particular are heavily influenced by precipitation, fog and sun irradiance. Furthermore, there is - in contrast to numerous publicly available LiDAR datasets in differing driving situations - a strong lack of datasets which are annotated with quantitative weather data such as particle size and velocity distribution in order to develop and validate such models. Hence, within this work, an automated infrastructure setup for targeted measurement of time-correlated LiDAR and weather data is presented with the aim to develop and calibrate weather models, which can eventually be used to augment virtual LiDAR data from raytracing capable driving simulation suits as well as real data, recorded under ideal weather conditions. In addition to that, the considerable effect of varying precipitation rates on an automotive Flash LiDAR system was demonstrated based on first measurements and quantified by calculating the pixel-wise temporal coefficient of variation for measured depth and intensity, reaching up to approximately 50% and 350%, respectively.
... Recent studies have focused on the analysis of a specific noise factor on a sensor type, e.g. rain on LiDAR [4], [5], interference on RADAR [6], [7], fog on camera [8]. However, an in depth investigation of noise factors is required. ...
p>Assisted and automated driving functions are increasingly deployed to support improved safety, efficiency, and enhance driver experience. However, there are still key technical challenges that need to be overcome, such as the degradation of perception sensor data due to noise factors. The quality of data being generated by sensors can directly impact the planning and control of the vehicle, which can affect the vehicle safety. This work builds on a recently proposed framework, analysing noise factors on automotive LiDAR sensors, and deploys it to camera sensors, focusing on the specific disturbed sensor outputs via a detailed analysis and classification of automotive camera specific noise sources (30 noise factors are identified and classified in this work). Moreover, the noise factor analysis has identified two omnipresent and independent noise factors (i.e. obstruction and windshield distortion). These noise factors have been modelled to generate noisy camera data; their impact on the perception step, based on deep neural networks, has been evaluated when the noise factors are applied independently and simultaneously. It is demonstrated that the performance degradation from the combination of noise factors is not simply the accumulated performance degradation from each single factor, which raises the importance of including the simultaneous analysis of multiple noise factors. Thus, the framework can support and enhance the use of simulation for development and testing of automated vehicles through careful consideration of the noise factors affecting camera data. </p
... Recent studies have focused on the analysis of a specific noise factor on a sensor type, e.g. rain on LiDAR [4], [5], interference on RADAR [6], [7], fog on camera [8]. However, an in depth investigation of noise factors is required. ...
p>Assisted and automated driving functions are increasingly deployed to support improved safety, efficiency, and enhance driver experience. However, there are still key technical challenges that need to be overcome, such as the degradation of perception sensor data due to noise factors. The quality of data being generated by sensors can directly impact the planning and control of the vehicle, which can affect the vehicle safety. This work builds on a recently proposed framework, analysing noise factors on automotive LiDAR sensors, and deploys it to camera sensors, focusing on the specific disturbed sensor outputs via a detailed analysis and classification of automotive camera specific noise sources (30 noise factors are identified and classified in this work). Moreover, the noise factor analysis has recognised two omnipresent and independent noise factors (i.e. obstruction and windshield distortion). These noise factors have been modelled to generate noisy camera data; their impact on the perception step, based on deep neural networks, has been evaluated when the noise factors are applied independently and simultaneously. It is demonstrated that the performance degradation from the combination of noise factors is not simply the accumulated performance degradation from each single factor, which raises the importance of including the simultaneous analysis of multiple noise factors. Thus, the framework can support and enhance the use of simulation for development and testing of automated vehicles through careful consideration of the noise factors affecting camera data. </p
... Recent studies have focused on the analysis of a specific noise factor on a sensor type, e.g. rain on LiDAR [4], [5], interference on RADAR [6], [7], fog on camera [8]. However, an in depth investigation of noise factors is required. ...
p>Assisted and automated driving functions are increasingly deployed to support improved safety, efficiency, and enhance driver experience. However, there are still key technical challenges that need to be overcome, such as the degradation of perception sensor data due to noise factors. The quality of data being generated by sensors can directly impact the planning and control of the vehicle, which can affect the vehicle safety. This work builds on a recently proposed framework, analysing noise factors on automotive LiDAR sensors, and deploys it to camera sensors, focusing on the specific disturbed sensor outputs via a detailed analysis and classification of automotive camera specific noise sources (30 noise factors are identified and classified in this work). Moreover, the noise factor analysis has recognised two omnipresent and independent noise factors (i.e. obstruction and windshield distortion). These noise factors have been modelled to generate noisy camera data; their impact on the perception step, based on deep neural networks, has been evaluated when the noise factors are applied independently and simultaneously. It is demonstrated that the performance degradation from the combination of noise factors is not simply the accumulated performance degradation from each single factor, which raises the importance of including the simultaneous analysis of multiple noise factors. Thus, the framework can support and enhance the use of simulation for development and testing of automated vehicles through careful consideration of the noise factors affecting camera data. </p