Conference Paper

Optimizing Autonomous Vehicle Sensor Setups: A Framework for Coverage Analysis

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... Analyzing comprehensive data sets obtained from various automotive sensors like LIDAR, RADAR, cameras, temperature sensors, and pressure sensors results in significant insights that can improve the functionalities of a vehicle. [11] The entire process of automotive sensor data analysis includes the collection of data, data pre-processing and processing, data integration, extraction of features, statistical data analysis, and data visualization. [16] [1] Today, monitoring and collection of data can be done with a variety of sensors that vehicles are equipped with. ...
Article
Full-text available
The analysis of data from automotive sensors is a crucial part of the development, maintenance, and improvement of the automotive industry as a whole. Examining data from various sensors shows how effective collection and analysis of data can yield some important insights about the data, which is essential for the improvement of sensor and system functionalities. With the advancement of machine learning techniques, there has been great success in the detection of patterns and the prediction of anomalies. This paper explores the analysis of automotive sensor data, its importance, a brief on methodology, key insights and implications, and some recommendations for future research that could be possible ways to overcome existing limitations.
Conference Paper
Full-text available
Abstract (english): Autonomous shuttle vehicles offer a promising avenue for deploying fully autonomous vehicles on the road and integrating them into real-world traffic. Since traditional vehicle manufacturers have hardly offered this vehicle concept to date, the market is being shaped by many new companies with specially developed vehicle concepts. These vehicles have numerous sensors to perceive the environment for a fully autonomous operation. This paper aims to investigate autonomous shuttles with respect to their sensor set. The focus is specifically on sensor positioning and body integration. It is shown that despite having the same use case, the vehicles differ in sensor positioning and use different sensor integration strategies. Further, the integration strategies are clustered and analyzed in their impact on the vehicle design. Based on this, sensor positioning and integration requirements in the concept phase are derived.
Article
Full-text available
Despite the progress in driving automation, the market introduction of higher-level automation has not yet been achieved. One of the main reasons for this is the effort in safety validation to prove functional safety to the customer. However, virtual testing may compromise this challenge, but the modelling of machine perception and proving its validity has not been solved completely. The present research focuses on a novel modelling approach for automotive radar sensors. Due to the complex high-frequency physics of radars, sensor models for vehicle development are challenging. The presented approach employs a semi-physical modelling approach based on experiments. The selected commercial automotive radar was applied in on-road tests where the ground truth was recorded with a precise measurement system installed in ego and target vehicles. High-frequency phenomena were observed and reproduced in the model on the one hand by using physically based equations such as antenna characteristics and the radar equation. On the other hand, high-frequency effects were statistically modelled using adequate error models derived from the measurements. The model was evaluated with performance metrics developed in previous works and compared to a commercial radar sensor model. Results show that, while keeping real-time performance necessary for X-in-the-loop applications, the model is able to achieve a remarkable fidelity as assessed by probability density functions of the radar point clouds and using the Jensen–Shannon divergence. The model delivers values of radar cross-section for the radar point clouds that correlate well with measurements comparable with the Euro NCAP Global Vehicle Target Validation process. The model outperforms a comparable commercial sensor model.
Conference Paper
Full-text available
The automotive design process prevalent in industry that dictates transportation design education, is optimized to facilitate the frequent aesthetic renewal of personally owned vehicles for car-oriented cities. With its origins in the late 1920s, this hyper-specialized design process has barely changed from its original form. In this paper, we provide a brief account of the automotive design process from its origins (analogue) to the present day (digital technologies), followed by a new paradigm instigated by immersive technologies. A passenger drone project is used as an example to describe the possibilities of immersive technologies in this radically innovated process. Enhancements from 2D and 3D to immersive and interactive 4D, enable a lean, yet contextualized process to design radically innovative vehicles.
Article
Full-text available
With the significant advancement of sensor and communication technology and the reliable application of obstacle detection techniques and algorithms, automated driving is becoming a pivotal technology that can revolutionize the future of transportation and mobility. Sensors are fundamental to the perception of vehicle surroundings in an automated driving system, and the use and performance of multiple integrated sensors can directly determine the safety and feasibility of automated driving vehicles. Sensor calibration is the foundation block of any autonomous system and its constituent sensors and must be performed correctly before sensor fusion and obstacle detection processes may be implemented. This paper evaluates the capabilities and the technical performance of sensors which are commonly employed in autonomous vehicles, primarily focusing on a large selection of vision cameras, LiDAR sensors, and radar sensors and the various conditions in which such sensors may operate in practice. We present an overview of the three primary categories of sensor calibration and review existing open-source calibration packages for multi-sensor calibration and their compatibility with numerous commercial sensors. We also summarize the three main approaches to sensor fusion and review current state-of-the-art multi-sensor fusion techniques and algorithms for object detection in autonomous driving applications. The current paper, therefore, provides an end-to-end review of the hardware and software methods required for sensor fusion object detection. We conclude by highlighting some of the challenges in the sensor fusion field and propose possible future research directions for automated driving systems.
Article
Full-text available
Array programming provides a powerful, compact and expressive syntax for accessing, manipulating and operating on data in vectors, matrices and higher-dimensional arrays. NumPy is the primary array programming library for the Python language. It has an essential role in research analysis pipelines in fields as diverse as physics, chemistry, astronomy, geoscience, biology, psychology, materials science, engineering, finance and economics. For example, in astronomy, NumPy was an important part of the software stack used in the discovery of gravitational waves1 and in the first imaging of a black hole2. Here we review how a few fundamental array concepts lead to a simple and powerful programming paradigm for organizing, exploring and analysing scientific data. NumPy is the foundation upon which the scientific Python ecosystem is constructed. It is so pervasive that several projects, targeting audiences with specialized needs, have developed their own NumPy-like interfaces and array objects. Owing to its central position in the ecosystem, NumPy increasingly acts as an interoperability layer between such array computation libraries and, together with its application programming interface (API), provides a flexible framework to support the next decade of scientific and industrial analysis.
Article
Full-text available
Designing an acquisition system for 2D or 3D information, based on the integration of data provided by different sensors is a task that requires a labor-intensive initial design phase. Indeed, the definition of the architecture of such acquisition systems needs to start from the identification of the position and orientation of the sensors observing the scene. Their placement is carefully studied to enhance the efficacy of the system. This often coincides with the need to maximize the surfaces observed by the sensors or some other metric. An automatic optimization procedure based on the Particle Swarm Optimization (PSO) algorithm, to seek the most convenient setting of multiple optical sensors observing a 3D scene, is proposed. The procedure has been developed to provide a fast and efficient tool for 2D and 3D data acquisition. Three different objective functions of general validity, to be used in future applications, are proposed and described in the text. Various filters are introduced to reduce computational times of the whole procedure. The method is capable of handling occlusions from undesired obstacle in the scene. Finally, the entire method is discussed with reference to 1) the development of a body scanner for the arm-wrist-hand district and 2) the acquisition of an internal environment as case studies.
Article
This paper aims to review the state of the art of Light Detection and Ranging (LiDAR) sensors for automotive applications, and particularly for automated vehicles, focusing on recent advances in the field of integrated LiDAR, and one of its key components: the Optical Phased Array (OPA). LiDAR is still a sensor that divides the automotive community, with several automotive companies investing in it, and some companies stating that LiDAR is a 'useless appendix'. However, currently there is not a single sensor technology able to robustly and completely support automated navigation. Therefore, LiDAR, with its capability to map in 3 dimensions (3D) the vehicle surroundings, is a strong candidate to support Automated Vehicles (AVs). This manuscript highlights current AV sensor challenges, and it analyses the strengths and weaknesses of the perception sensor currently deployed. Then, the manuscript discusses the main LiDAR technologies emerging in automotive, and focuses on integrated LiDAR, challenges associated with light beam steering on a chip, the use of Optical Phased Arrays, finally discussing current factors hindering the affirmation of silicon photonics OPAs and their future research directions.
Poster
With increasing degree of automation, vehicles require more and more perception sensors to observe their surrounding environment. Car manufacturers are facing the challenge of defining a suitable sensor setup that covers all requirements. Besides the sensors’ performance and field of view coverage, other factors like setup costs, vehicle integration and design aspects need to be taken into account. Additionally, a redundant sensor arrangement and the sensors’ sensitivity to environmental influences are of crucial importance for safety. It is not feasible to explore every possible sensor combination in test drives. This paper presents a new simulation-based evaluation methodology, which allows the configuration of arbitrary sensor setups and enables virtual test drives within specific scenarios to evaluate the environmental perception in early development phases with metrics and key performance indicators. This evaluation suite is an important tool for researchers and developers to analyze setup correlations and to define optimal setup solutions.
Conference Paper
This paper proposes a methodology for determining strategically bundled relevant test scenarios for the simulation-based evaluation of sensor constellations. This is achieved by (a) identification of important use cases for the autonomous driving operation, (b) the conversion of these use cases into Regions of Interests (ROIs) around the vehicle along with (c) a definition of a critical index (CI) for each of these regions and (d) a procedure to derive crucial scenarios and (e) categorise them into scenario families. The derived test scenarios help to optimise the field of view of the sensor constellation for the most important regions around the ego vehicle. The novelty lies in its independence from traditional methods of deriving test scenarios and its capability of providing targeted feedback to improve the sensor constellation at the identified pain points. The test scenario families can reduce the development time of highly automated vehicles by providing virtual testing of the sensor constellation performance in the vehicle concept phase.
Conference Paper
Automotive sensor systems play an essential role for highly and fully automated driving by enabling environmental perception. Besides improving the sensors’ performance, the data quality and information processing, the arrangement of the vehicular surround sensors is crucial. The conceptual design of robust and reliable sensor systems is a complex task since they comprise a high number of different sensors, each with diverse, adjustable parameters. At this early vehicle concept phase, it is not possible to examine the performance of all system variations in field tests. Thus, we established a simulation-based evaluation methodology to investigate the interaction of multiple sensors as a system and the effects of particular sensor arrangements on object detection and vehicle’s surround-view coverage. This paper highlights the considerations while designing sensor systems and analyzes the impact of different sensor positions, especially by systematically altering the mounting height.
Article
In emerging autonomous vehicles, perception of the environment around the vehicle depends not only on the quality and choice of sensor type, but more importantly also on the instrumented location and orientation of each of the sensors. This article explores the synthesis of heterogeneous sensor configurations towards achieving vehicle autonomy goals. We propose a novel optimization framework called VESPA that explores the design space of sensor placement locations and orientations to find the optimal sensor configuration for a vehicle. We demonstrate how our framework can obtain optimal sensor configurations for heterogeneous sensors deployed across two contemporary real vehicles.
Conference Paper
Car manufacturers are facing the challenge of defining suitable sensor setups that cover all requirements for the particular SAE level of automated driving. Besides the sensors' performance and surround-view coverage, other factors like vehicle integration, costs and design aspects need to be taken into account. Additionally, a redundant sensor arrangement and the sensors' sensitivity to environmental influences are of crucial importance for safety. By increasing the degree of automation, vehicles require more external sensors to observe their surrounding environment sufficiently, which raises the variety of setup configurations and the difficulty to identify the optimal one. Concerning the vehicle development process, concepts for sensor setups need to be defined at a very early stage. In this concept stage, it is not feasible to explore every possible sensor arrangement with test drives or to simulate the setup performance with tools used for vehicle validation. Thus, we propose a new simulation-based evaluation method, which allows the configuration of arbitrary sensor setups and enables virtual test drives within specific scenarios to evaluate the setup performance in this early development phase with metrics and key performance indicators. Two different setups are analyzed to demonstrate the results of this evaluation method.
Article
The conventional radar modeling methods for automotive applications were either function-based or physics-based. The former approach was mainly abstracted as a solution of the intersection between geometric representations of radar beam and targets, while the latter one took radar detection mechanism into consideration by means of “ray tracing”. Although they each has its unique advantages, they were often unrealistic or time-consuming to meet actual simulation requirements. This paper presents a combined geometric and physical modeling method on millimeter-wave radar systems for Frequency Modulated Continuous Wave (FMCW) modulation format under a 3D simulation environment. With the geometric approach, a link between the virtual radar and 3D environment is established. With the physical approach, on the other hand, the ideal target detection and measurement are contaminated with noise and clutters aimed to produce the signals as close to the real ones as possible. The proposed modeling method not only makes it feasible, safe and convenient to develop and test radar-based Advanced Driver Assistance Systems (ADAS) under various simulated scenarios and environment efficiently at lower cost, it also achieves good balance between model fidelity and computational efficiency. The proposed method further enables real-time simulation under a Hardware-In-the-Loop (HIL) platform. Extensive simulation such as Adaptive Cruise Control (ACC) has been conducted under Matlab/Simulink platform with a Simulink model automatically generated by PanoSim, and the results demonstrate that the proposed radar modeling method is valid and effective.
Article
This paper presents a modelling and simulation method for sensor-guided autonomous driving. A generic model of range sensing and object detection in 3D space is discussed first that represents their high-level functions. The low-level physical characteristics of range sensing and object detection are further investigated based on Frequency Modulated Continuous Wave (FMCW) radar which is gaining wide popularity in automotive applications. These are the enablers to modelling and simulation of vehicle interactions with one another under traffic and with surrounding environment. A closed-loop adaptive cruise control is then used as an example to demonstrate the vehicle limited autonomous driving with the proposed model and method that has been shown to be valid, effective and numerically efficient.
Perception Entropy: A Metric for Multiple Sensors Configuration Evaluation and Design
  • T Ma
  • Z Liu
  • Y Li
T. Ma, Z. Liu, and Y. Li, "Perception Entropy: A Metric for Multiple Sensors Configuration Evaluation and Design," Apr. 2021.
EDGAR: An Autonomous Driving Research Platform – From Feature Development to Real-World Application
  • P Karle
  • T Betz
  • M Bosk
  • F Fent
  • N Gehrke
  • M Geisslinger
Perception Entropy: A Metric for Multiple Sensors Configuration Evaluation and Design
  • Ma
EDGAR: An Autonomous Driving Research Platform -From Feature Development to Real-World Application
  • P Karle
  • T Betz
  • M Bosk
  • F Fent
  • N Gehrke
  • M Geisslinger
  • L Gressenbuch
  • P Hafemann
  • S Huber
  • M Hübner
  • S Huch
  • G Kaljavesi
  • T Kerbl
  • D Kulmer
  • T Mascetta
  • S Maierhofer
  • F Pfab
  • F Rezabek
  • E Rivera
  • S Sagmeister
  • L Seidlitz
  • F Sauerbeck
  • I Tahiraj
  • R Trauth
  • N Uhlemann
  • G Würsching
  • B Zarrouki
  • M Althoff
  • J Betz
  • K Bengler
  • G Carle
  • F Diermeyer
  • J Ott
  • M Lienkamp
P. Karle, T. Betz, M. Bosk, F. Fent, N. Gehrke, M. Geisslinger, L. Gressenbuch, P. Hafemann, S. Huber, M. Hübner, S. Huch, G. Kaljavesi, T. Kerbl, D. Kulmer, T. Mascetta, S. Maierhofer, F. Pfab, F. Rezabek, E. Rivera, S. Sagmeister, L. Seidlitz, F. Sauerbeck, I. Tahiraj, R. Trauth, N. Uhlemann, G. Würsching, B. Zarrouki, M. Althoff, J. Betz, K. Bengler, G. Carle, F. Diermeyer, J. Ott, and M. Lienkamp. EDGAR: An Autonomous Driving Research Platform -From Feature Development to Real-World Application. [Online]. Available: http://arxiv.org/abs/2309.15492