Multi-Sensor Fusion for Environmental Perception: Calibration and Understanding using LiDAR, Radar, and Cameras
Keywords:
Sensor fusion; Calibration; Environmental understanding; LiDAR; Radar; Cameras; Machine learning; Automotive perceptionAbstract
Automotive perception, which involves using sensor data to understand the external driving environment and the internal state of the vehicle cabin and occupants, is crucial for achieving high levels of safety and autonomy in driving. This paper focuses on sensor-based perception, specifically the utilization of LiDAR, radar, cameras, and other sensors for sensor fusion, calibration, and environmental understanding. It provides an overview of different sensor modalities and their associated data processing techniques. Critical aspects such as architectures for single or multiple sensor data processing, sensor data processing algorithms, the role of machine learning in perception, validation methodologies for perception systems, and safety considerations are analyzed. The technical challenges for each aspect are discussed, with an emphasis on machine-learning-based approaches due to their potential for enhancing perception. Finally, future research directions in automotive perception for broader deployment are proposed.