A Distributed Time-of-Flight Sensor System for Autonomous Vehicles: Architecture, Sensor Fusion, and Spiking Neural Network Perception

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Electronics vol. 14, no. 7 (2025), p. 1375
1. Verfasser: Lielamurs, Edgars
Weitere Verfasser: Ibrahim, Sayed, Cvetkovs, Andrejs, Novickis, Rihards, Zencovs, Anatolijs, Celitans, Maksis, Bizuns, Andis, Dimitrakopoulos, George, Koszescha, Jochen, Ozols, Kaspars
Veröffentlicht:
MDPI AG
Schlagworte:
Online-Zugang:Citation/Abstract
Full Text + Graphics
Full Text - PDF
Tags: Tag hinzufügen
Keine Tags, Fügen Sie das erste Tag hinzu!
Beschreibung
Abstract:Mechanically scanning LiDAR imaging sensors are abundantly used in applications ranging from basic safety assistance to high-level automated driving, offering excellent spatial resolution and full surround-view coverage in most scenarios. However, their complex optomechanical structure introduces limitations, namely limited mounting options and blind zones, especially in elongated vehicles. To mitigate these challenges, we propose a distributed Time-of-Flight (ToF) sensor system with a flexible hardware–software architecture designed for multi-sensor synchronous triggering and fusion. We formalize the sensor triggering, interference mitigation scheme, data aggregation and fusion procedures and highlight challenges in achieving accurate global registration with current state-of-the-art methods. The resulting surround view visual information is then applied to Spiking Neural Network (SNN)-based object detection and probabilistic occupancy grid mapping (OGM) for enhanced environmental awareness. The proposed system is demonstrated on a test vehicle, achieving coverage of blind zones in a range of 0.5–6 m with a scalable and reconfigurable sensor mounting setup. Using seven ToF sensors, we can achieve a 10 Hz synchronized frame rate, with a 360° point cloud registration and fusion latency below 40 ms. We collected real-world driving data to evaluate the system, achieving 65% mean Average Precision (mAP) in object detection with our SNN. Overall, this work presents a replacement or addition to LiDAR in future high-level automation tasks, offering improved coverage and system integration.
ISSN:2079-9292
DOI:10.3390/electronics14071375
Quelle:Advanced Technologies & Aerospace Database