An Efficient and Accurate UAV State Estimation Method with Multi-LiDAR–IMU–Camera Fusion
I tiakina i:
| I whakaputaina i: | Drones vol. 9, no. 12 (2025), p. 823-847 |
|---|---|
| Kaituhi matua: | |
| Ētahi atu kaituhi: | , , , , |
| I whakaputaina: |
MDPI AG
|
| Ngā marau: | |
| Urunga tuihono: | Citation/Abstract Full Text + Graphics Full Text - PDF |
| Ngā Tūtohu: |
Kāore He Tūtohu, Me noho koe te mea tuatahi ki te tūtohu i tēnei pūkete!
|
| Whakarāpopotonga: | <sec sec-type="highlights"> What are the main findings? <list list-type="bullet"> <list-item> </list-item>The proposed DLIC method reformulates the complex, coupled UAV state estimation problem in multi-LiDAR–IMU–camera systems as an efficient distributed subsystem optimization framework. The designed feedback mechanism effectively constrains and optimizes the UAV state using the estimated subsystem states. <list-item> Extensive experiments demonstrate that DLIC achieves superior accuracy and efficiency on a resource-constrained embedded UAV platform equipped with only an 8-core CPU. It operates in real time while maintaining low memory usage. </list-item> What are the implications of the main finding? <list list-type="bullet"> <list-item> </list-item>This work demonstrates that the challenging, coupled UAV state estimation problem in multi-LiDAR–IMU–camera systems can be effectively addressed through distributed optimization techniques, paving the way for scalable and efficient estimation frameworks. <list-item> The proposed DLIC method offers a promising solution for real-time state estimation in resource-limited UAVs with multi-sensor configurations. </list-item> State estimation plays a vital role in UAV navigation and control. With the continuous decrease in sensor cost and size, UAVs equipped with multiple LiDARs, Inertial Measurement Units (IMUs), and cameras have attracted increasing attention. Such systems can acquire rich environmental and motion information from multiple perspectives, thereby enabling more precise navigation and mapping in complex environments. However, efficiently utilizing multi-sensor data for state estimation remains challenging. There is a complex coupling relationship between IMUs’ bias and UAV state. To address these challenges, this paper proposes an efficient and accurate UAV state estimation method tailored for multi-LiDAR–IMU–camera systems. Specifically, we first construct an efficient distributed state estimation model. It decomposes the multi-LiDAR–IMU–camera system into a series of single LiDAR–IMU–camera subsystems, reformulating the complex coupling problem as an efficient distributed state estimation problem. Then, we derive an accurate feedback function to constrain and optimize the UAV state using estimated subsystem states, thus enhancing overall estimation accuracy. Based on this model, we design an efficient distributed state estimation algorithm with multi-LiDAR-IMU-Camerafusion, termed DLIC. DLIC achieves robust multi-sensor data fusion via shared feature maps, effectively improving both estimation robustness and accuracy. In addition, we design an accelerated image-to-point cloud registration module (A-I2P) to provide reliable visual measurements, further boosting state estimation efficiency. Extensive experiments are conducted on 18 real-world indoor and outdoor scenarios from the public NTU VIRAL dataset. The results demonstrate that DLIC consistently outperforms existing multi-sensor methods across key evaluation metrics, including RMSE, MAE, SD, and SSE. More importantly, our method runs in real time on a resource-constrained embedded device equipped with only an 8-core CPU, while maintaining low memory consumption. |
|---|---|
| ISSN: | 2504-446X |
| DOI: | 10.3390/drones9120823 |
| Puna: | Advanced Technologies & Aerospace Database |