Bio-inspired Vision Mapping and Localization Method Based on Reprojection Error Optimization and Asynchronous Kalman Fusion

Enregistré dans:
Détails bibliographiques
Publié dans:Chinese Journal of Mechanical Engineering = Ji xie gong cheng xue bao vol. 38, no. 1 (Dec 2025), p. 163
Auteur principal: Zhang, Shijie
Autres auteurs: Tang, Tao, Hou, Taogang, Huang, Yuxuan, Pei, Xuan, Wang, Tianmiao
Publié:
Springer Nature B.V.
Sujets:
Accès en ligne:Citation/Abstract
Full Text
Full Text - PDF
Tags: Ajouter un tag
Pas de tags, Soyez le premier à ajouter un tag!

MARC

LEADER 00000nab a2200000uu 4500
001 3241748185
003 UK-CbPIL
022 |a 1000-9345 
022 |a 2192-8258 
024 7 |a 10.1186/s10033-025-01342-3  |2 doi 
035 |a 3241748185 
045 2 |b d20251201  |b d20251231 
100 1 |a Zhang, Shijie  |u Beijing Jiaotong University, School of Automation and Intelligence, Beijing, China (GRID:grid.181531.f) (ISNI:0000 0004 1789 9622) 
245 1 |a Bio-inspired Vision Mapping and Localization Method Based on Reprojection Error Optimization and Asynchronous Kalman Fusion 
260 |b Springer Nature B.V.  |c Dec 2025 
513 |a Journal Article 
520 3 |a Bio-inspired visual systems have garnered significant attention in robotics owing to their energy efficiency, rapid dynamic response, and environmental adaptability. Among these, event cameras—bio-inspired sensors that asynchronously report pixel-level brightness changes called 'events', stand out because of their ability to capture dynamic changes with minimal energy consumption, making them suitable for challenging conditions, such as low light or high-speed motion. However, current mapping and localization methods for event cameras depend primarily on point and line features, which struggle in sparse or low-feature environments and are unsuitable for static or slow-motion scenarios. We addressed these challenges by proposing a bio-inspired vision mapping and localization method using active LED markers (ALMs) combined with reprojection error optimization and asynchronous Kalman fusion. Our approach replaces traditional features with ALMs, thereby enabling accurate tracking under dynamic and low-feature conditions. The global mapping accuracy significantly improved by minimizing the reprojection error, with corner errors reduced from 16.8 cm to 3.1 cm after 400 iterations. The asynchronous Kalman fusion of multiple camera pose estimations from ALMs ensures precise localization with a high temporal efficiency. This method achieved a mean translation error of 0.078 m and a rotational error of 5.411° while evaluating dynamic motion. In addition, the method supported an output rate of 4.5 kHz while maintaining high localization accuracy in UAV spiral flight experiments. These results demonstrate the potential of the proposed approach for real-time robot localization in challenging environments. 
653 |a Robotics 
653 |a Localization method 
653 |a Simultaneous localization and mapping 
653 |a Cameras 
653 |a Accuracy 
653 |a Vision 
653 |a Dynamic response 
653 |a Optimization 
653 |a Sensors 
653 |a Mapping 
653 |a Methods 
653 |a Errors 
653 |a Pose estimation 
653 |a Localization 
653 |a Real time 
653 |a Energy consumption 
653 |a Efficiency 
700 1 |a Tang, Tao  |u Beijing Jiaotong University, School of Automation and Intelligence, Beijing, China (GRID:grid.181531.f) (ISNI:0000 0004 1789 9622) 
700 1 |a Hou, Taogang  |u Beijing Jiaotong University, School of Automation and Intelligence, Beijing, China (GRID:grid.181531.f) (ISNI:0000 0004 1789 9622) 
700 1 |a Huang, Yuxuan  |u Beijing Jiaotong University, School of Automation and Intelligence, Beijing, China (GRID:grid.181531.f) (ISNI:0000 0004 1789 9622) 
700 1 |a Pei, Xuan  |u Beijing Jiaotong University, School of Automation and Intelligence, Beijing, China (GRID:grid.181531.f) (ISNI:0000 0004 1789 9622) 
700 1 |a Wang, Tianmiao  |u Beihang University, School of Mechanical Engineering and Automation, Beijing, China (GRID:grid.64939.31) (ISNI:0000 0000 9999 1211) 
773 0 |t Chinese Journal of Mechanical Engineering = Ji xie gong cheng xue bao  |g vol. 38, no. 1 (Dec 2025), p. 163 
786 0 |d ProQuest  |t Engineering Database 
856 4 1 |3 Citation/Abstract  |u https://www.proquest.com/docview/3241748185/abstract/embedded/L8HZQI7Z43R0LA5T?source=fedsrch 
856 4 0 |3 Full Text  |u https://www.proquest.com/docview/3241748185/fulltext/embedded/L8HZQI7Z43R0LA5T?source=fedsrch 
856 4 0 |3 Full Text - PDF  |u https://www.proquest.com/docview/3241748185/fulltextPDF/embedded/L8HZQI7Z43R0LA5T?source=fedsrch