Orthographic Video Map Generation Considering 3D GIS View Matching

Guardado en:
Bibliografiske detaljer
Udgivet i:ISPRS International Journal of Geo-Information vol. 14, no. 10 (2025), p. 398-420
Hovedforfatter: Zhang, Xingguo
Andre forfattere: Meng Xiangfei, Zhang, Li, Ling Xianguo, Sen, Yang
Udgivet:
MDPI AG
Fag:
Online adgang:Citation/Abstract
Full Text + Graphics
Full Text - PDF
Tags: Tilføj Tag
Ingen Tags, Vær først til at tagge denne postø!
Beskrivelse
Resumen:Converting tower-mounted videos from perspective to orthographic view is beneficial for their integration with maps and remote sensing images and can provide a clearer and more real-time data source for earth observation. This paper addresses the issue of low geometric accuracy in orthographic video generation by proposing a method that incorporates 3D GIS view matching. Firstly, a geometric alignment model between video frames and 3D GIS views is established through camera parameter mapping. Then, feature point detection and matching algorithms are employed to associate image coordinates with corresponding 3D spatial coordinates. Finally, an orthographic video map is generated based on the color point cloud. The results show that (1) for tower-based video, a 3D GIS constructed from publicly available DEMs and high-resolution remote sensing imagery can meet the spatialization needs of large-scale tower-mounted video data. (2) The feature point matching algorithm based on deep learning effectively achieves accurate matching between video frames and 3D GIS views. (3) Compared with the traditional method, such as the camera parameters method, the orthographic video map generated by this method has advantages in terms of geometric mapping accuracy and visualization effect. In the mountainous area, the RMSE of the control points is reduced from 137.70 m to 7.72 m. In the flat area, it is reduced from 13.52 m to 8.10 m. The proposed method can provide a near-real-time orthographic video map for smart cities, natural resource monitoring, emergency rescue, and other fields.
ISSN:2220-9964
DOI:10.3390/ijgi14100398
Fuente:Advanced Technologies & Aerospace Database