R2GDN: RepGhost based residual dense network for image super-resolution

Uloženo v:
Podrobná bibliografie
Vydáno v:PLoS One vol. 20, no. 12 (Dec 2025), p. e0338432
Hlavní autor: Li, Tianyu
Další autoři: Jin, Xiaoshi, Liu, Qiang, Liu, Xi, Yuan, Zehang, Liang, Tianyang, Jia, Lou, Rao, Yangfan
Vydáno:
Public Library of Science
Témata:
On-line přístup:Citation/Abstract
Full Text
Full Text - PDF
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Abstrakt:This study introduces a novel lightweight image super-resolution reconstruction network aimed at mitigating the challenges associated with computational complexity and memory consumption in existing super-resolution reconstruction networks. The proposed network optimizes its architecture through feature reuse and structural reparameterization, rendering it more suitable for deployment in edge computing environments. Specifically, we have developed a new lightweight reparameterization layer that derives redundant features from intrinsic features using low-cost operations and integrates them with reparameterization techniques to enhance efficient feature utilization. Furthermore, an efficient deep feature extraction module named RGAB has been designed, which retains dense connections, local feature integration, and local residual learning mechanisms while incorporating addition operations for feature integration. The resultant network, termed R2GDN, exhibits a significant reduction in model parameters and improved inference speed. Compared to performance-oriented super-resolution algorithms, our model reduces the number of parameters by approximately 95% and enhances inference speed by 86.8% on the edge device. When benchmarked against lightweight super-resolution algorithms, our model maintains a lower parameter count and achieves a 0.74% improvement in the structural similarity index (SSIM) on the BSD100 dataset for 4 × super-resolution reconstruction. Experimental results demonstrate that R2GDN effectively balances network performance and complexity.
ISSN:1932-6203
DOI:10.1371/journal.pone.0338432
Zdroj:Health & Medical Collection