Enhancing Document Forgery Detection with Edge-Focused Deep Learning

Guardado en:
Detalles Bibliográficos
Publicado en:Symmetry vol. 17, no. 8 (2025), p. 1208-1226
Autor principal: Yong-Yeol, Bae
Otros Autores: Dae-Jea, Cho, Ki-Hyun, Jung
Publicado:
MDPI AG
Materias:
Acceso en línea:Citation/Abstract
Full Text + Graphics
Full Text - PDF
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!

MARC

LEADER 00000nab a2200000uu 4500
001 3244064305
003 UK-CbPIL
022 |a 2073-8994 
024 7 |a 10.3390/sym17081208  |2 doi 
035 |a 3244064305 
045 2 |b d20250101  |b d20251231 
084 |a 231635  |2 nlm 
100 1 |a Yong-Yeol, Bae 
245 1 |a Enhancing Document Forgery Detection with Edge-Focused Deep Learning 
260 |b MDPI AG  |c 2025 
513 |a Journal Article 
520 3 |a Detecting manipulated document images is essential for verifying the authenticity of official records and preventing document forgery. However, forgery artifacts are often subtle and localized in fine-grained regions, such as text boundaries or character outlines, where visual symmetry and structural regularity are typically expected. These manipulations can disrupt the inherent symmetry of document layouts, making the detection of such inconsistencies crucial for forgery identification. Conventional CNN-based models face limitations in capturing such edge-level asymmetric features, as edge-related information tends to weaken through repeated convolution and pooling operations. To address this issue, this study proposes an edge-focused method composed of two components: the Edge Attention (EA) layer and the Edge Concatenation (EC) layer. The EA layer dynamically identifies channels that are highly responsive to edge features in the input feature map and applies learnable weights to emphasize them, enhancing the representation of boundary-related information, thereby emphasizing structurally significant boundaries. Subsequently, the EC layer extracts edge maps from the input image using the Sobel filter and concatenates them with the original feature maps along the channel dimension, allowing the model to explicitly incorporate edge information. To evaluate the effectiveness and compatibility of the proposed method, it was initially applied to a simple CNN architecture to isolate its impact. Subsequently, it was integrated into various widely used models, including DenseNet121, ResNet50, Vision Transformer (ViT), and a CAE-SVM-based document forgery detection model. Experiments were conducted on the DocTamper, Receipt, and MIDV-2020 datasets to assess classification accuracy and F1-score using both original and forged text images. Across all model architectures and datasets, the proposed EA–EC method consistently improved model performance, particularly by increasing sensitivity to asymmetric manipulations around text boundaries. These results demonstrate that the proposed edge-focused approach is not only effective but also highly adaptable, serving as a lightweight and modular extension that can be easily incorporated into existing deep learning-based document forgery detection frameworks. By reinforcing attention to structural inconsistencies often missed by standard convolutional networks, the proposed method provides a practical solution for enhancing the robustness and generalizability of forgery detection systems. 
653 |a Asymmetry 
653 |a Forgery 
653 |a Image manipulation 
653 |a Datasets 
653 |a Deep learning 
653 |a Forensic sciences 
653 |a Image filters 
653 |a Artificial neural networks 
653 |a Boundaries 
653 |a Neural networks 
653 |a Documents 
653 |a Effectiveness 
653 |a Feature maps 
653 |a Methods 
653 |a Machine learning 
653 |a Symmetry 
700 1 |a Dae-Jea, Cho 
700 1 |a Ki-Hyun, Jung 
773 0 |t Symmetry  |g vol. 17, no. 8 (2025), p. 1208-1226 
786 0 |d ProQuest  |t Engineering Database 
856 4 1 |3 Citation/Abstract  |u https://www.proquest.com/docview/3244064305/abstract/embedded/7BTGNMKEMPT1V9Z2?source=fedsrch 
856 4 0 |3 Full Text + Graphics  |u https://www.proquest.com/docview/3244064305/fulltextwithgraphics/embedded/7BTGNMKEMPT1V9Z2?source=fedsrch 
856 4 0 |3 Full Text - PDF  |u https://www.proquest.com/docview/3244064305/fulltextPDF/embedded/7BTGNMKEMPT1V9Z2?source=fedsrch