Computer Vision-Based Deep Learning Modeling for Salmon Part Segmentation and Defect Identification

Guardado en:
Detalles Bibliográficos
Publicado en:Foods vol. 14, no. 20 (2025), p. 3529-3547
Autor principal: Zhang Chunxu
Otros Autores: Zhao Yuanshan, Yang Wude, Gao Liuqian, Zhang, Wenyu, Liu, Yang, Zhang, Xu, Wang, Huihui
Publicado:
MDPI AG
Materias:
Acceso en línea:Citation/Abstract
Full Text + Graphics
Full Text - PDF
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!

MARC

LEADER 00000nab a2200000uu 4500
001 3265899703
003 UK-CbPIL
022 |a 2304-8158 
024 7 |a 10.3390/foods14203529  |2 doi 
035 |a 3265899703 
045 2 |b d20250101  |b d20251231 
084 |a 231462  |2 nlm 
100 1 |a Zhang Chunxu  |u College of Biosystems Engineering and Food Science, Zhejiang University, Hangzhou 310027, China; 747189502zcx@gmail.com 
245 1 |a Computer Vision-Based Deep Learning Modeling for Salmon Part Segmentation and Defect Identification 
260 |b MDPI AG  |c 2025 
513 |a Journal Article 
520 3 |a Accurate cutting of salmon parts and surface defect detection are the key steps to enhance the added value of its processing. At present, mainstream manual inspection methods have low accuracy and efficiency, making it difficult to meet the demands of industrialized production. A machine vision inspection method based on a two-stage fusion network is proposed in this paper, aiming to achieve accurate cutting of salmon parts and efficient recognition of defects. The fish body image is collected by building a visual inspection system, and the dataset is constructed by preprocessing and data enhancement. For the part cutting, the improved U-Net model that introduces the CBAM attention mechanism is used to strengthen the extraction ability of the fish body texture features. For defect detection, the two-stage fusion architecture is designed to quickly locate the defective region by adding the YOLOv5 of the P2 small target detection layer first, and then the cropped region is fed into the improved U-Net for accurate cutting. The experimental results demonstrate that the improved U-Net achieves a mean average precision (mAP) of 96.87% and a mean intersection over union (mIoU) of 94.33% in part cutting, representing improvements of 2.44% and 1.06%, respectively, over the base model. In defect detection, the fusion model attains an mAP of 94.28% with a processing speed of 7.30 fps, outperforming the single U-Net by 28.02% in accuracy and 236.4% in efficiency. This method provides a high-precision, high-efficiency solution for intelligent salmon processing, offering significant value for advancing automation in the aquatic product processing industry. 
653 |a Accuracy 
653 |a Deep learning 
653 |a Defects 
653 |a Inspection 
653 |a Vision systems 
653 |a Salmon 
653 |a Efficiency 
653 |a Computer vision 
653 |a Automation 
653 |a Surface defects 
653 |a Fish 
653 |a Machine learning 
653 |a Raw materials 
653 |a Machine vision 
653 |a Target detection 
653 |a Body image 
653 |a Cuttings 
653 |a Algorithms 
653 |a Processing industry 
700 1 |a Zhao Yuanshan  |u School of Mechanical Engineering & Automation, Dalian Polytechnic University, Dalian 116039, China; zhaoyuanshan2024@163.com (Y.Z.); 18741357362@163.com (W.Y.); glqjy9@163.com (L.G.); zwy20241029@163.com (W.Z.); ly.19880126@163.com (Y.L.) 
700 1 |a Yang Wude  |u School of Mechanical Engineering & Automation, Dalian Polytechnic University, Dalian 116039, China; zhaoyuanshan2024@163.com (Y.Z.); 18741357362@163.com (W.Y.); glqjy9@163.com (L.G.); zwy20241029@163.com (W.Z.); ly.19880126@163.com (Y.L.) 
700 1 |a Gao Liuqian  |u School of Mechanical Engineering & Automation, Dalian Polytechnic University, Dalian 116039, China; zhaoyuanshan2024@163.com (Y.Z.); 18741357362@163.com (W.Y.); glqjy9@163.com (L.G.); zwy20241029@163.com (W.Z.); ly.19880126@163.com (Y.L.) 
700 1 |a Zhang, Wenyu  |u School of Mechanical Engineering & Automation, Dalian Polytechnic University, Dalian 116039, China; zhaoyuanshan2024@163.com (Y.Z.); 18741357362@163.com (W.Y.); glqjy9@163.com (L.G.); zwy20241029@163.com (W.Z.); ly.19880126@163.com (Y.L.) 
700 1 |a Liu, Yang  |u School of Mechanical Engineering & Automation, Dalian Polytechnic University, Dalian 116039, China; zhaoyuanshan2024@163.com (Y.Z.); 18741357362@163.com (W.Y.); glqjy9@163.com (L.G.); zwy20241029@163.com (W.Z.); ly.19880126@163.com (Y.L.) 
700 1 |a Zhang, Xu  |u School of Mechanical Engineering & Automation, Dalian Polytechnic University, Dalian 116039, China; zhaoyuanshan2024@163.com (Y.Z.); 18741357362@163.com (W.Y.); glqjy9@163.com (L.G.); zwy20241029@163.com (W.Z.); ly.19880126@163.com (Y.L.) 
700 1 |a Wang, Huihui  |u School of Mechanical Engineering & Automation, Dalian Polytechnic University, Dalian 116039, China; zhaoyuanshan2024@163.com (Y.Z.); 18741357362@163.com (W.Y.); glqjy9@163.com (L.G.); zwy20241029@163.com (W.Z.); ly.19880126@163.com (Y.L.) 
773 0 |t Foods  |g vol. 14, no. 20 (2025), p. 3529-3547 
786 0 |d ProQuest  |t Agriculture Science Database 
856 4 1 |3 Citation/Abstract  |u https://www.proquest.com/docview/3265899703/abstract/embedded/H09TXR3UUZB2ISDL?source=fedsrch 
856 4 0 |3 Full Text + Graphics  |u https://www.proquest.com/docview/3265899703/fulltextwithgraphics/embedded/H09TXR3UUZB2ISDL?source=fedsrch 
856 4 0 |3 Full Text - PDF  |u https://www.proquest.com/docview/3265899703/fulltextPDF/embedded/H09TXR3UUZB2ISDL?source=fedsrch