Individual Identification of Holstein Cows from Top-View RGB and Depth Images Based on Improved PointNet++ and ConvNeXt

Enregistré dans:
Détails bibliographiques
Publié dans:Agriculture vol. 15, no. 7 (2025), p. 710
Auteur principal: Zhao, Kaixuan
Autres auteurs: Wang, Jinjin, Chen, Yinan, Sun, Junrui, Zhang, Ruihong
Publié:
MDPI AG
Sujets:
Accès en ligne:Citation/Abstract
Full Text + Graphics
Full Text - PDF
Tags: Ajouter un tag
Pas de tags, Soyez le premier à ajouter un tag!

MARC

LEADER 00000nab a2200000uu 4500
001 3188771716
003 UK-CbPIL
022 |a 2077-0472 
024 7 |a 10.3390/agriculture15070710  |2 doi 
035 |a 3188771716 
045 2 |b d20250101  |b d20251231 
084 |a 231331  |2 nlm 
100 1 |a Zhao, Kaixuan  |u College of Agricultural Equipment Engineering, Henan University of Science and Technology, Luoyang 471023, China; <email>wangjj@stu.haust.edu.cn</email> (J.W.); <email>chenyn@haust.edu.cn</email> (Y.C.); <email>sjunrui@163.com</email> (J.S.); <email>zhang_rh@stu.edu.cn</email> (R.Z.); Science & Technology Innovation Center for Completed Set Equipment, Longmen Laboratory, Luoyang 471023, China 
245 1 |a Individual Identification of Holstein Cows from Top-View RGB and Depth Images Based on Improved PointNet++ and ConvNeXt 
260 |b MDPI AG  |c 2025 
513 |a Journal Article 
520 3 |a The identification of individual cows is a prerequisite and foundation for realizing accurate and intelligent farming, but this identification method based on image information is easily affected by the environment and observation angle. To identify cows more accurately and efficiently, a novel individual recognition method based on the using anchor point detection and body pattern features from top-view depth images of cows was proposed. First, the top-view RGBD images of cows were collected. The hook and pin bones of cows were coarsely located based on the improved PointNet++ neural network. Second, the curvature variations in the hook and pin bone regions were analyzed to accurately locate the hook and pin bones. Based on the spatial relationship between the hook and pin bones, the critical area was determined, and the key region was transformed from a point cloud to a two-dimensional body pattern image. Finally, body pattern image classification based on the improved ConvNeXt network model was performed for individual cow identification. A dataset comprising 7600 top-view images from 40 cows was created and partitioned into training, validation, and test subsets using a 7:2:1 proportion. The results revealed that the AP50 value of the point cloud segmentation model is 95.5%, and the cow identification accuracy could reach 97.95%. The AP50 metric of the enhanced PointNet++ neural network exceeded that of the original model by 3 percentage points. Relative to the original model, the enhanced ConvNeXt model achieved a 6.11 percentage point increase in classification precision. The method is robust to the position and angle of the cow in the top-view. 
653 |a Two dimensional bodies 
653 |a Cameras 
653 |a Accuracy 
653 |a Wavelet transforms 
653 |a Farming 
653 |a Neural networks 
653 |a Dairy cattle 
653 |a Image segmentation 
653 |a Identification methods 
653 |a Dairy farms 
653 |a Computer vision 
653 |a Cooperation 
653 |a Classification 
653 |a Bones 
653 |a Radio frequency identification 
653 |a Image classification 
653 |a Cattle 
653 |a Methods 
653 |a Algorithms 
653 |a Environmental 
700 1 |a Wang, Jinjin  |u College of Agricultural Equipment Engineering, Henan University of Science and Technology, Luoyang 471023, China; <email>wangjj@stu.haust.edu.cn</email> (J.W.); <email>chenyn@haust.edu.cn</email> (Y.C.); <email>sjunrui@163.com</email> (J.S.); <email>zhang_rh@stu.edu.cn</email> (R.Z.) 
700 1 |a Chen, Yinan  |u College of Agricultural Equipment Engineering, Henan University of Science and Technology, Luoyang 471023, China; <email>wangjj@stu.haust.edu.cn</email> (J.W.); <email>chenyn@haust.edu.cn</email> (Y.C.); <email>sjunrui@163.com</email> (J.S.); <email>zhang_rh@stu.edu.cn</email> (R.Z.) 
700 1 |a Sun, Junrui  |u College of Agricultural Equipment Engineering, Henan University of Science and Technology, Luoyang 471023, China; <email>wangjj@stu.haust.edu.cn</email> (J.W.); <email>chenyn@haust.edu.cn</email> (Y.C.); <email>sjunrui@163.com</email> (J.S.); <email>zhang_rh@stu.edu.cn</email> (R.Z.) 
700 1 |a Zhang, Ruihong  |u College of Agricultural Equipment Engineering, Henan University of Science and Technology, Luoyang 471023, China; <email>wangjj@stu.haust.edu.cn</email> (J.W.); <email>chenyn@haust.edu.cn</email> (Y.C.); <email>sjunrui@163.com</email> (J.S.); <email>zhang_rh@stu.edu.cn</email> (R.Z.) 
773 0 |t Agriculture  |g vol. 15, no. 7 (2025), p. 710 
786 0 |d ProQuest  |t Agriculture Science Database 
856 4 1 |3 Citation/Abstract  |u https://www.proquest.com/docview/3188771716/abstract/embedded/L8HZQI7Z43R0LA5T?source=fedsrch 
856 4 0 |3 Full Text + Graphics  |u https://www.proquest.com/docview/3188771716/fulltextwithgraphics/embedded/L8HZQI7Z43R0LA5T?source=fedsrch 
856 4 0 |3 Full Text - PDF  |u https://www.proquest.com/docview/3188771716/fulltextPDF/embedded/L8HZQI7Z43R0LA5T?source=fedsrch