Identification of Saline Soybean Varieties Based On Trinocular Vision Fusion and Deep Learning

-д хадгалсан:
Номзүйн дэлгэрэнгүй
-д хэвлэсэн:Gesunde Pflanzen vol. 76, no. 6 (Dec 2024), p. 1693
Үндсэн зохиолч: Liu, Hang
Бусад зохиолчид: Wu, Qiong, Wu, Guangxia, Zhu, Dan, Deng, Limiao, Liu, Xiaoyang, Han, Zhongzhi, Zhao, Longgang
Хэвлэсэн:
Springer Nature B.V.
Нөхцлүүд:
Онлайн хандалт:Citation/Abstract
Full Text - PDF
Шошгууд: Шошго нэмэх
Шошго байхгүй, Энэхүү баримтыг шошголох эхний хүн болох!

MARC

LEADER 00000nab a2200000uu 4500
001 3128898161
003 UK-CbPIL
022 |a 0367-4223 
022 |a 1439-0345 
024 7 |a 10.1007/s10343-024-01040-5  |2 doi 
035 |a 3128898161 
045 2 |b d20241201  |b d20241231 
084 |a 109057  |2 nlm 
100 1 |a Liu, Hang  |u Qingdao Agricultural University, College of Grassland Science, Qingdao, China (GRID:grid.412608.9) (ISNI:0000 0000 9526 6338) 
245 1 |a Identification of Saline Soybean Varieties Based On Trinocular Vision Fusion and Deep Learning 
260 |b Springer Nature B.V.  |c Dec 2024 
513 |a Journal Article 
520 3 |a Soybean variety recognition is the basis of soybean agronomic yield and commodity attributes. In order to more comprehensively study the recognition performance of deep learning networks under multi-camera fusion, this paper innovatively proposes two new strategies for deep learning of soybean strain recognition based on three-camera fusion. One is image layer fusion and the other is feature layer fusion. Three cameras are used as the experimental trinocular vision. These strategies were evaluated with seven different deep learning network models, including Alexnet, Googlenet, Resnet34, Resnet50, Mobilenet, Shufflenet, and Densenet. Experimental results show that the network performance of both fusion strategies improves with the number of cameras. Notably, Densenet outperforms the other network models. Under the image-layer fusion strategy, Densenet achieves a validation accuracy of 0.9831 and a test accuracy of 0.9938 when three cameras are used. In the feature-layer fusion phase, Densenet achieves a validation accuracy of 0.9875 and a test accuracy when three cameras are used. In the three-camera setup, the image-layer fusion achieved a precision of 0.9729, a recall of 0.9500, and an F1 score of 0.9744. The feature-layer fusion achieved a precision of 0.9756, a recall of 0.9474, and an F1 score of 0.9474. Additionally, based on this research, a new mobile application called “Soybean Seed Classifier” was designed and developed. The results of the study provide a new method for comprehensive soybean seed identification, and the developed software shows practical value in soybean seed identification and breeding processes. 
653 |a Soybeans 
653 |a Accuracy 
653 |a Recall 
653 |a Cameras 
653 |a Deep learning 
653 |a Crop yield 
653 |a Plant breeding 
653 |a Applications programs 
653 |a Mobile computing 
653 |a Vision 
653 |a Images 
653 |a Environmental 
700 1 |a Wu, Qiong  |u Qingdao Agricultural University, College of Grassland Science, Qingdao, China (GRID:grid.412608.9) (ISNI:0000 0000 9526 6338) 
700 1 |a Wu, Guangxia  |u Qingdao Agricultural University, College of Agronomy, Qingdao, China (GRID:grid.412608.9) (ISNI:0000 0000 9526 6338) 
700 1 |a Zhu, Dan  |u Qingdao Agricultural University, College of Life Sciences, Qingdao, China (GRID:grid.412608.9) (ISNI:0000 0000 9526 6338) 
700 1 |a Deng, Limiao  |u Qingdao Agricultural University, College of Science and Information, Qingdao, China (GRID:grid.412608.9) (ISNI:0000 0000 9526 6338) 
700 1 |a Liu, Xiaoyang  |u The Chinese University of Hong Kong, School of Data Science, Shenzhen, China (GRID:grid.10784.3a) (ISNI:0000 0004 1937 0482) 
700 1 |a Han, Zhongzhi  |u Qingdao Agricultural University, College of Science and Information, Qingdao, China (GRID:grid.412608.9) (ISNI:0000 0000 9526 6338) 
700 1 |a Zhao, Longgang  |u Qingdao Agricultural University, College of Grassland Science, Qingdao, China (GRID:grid.412608.9) (ISNI:0000 0000 9526 6338) 
773 0 |t Gesunde Pflanzen  |g vol. 76, no. 6 (Dec 2024), p. 1693 
786 0 |d ProQuest  |t Science Database 
856 4 1 |3 Citation/Abstract  |u https://www.proquest.com/docview/3128898161/abstract/embedded/6A8EOT78XXH2IG52?source=fedsrch 
856 4 0 |3 Full Text - PDF  |u https://www.proquest.com/docview/3128898161/fulltextPDF/embedded/6A8EOT78XXH2IG52?source=fedsrch