Continuous Growth Monitoring and Prediction with 1D Convolutional Neural Network Using Generated Data with Vision Transformer

Guardado en:
Detalles Bibliográficos
Publicado en:Plants vol. 13, no. 21 (2024), p. 3110
Autor principal: Woo-Joo, Choi
Otros Autores: Se-Hun Jang, Moon, Taewon, Kyeong-Su Seo, Da-Seul, Choi, Myung-Min, Oh
Publicado:
MDPI AG
Materias:
Acceso en línea:Citation/Abstract
Full Text + Graphics
Full Text - PDF
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!

MARC

LEADER 00000nab a2200000uu 4500
001 3126033617
003 UK-CbPIL
022 |a 2223-7747 
024 7 |a 10.3390/plants13213110  |2 doi 
035 |a 3126033617 
045 2 |b d20240101  |b d20241231 
084 |a 231551  |2 nlm 
100 1 |a Woo-Joo, Choi  |u Division of Animal, Horticultural and Food Sciences, Chungbuk National University, Cheongju 28644, Republic of Korea; <email>ujuchoe79@chungbuk.ac.kr</email> (W.-J.C.); <email>zsh8976@naver.com</email> (S.-H.J.); <email>nari4491@naver.com</email> (K.-S.S.); <email>daseul7312@gmail.com</email> (D.-S.C.) 
245 1 |a Continuous Growth Monitoring and Prediction with 1D Convolutional Neural Network Using Generated Data with Vision Transformer 
260 |b MDPI AG  |c 2024 
513 |a Journal Article 
520 3 |a Crop growth information is collected through destructive investigation, which inevitably causes discontinuity of the target. Real-time monitoring and estimation of the same target crops can lead to dynamic feedback control, considering immediate crop growth. Images are high-dimensional data containing crop growth and developmental stages and image collection is non-destructive. We propose a non-destructive growth prediction method that uses low-cost RGB images and computer vision. In this study, two methodologies were selected and verified: an image-to-growth model with crop images and a growth simulation model with estimated crop growth. The best models for each case were the vision transformer (ViT) and one-dimensional convolutional neural network (1D ConvNet). For shoot fresh weight, shoot dry weight, and leaf area of lettuce, ViT showed R2 values of 0.89, 0.93, and 0.78, respectively, whereas 1D ConvNet showed 0.96, 0.94, and 0.95, respectively. These accuracies indicated that RGB images and deep neural networks can non-destructively interpret the interaction between crops and the environment. Ultimately, growers can enhance resource use efficiency by adapting real-time monitoring and prediction to feedback environmental controls to yield high-quality crops. 
610 4 |a Raspberry Pi Ltd 
651 4 |a South Korea 
651 4 |a United States--US 
653 |a Resource efficiency 
653 |a Environmental monitoring 
653 |a Leaf area 
653 |a Humidity 
653 |a Datasets 
653 |a Deep learning 
653 |a Nondestructive testing 
653 |a Investigations 
653 |a Simulation models 
653 |a Color imagery 
653 |a Artificial neural networks 
653 |a Developmental stages 
653 |a Productivity 
653 |a Crops 
653 |a Image processing 
653 |a Feedback 
653 |a Control systems 
653 |a Computer vision 
653 |a Monitoring 
653 |a Lettuce 
653 |a Growth factors 
653 |a Crop growth 
653 |a Farming 
653 |a Image enhancement 
653 |a Predictions 
653 |a Neural networks 
653 |a Data collection 
653 |a Image quality 
653 |a Real time 
653 |a Feedback control 
700 1 |a Se-Hun Jang  |u Division of Animal, Horticultural and Food Sciences, Chungbuk National University, Cheongju 28644, Republic of Korea; <email>ujuchoe79@chungbuk.ac.kr</email> (W.-J.C.); <email>zsh8976@naver.com</email> (S.-H.J.); <email>nari4491@naver.com</email> (K.-S.S.); <email>daseul7312@gmail.com</email> (D.-S.C.) 
700 1 |a Moon, Taewon  |u Smart Farm Research Center, Korea Institute of Science and Technology (KIST), Gangneung 25451, Republic of Korea; <email>tmoon.hort@kist.re.kr</email> 
700 1 |a Kyeong-Su Seo  |u Division of Animal, Horticultural and Food Sciences, Chungbuk National University, Cheongju 28644, Republic of Korea; <email>ujuchoe79@chungbuk.ac.kr</email> (W.-J.C.); <email>zsh8976@naver.com</email> (S.-H.J.); <email>nari4491@naver.com</email> (K.-S.S.); <email>daseul7312@gmail.com</email> (D.-S.C.) 
700 1 |a Da-Seul, Choi  |u Division of Animal, Horticultural and Food Sciences, Chungbuk National University, Cheongju 28644, Republic of Korea; <email>ujuchoe79@chungbuk.ac.kr</email> (W.-J.C.); <email>zsh8976@naver.com</email> (S.-H.J.); <email>nari4491@naver.com</email> (K.-S.S.); <email>daseul7312@gmail.com</email> (D.-S.C.) 
700 1 |a Myung-Min, Oh  |u Division of Animal, Horticultural and Food Sciences, Chungbuk National University, Cheongju 28644, Republic of Korea; <email>ujuchoe79@chungbuk.ac.kr</email> (W.-J.C.); <email>zsh8976@naver.com</email> (S.-H.J.); <email>nari4491@naver.com</email> (K.-S.S.); <email>daseul7312@gmail.com</email> (D.-S.C.) 
773 0 |t Plants  |g vol. 13, no. 21 (2024), p. 3110 
786 0 |d ProQuest  |t Agriculture Science Database 
856 4 1 |3 Citation/Abstract  |u https://www.proquest.com/docview/3126033617/abstract/embedded/ZKJTFFSVAI7CB62C?source=fedsrch 
856 4 0 |3 Full Text + Graphics  |u https://www.proquest.com/docview/3126033617/fulltextwithgraphics/embedded/ZKJTFFSVAI7CB62C?source=fedsrch 
856 4 0 |3 Full Text - PDF  |u https://www.proquest.com/docview/3126033617/fulltextPDF/embedded/ZKJTFFSVAI7CB62C?source=fedsrch