Healthy and Unhealthy Oil Palm Tree Detection Using Deep Learning Method

Salvato in:
Dettagli Bibliografici
Pubblicato in:International Journal of Advanced Computer Science and Applications vol. 16, no. 4 (2025)
Autore principale: PDF
Pubblicazione:
Science and Information (SAI) Organization Limited
Soggetti:
Accesso online:Citation/Abstract
Full Text - PDF
Tags: Aggiungi Tag
Nessun Tag, puoi essere il primo ad aggiungerne!!

MARC

LEADER 00000nab a2200000uu 4500
001 3206239924
003 UK-CbPIL
022 |a 2158-107X 
022 |a 2156-5570 
024 7 |a 10.14569/IJACSA.2025.0160474  |2 doi 
035 |a 3206239924 
045 2 |b d20250101  |b d20251231 
100 1 |a PDF 
245 1 |a Healthy and Unhealthy Oil Palm Tree Detection Using Deep Learning Method 
260 |b Science and Information (SAI) Organization Limited  |c 2025 
513 |a Journal Article 
520 3 |a Oil palm trees are the world's most efficient and economically productive oil bearing crop. It can be processed into components needed in various products, such as beauty products and biofuel. In Malaysia, the oil palm industry contributes around 2.2% annually to the nation's GDP. The continuous surge in demand for oil palm worldwide has created an awareness among the local plantation owner to apply more monitoring standards on the trees to increase their yield. However, Malaysia's cultivation and monitoring process still mainly depends on the labor force, which caused it to be inefficient and expensive. This scenario served as a motivation for the owner to innovate the tree monitoring process through the use of computer vision techniques. This paper aims to develop an object detection model to differentiate healthy and unhealthy oil palm trees through aerial images collected through a drone on an oil palm plantation. Different pre-trained models, such as Faster R-CNN (Region-Based Convolutional Neural Network) and SSD (Single-Shot MultiBox Detector), with different backbone modules, such as ResNet, Inception, and Hourglass, are used on the images of palm leaves. A comparison will then be made to select the best model based on the AP and AR of various scales and total loss to differentiate healthy and unhealthy oil palm. Eventually, the Faster R-CNN ResNet101 FPN model performed the best among the models, with AParea = all of 0.355, ARarea = all of 0.44, and total loss of 0.2296. 
651 4 |a Malaysia 
653 |a Biofuels 
653 |a Plantations 
653 |a Computer vision 
653 |a Object recognition 
653 |a Monitoring 
653 |a Machine learning 
653 |a Oil palm trees 
653 |a Artificial neural networks 
653 |a Drone aircraft 
653 |a Datasets 
653 |a Deep learning 
653 |a Agricultural production 
653 |a Computer science 
653 |a Foreign labor 
653 |a Sensors 
653 |a Neural networks 
653 |a Labor shortages 
653 |a Classification 
653 |a Computer engineering 
653 |a Trees 
653 |a Vegetable oils 
653 |a Algorithms 
653 |a Telematics 
773 0 |t International Journal of Advanced Computer Science and Applications  |g vol. 16, no. 4 (2025) 
786 0 |d ProQuest  |t Advanced Technologies & Aerospace Database 
856 4 1 |3 Citation/Abstract  |u https://www.proquest.com/docview/3206239924/abstract/embedded/7BTGNMKEMPT1V9Z2?source=fedsrch 
856 4 0 |3 Full Text - PDF  |u https://www.proquest.com/docview/3206239924/fulltextPDF/embedded/7BTGNMKEMPT1V9Z2?source=fedsrch