Enhancing Neural Architecture Search Using Transfer Learning and Dynamic Search Spaces for Global Horizontal Irradiance Prediction

Guardado en:
Detalles Bibliográficos
Publicado en:Forecasting vol. 7, no. 3 (2025), p. 43-66
Autor principal: Inoussa, Legrene
Otros Autores: Wong, Tony, Louis-A, Dessaint
Publicado:
MDPI AG
Materias:
Acceso en línea:Citation/Abstract
Full Text + Graphics
Full Text - PDF
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
Descripción
Resumen:The neural architecture search technique is used to automate the engineering of neural network models. Several studies have applied this approach, mainly in the fields of image processing and natural language processing. Its application generally requires very long computing times before converging on the optimal architecture. This study proposes a hybrid approach that combines transfer learning and dynamic search space adaptation (TL-DSS) to reduce the architecture search time. To validate this approach, Long Short-Term Memory (LSTM) models were designed using different evolutionary algorithms, including artificial bee colony (ABC), genetic algorithm (GA), differential evolution (DE), and particle swarm optimization (PSO), which were developed to predict trends in global horizontal irradiation data. The performance measures of this approach include the performance of the proposed models, as evaluated via RMSE over a 24-h prediction window of the solar irradiance data trend on one hand, and CPU search time on the other. The results show that, in addition to reducing the search time by up to 89.09% depending on the search algorithm, the proposed approach enables the creation of models that are up to 99% more accurate than the non-enhanced approach. This study demonstrates that it is possible to reduce the search time of a neural architecture while ensuring that models achieve good performance.
ISSN:2571-9394
DOI:10.3390/forecast7030043
Fuente:ABI/INFORM Global