Balancing Prediction Accuracy and Explanation Power of Path Loss Modeling in a University Campus Environment via Explainable AI

Guardado en:
Detalles Bibliográficos
Publicado en:Future Internet vol. 17, no. 4 (2025), p. 155
Autor principal: Khalili Hamed
Otros Autores: Frey, Hannes, Wimmer, Maria A
Publicado:
MDPI AG
Materias:
Acceso en línea:Citation/Abstract
Full Text + Graphics
Full Text - PDF
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
Descripción
Resumen:For efficient radio network planning, empirical path loss (PL) prediction models are utilized to predict signal attenuation in different environments. Alternatively, machine learning (ML) models are proposed to predict path loss. While empirical models are transparent and require less computational capacity, their predictions are not able to generate accurate forecasting in complex environments. While ML models are precise and can cope with complex terrains, their opaque nature hampers building trust and relying assertively on their predictions. To fill the gap between transparency and accuracy, in this paper, we utilize glass box ML using Microsoft research’s explainable boosting machines (EBM) together with the PL data measured for a university campus environment. Moreover, polar coordinate transformation is applied in our paper, which unravels the superior explanation capacity of the feature transmitting angle beyond the feature distance. PL predictions of glass box ML are compared with predictions of black box ML models as well as those generated by empirical models. The glass box EBM exhibits the highest performance. The glass box ML, furthermore, sheds light on the important explanatory features and the magnitude of their effects on signal attenuation in the underlying propagation environment.
ISSN:1999-5903
DOI:10.3390/fi17040155
Fuente:ABI/INFORM Global