Artificial Intelligence for Iteration Count Prediction in Real-Time CORDIC Processing

محفوظ في:
التفاصيل البيبلوغرافية
الحاوية / القاعدة:Mathematics vol. 13, no. 24 (2025), p. 3957-3975
المؤلف الرئيسي: Ratheesh, Sudheerbabu
مؤلفون آخرون: Chandrika, Reghunath Lekshmi, Franzoni Valentina, Milani, Alfredo, Randieri Cristian
منشور في:
MDPI AG
الموضوعات:
الوصول للمادة أونلاين:Citation/Abstract
Full Text + Graphics
Full Text - PDF
الوسوم: إضافة وسم
لا توجد وسوم, كن أول من يضع وسما على هذه التسجيلة!
الوصف
مستخلص:The first research attempt to dynamically optimize the CORDIC algorithm’s iteration count using artificial intelligence is presented in this paper. Conventional approaches depend on a certain number of iterations, which frequently results in extra calculations and longer processing times. Our method drastically reduces the number of iterations without compromising accuracy by using machine learning regression models to predict the near-best iteration value for a given input angle. Overall efficiency is increased as a result of reduced computational complexity along with faster execution. We optimized the hyperparameters of several models, including Random Forest, XGBoost, and Support Vector Machine (SVM) Regressor, using Grid Search and Cross-Validation. Experimental results show that the SVM Regressor performs best, with a mean absolute error of 0.045 and an R2 score of 0.998. This AI-driven dynamic iteration prediction thus offers a promising route for efficient and adaptable CORDIC implementations in real-time digital signal processing applications.
تدمد:2227-7390
DOI:10.3390/math13243957
المصدر:Engineering Database