PWFS: Probability-Weighted Feature Selection

Uloženo v:
Podrobná bibliografie
Vydáno v:Electronics vol. 14, no. 11 (2025), p. 2264
Hlavní autor: Ayanoglu, Mehmet B
Další autoři: Uysal Ismail
Vydáno:
MDPI AG
Témata:
On-line přístup:Citation/Abstract
Full Text + Graphics
Full Text - PDF
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!

MARC

LEADER 00000nab a2200000uu 4500
001 3217725503
003 UK-CbPIL
022 |a 2079-9292 
024 7 |a 10.3390/electronics14112264  |2 doi 
035 |a 3217725503 
045 2 |b d20250101  |b d20251231 
084 |a 231458  |2 nlm 
100 1 |a Ayanoglu, Mehmet B 
245 1 |a PWFS: Probability-Weighted Feature Selection 
260 |b MDPI AG  |c 2025 
513 |a Journal Article 
520 3 |a Feature selection has been a fundamental research area for both conventional and contemporary machine learning since the beginning of predictive analytics. From early statistical methods, such as principal component analysis, to more recent and data-driven approaches, such as deep unsupervised feature learning, selecting input features to achieve the best objective performance has been a critical component of any machine learning application. In this study, we propose a novel, easily replicable, and robust approach called probability-weighted feature selection (PWFS), which randomly selects a subset of features prior to each training–testing regimen and assigns probability weights to each feature based on an objective performance metric such as accuracy, mean-square error, or area under the curve for the receiver operating characteristic curve (AUC–ROC). Using the objective metric scores and weight assignment techniques based on the golden ratio led iteration method, the features that yield higher performance are incrementally more likely to be selected in subsequent train–test regimens, whereas the opposite is true for features that yield lower performance. This probability-based search method has demonstrated significantly faster convergence to a near-optimal set of features compared to a purely random search within the feature space. We compare our method with an extensive list of twelve popular feature selection algorithms and demonstrate equal or better performance on a range of benchmark datasets. The specific approach to assigning weights to the features also allows for expanded applications in which two correlated features can be included in separate clusters of near-optimal feature sets for ensemble learning scenarios. 
653 |a Accuracy 
653 |a Datasets 
653 |a Random variables 
653 |a Principal components analysis 
653 |a Search methods 
653 |a Iterative methods 
653 |a Classification 
653 |a Statistical methods 
653 |a Feature selection 
653 |a Data collection 
653 |a Methods 
653 |a Algorithms 
653 |a Machine learning 
653 |a Critical components 
653 |a Ensemble learning 
653 |a Statistical analysis 
653 |a Variance analysis 
700 1 |a Uysal Ismail 
773 0 |t Electronics  |g vol. 14, no. 11 (2025), p. 2264 
786 0 |d ProQuest  |t Advanced Technologies & Aerospace Database 
856 4 1 |3 Citation/Abstract  |u https://www.proquest.com/docview/3217725503/abstract/embedded/H09TXR3UUZB2ISDL?source=fedsrch 
856 4 0 |3 Full Text + Graphics  |u https://www.proquest.com/docview/3217725503/fulltextwithgraphics/embedded/H09TXR3UUZB2ISDL?source=fedsrch 
856 4 0 |3 Full Text - PDF  |u https://www.proquest.com/docview/3217725503/fulltextPDF/embedded/H09TXR3UUZB2ISDL?source=fedsrch