Federated Incomplete Multi-View Unsupervised Feature Selection with Fractional Sparsity-Guided Whale Optimization and Tensor Alternating Learning
Guardado en:
| Publicado en: | Fractal and Fractional vol. 9, no. 11 (2025), p. 717-744 |
|---|---|
| Autor principal: | |
| Otros Autores: | , , , |
| Publicado: |
MDPI AG
|
| Materias: | |
| Acceso en línea: | Citation/Abstract Full Text + Graphics Full Text - PDF |
| Etiquetas: |
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
| Resumen: | With the widespread application of multi-view data across various domains, multi-view unsupervised feature selection (MUFS) has achieved remarkable progress in both feature selection (FS) and missing-view completion. However, existing MUFS methods typically rely on centralized servers, which not only fail to meet privacy requirements in distributed settings but also suffer from suboptimal FS quality and poor convergence. To overcome these challenges, we propose a novel federated incomplete MUFS method (Fed-IMUFS), which integrates a fractional Sparsity-Guided Whale Optimization Algorithm (SGWOA) and Tensor Alternating Learning (TAL). Within this federated learning framework, each client performs local optimization in two stages: in the first stage, SGWOA introduces an <inline-formula>L2,1</inline-formula> proximal projection to enforce row-sparsity in the FS weight matrix, while fractional-order dynamics and fractal-inspired elite kernel injection mechanisms enhance global search ability, yielding a discriminative and stable weight matrix; in the second stage, based on the obtained weight matrix, an alternating optimization framework with tensor decomposition is employed to iteratively complete missing views while simultaneously optimizing low-dimensional representations to preserve cross-view consistency, with the objective function gradually minimized until convergence. During federated training, the server employs an aggregation and distribution strategy driven by normalized mutual information, where clients upload only their local weight matrices and quality indicators, and the server adaptively fuses them into a global FS matrix before distributing it back to clients. This process achieves consistent FS across clients while safeguarding data privacy. Comprehensive evaluations on CEC2022 and several incomplete multi-view datasets confirm that Fed-IMUFS outperforms state-of-the-art methods, delivering stronger global optimization capability, higher-quality feature selection, faster convergence, and more effective handling of missing views. |
|---|---|
| ISSN: | 2504-3110 |
| DOI: | 10.3390/fractalfract9110717 |
| Fuente: | Engineering Database |