Unsupervised Domain Adaptation Method Based on Relative Entropy Regularization and Measure Propagation

Guardado en:
Detalles Bibliográficos
Publicado en:Entropy vol. 27, no. 4 (2025), p. 426
Autor principal: Tan Lianghao
Otros Autores: Peng Zhuo, Song Yongjia, Liu, Xiaoyi, Jiang Huangqi, Liu Shubing, Wu, Weixi, Xiang Zhiyuan
Publicado:
MDPI AG
Materias:
Acceso en línea:Citation/Abstract
Full Text + Graphics
Full Text - PDF
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
Descripción
Resumen:This paper presents a novel unsupervised domain adaptation (UDA) framework that integrates information-theoretic principles to mitigate distributional discrepancies between source and target domains. The proposed method incorporates two key components: (1) relative entropy regularization, which leverages Kullback–Leibler (KL) divergence to align the predicted label distribution of the target domain with a reference distribution derived from the source domain, thereby reducing prediction uncertainty; and (2) measure propagation, a technique that transfers probability mass from the source domain to generate pseudo-measures—estimated probabilistic representations—for the unlabeled target domain. This dual mechanism enhances both global feature alignment and semantic consistency across domains. Extensive experiments on benchmark datasets (OfficeHome and DomainNet) demonstrate that the proposed approach consistently outperforms State-of-the-Art methods, particularly in scenarios with significant domain shifts. These results confirm the robustness, scalability, and theoretical grounding of our framework, offering a new perspective on the fusion of information theory and domain adaptation.
ISSN:1099-4300
DOI:10.3390/e27040426
Fuente:Engineering Database