T-JEPA: Augmentation-Free Self-Supervised Learning for Tabular Data

Enregistré dans:
Détails bibliographiques
Publié dans:arXiv.org (Dec 19, 2024), p. n/a
Auteur principal: Thimonier, Hugo
Autres auteurs: José Lucas De Melo Costa, Popineau, Fabrice, Rimmel, Arpad, Doan, Bich-Liên
Publié:
Cornell University Library, arXiv.org
Sujets:
Accès en ligne:Citation/Abstract
Full text outside of ProQuest
Tags: Ajouter un tag
Pas de tags, Soyez le premier à ajouter un tag!
Description
Résumé:Self-supervision is often used for pre-training to foster performance on a downstream task by constructing meaningful representations of samples. Self-supervised learning (SSL) generally involves generating different views of the same sample and thus requires data augmentations that are challenging to construct for tabular data. This constitutes one of the main challenges of self-supervision for structured data. In the present work, we propose a novel augmentation-free SSL method for tabular data. Our approach, T-JEPA, relies on a Joint Embedding Predictive Architecture (JEPA) and is akin to mask reconstruction in the latent space. It involves predicting the latent representation of one subset of features from the latent representation of a different subset within the same sample, thereby learning rich representations without augmentations. We use our method as a pre-training technique and train several deep classifiers on the obtained representation. Our experimental results demonstrate a substantial improvement in both classification and regression tasks, outperforming models trained directly on samples in their original data space. Moreover, T-JEPA enables some methods to consistently outperform or match the performance of traditional methods likes Gradient Boosted Decision Trees. To understand why, we extensively characterize the obtained representations and show that T-JEPA effectively identifies relevant features for downstream tasks without access to the labels. Additionally, we introduce regularization tokens, a novel regularization method critical for training of JEPA-based models on structured data.
ISSN:2331-8422
Source:Engineering Database