Convergence of Statistical Estimators via Mutual Information Bounds

Bewaard in:
Bibliografische gegevens
Gepubliceerd in:arXiv.org (Dec 24, 2024), p. n/a
Hoofdauteur: El Mahdi Khribch
Andere auteurs: Alquier, Pierre
Gepubliceerd in:
Cornell University Library, arXiv.org
Onderwerpen:
Online toegang:Citation/Abstract
Full text outside of ProQuest
Tags: Voeg label toe
Geen labels, Wees de eerste die dit record labelt!
Omschrijving
Samenvatting:Recent advances in statistical learning theory have revealed profound connections between mutual information (MI) bounds, PAC-Bayesian theory, and Bayesian nonparametrics. This work introduces a novel mutual information bound for statistical models. The derived bound has wide-ranging applications in statistical inference. It yields improved contraction rates for fractional posteriors in Bayesian nonparametrics. It can also be used to study a wide range of estimation methods, such as variational inference or Maximum Likelihood Estimation (MLE). By bridging these diverse areas, this work advances our understanding of the fundamental limits of statistical inference and the role of information in learning from data. We hope that these results will not only clarify connections between statistical inference and information theory but also help to develop a new toolbox to study a wide range of estimators.
ISSN:2331-8422
Bron:Engineering Database