Convergence of Statistical Estimators via Mutual Information Bounds

Tallennettuna:
Bibliografiset tiedot
Julkaisussa:arXiv.org (Dec 24, 2024), p. n/a
Päätekijä: El Mahdi Khribch
Muut tekijät: Alquier, Pierre
Julkaistu:
Cornell University Library, arXiv.org
Aiheet:
Linkit:Citation/Abstract
Full text outside of ProQuest
Tagit: Lisää tagi
Ei tageja, Lisää ensimmäinen tagi!
Kuvaus
Abstrakti:Recent advances in statistical learning theory have revealed profound connections between mutual information (MI) bounds, PAC-Bayesian theory, and Bayesian nonparametrics. This work introduces a novel mutual information bound for statistical models. The derived bound has wide-ranging applications in statistical inference. It yields improved contraction rates for fractional posteriors in Bayesian nonparametrics. It can also be used to study a wide range of estimation methods, such as variational inference or Maximum Likelihood Estimation (MLE). By bridging these diverse areas, this work advances our understanding of the fundamental limits of statistical inference and the role of information in learning from data. We hope that these results will not only clarify connections between statistical inference and information theory but also help to develop a new toolbox to study a wide range of estimators.
ISSN:2331-8422
Lähde:Engineering Database