Convergence of Statistical Estimators via Mutual Information Bounds

保存先:
書誌詳細
出版年:arXiv.org (Dec 24, 2024), p. n/a
第一著者: El Mahdi Khribch
その他の著者: Alquier, Pierre
出版事項:
Cornell University Library, arXiv.org
主題:
オンライン・アクセス:Citation/Abstract
Full text outside of ProQuest
タグ: タグ追加
タグなし, このレコードへの初めてのタグを付けませんか!
その他の書誌記述
抄録:Recent advances in statistical learning theory have revealed profound connections between mutual information (MI) bounds, PAC-Bayesian theory, and Bayesian nonparametrics. This work introduces a novel mutual information bound for statistical models. The derived bound has wide-ranging applications in statistical inference. It yields improved contraction rates for fractional posteriors in Bayesian nonparametrics. It can also be used to study a wide range of estimation methods, such as variational inference or Maximum Likelihood Estimation (MLE). By bridging these diverse areas, this work advances our understanding of the fundamental limits of statistical inference and the role of information in learning from data. We hope that these results will not only clarify connections between statistical inference and information theory but also help to develop a new toolbox to study a wide range of estimators.
ISSN:2331-8422
ソース:Engineering Database