Rethinking Mean Square Error: Why Information is a Superior Assessment of Estimators

I tiakina i:
Ngā taipitopito rārangi puna kōrero
I whakaputaina i:arXiv.org (Dec 11, 2024), p. n/a
Kaituhi matua: Vos, Paul
I whakaputaina:
Cornell University Library, arXiv.org
Ngā marau:
Urunga tuihono:Citation/Abstract
Full text outside of ProQuest
Ngā Tūtohu: Tāpirihia he Tūtohu
Kāore He Tūtohu, Me noho koe te mea tuatahi ki te tūtohu i tēnei pūkete!
Whakaahuatanga
Whakarāpopotonga:James-Stein (JS) estimators have been described as showing the inadequacy of maximum likelihood estimation when assessed using mean square error (MSE). We claim the problem is not with maximum likelihood (ML) but with MSE. When MSE is replaced with a measure \(\Lambda\) of the information utilized by a statistic, likelihood based methods are superior. The information measure \(\Lambda\) describes not just point estimators but extends to Fisher's view of estimation so that we not only reconsider how estimators are assessed but also how we define an estimator. Fisher information and his views on the role of parameters, interpretation of probability, and logic of statistical inference fit well with \(\Lambda\) as measure of information.
ISSN:2331-8422
Puna:Engineering Database