Entropic Statistics: Concept, Estimation, and Application in Machine Learning and Knowledge Extraction

Guardado en:
Detalles Bibliográficos
Publicado en:Machine Learning and Knowledge Extraction vol. 4, no. 4 (2022), p. 865
Autor principal: Zhang, Jialin
Publicado:
MDPI AG
Materias:
Acceso en línea:Citation/Abstract
Full Text
Full Text - PDF
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!

MARC

LEADER 00000nab a2200000uu 4500
001 2756739265
003 UK-CbPIL
022 |a 2504-4990 
024 7 |a 10.3390/make4040044  |2 doi 
035 |a 2756739265 
045 2 |b d20220101  |b d20221231 
100 1 |a Zhang, Jialin 
245 1 |a Entropic Statistics: Concept, Estimation, and Application in Machine Learning and Knowledge Extraction 
260 |b MDPI AG  |c 2022 
513 |a Journal Article 
520 3 |a The demands for machine learning and knowledge extraction methods have been booming due to the unprecedented surge in data volume and data quality. Nevertheless, challenges arise amid the emerging data complexity as significant chunks of information and knowledge lie within the non-ordinal realm of data. To address the challenges, researchers developed considerable machine learning and knowledge extraction methods regarding various domain-specific challenges. To characterize and extract information from non-ordinal data, all the developed methods pointed to the subject of Information Theory, established following Shannon’s landmark paper in 1948. This article reviews recent developments in entropic statistics, including estimation of Shannon’s entropy and its functionals (such as mutual information and Kullback–Leibler divergence), concepts of entropic basis, generalized Shannon’s entropy (and its functionals), and their estimations and potential applications in machine learning and knowledge extraction. With the knowledge of recent development in entropic statistics, researchers can customize existing machine learning and knowledge extraction methods for better performance or develop new approaches to address emerging domain-specific challenges. 
653 |a Machine learning 
653 |a Datasets 
653 |a Hypothesis testing 
653 |a Regression analysis 
653 |a Random variables 
653 |a Entropy (Information theory) 
653 |a Algorithms 
653 |a Information theory 
653 |a Genes 
653 |a Statistics 
653 |a Probability distribution 
653 |a Entropy 
653 |a Statistical methods 
653 |a Variance analysis 
653 |a Bias 
653 |a Nonparametric statistics 
773 0 |t Machine Learning and Knowledge Extraction  |g vol. 4, no. 4 (2022), p. 865 
786 0 |d ProQuest  |t Advanced Technologies & Aerospace Database 
856 4 1 |3 Citation/Abstract  |u https://www.proquest.com/docview/2756739265/abstract/embedded/7BTGNMKEMPT1V9Z2?source=fedsrch 
856 4 0 |3 Full Text  |u https://www.proquest.com/docview/2756739265/fulltext/embedded/7BTGNMKEMPT1V9Z2?source=fedsrch 
856 4 0 |3 Full Text - PDF  |u https://www.proquest.com/docview/2756739265/fulltextPDF/embedded/7BTGNMKEMPT1V9Z2?source=fedsrch