Importance weighted variational graph autoencoder

Guardat en:
Dades bibliogràfiques
Publicat a:Complex & Intelligent Systems vol. 12, no. 1 (Jan 2026), p. 31
Autor principal: Tao, Yuhao
Altres autors: Guo, Lin, Zhao, Shuchang, Zhang, Shiqing
Publicat:
Springer Nature B.V.
Matèries:
Accés en línia:Citation/Abstract
Full Text
Full Text - PDF
Etiquetes: Afegir etiqueta
Sense etiquetes, Sigues el primer a etiquetar aquest registre!

MARC

LEADER 00000nab a2200000uu 4500
001 3278415588
003 UK-CbPIL
022 |a 2199-4536 
022 |a 2198-6053 
024 7 |a 10.1007/s40747-025-02144-9  |2 doi 
035 |a 3278415588 
045 2 |b d20260101  |b d20260131 
100 1 |a Tao, Yuhao  |u Taizhou University, Institute of Intelligent Information Processing, Taizhou, China (GRID:grid.440657.4) (ISNI:0000 0004 1762 5832) 
245 1 |a Importance weighted variational graph autoencoder 
260 |b Springer Nature B.V.  |c Jan 2026 
513 |a Journal Article 
520 3 |a Variational Graph Autoencoder (VGAE) is a widely explored model for learning the distribution of graph data. Currently, the approximate posterior distribution in VGAE-based methods is overly restrictive, leading to a significant gap between the variational lower bound and the log-likelihood of graph data. This limitation reduces the expressive power of these VGAE-based models. To address this issue, this paper proposes the Importance Weighted Variational Graph Autoencoder (IWVGAE) and provides a theoretical justification. This method makes the posterior distribution more flexible through Monte Carlo sampling and assigns importance weights to the likelihood gradients during backpropagation. In this way, IWVGAE achieves a more flexible optimization objective, enabling the learning of richer latent representations for graph data. It not only achieves a theoretically tighter variational lower bound but also makes graph density estimation more accurate. Extensive experimental results on seven classic graph datasets show that as the number of samples from the approximate posterior distribution increases, (1) the variational lower bound continuously improves, validating the proposed theory, and (2) the performance on downstream tasks significantly improves, demonstrating more effective learning and representation of graph data. 
653 |a Lower bounds 
653 |a Teaching methods 
653 |a Learning 
653 |a Graphical representations 
653 |a Recommender systems 
653 |a Graph representations 
653 |a Optimization 
653 |a Neural networks 
653 |a Back propagation 
700 1 |a Guo, Lin  |u Taizhou University, Institute of Intelligent Information Processing, Taizhou, China (GRID:grid.440657.4) (ISNI:0000 0004 1762 5832) 
700 1 |a Zhao, Shuchang  |u Taizhou University, Institute of Intelligent Information Processing, Taizhou, China (GRID:grid.440657.4) (ISNI:0000 0004 1762 5832) 
700 1 |a Zhang, Shiqing  |u Taizhou University, Institute of Intelligent Information Processing, Taizhou, China (GRID:grid.440657.4) (ISNI:0000 0004 1762 5832) 
773 0 |t Complex & Intelligent Systems  |g vol. 12, no. 1 (Jan 2026), p. 31 
786 0 |d ProQuest  |t Advanced Technologies & Aerospace Database 
856 4 1 |3 Citation/Abstract  |u https://www.proquest.com/docview/3278415588/abstract/embedded/7BTGNMKEMPT1V9Z2?source=fedsrch 
856 4 0 |3 Full Text  |u https://www.proquest.com/docview/3278415588/fulltext/embedded/7BTGNMKEMPT1V9Z2?source=fedsrch 
856 4 0 |3 Full Text - PDF  |u https://www.proquest.com/docview/3278415588/fulltextPDF/embedded/7BTGNMKEMPT1V9Z2?source=fedsrch