Improving Pareto Set Learning for Expensive Multi-objective Optimization via Stein Variational Hypernetworks

Αποθηκεύτηκε σε:
Λεπτομέρειες βιβλιογραφικής εγγραφής
Εκδόθηκε σε:arXiv.org (Dec 23, 2024), p. n/a
Κύριος συγγραφέας: Minh-Duc Nguyen
Άλλοι συγγραφείς: Phuong Mai Dinh, Nguyen, Quang-Huy, Hoang, Long P, Le, Dung D
Έκδοση:
Cornell University Library, arXiv.org
Θέματα:
Διαθέσιμο Online:Citation/Abstract
Full text outside of ProQuest
Ετικέτες: Προσθήκη ετικέτας
Δεν υπάρχουν, Καταχωρήστε ετικέτα πρώτοι!

MARC

LEADER 00000nab a2200000uu 4500
001 3148977526
003 UK-CbPIL
022 |a 2331-8422 
035 |a 3148977526 
045 0 |b d20241223 
100 1 |a Minh-Duc Nguyen 
245 1 |a Improving Pareto Set Learning for Expensive Multi-objective Optimization via Stein Variational Hypernetworks 
260 |b Cornell University Library, arXiv.org  |c Dec 23, 2024 
513 |a Working Paper 
520 3 |a Expensive multi-objective optimization problems (EMOPs) are common in real-world scenarios where evaluating objective functions is costly and involves extensive computations or physical experiments. Current Pareto set learning methods for such problems often rely on surrogate models like Gaussian processes to approximate the objective functions. These surrogate models can become fragmented, resulting in numerous small uncertain regions between explored solutions. When using acquisition functions such as the Lower Confidence Bound (LCB), these uncertain regions can turn into pseudo-local optima, complicating the search for globally optimal solutions. To address these challenges, we propose a novel approach called SVH-PSL, which integrates Stein Variational Gradient Descent (SVGD) with Hypernetworks for efficient Pareto set learning. Our method addresses the issues of fragmented surrogate models and pseudo-local optima by collectively moving particles in a manner that smooths out the solution space. The particles interact with each other through a kernel function, which helps maintain diversity and encourages the exploration of underexplored regions. This kernel-based interaction prevents particles from clustering around pseudo-local optima and promotes convergence towards globally optimal solutions. Our approach aims to establish robust relationships between trade-off reference vectors and their corresponding true Pareto solutions, overcoming the limitations of existing methods. Through extensive experiments across both synthetic and real-world MOO benchmarks, we demonstrate that SVH-PSL significantly improves the quality of the learned Pareto set, offering a promising solution for expensive multi-objective optimization problems. 
653 |a Pareto optimization 
653 |a Gaussian process 
653 |a Solution space 
653 |a Multiple objective analysis 
653 |a Learning 
653 |a Kernel functions 
653 |a Clustering 
653 |a Optimization 
700 1 |a Phuong Mai Dinh 
700 1 |a Nguyen, Quang-Huy 
700 1 |a Hoang, Long P 
700 1 |a Le, Dung D 
773 0 |t arXiv.org  |g (Dec 23, 2024), p. n/a 
786 0 |d ProQuest  |t Engineering Database 
856 4 1 |3 Citation/Abstract  |u https://www.proquest.com/docview/3148977526/abstract/embedded/6A8EOT78XXH2IG52?source=fedsrch 
856 4 0 |3 Full text outside of ProQuest  |u http://arxiv.org/abs/2412.17312