Selective Multistart Optimization Based on Adaptive Latin Hypercube Sampling and Interval Enclosures

Gorde:
Xehetasun bibliografikoak
Argitaratua izan da:Mathematics vol. 13, no. 11 (2025), p. 1733
Egile nagusia: Nikas, Ioannis A
Beste egile batzuk: Georgopoulos, Vasileios P, Loukopoulos, Vasileios C
Argitaratua:
MDPI AG
Gaiak:
Sarrera elektronikoa:Citation/Abstract
Full Text + Graphics
Full Text - PDF
Etiketak: Etiketa erantsi
Etiketarik gabe, Izan zaitez lehena erregistro honi etiketa jartzen!

MARC

LEADER 00000nab a2200000uu 4500
001 3217737877
003 UK-CbPIL
022 |a 2227-7390 
024 7 |a 10.3390/math13111733  |2 doi 
035 |a 3217737877 
045 2 |b d20250101  |b d20251231 
084 |a 231533  |2 nlm 
100 1 |a Nikas, Ioannis A  |u Department of Tourism Management, University of Patras, GR 26334 Patras, Greece 
245 1 |a Selective Multistart Optimization Based on Adaptive Latin Hypercube Sampling and Interval Enclosures 
260 |b MDPI AG  |c 2025 
513 |a Journal Article 
520 3 |a Solving global optimization problems is a significant challenge, particularly in high-dimensional spaces. This paper proposes a selective multistart optimization framework that employs a modified Latin Hypercube Sampling (LHS) technique to maintain a constant search space coverage rate, alongside Interval Arithmetic (IA) to prioritize sampling points. The proposed methodology addresses key limitations of conventional multistart methods, such as the exponential decline in space coverage with increasing dimensionality. It prioritizes sampling points by leveraging the hypercubes generated through LHS and their corresponding interval enclosures, guiding the optimization process toward regions more likely to contain the global minimum. Unlike conventional multistart methods, which assume uniform sampling without quantifying spatial coverage, the proposed approach constructs interval enclosures around each sample point, enabling explicit estimation and control of the explored search space. Numerical experiments on well-known benchmark functions demonstrate improvements in space coverage efficiency and enhanced local/global minimum identification. The proposed framework offers a promising approach for large-scale optimization problems frequently encountered in machine learning, artificial intelligence, and data-intensive domains. 
653 |a Big Data 
653 |a Machine learning 
653 |a Deep learning 
653 |a Interval arithmetic 
653 |a Artificial intelligence 
653 |a Adaptive sampling 
653 |a Optimization techniques 
653 |a Global optimization 
653 |a Hypercubes 
653 |a Methods 
653 |a Algorithms 
653 |a Enclosures 
653 |a Efficiency 
653 |a Latin hypercube sampling 
700 1 |a Georgopoulos, Vasileios P  |u Department of Physics, University of Patras, GR 26504 Rion, Greece; vasileios.georgopoulos@upnet.gr (V.P.G.); vxloukop@upatras.gr (V.C.L.) 
700 1 |a Loukopoulos, Vasileios C  |u Department of Physics, University of Patras, GR 26504 Rion, Greece; vasileios.georgopoulos@upnet.gr (V.P.G.); vxloukop@upatras.gr (V.C.L.) 
773 0 |t Mathematics  |g vol. 13, no. 11 (2025), p. 1733 
786 0 |d ProQuest  |t Engineering Database 
856 4 1 |3 Citation/Abstract  |u https://www.proquest.com/docview/3217737877/abstract/embedded/7BTGNMKEMPT1V9Z2?source=fedsrch 
856 4 0 |3 Full Text + Graphics  |u https://www.proquest.com/docview/3217737877/fulltextwithgraphics/embedded/7BTGNMKEMPT1V9Z2?source=fedsrch 
856 4 0 |3 Full Text - PDF  |u https://www.proquest.com/docview/3217737877/fulltextPDF/embedded/7BTGNMKEMPT1V9Z2?source=fedsrch