Physics-Based Neural Deformable Models

Guardat en:
Dades bibliogràfiques
Publicat a:ProQuest Dissertations and Theses (2025)
Autor principal: Liu, Di
Publicat:
ProQuest Dissertations & Theses
Matèries:
Accés en línia:Citation/Abstract
Full Text - PDF
Etiquetes: Afegir etiqueta
Sense etiquetes, Sigues el primer a etiquetar aquest registre!
Descripció
Resum:This thesis introduces a novel class of physics-inspired neural networks—Physics-based Neural Deformable Models(PNDMs)—that integrate traditional physics-based deformable models with modern deep learning to achieve interpretable and flexible 3D shape representations. While classical deformable models offer semantic clarity through parametric primitives, they suffer from limited geometric flexibility and dependence on handcrafted initializations. In contrast, PNDMs overcome these limitations by learning parameter functions that generalize primitive geometry, employing diffeomorphic mappings to preserve topology, and leveraging external forces for robust training.We extend this paradigm in DeFormer, a transformer-based framework that hierarchically disentangles global and local shape deformations, and in LEPARD, which enables 3D articulated part discovery directly from 2D supervision. Finally, we demonstrate the application of our methods in photorealistic avatar reconstruction, including the LUCAS system for layered codec avatars. Together, these contributions bridge interpretable physics-based modeling with scalable neural architectures for shape abstraction, segmentation, and generation.
ISBN:9798293845903
Font:ProQuest Dissertations & Theses Global