Differentiable 3D Scene Representations With Point-Based Neural Methods

Guardado en:
Detalles Bibliográficos
Publicado en:ProQuest Dissertations and Theses (2025)
Autor principal: Börcsök, Barnabás Barney
Publicado:
ProQuest Dissertations & Theses
Materias:
Acceso en línea:Citation/Abstract
Full Text - PDF
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
Descripción
Resumen:This thesis explores reconstructing explicit scene geometry via geometry particles carrying local Lagrangian patches. We formulate a signed distance field as a weighted sum of moving basis functions and describe an optimization framework to fit target shapes in both 2D and 3D. Experiments on canonical geometry meshes show that with a modest number of particles, our approach can capture coarse geometric structures while providing intuitive control and interpretable local geometry images in a storage-efficient representation. Although these preliminary results do not yet match state-of-the-art accuracy, they highlight the promise of a particle-based, differentiable explicit representation that is suitable to inspire further work in a vast array of workflow improvements from digital sculpting to generative modeling. We conclude by discussing avenues for further research on improving particle placement, blending strategies, and interactive editing capabilities.
ISBN:9798263351267
Fuente:ProQuest Dissertations & Theses Global