Unified, Multi-Scale Scene Representations for Scalable Physically Based Rendering
Guardado en:
| Publicado en: | ProQuest Dissertations and Theses (2025) |
|---|---|
| Autor principal: | |
| Publicado: |
ProQuest Dissertations & Theses
|
| Materias: | |
| Acceso en línea: | Citation/Abstract Full Text - PDF |
| Etiquetas: |
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
| Resumen: | Physically based rendering is the computational process of generating realistic images of 3D scenes. As the bar of realism has been constantly evolving, physically based rendering must scale to the ever-growing scene complexity demanded by modern applications. Scene representation is a fundamental building block of rendering that determines the diversity of content and the efficiency of rendering algorithms. We claim that an ideal scene representation should be unified and multi-scale. However, these properties are not satisfied by mainstream representations consisting of meshes and textures.This dissertation bridges the gap by developing novel scene representations suitable for rendering. First, we present a novel volumetric representation for complex scene aggregation and level-of-detail (LoD) rendering. The core of our representation is the Aggregated Bidirectional Scattering Distribution Function (ABSDF) that summarizes the far-field appearance of all surfaces inside a voxel. We propose a closed-form factorization of the ABSDF that accounts for spatially varying and orientation-varying material parameters. We tackle the challenge of capturing the correlation existing locally within a voxel and globally across different parts of the scene. Our method faithfully reproduces appearance and achieves higher quality than existing scene filtering methods. The memory footprint and rendering cost of our representation are decoupled from the original scene complexity.Second, we propose a general-purpose rendering primitive based on 3D Gaussian distribution. Our primitive supports versatile appearance ranging from glossy surfaces to fuzzy elements, as well as physically based scattering to enable accurate global illumination. We formulate the rendering theory for the primitive based on non-exponential transport and derive efficient rendering operations to be compatible with Monte Carlo path tracing. The new representation can be converted from different sources, including meshes and 3D Gaussian splatting, and further refined via transmittance optimization thanks to its differentiability. We demonstrate the versatility of our representation in various rendering applications such as global illumination and appearance editing, while supporting arbitrary lighting conditions by nature. Additionally, we compare our representation to existing volumetric representations, highlighting its efficiency to reproduce details.A complementary theme of this dissertation is about new ways to solve the rendering problem as a high-dimensional integration problem. The standard Monte Carlo integration, while versatile, introduces noise and requires many samples for convergence. In contrast, our scene aggregation method prefilters the integral before rendering, and our Gaussian rendering primitive exploits analytic mathematical operations. We continue to demonstrate the usefulness of specialized analytic solutions by vectorization, an analytic method to compute and differentiable visibility. Our method analytically solves the 2D point-to-region visibility problem by dynamically constructing a VBVH structure that maintains all visible regions. This leads to various rendering applications such as anti-aliased visibility, analytic shading, and soft shadows. More importantly, it also allows us to effortlessly differentiate visibility with respect to any parameter that was previously considered difficult, simply using automatic differentiation. Compared to methods based on Monte Carlo sampling, our method generates noise-free gradients, which can be easily used in inverse rendering and enable second-order optimization techniques for the first time. |
|---|---|
| ISBN: | 9798314846339 |
| Fuente: | Publicly Available Content Database |