Exploring differential geometry in neural implicits
@article{Novello2022ExploringDG, title={Exploring differential geometry in neural implicits}, author={Tiago Novello and Guilherme Gonçalves Schardong and Luiz Schirmer and Vin{\'i}cius da Silva and H{\'e}lio Lopes and Luiz Velho}, journal={Comput. Graph.}, year={2022}, volume={108}, pages={49-60} }
Figures and Tables from this paper
4 Citations
Neural Implicit Surface Evolution using Differential Equations
- Computer Science
- 2022
This work investigates the use of smooth neural networks for modeling dynamic variations of implicit surfaces under partial differential equations (PDE). For this purpose, it extends the…
Neural Implicit Mapping via Nested Neighborhoods
- Computer Science
- 2022
The neural normal mapping transfers details from a neural SDF to a surface nested on a neighborhood of its zero-level set, and it can be used to fetch smooth normals for discrete surfaces such as meshes and to skip later iterations when sphere tracing level sets.
ORCa: Glossy Objects as Radiance Field Cameras
- Computer Science
- 2022
It is shown that recovering the environment radiance enables depth and radiance estimation from the object to its surroundings in addition to beyond field-of-view novel-view synthesis, i.e. rendering of novel views that are only directly-visible to the glossy object present in the scene, but not the observer.
Understanding Sinusoidal Neural Networks
- Computer ScienceArXiv
- 2022
It is proved that the composition of sinusoidal layers expands as a sum of sines consisting of a large number of new frequencies given by linear combinations of the weights of the network’s first layer.
References
SHOWING 1-10 OF 45 REFERENCES
Neural Implicit Surfaces in Higher Dimension
- Mathematics, Computer ScienceArXiv
- 2022
This work investigates the use of neural networks admitting high-order derivatives for modeling dynamic variations of smooth implicit surfaces. For this purpose, it extends the representation of…
Implicit Geometric Regularization for Learning Shapes
- Computer ScienceICML
- 2020
It is observed that a rather simple loss function, encouraging the neural network to vanish on the input point cloud and to have a unit norm gradient, possesses an implicit geometric regularization property that favors smooth and natural zero level set surfaces, avoiding bad zero-loss solutions.
Implicit Surface Representations As Layers in Neural Networks
- Computer Science2019 IEEE/CVF International Conference on Computer Vision (ICCV)
- 2019
This work proposes a novel formulation that permits the use of implicit representations of curves and surfaces, of arbitrary topology, as individual layers in Neural Network architectures with end-to-end trainability, and proposes to represent the output as an oriented level set of a continuous and discretised embedding function.
MIP-plicits: Level of Detail Factorization of Neural Implicits Sphere Tracing
- Computer ScienceArXiv
- 2022
We introduce MIP-plicits, a novel approach for rendering 3D and 4D Neural Implicits that divide the problem into macro and meso components. We rely on the iterative nature of the sphere tracing…
Geometry Processing with Neural Fields
- Computer ScienceNeurIPS
- 2021
This paper proposes the use of neural fields for geometry processing, and introduces loss functions and architectures to show that some of the most challenging geometry processing tasks, such as deformation and filtering, can be done with neural fields.
Discrete Differential-Geometry Operators for Triangulated 2-Manifolds
- Mathematics, Computer ScienceVisMath
- 2002
A unified and consistent set of flexible tools to approximate important geometric attributes, including normal vectors and curvatures on arbitrary triangle meshes, using averaging Voronoi cells and the mixed Finite-Element/Finite-Volume method is proposed.
Implicit Neural Representations with Periodic Activation Functions
- Computer ScienceNeurIPS
- 2020
This work proposes to leverage periodic activation functions for implicit neural representations and demonstrates that these networks, dubbed sinusoidal representation networks or Sirens, are ideally suited for representing complex natural signals and their derivatives.
BSP-Net: Generating Compact Meshes via Binary Space Partitioning
- Computer Science2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
- 2020
BSP-Net is a network that learns to represent a 3D shape via convex decomposition, which is unsupervised since no convex shape decompositions are needed for training and the reconstruction quality is competitive with state-of-the-art methods while using much fewer primitives.
Reconstruction and representation of 3D objects with radial basis functions
- Computer ScienceSIGGRAPH
- 2001
It is shown that the RBF representation has advantages for mesh simplification and remeshing applications, and a greedy algorithm in the fitting process reduces the number of RBF centers required to represent a surface and results in significant compression and further computational advantages.
CvxNet: Learnable Convex Decomposition
- Computer Science2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
- 2020
This work introduces a network architecture to represent a low dimensional family of convexes, automatically derived via an auto-encoding process, and investigates the applications including automatic convex decomposition, image to 3D reconstruction, and part-based shape retrieval.