# Generalised Implicit Neural Representations

@article{Grattarola2022GeneralisedIN, title={Generalised Implicit Neural Representations}, author={Daniele Grattarola and Pierre Vandergheynst}, journal={ArXiv}, year={2022}, volume={abs/2205.15674} }

We consider the problem of learning implicit neural representations (INRs) for signals on non-Euclidean domains. In the Euclidean case, INRs are trained on a discrete sampling of a signal over a regular lattice. Here, we assume that the continuous signal exists on some unknown topological space from which we sample a discrete graph. In the absence of a coordinate system to identify the sampled nodes, we propose approximating their location with a spectral embedding of the graph. This allows us…

## References

SHOWING 1-10 OF 64 REFERENCES

### Seeing Implicit Neural Representations as Fourier Series

- Computer Science2022 IEEE/CVF Winter Conference on Applications of Computer Vision (WACV)
- 2022

This work analyzes the connection between the two methods and shows that a Fourier mapped perceptron is structurally like one hidden layer SIREN and identifies the relationship between the previously proposed Fourier mapping and the general d-dimensional Fourier series, leading to an integer lattice mapping.

### Geometric Deep Learning: Going beyond Euclidean data

- Computer ScienceIEEE Signal Processing Magazine
- 2017

Deep neural networks are used for solving a broad range of problems from computer vision, natural-language processing, and audio analysis where the invariances of these structures are built into networks used to model them.

### Implicit Geometric Regularization for Learning Shapes

- Computer ScienceICML
- 2020

It is observed that a rather simple loss function, encouraging the neural network to vanish on the input point cloud and to have a unit norm gradient, possesses an implicit geometric regularization property that favors smooth and natural zero level set surfaces, avoiding bad zero-loss solutions.

### Phase Transitions, Distance Functions, and Implicit Neural Representations

- Computer ScienceICML
- 2021

Inspiration from the theory of phase transitions of fluids is drawn and a loss for training INRs is suggested that learns a density function that converges to a proper occupancy function, while its log transform converging to a distance function.

### Fourier Features Let Networks Learn High Frequency Functions in Low Dimensional Domains

- Computer ScienceNeurIPS
- 2020

An approach for selecting problem-specific Fourier features that greatly improves the performance of MLPs for low-dimensional regression tasks relevant to the computer vision and graphics communities is suggested.

### Transferability of Spectral Graph Convolutional Neural Networks

- Computer ScienceJ. Mach. Learn. Res.
- 2021

It is shown that if two graphs discretize the same continuous metric space, then a spectral filter/ConvNet has approximately the same repercussion on both graphs, which is more permissive than the standard analysis.

### Sign and Basis Invariant Networks for Spectral Graph Representation Learning

- Computer ScienceArXiv
- 2022

SignNet and BasisNet are introduced — new neural architectures that are invariant to all requisite symmetries and hence process collections of eigenspaces in a principled manner and can approximate any continuous function of eigenvectors with the proper invariances.

### Laplacian Eigenmaps for Dimensionality Reduction and Data Representation

- Computer ScienceNeural Computation
- 2003

This work proposes a geometrically motivated algorithm for representing the high-dimensional data that provides a computationally efficient approach to nonlinear dimensionality reduction that has locality-preserving properties and a natural connection to clustering.

### Intrinsic Neural Fields: Learning Functions on Manifolds

- Computer ScienceECCV
- 2022

Intrinsic neural ﬁelds can reconstruct high-ﬁdelity textures from images with state-of-the-art quality and are robust to the discretization of the underlying manifold, and demonstrate the versatility of intrinsic neuralﬂeld by tackling various applications: texture transfer between deformed shapes & diﬀerent shapes, texture reconstruction from real-world images with view dependence, and discretized-agnostic learning on meshes and point

### Vector Neurons: A General Framework for SO(3)-Equivariant Networks

- Computer Science2021 IEEE/CVF International Conference on Computer Vision (ICCV)
- 2021

Invariance and equivariance to the rotation group have been widely discussed in the 3D deep learning community for pointclouds. Yet most proposed methods either use complex mathematical tools that…