• Corpus ID: 233481308

Deep Convolution for Irregularly Sampled Temporal Point Clouds

@article{Merrill2021DeepCF,
  title={Deep Convolution for Irregularly Sampled Temporal Point Clouds},
  author={Erich Merrill and Stefan Lee and Li Fuxin and Thomas G. Dietterich and Alan Fern},
  journal={ArXiv},
  year={2021},
  volume={abs/2105.00137}
}
We consider the problem of modeling the dynamics of continuous spatial-temporal processes represented by irregular samples through both space and time. Such processes occur in sensor networks, citizen science, multi-robot systems, and many others. We propose a new deep model that is able to directly learn and predict over this irregularly sampled data, without voxelization, by leveraging a recent convolutional architecture for static point clouds. The model also easily incorporates the notion… 

Figures and Tables from this paper

Unsupervised Hebbian Learning on Point Sets in StarCraft II

A novel Hebbian learning method to extract the global feature of a point set in StarCraft II game units, and its application to predict the movement of the points and introduces the concept of neuron activity aware learning combined with k-Winner-Takes-All.

References

SHOWING 1-10 OF 25 REFERENCES

4D Spatio-Temporal ConvNets: Minkowski Convolutional Neural Networks

This work creates an open-source auto-differentiation library for sparse tensors that provides extensive functions for high-dimensional convolutional neural networks and proposes the hybrid kernel, a special case of the generalized sparse convolution, and trilateral-stationary conditional random fields that enforce spatio-temporal consistency in the 7D space-time-chroma space.

PointConv: Deep Convolutional Networks on 3D Point Clouds

The dynamic filter is extended to a new convolution operation, named PointConv, which can be applied on point clouds to build deep convolutional networks and is able to achieve state-of-the-art on challenging semantic segmentation benchmarks on 3D point clouds.

Set Functions for Time Series

This paper proposes a novel approach for classifying irregularly-sampled time series with unaligned measurements, focusing on high scalability and data efficiency, and is based on recent advances in differentiable set function learning, extremely parallelizable with a beneficial memory footprint.

PointNet++: Deep Hierarchical Feature Learning on Point Sets in a Metric Space

A hierarchical neural network that applies PointNet recursively on a nested partitioning of the input point set and proposes novel set learning layers to adaptively combine features from multiple scales to learn deep point set features efficiently and robustly.

PointNet: Deep Learning on Point Sets for 3D Classification and Segmentation

This paper designs a novel type of neural network that directly consumes point clouds, which well respects the permutation invariance of points in the input and provides a unified architecture for applications ranging from object classification, part segmentation, to scene semantic parsing.

Transformer Hawkes Process

A Transformer Hawkes Process (THP) model is proposed, which leverages the self-attention mechanism to capture long-term dependencies and meanwhile enjoys computational efficiency.

Latent Ordinary Differential Equations for Irregularly-Sampled Time Series

This work generalizes RNNs to have continuous-time hidden dynamics defined by ordinary differential equations (ODEs), a model they are called ODE-RNNs, which outperform their RNN-based counterparts on irregularly-sampled data.

Self-Attentive Hawkes Processes

The proposed method adapts self-attention to fit the intensity function of Hawkes processes and is more powerful to identify complicated dependency relationships between temporal events and is able to capture longer historical information.

KPConv: Flexible and Deformable Convolution for Point Clouds

KPConv is a new design of point convolution, i.e. that operates on point clouds without any intermediate representation, that outperform state-of-the-art classification and segmentation approaches on several datasets.

Interpolation-Prediction Networks for Irregularly Sampled Time Series

A new deep learning architecture based on the use of a semi-parametric interpolation network followed by the application of a prediction network for addressing the problem of supervised learning with sparse and irregularly sampled multivariate time series is presented.