• Corpus ID: 126187239

Graph Element Networks: adaptive, structured computation and memory

  title={Graph Element Networks: adaptive, structured computation and memory},
  author={Ferran Alet and Adarsh K. Jeewajee and Maria Bauz{\'a} and Alberto Rodriguez and Tomas Lozano-Perez and Leslie Pack Kaelbling},
We explore the use of graph neural networks (GNNs) to model spatial processes in which there is no a priori graphical structure. [] Key Method We use GNNs as a computational substrate, and show that the locations of the nodes in space as well as their connectivity can be optimized to focus on the most complex parts of the space. Moreover, this representational strategy allows the learned input-output relationship to generalize over the size of the underlying space and run the same model at different levels…

Figures from this paper

Learning Connectivity with Graph Convolutional Networks

  • H. Sahbi
  • Computer Science
    2020 25th International Conference on Pattern Recognition (ICPR)
  • 2021
Experiments conducted on the challenging task of skeleton-based action recognition shows the superiority of the proposed framework for graph convolutional networks that learns the topological properties of graphs compared to handcrafted graph design as well as the related work.

Neural Operator: Graph Kernel Network for Partial Differential Equations

The key innovation in this work is that a single set of network parameters, within a carefully designed network architecture, may be used to describe mappings between infinite-dimensional spaces and between different finite-dimensional approximations of those spaces.

Simulating Continuum Mechanics with Multi-Scale Graph Neural Networks

The proposed MultiScaleGNN model is a novel multi-scale graph neural network model for learning to infer unsteady continuum mechanics that can generalise from uniform advection fields to high-gradient fields on complex domains at test time and infer long-term Navier-Stokes solutions within a range of Reynolds numbers.


This paper proposes a set of transformation invariant and equivariant models based on graph convolutional networks, called IsoGCNs, and demonstrates that the proposed model has a competitive performance compared to state-of-the-art methods on tasks related to geometrical and physical simulation data.

Learning the Solution Operator of Boundary Value Problems using Graph Neural Networks

GNNs can be used to learn solution operators that generalize over a range of properties and produce solutions much faster than a generic solver, and this work designs a general solution operator for two different time-independent PDEs using graph neural networks and spectral graph convolutions.

Physics-Embedded Neural Networks: $\boldsymbol{\mathrm{E}(n)}$-Equivariant Graph Neural PDE Solvers

This work presents an approach termed physics-embedded neural networks that considers boundary conditions and predicts the state after a long time using an implicit method, and demonstrates that the model learns phenomena in complex shapes and outperforms a well-optimized classical solver and a state-of-the-art machine learning model in speed-accuracy trade-off.

Convergent Graph Solvers

We propose the convergent graph solver (CGS)1, a deep learning method that learns iterative mappings to predict the properties of a graph system at its stationary state (fixed point) with guaranteed

Learning Connectivity with Graph Convolutional Networks for Skeleton-based Action Recognition

Experiments conducted on the challenging task of skeleton-based action recognition shows the superiority of the proposed framework for graph convolutional networks that learns the topological properties of graphs compared to handcrafted graph design as well as the related work.

Learning Mesh-Based Simulation with Graph Networks

MeshGraphNets is introduced, a framework for learning mesh-based simulations using graph neural networks that can be trained to pass messages on a mesh graph and to adapt the mesh discretization during forward simulation, and can accurately predict the dynamics of a wide range of physical systems.

Deep learning of material transport in complex neurite networks

A graph neural network (GNN)-based deep learning model is presented to learn the IGA-based material transport simulation and provide fast material concentration prediction within neurite networks of any topology.



Geometric Deep Learning: Going beyond Euclidean data

Deep neural networks are used for solving a broad range of problems from computer vision, natural-language processing, and audio analysis where the invariances of these structures are built into networks used to model them.

Flexible Neural Representation for Physics Prediction

The Hierarchical Relation Network (HRN) is described, an end-to-end differentiable neural network based on hierarchical graph convolution that learns to predict physical dynamics in this hierarchical particle-based object representation.

The Graph Neural Network Model

A new neural network model, called graph neural network (GNN) model, that extends existing neural network methods for processing the data represented in graph domains, and implements a function tau(G,n) isin IRm that maps a graph G and one of its nodes n into an m-dimensional Euclidean space.

A Compositional Object-Based Approach to Learning Physical Dynamics

The NPE's compositional representation of the structure in physical interactions improves its ability to predict movement, generalize across variable object count and different scene configurations, and infer latent properties of objects such as mass.

GEOMetrics: Exploiting Geometric Structure for Graph-Encoded Objects

This paper argues that the graph representation of geometric objects allows for additional structure, which should be leveraged for enhanced reconstruction, and proposes a system which properly benefits from the advantages of the geometric structure of graph encoded objects by introducing a graph convolutional update preserving vertex information.

Graph Neural Networks for IceCube Signal Classification

This work leverages graph neural networks to improve signal detection in the IceCube neutrino observatory and demonstrates the effectiveness of the GNN architecture on a task classifying IceCube events, where it outperforms both a traditional physics-based method as well as classical 3D convolution neural networks.

Hybrid computing using a neural network with dynamic external memory

A machine learning model called a differentiable neural computer (DNC), which consists of a neural network that can read from and write to an external memory matrix, analogous to the random-access memory in a conventional computer.

Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering

This work presents a formulation of CNNs in the context of spectral graph theory, which provides the necessary mathematical background and efficient numerical schemes to design fast localized convolutional filters on graphs.

Geometric Deep Learning on Graphs and Manifolds Using Mixture Model CNNs

This paper proposes a unified framework allowing to generalize CNN architectures to non-Euclidean domains (graphs and manifolds) and learn local, stationary, and compositional task-specific features and test the proposed method on standard tasks from the realms of image-, graph-and 3D shape analysis and show that it consistently outperforms previous approaches.