In search for more expressive graph learning models we build upon the recent k-order invariant and equivariant graph neural networks (Maron et al. 2019a,b) and present two results: First, we show that such k- order networks can distinguish between non-isomorphic graphs as good as the k-WL tests, which are provably stronger than the first WL test for k>2.Expand

This paper describes a fully automatic pipeline for finding an intrinsic map between two non-isometric, genus zero surfaces. Our approach is based on the observation that efficient methods exist to… Expand

We provide a characterization of all permutation invariant and equivariant linear layers for (hyper-)graph data, and show that their dimension, in case of edge-value graph data, is 2 and 15, respectively.Expand

We introduce an alternative, coordinate-based approach, where rather than solving a large linear system to perform the aforementioned interpolation, the value of the interpolant at each interior pixel is given by a weighted combination of values along the boundary.Expand

In this paper we advocate the use of linear differential coordinates as means to preserve the high-frequency detail of the surface by solving a linear least squares system.Expand

Inferred dietary preference is a major component of paleoecologies of extinct primates. Molar occlusal shape correlates with diet in living mammals, so teeth are a potentially useful structure from… Expand

Constraining linear layers in neural networks to respect symmetry transformations from a group $G$ is a common design principle for invariant networks that has found many applications in machine learning.Expand