Fast Graph Kernel with Optical Random Features

@article{Ghanem2021FastGK,
  title={Fast Graph Kernel with Optical Random Features},
  author={Hashem Ghanem and Nicolas Keriven and Nicolas Tremblay},
  journal={ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)},
  year={2021},
  pages={3575-3579}
}
  • Hashem Ghanem, N. Keriven, Nicolas Tremblay
  • Published 16 October 2020
  • Computer Science, Mathematics
  • ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
The graphlet kernel is a classical method in graph classification. It however suffers from a high computation cost due to the isomorphism test it includes. As a generic proxy, and in general at the cost of losing some information, this test can be efficiently replaced by a user-defined mapping that computes various graph characteristics. In this paper, we propose to leverage kernel random features within the graphlet framework, and establish a theoretical link with a mean kernel metric. If this… Expand

Figures and Tables from this paper

LightOn Optical Processing Unit : Scaling-up AI and HPC with a Non von Neumann co-processor
Beyond pure Von Neumann processing Scalability of AI / HPC models is limited by the Von Neumann bottleneck for accessing massive amounts of memory, driving up power consumption.

References

SHOWING 1-10 OF 26 REFERENCES
Stochastic Graphlet Embedding
  • A. Dutta, H. Sahbi
  • Computer Science, Medicine
  • IEEE Transactions on Neural Networks and Learning Systems
  • 2019
TLDR
A novel high-order stochastic graphlet embedding that maps graphs into vector spaces that has a positive impact on the performance of pattern comparison and recognition as corroborated through extensive experiments using standard benchmark databases. Expand
A survey on graph kernels
TLDR
This survey gives a comprehensive overview of techniques for kernel-based graph classification developed in the past 15 years and describes and categorizes graph kernels based on properties inherent to their design, such as the nature of their extracted graph features, their method of computation and their applicability to problems in practice. Expand
Efficient graphlet kernels for large graph comparison
TLDR
In this article, two theoretically grounded speedup schemes are introduced, one based on sampling and the second specifically designed for bounded degree graphs, to efficiently compare large graphs that cannot be tackled by existing graph kernels. Expand
Matching Node Embeddings for Graph Similarity
TLDR
This paper presents a graph kernel based on the Pyramid Match kernel that finds an approximate correspondence between the sets of vectors of the two graphs and evaluates the proposed methods on several benchmark datasets for graph classification and compares their performance to state-of-the-art graph kernels. Expand
Learning Kernels with Random Features
TLDR
This work presents an efficient optimization problem that learns a kernel in a supervised manner and proves the consistency of the estimated kernel as well as generalization bounds for the class of estimators induced by the optimized kernel. Expand
How Powerful are Graph Neural Networks?
TLDR
This work characterize the discriminative power of popular GNN variants, such as Graph Convolutional Networks and GraphSAGE, and show that they cannot learn to distinguish certain simple graph structures, and develops a simple architecture that is provably the most expressive among the class of GNNs. Expand
Deep Graph Kernels
TLDR
A unified framework to learn latent representations of sub-structures for graphs, inspired by latest advancements in language modeling and deep learning, which achieves significant improvements in classification accuracy over state-of-the-art graph kernels. Expand
Sampling from large graphs
TLDR
The best performing methods are the ones based on random-walks and "forest fire"; they match very accurately both static as well as evolutionary graph patterns, with sample sizes down to about 15% of the original graph. Expand
Random projections through multiple optical scattering: Approximating Kernels at the speed of light
TLDR
This work proposes an analog, optical device that performs the random projections literally at the speed of light without having to store any matrix in memory, and shows that, on the MNIST database, the experimental results closely match the theoretical performance of the corresponding kernel. Expand
On Node Features for Graph Neural Networks
TLDR
It is shown that GNNs work well if there is a strong correlation between node features and node labels, and proposed new feature initialization methods that allows to apply graph neural network to non-attributed graphs. Expand
...
1
2
3
...