• Publications
  • Influence
How Powerful are Graph Neural Networks?
TLDR
This work characterize the discriminative power of popular GNN variants, such as Graph Convolutional Networks and GraphSAGE, and show that they cannot learn to distinguish certain simple graph structures, and develops a simple architecture that is provably the most expressive among the class of GNNs.
Deep Metric Learning via Lifted Structured Feature Embedding
TLDR
An algorithm for taking full advantage of the training batches in the neural network training by lifting the vector of pairwise distances within the batch to the matrix of Pairwise distances enables the algorithm to learn the state of the art feature embedding by optimizing a novel structured prediction objective on the lifted problem.
Representation Learning on Graphs with Jumping Knowledge Networks
TLDR
This work explores an architecture -- jumping knowledge (JK) networks -- that flexibly leverages, for each node, different neighborhood ranges to enable better structure-aware representation in graphs.
Max-value Entropy Search for Efficient Bayesian Optimization
TLDR
It is observed that MES maintains or improves the good empirical performance of ES/PES, while tremendously lightening the computational burden, and is much more robust to the number of samples used for computing the entropy, and hence more efficient for higher dimensional problems.
On learning to localize objects with minimal supervision
TLDR
This paper proposes a new method that achieves this goal with only image-level labels of whether the objects are present or not, and combines a discriminative submodular cover problem for automatically discovering a set of positive object windows with a smoothed latent SVM formulation.
Deep Metric Learning via Facility Location
TLDR
This paper proposes a new metric learning scheme, based on structured prediction, that is aware of the global structure of the embedding space, and which is designed to optimize a clustering quality metric (NMI).
Submodularity beyond submodular energies: Coupling edges in graph cuts
TLDR
This work proposes a new family of non-submodular global energy functions that still use submodularity internally to couple edges in a graph cut and shows it is possible to develop an efficient approximation algorithm that can use standard graph cuts as a subroutine.
Weakly-supervised Discovery of Visual Pattern Configurations
TLDR
This work proposes an approach that automatically identifies discriminative configurations of visual patterns that are characteristic of a given object class that leads to state-of-the-art weakly-supervised detection results on the challenging PASCAL VOC dataset.
What Can Neural Networks Reason About?
TLDR
This framework offers an explanation for the empirical success of popular reasoning models, and suggests their limitations, and unify seemingly different reasoning tasks via the lens of a powerful algorithmic paradigm, dynamic programming (DP).
Debiased Contrastive Learning
TLDR
A debiased contrastive objective is developed that corrects for the sampling of same-label datapoints, even without knowledge of the true labels, and consistently outperforms the state-of-the-art for representation learning in vision, language, and reinforcement learning benchmarks.
...
1
2
3
4
5
...