• Publications
  • Influence
Graph Attention Networks
We present graph attention networks (GATs), novel neural network architectures that operate on graph-structured data, leveraging masked self-attentional layers to address the shortcomings of prior
Deep Pain: Exploiting Long Short-Term Memory Networks for Facial Expression Classification
TLDR
It is suggested that the performance of pain assessment can be enhanced by feeding the raw frames to deep learning models, outperforming the latest state-of-the-art results while also directly facing the problem of imbalanced data.
Regularizing CNNs with Locally Constrained Decorrelations
TLDR
It is shown that regularizing negatively correlated features is an obstacle for effective decorrelation and OrthoReg, a novel regularization technique that locally enforces feature orthogonality is presented, which is able to reduce the overfitting of state-of-the-art CNNs on CIFar-10, CIFAR-100, and SVHN.
Context-Aware Visual Compatibility Prediction
TLDR
This work proposes a method that predicts compatibility between two items based on their visual features, as well as their context, using a graph neural network that learns to generate product embeddings conditioned on their context.
BigBrain 3D atlas of cortical layers: Cortical and laminar thickness gradients diverge in sensory and motor cortices
TLDR
This BigBrain cortical atlas was derived from a 3D histological model of the human brain at 20 micron isotropic resolution (BigBrain), using a convolutional neural network to segment, automatically, the cortical layers in both hemispheres and provides an unprecedented level of precision and detail.
BigBrain 3D atlas of cortical layers: Cortical and laminar thickness gradients diverge in sensory and motor cortices
TLDR
This BigBrain cortical atlas was derived from a 3D histological model of the human brain at 20 micron isotropic resolution (BigBrain), using a convolutional neural network to segment, automatically, the cortical layers in both hemispheres and provides an unprecedented level of precision and detail.
Pay Attention to the Activations: A Modular Attention Mechanism for Fine-Grained Image Recognition
TLDR
The proposed approach learns to attend to lower-level feature activations without requiring part annotations and uses those activations to update and rectify the output likelihood distribution, and demonstrates that well-known networks such as wide residual networks and ResNeXt, when augmented with the approach, systematically improve their classification accuracy and become more robust to changes in deformation and pose.
Attend and Rectify: a Gated Attention Mechanism for Fine-Grained Recovery
TLDR
Wide Residual Networks augmented with the proposed novel attention mechanism surpasses the state of the art classification accuracies in CIFAR-10, the Adience gender recognition task, Stanford dogs, and UEC Food-100.
Convolutional neural networks for mesh-based parcellation of the cerebral cortex
TLDR
It is shown experimentally on the Human Connectome Project dataset that the proposed graph convolutional models outperform current state-ofthe-art and baselines, highlighting the potential and applicability of these methods to tackle neuroimaging challenges, paving the road towards a better characterization of brain diseases.
...
...