• Publications
  • Influence
SURF: Speeded Up Robust Features
TLDR
We present a novel scale- and rotation-invariant interest point detector and descriptor, coined SURF (Speeded Up Robust Features). Expand
  • 8,884
  • 1334
  • PDF
Speeded-Up Robust Features (SURF)
TLDR
This article presents a novel scale- and rotation-invariant detector and descriptor, coined SURF (Speeded-Up Robust Features). Expand
  • 10,451
  • 1264
  • PDF
A Comparison of Affine Region Detectors
TLDR
The paper gives a snapshot of the state of the art in affine covariant region detectors, and compares their performance on a set of test images under varying imaging conditions. Expand
  • 3,234
  • 331
  • PDF
Unsupervised Visual Domain Adaptation Using Subspace Alignment
TLDR
We introduce a new domain adaptation (DA) algorithm where the source and target domains are represented by subspaces described by eigenvectors. Expand
  • 774
  • 111
  • PDF
Pose Guided Person Image Generation
TLDR
This paper proposes the Pose Guided Person Generation Network (PG$^2$) that allows to synthesize person images in arbitrary poses, based on an image of that person and a novel pose. Expand
  • 327
  • 78
  • PDF
An Efficient Dense and Scale-Invariant Spatio-Temporal Interest Point Detector
TLDR
This paper presents for the first time spatio-temporal interest points that are at the same time scale-invariant (both spatially and temporally) and densely cover the video content. Expand
  • 962
  • 63
  • PDF
Object Detection by Contour Segment Networks
TLDR
We propose a method for object detection in cluttered real images, given a single hand-drawn example as model. Expand
  • 338
  • 50
  • PDF
Matching Widely Separated Views Based on Affine Invariant Regions
TLDR
In this paper, we propose a method to find a relatively sparse set of feature correspondences between wide baseline images. Expand
  • 710
  • 49
  • PDF
Memory Aware Synapses: Learning what (not) to forget
TLDR
We propose a novel approach for lifelong learning, coined Memory Aware Synapses, which computes the importance of the parameters of a neural network in an unsupervised and online manner. Expand
  • 223
  • 42
  • PDF
Dynamic Filter Networks
TLDR
In a traditional convolutional layer, the learned filters stay fixed after training. Expand
  • 387
  • 41
  • PDF