• Publications
  • Influence
Online Learning for Matrix Factorization and Sparse Coding
A new online optimization algorithm is proposed, based on stochastic approximations, which scales up gracefully to large data sets with millions of training samples, and extends naturally to various matrix factorization formulations, making it suitable for a wide range of learning problems. Expand
Online dictionary learning for sparse coding
A new online optimization algorithm for dictionary learning is proposed, based on stochastic approximations, which scales up gracefully to large datasets with millions of training samples, and leads to faster performance and better dictionaries than classical batch algorithms for both small and large datasets. Expand
Unsupervised Learning of Visual Features by Contrasting Cluster Assignments
This paper proposes an online algorithm, SwAV, that takes advantage of contrastive methods without requiring to compute pairwise comparisons, and uses a swapped prediction mechanism where it predicts the cluster assignment of a view from the representation of another view. Expand
Non-local sparse models for image restoration
Experimental results in image denoising and demosaicking tasks with synthetic and real noise show that the proposed method outperforms the state of the art, making it possible to effectively restore raw images from digital cameras at a reasonable speed and memory cost. Expand
Task-Driven Dictionary Learning
  • J. Mairal, F. Bach, J. Ponce
  • Computer Science, Medicine
  • IEEE Transactions on Pattern Analysis and Machine…
  • 27 September 2010
This paper presents a general formulation for supervised dictionary learning adapted to a wide variety of tasks, and presents an efficient algorithm for solving the corresponding optimization problem. Expand
Optimization with Sparsity-Inducing Penalties
This monograph covers proximal methods, block-coordinate descent, reweighted l2-penalized techniques, working-set and homotopy methods, as well as non-convex formulations and extensions, and provides an extensive set of experiments to compare various algorithms from a computational point of view. Expand
Sparse Representation for Computer Vision and Pattern Recognition
This review paper highlights a few representative examples of how the interaction between sparse signal representation and computer vision can enrich both fields, and raises a number of open questions for further study. Expand
Sparse Representation for Color Image Restoration
This work puts forward ways for handling nonhomogeneous noise and missing information, paving the way to state-of-the-art results in applications such as color image denoising, demosaicing, and inpainting, as demonstrated in this paper. Expand
Supervised Dictionary Learning
A novel sparse representation for signals belonging to different classes in terms of a shared dictionary and discriminative class models is proposed, with results on standard handwritten digit and texture classification tasks. Expand
A Universal Catalyst for First-Order Optimization
This work introduces a generic scheme for accelerating first-order optimization methods in the sense of Nesterov, which builds upon a new analysis of the accelerated proximal point algorithm, and shows that acceleration is useful in practice, especially for ill-conditioned problems where the authors measure significant improvements. Expand