• Publications
  • Influence
Optimization with Sparsity-Inducing Penalties
TLDR
Sparse estimation methods are aimed at using or obtaining parsimonious representations of data or models. Expand
Structured Variable Selection with Sparsity-Inducing Norms
TLDR
We consider the empirical risk minimization problem for linear supervised learning, with regularization by structured sparsity-inducing norms. Expand
A latent factor model for highly multi-relational data
TLDR
In this paper, we propose a method for modeling large multi-relational datasets, with possibly thousands of relations. Expand
Proximal Methods for Hierarchical Sparse Coding
TLDR
We propose efficient algorithms to solve the tree-structured sparse approximation problem at the same computational cost as traditional ones using the l1-norm, which has proven useful in several applications. Expand
Structured Sparse Principal Component Analysis
TLDR
We present an extension of sparse PCA, or sparse dictionary learning, where the sparsity patterns of all dictionary elements are structured and constrained to belong to a prespecified set of shapes. Expand
Proximal Methods for Sparse Hierarchical Dictionary Learning
TLDR
We propose to combine two approaches for modeling data admitting sparse representations: on the one hand, dictionary learning and structured sparsity. Expand
Structured sparsity through convex optimization
TLDR
We show that the $\ell_1$-norm can then be extended to structured norms built on either disjoint or overlapping groups of variables, leading to a flexible framework that can deal with various structures. Expand
Network Flow Algorithms for Structured Sparsity
TLDR
We consider a class of learning problems that involve a structured sparsity-inducing norm defined as the sum of l∞-norms over groups of variables. Expand
Convex and Network Flow Optimization for Structured Sparsity
TLDR
We show that the proximal operator associated with a sum of l∞-norms can be computed exactly in polynomial time by solving a quadratic min cost flow problem, allowing the use of accelerated proximal gradient methods. Expand
...
1
2
3
4
5
...