#### Filter Results:

- Full text PDF available (24)

#### Publication Year

1982

2017

- This year (6)
- Last 5 years (26)
- Last 10 years (36)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Data Set Used

#### Key Phrases

Learn More

- Lorenzo Rosasco, So fia Mosci, Matteo Santoro, Alessandro Verri, Silvia Villa, Sofia Mosci
- 2009

In this paper we propose a general framework to characterize and solve the optimization problems underlying a large class of sparsity based regularization algorithms. More precisely, we study the minimization of learning functionals that are sums of a differentiable data term and a convex non differentiable penalty. These latter penalties have recently… (More)

- Silvia Villa, Saverio Salzo, Luca Baldassarre, Alessandro Verri
- SIAM Journal on Optimization
- 2013

We propose a convergence analysis of accelerated forward-backward splitting methods for composite function minimization, when the proximity operator is not available in closed form, and can only be computed up to a certain precision. We prove that the 1/k2 convergence rate for the function values can be achieved if the admissible errors are of a certain… (More)

- Sofia Mosci, Lorenzo Rosasco, Matteo Santoro, Alessandro Verri, Silvia Villa
- ECML/PKDD
- 2010

Proximal methods have recently been shown to provide effective optimization procedures to solve the variational problems defining the !1 regularization algorithms. The goal of the paper is twofold. First we discuss how proximal methods can be applied to solve a large class of machine learning algorithms which can be seen as extensions of !1 regularization,… (More)

- Curzio Basso, Matteo Santoro, Alessandro Verri, Silvia Villa
- ICANN
- 2011

Recently, considerable research efforts have been devoted to the design of methods to learn from data overcomplete dictionaries for sparse coding. However, learned dictionaries require the solution of an optimization problem for coding new data. In order to overcome this drawback, we propose an algorithm aimed at learning both a dictionary and its dual: a… (More)

- Lorenzo Rosasco, Silvia Villa, Sofia Mosci, Matteo Santoro, Alessandro Verri
- Journal of Machine Learning Research
- 2013

In this work we are interested in the problems of supervised learning and variable selection when the input-output dependence is described by a nonlinear function depending on a few variables. Our goal is to consider a sparse nonparametric model, hence avoiding linear or additive models. The key idea is to measure the importance of each variable in the… (More)

- Lorenzo Rosasco, Andrea Tacchetti, Silvia Villa
- ArXiv
- 2014

We present inexact accelerated proximal point algorithms for minimizing a proper lower semicontinuous and convex function. We carry on a convergence analysis under different types of errors in the evaluation of the proximity operator, and we provide corresponding convergence rates for the objective function values. The proof relies on a generalization of… (More)

- S Eridani, I D Johnston, S Villa
- Annals of internal medicine
- 1982

- Sofia Mosci, Silvia Villa, Alessandro Verri, Lorenzo Rosasco
- NIPS
- 2010

We deal with the problem of variable selection when variables must be selected group-wise, with possibly overlapping groups defined a priori. In particular we propose a new optimization procedure for solving the regularized algorithm presented in [12], where the group lasso penalty is generalized to overlapping groups of variables. While in [12] the… (More)

- Lorenzo Rosasco, Matteo Santoro, Sofia Mosci, Alessandro Verri, Silvia Villa
- AISTATS
- 2010

In this paper we consider a regularization approach to variable selection when the regression function depends nonlinearly on a few input variables. The proposed method is based on a regularized least square estimator penalizing large values of the partial derivatives. An efficient iterative procedure is proposed to solve the underlying variational problem,… (More)