We consider linear inverse problems where the solution is assumed to have a sparse expansion on an arbitrary preassigned orthonormal basis. We prove that replacing the usual quadratic regularizing… Expand

Many inverse problems arising in optics and other fields like geophysics, medical diagnostics and remote sensing, present numerical instability: the noise affecting the data may produce arbitrarily large errors in the solutions.Expand

We prove that there exists a particular ''elastic-net representation'' of the regression function such that, if the number of data increases, the elastic-net estimator is consistent not only for prediction but also for variable/feature selection.Expand

For pt.I. see ibid., vol.1, p.301 (1985). In the first part of this work a general definition of an inverse problem with discrete data has been given and an analysis in terms of singular systems has… Expand

The authors discuss linear methods for the solution of linear inverse problems with discrete data. Such problems occur frequently in instrumental science, e.g. tomography, radar, sonar, optical… Expand

This paper focuses on feature selection for problems dealing with high-dimensional data. We discuss the benefits of adopting a regularized approach with L1 or L1–L2 penalties in two different… Expand

We propose a general framework for selecting features in the computer vision domain—i.e., learning descriptions from data—where the prior knowledge related to the application is confined in the early stages.Expand

We propose a two-stage regularization method able to learn linear models characterized by a high prediction performance and its potential as a starting point for further biological investigations.Expand