A specialized interior-point method for solving large-scale -regularized LSPs that uses the preconditioned conjugate gradients algorithm to compute the search direction and can solve large sparse problems, with a million variables and observations, in a few tens of minutes on a PC.Expand

This paper proposes a variation on Hodrick-Prescott (H-P) filtering, a widely used method for trend estimation that substitutes a sum of absolute values for the sum of squares used in H-P filtering to penalize variations in the estimated trend.Expand

This paper describes an efficient interior-point method for solving large-scale l1-regularized logistic regression problems, and shows how a good approximation of the entire regularization path can be computed much more efficiently than by solving a family of problems independently.Expand

A framework for single-period optimization, where the trades in each period are found by solving a convex optimization problem that trades off expected return, risk, transaction cost and holding cost such as the borrowing cost for shorting assets.Expand

A specialized interior-point method for solving CS problems that uses a preconditioned conjugate gradient method to compute the search step and can efficiently solve large CS problems, by exploiting fast algorithms for the signal transforms used.Expand

The problem of estimating underlying trends in time series data arises in a variety of disciplines. In this paper we propose a variation on Hodrick–Prescott (H-P) filtering, a widely used method for… Expand

Numerical experiments show that the efficient interior-point method described here outperforms standard methods for solving convex optimization problems as well as other methods specifically designed for l1- regularized LRPs.Expand

It is shown that, in a wide variety of kernel-based learning algorithms, the kernel learning problem can be formulated as a convex optimization problem which interior-point methods can solve globally and efficiently.Expand

An efficient interior-point method for solving large-scale lscr1-regularized convex loss minimization problems that uses a preconditioned conjugate gradient method to compute the search step and can solve very large problems.Expand