• Corpus ID: 1650609

Primal-Dual Active-Set Methods for Isotonic Regression and Trend Filtering

  title={Primal-Dual Active-Set Methods for Isotonic Regression and Trend Filtering},
  author={Zheng Han and Frank E. Curtis},
Isotonic regression (IR) is a non-parametric calibration method used in supervised learning. [] Key Result In addition, we propose PDAS variants (with safeguarding to ensure convergence) for solving related trend filtering (TF) problems, providing the results of experiments to illustrate their effectiveness.

Figures and Tables from this paper

A Dual Active-Set Algorithm for Regularized Monotonic Regression

This work introduces a regularization term in the monotonic regression, formulated as a least distance problem with monotonicity constraints, and proves that it converges to the optimal solution in a finite number of iterations that does not exceed the problem size.

A second-order method for convex 1-regularized optimization with active-set prediction

An active-set method for the minimization of an objective function φ that is the sum of a smooth convex function f and an -regularization term is described, and global convergence is established under the assumptions of Lipschitz-continuity and strong-convexity of f.

Penalized matrix decomposition for denoising, compression, and improved demixing of functional imaging data

An improved approach to compressing and denoising functional imaging data is introduced, based on a spatially-localized penalized matrix decomposition (PMD) of the data to separate (low-dimensional) signal from (temporally-uncorrelated) noise, which facilitates the process of demixing the observed activity into contributions from individual neurons.

Regularized monotonic regression

Monotonic (isotonic) Regression (MR) is a powerful tool used for solving a wide range of important applied problems. One of its features, which poses a limitation on its use in some areas, is that ...



Fast Active-set-type Algorithms for L1-regularized Linear Regression

A fast active-set-type method, called block principal pivoting, that accelerates computation by allowing exchanges of several variables among working sets by showing a relationship between l1-regularized linear regression and the linear complementarity problem with bounds.

Globally Convergent Primal-Dual Active-Set Methods with Inexact Subproblem Solves

Three primal-dual active-set (PDAS) methods for solving large-scale instances of an important class of convex quadratic optimization problems (QPs) that allow inexactness in the (reduced) linear system solves at all partitions except optimal ones.

A globally convergent primal-dual active-set framework for large-scale convex quadratic optimization

This work presents a primal-dual active-set framework for solving large-scale convex quadratic optimization problems (QPs) and explains the relationship between this framework and semi-smooth Newton techniques, finding that this approach is globally convergent for strictly convex QPs.

Nearly-Isotonic Regression

A simple algorithm is devised to solve for the path of solutions, which can be viewed as a modified version of the well-known pool adjacent violators algorithm, and computes the entire path in O(n) operations (n being the number of data points).

Fast and Flexible ADMM Algorithms for Trend Filtering

This article presents a fast and robust algorithm for trend filtering, a recently developed nonparametric regression tool, that is competitive with the specialized interior point methods that are currently in use, and yet is far more numerically robust.

The Primal-Dual Active Set Strategy as a Semismooth Newton Method

The notion of slant differentiability is recalled and it is argued that the $\max$-function is slantly differentiable in Lp-spaces when appropriately combined with a two-norm concept, which leads to new local convergence results of the primal-dual active set strategy.

Active set algorithms for isotonic regression; A unifying framework

The active set approach provides a unifying framework for studying algorithms for isotonic regression, simplifies the exposition of existing algorithms and leads to several new efficient algorithms, including a new O(n) primal feasible active set algorithm.

A family of second-order methods for convex $$\ell _1$$ℓ1-regularized optimization

A new active set method is proposed that performs multiple changes in the active manifold estimate at every iteration, and employs a mechanism for correcting these estimates, when needed.

The Isotonic Regression Problem and its Dual

Abstract The isotonic regression problem is to minimize Σt i = 1 [gi − xi]2wi subject to xi ≤ xj when where wi>0 and gi (i= 1, 2, …, k) are given and is a specified partial ordering on {1, 2, …, k}.

Reoptimization With the Primal-Dual Interior Point Method

Reoptimization techniques for an interior point method applied to solving a sequence of linear programming problems are discussed and numerical results with OOPS, a new object-oriented parallel solver, demonstrate the efficiency of the approach.