# Efficient evaluation of scaled proximal operators

@article{Friedlander2016EfficientEO, title={Efficient evaluation of scaled proximal operators}, author={Michael P. Friedlander and Gabriel Goh}, journal={ArXiv}, year={2016}, volume={abs/1603.05719} }

Quadratic-support functions [Aravkin, Burke, and Pillonetto; J. Mach. Learn. Res. 14(1), 2013] constitute a parametric family of convex functions that includes a range of useful regularization terms found in applications of convex optimization. We show how an interior method can be used to efficiently compute the proximal operator of a quadratic-support function under different metrics. When the metric and the function have the right structure, the proximal map can be computed with cost nearly…

## Figures and Tables from this paper

## 15 Citations

### Cardinality, Simplex and Proximal Operator ∗

- Mathematics
- 2019

In this report, we consider the proximal operator for the (cid:96) 0 “norm”, which is widely adopted as a sparsity-inducing penalty in e.g. compressed sensing, sparse signal representation and…

### Adaptive FISTA for Non-convex Optimization

- Mathematics, Computer Science
- 2017

It turns out that in some situations, the proposed algorithm is equivalent to a class of SR1 (identity minus rank 1) proximal quasi-Newton methods, and convergence is proved in a general non-convex setting, and hence, as a byproduct, new convergence guarantees for proximal semi-newton methods are obtained.

### On Quasi-Newton Forward-Backward Splitting: Proximal Calculus and Convergence

- Computer Science, MathematicsSIAM J. Optim.
- 2019

A framework for quasi-Newton forward--backward splitting algorithms (proximal quasi- newton methods) with a metric induced by diagonal $\pm$ rank-$r$ symmetric positive definite matrices is introduced, which allows for a highly efficient evaluation of the proximal mapping.

### The Proximal Bootstrap for Finite-Dimensional Regularized Estimators

- Mathematics, Computer ScienceAEA Papers and Proceedings
- 2021

We propose a proximal bootstrap that can consistently estimate the limiting distribution of sqrt(n)-consistent estimators with nonstandardasymptotic distributions in a computationally efficient…

### Optimization of Graph Total Variation via Active-Set-based Combinatorial Reconditioning

- Computer ScienceAISTATS
- 2020

This work proposes a novel adaptive preconditioner driven by a sharp analysis of the local linear convergence rate depending on the "active set" at the current iterate, and shows that nested-forest decomposition of the inactive edges yields a guaranteed locallinear convergence rate.

### A New Homotopy Proximal Variable-Metric Framework for Composite Convex Minimization

- Mathematics, Computer ScienceMath. Oper. Res.
- 2022

A new primal-dual-primal framework for implementing proximal Newton methods that has attractive computational features for a subclass of nonsmooth composite convex minimization problems is suggested.

### Further properties of the forward–backward envelope with applications to difference-of-convex programming

- Computer ScienceComput. Optim. Appl.
- 2017

The forward–backward envelope first introduced in Patrinos and Bemporad is studied and it is demonstrated how to minimize some difference-of-convex regularized least squares problems by minimizing a suitably constructed forward– backward envelope.

### ProxSkip: Yes! Local Gradient Steps Provably Lead to Communication Acceleration! Finally!

- Computer ScienceICML
- 2022

ProxSkip is a surprisingly simple and provably provably effective method for minimizing the sum of a smooth and an expensive nonsmooth proximable function and offers an effec-tive acceleration of communication complexity.

### Generalized self-concordant functions: a recipe for Newton-type methods

- Mathematics, Computer ScienceMath. Program.
- 2019

The proposed theory provides a mathematical tool to analyze both local and global convergence of Newton-type methods without imposing unverifiable assumptions as long as the underlying functionals fall into the class of generalized self-concordant functions.

### One-Step Estimation with Scaled Proximal Methods

- MathematicsMathematics of Operations Research
- 2021

We study statistical estimators computed using iterative optimization methods that are not run until completion. Classical results on maximum likelihood estimators (MLEs) assert that a one-step…

## References

SHOWING 1-10 OF 41 REFERENCES

### A quasi-Newton proximal splitting method

- Computer ScienceNIPS
- 2012

Efficient implementations of the proximity calculation for a useful class of functions are described and an elegant quasi-Newton method is applied to acceleration of convex minimization problems, and compares favorably against state-of-the-art alternatives.

### Optimizing Costly Functions with Simple Constraints: A Limited-Memory Projected Quasi-Newton Algorithm

- Computer ScienceAISTATS
- 2009

An optimization algorithm for minimizing a smooth function over a convex set by minimizing a diagonal plus lowrank quadratic approximation to the function, which substantially improves on state-of-the-art methods for problems such as learning the structure of Gaussian graphical models and Markov random elds.

### Proximal Quasi-Newton for Computationally Intensive L1-regularized M-estimators

- Computer Science, MathematicsNIPS
- 2014

It is shown that the proximal quasi-Newton method is provably super-linearly convergent, even in the absence of strong convexity, by leveraging a restricted variant of strong Convexity.

### IMRO: A Proximal Quasi-Newton Method for Solving ℓ1-Regularized Least Squares Problems

- Computer ScienceSIAM J. Optim.
- 2017

This work presents a proximal quasi-Newton method in which the approximation of the Hessian has the special format of “identity minus rank one” (IMRO) in each iteration, and provides a complexity analysis for variants of IMRO, showing that it matches known best bounds.

### Modular Proximal Optimization for Multidimensional Total-Variation Regularization

- Computer ScienceJ. Mach. Learn. Res.
- 2018

1D-TV solvers provide the backbone for building more complex (two or higher-dimensional) TV solvers within a modular proximal optimization approach and are provided in an easy to use multi-threaded C++, Matlab and Python library.

### Proximal Newton-Type Methods for Minimizing Composite Functions

- Mathematics, Computer ScienceSIAM J. Optim.
- 2014

Newton-type methods for minimizing smooth functions are generalized to handle a sum of two convex functions: a smooth function and a nonsmooth function with a simple proximal mapping, which yields new convergence results for some of these methods.

### An Interior-Point Method for Large-Scale $\ell_1$-Regularized Least Squares

- Computer ScienceIEEE Journal of Selected Topics in Signal Processing
- 2007

A specialized interior-point method for solving large-scale -regularized LSPs that uses the preconditioned conjugate gradients algorithm to compute the search direction and can solve large sparse problems, with a million variables and observations, in a few tens of minutes on a PC.

### Fast Newton-type Methods for Total Variation Regularization

- Computer ScienceICML
- 2011

This work studies anisotropic (l1-based) TV and also a related l2-norm variant and develops Newton-type methods that outperform the state-of-the-art algorithms for solving the harder task of computing 2- (and higher)-dimensional TV proximity.

### An inexact successive quadratic approximation method for L-1 regularized optimization

- Mathematics, Computer ScienceMath. Program.
- 2016

The inexactness conditions are based on a semi-smooth function that represents a (continuous) measure of the optimality conditions of the problem, and that embodies the soft-thresholding iteration.

### Epi-convergent Smoothing with Applications to Convex Composite Functions

- MathematicsSIAM J. Optim.
- 2013

Epi-convergence techniques are used to define a notion of epi-smoothing that allows us to tap into the rich variational structure of the subdifferential calculus for nonsmooth, nonconvex, and nonfinite-valued functions.