# SnapVX: A Network-Based Convex Optimization Solver

@article{Hallac2017SnapVXAN, title={SnapVX: A Network-Based Convex Optimization Solver}, author={David Hallac and C Wong and Steven Diamond and Abhijit Sharang and Rok Sosi{\vc} and Stephen P. Boyd and Jure Leskovec}, journal={Journal of machine learning research : JMLR}, year={2017}, volume={18}, pages={110 - 114} }

SnapVX is a high-performance solver for convex optimization problems defined on networks. For problems of this form, SnapVX provides a fast and scalable solution with guaranteed global convergence. It combines the capabilities of two open source software packages: Snap.py and CVXPY. Snap.py is a large scale graph processing library, and CVXPY provides a general modeling framework for small-scale subproblems. SnapVX offers a customizable yet easy-to-use Python interface with “out-of-the-box…

## Figures from this paper

## 19 Citations

Testing Fine-Grained Parallelism for the ADMM on a Factor-Graph

- Computer Science2016 IEEE International Parallel and Distributed Processing Symposium Workshops (IPDPSW)
- 2016

This work proposes a problem-independent scheme of accelerating the Alternating Direction Method of Multipliers that can automatically exploit fine-grained parallelism both in GPUs and shared-memory multi-core computers and achieves significant speedup in such diverse application domains as combinatorial optimization, machine learning, and optimal control.

Distributed majorization-minimization for Laplacian regularized problems

- Mathematics, Computer ScienceIEEE/CAA Journal of Automatica Sinica
- 2019

A distributed majorization-minimization method is developed that is able to scale to very large problems, and is illustrated on two applications, demonstrating its scalability and accuracy.

A Graph-Based Modeling Abstraction for Optimization: Concepts and Implementation in Plasmo.jl

- Computer Science
- 2020

This work presents a general graph-based modeling abstraction for optimization that is called an OptiGraph, which enables the modular construction of highly complex models in an intuitive manner, facilitates the use of graph analysis tools, and facilitates communication of structures to decomposition algorithms.

Sparse Network Lasso for Local High-dimensional Regression

- Computer ScienceArXiv
- 2016

A simple yet efficient iterative least-squares based optimization procedure for the sparse network lasso is proposed, which does not need a tuning parameter, and is guaranteed to converge to a globally optimal solution.

Localized Lasso for High-Dimensional Regression

- Computer ScienceAISTATS
- 2017

The localized Lasso is introduced, which is suited for learning models that are both interpretable and have a high predictive power in problems with high dimensionality and small sample size, and a simple yet efficient iterative least-squares based optimization procedure is proposed.

Solving Fused Penalty Estimation Problems via Block Splitting Algorithms

- MathematicsJournal of Computational and Graphical Statistics
- 2019

A method for solving a penalized estimation problem in which the penalty function is a function of differences between pairs of parameter vectors is proposed, which introduces a set of equality constraints that connect each parameter vector to a group of auxiliary variables.

HGraph: Parallel and Distributed Tool for Large-Scale Graph Processing

- Computer Science2021 1st International Conference on Artificial Intelligence and Data Analytics (CAIDA)
- 2021

HGraph is a parallel and distributed tool which handles large-scale graphs and is build on top of Hadoop and Spark frameworks and is adapted to easily implement algorithms for various graph problems.

Network Inference via the Time-Varying Graphical Lasso

- Computer ScienceKDD
- 2017

The TVGL algorithm is introduced, a method of inferring time-varying networks from raw time series data and a scalable message-passing algorithm based on the Alternating Direction Method of Multipliers (ADMM) to solve the problem in an efficient way.

## References

SHOWING 1-10 OF 30 REFERENCES

Parameter Selection and Preconditioning for a Graph Form Solver

- Computer Science
- 2018

This chapter addresses the critical practical issues of how to select the proximal parameter in each iteration, and how to scale the original problem variables, so as to achieve reliable practical performance.

Network Lasso: Clustering and Optimization in Large Graphs

- Computer ScienceKDD
- 2015

The network lasso is introduced, a generalization of the group lasso to a network setting that allows for simultaneous clustering and optimization on graphs and an algorithm based on the Alternating Direction Method of Multipliers (ADMM) to solve this problem in a distributed and scalable manner.

ECOS: An SOCP solver for embedded systems

- Computer Science2013 European Control Conference (ECC)
- 2013

In this paper, we describe the embedded conic solver (ECOS), an interior-point solver for second-order cone programming (SOCP) designed specifically for embedded applications. ECOS is written in low…

Large-Scale Nonlinear Optimization

- Computer Science
- 2006

Fast Linear Algebra for Multiarc Trajectory Optimization and Parametric Sensitivity Analysis for Optimal Boundary Control of a 3D Reaction-Diffusion System.

A General Analysis of the Convergence of ADMM

- Computer ScienceICML
- 2015

This work provides a new proof of the linear convergence of the alternating direction method of multipliers when one of the objective terms is strongly convex, and demonstrates that minimizing the derived bound on the convergence rate provides a practical approach to selecting algorithm parameters for particular ADMM instances.

Knitro: An Integrated Package for Nonlinear Optimization

- Computer Science
- 2006

The package provides crossover techniques between algorithmic options as well as automatic selection of options and settings, and it is effective for the following special cases: unconstrained optimization, nonlinear systems of equations, least squares, and linear and quadratic programming.

Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers

- Computer ScienceFound. Trends Mach. Learn.
- 2011

It is argued that the alternating direction method of multipliers is well suited to distributed convex optimization, and in particular to large-scale problems arising in statistics, machine learning, and related areas.

Localized Lasso for High-Dimensional Regression

- Computer ScienceAISTATS
- 2017

The localized Lasso is introduced, which is suited for learning models that are both interpretable and have a high predictive power in problems with high dimensionality and small sample size, and a simple yet efficient iterative least-squares based optimization procedure is proposed.

A Matlab toolbox for optimization over symmetric cones

- Computer Science
- 1999

This paper describes how to work with SeDuMi, an add-on for MATLAB, which lets you solve optimization problems with linear, quadratic and semidefiniteness constraints by exploiting sparsity.

Alternating Direction Method with Self-Adaptive Penalty Parameters for Monotone Variational Inequalities

- Mathematics, Computer Science
- 2000

This paper presents a modified alternating direction method that adjusts the penalty parameter per iteration based on the iterate message and preliminary numerical tests show that the self-adaptive adjustment technique is effective in practice.