• Corpus ID: 2212672

SnapVX: A Network-Based Convex Optimization Solver

@article{Hallac2017SnapVXAN,
  title={SnapVX: A Network-Based Convex Optimization Solver},
  author={David Hallac and C Wong and Steven Diamond and Abhijit Sharang and Rok Sosi{\vc} and Stephen P. Boyd and Jure Leskovec},
  journal={Journal of machine learning research : JMLR},
  year={2017},
  volume={18},
  pages={110 - 114}
}
SnapVX is a high-performance solver for convex optimization problems defined on networks. For problems of this form, SnapVX provides a fast and scalable solution with guaranteed global convergence. It combines the capabilities of two open source software packages: Snap.py and CVXPY. Snap.py is a large scale graph processing library, and CVXPY provides a general modeling framework for small-scale subproblems. SnapVX offers a customizable yet easy-to-use Python interface with “out-of-the-box… 

Figures from this paper

Testing Fine-Grained Parallelism for the ADMM on a Factor-Graph
TLDR
This work proposes a problem-independent scheme of accelerating the Alternating Direction Method of Multipliers that can automatically exploit fine-grained parallelism both in GPUs and shared-memory multi-core computers and achieves significant speedup in such diverse application domains as combinatorial optimization, machine learning, and optimal control.
Distributed majorization-minimization for Laplacian regularized problems
TLDR
A distributed majorization-minimization method is developed that is able to scale to very large problems, and is illustrated on two applications, demonstrating its scalability and accuracy.
A Graph-Based Modeling Abstraction for Optimization: Concepts and Implementation in Plasmo.jl
TLDR
This work presents a general graph-based modeling abstraction for optimization that is called an OptiGraph, which enables the modular construction of highly complex models in an intuitive manner, facilitates the use of graph analysis tools, and facilitates communication of structures to decomposition algorithms.
Sparse Network Lasso for Local High-dimensional Regression
TLDR
A simple yet efficient iterative least-squares based optimization procedure for the sparse network lasso is proposed, which does not need a tuning parameter, and is guaranteed to converge to a globally optimal solution.
Localized Lasso for High-Dimensional Regression
TLDR
The localized Lasso is introduced, which is suited for learning models that are both interpretable and have a high predictive power in problems with high dimensionality and small sample size, and a simple yet efficient iterative least-squares based optimization procedure is proposed.
Solving Fused Penalty Estimation Problems via Block Splitting Algorithms
  • T. Yen
  • Mathematics
    Journal of Computational and Graphical Statistics
  • 2019
TLDR
A method for solving a penalized estimation problem in which the penalty function is a function of differences between pairs of parameter vectors is proposed, which introduces a set of equality constraints that connect each parameter vector to a group of auxiliary variables.
HGraph: Parallel and Distributed Tool for Large-Scale Graph Processing
TLDR
HGraph is a parallel and distributed tool which handles large-scale graphs and is build on top of Hadoop and Spark frameworks and is adapted to easily implement algorithms for various graph problems.
Network Inference via the Time-Varying Graphical Lasso
TLDR
The TVGL algorithm is introduced, a method of inferring time-varying networks from raw time series data and a scalable message-passing algorithm based on the Alternating Direction Method of Multipliers (ADMM) to solve the problem in an efficient way.
...
...

References

SHOWING 1-10 OF 30 REFERENCES
Parameter Selection and Preconditioning for a Graph Form Solver
TLDR
This chapter addresses the critical practical issues of how to select the proximal parameter in each iteration, and how to scale the original problem variables, so as to achieve reliable practical performance.
Network Lasso: Clustering and Optimization in Large Graphs
TLDR
The network lasso is introduced, a generalization of the group lasso to a network setting that allows for simultaneous clustering and optimization on graphs and an algorithm based on the Alternating Direction Method of Multipliers (ADMM) to solve this problem in a distributed and scalable manner.
ECOS: An SOCP solver for embedded systems
In this paper, we describe the embedded conic solver (ECOS), an interior-point solver for second-order cone programming (SOCP) designed specifically for embedded applications. ECOS is written in low
Large-Scale Nonlinear Optimization
TLDR
Fast Linear Algebra for Multiarc Trajectory Optimization and Parametric Sensitivity Analysis for Optimal Boundary Control of a 3D Reaction-Diffusion System.
A General Analysis of the Convergence of ADMM
TLDR
This work provides a new proof of the linear convergence of the alternating direction method of multipliers when one of the objective terms is strongly convex, and demonstrates that minimizing the derived bound on the convergence rate provides a practical approach to selecting algorithm parameters for particular ADMM instances.
Knitro: An Integrated Package for Nonlinear Optimization
TLDR
The package provides crossover techniques between algorithmic options as well as automatic selection of options and settings, and it is effective for the following special cases: unconstrained optimization, nonlinear systems of equations, least squares, and linear and quadratic programming.
Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers
TLDR
It is argued that the alternating direction method of multipliers is well suited to distributed convex optimization, and in particular to large-scale problems arising in statistics, machine learning, and related areas.
Localized Lasso for High-Dimensional Regression
TLDR
The localized Lasso is introduced, which is suited for learning models that are both interpretable and have a high predictive power in problems with high dimensionality and small sample size, and a simple yet efficient iterative least-squares based optimization procedure is proposed.
A Matlab toolbox for optimization over symmetric cones
TLDR
This paper describes how to work with SeDuMi, an add-on for MATLAB, which lets you solve optimization problems with linear, quadratic and semidefiniteness constraints by exploiting sparsity.
Alternating Direction Method with Self-Adaptive Penalty Parameters for Monotone Variational Inequalities
TLDR
This paper presents a modified alternating direction method that adjusts the penalty parameter per iteration based on the iterate message and preliminary numerical tests show that the self-adaptive adjustment technique is effective in practice.
...
...